var/home/core/zuul-output/0000755000175000017500000000000015145321540014525 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015145335270015476 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000330523015145335134020260 0ustar corecore\ikubelet.log_o[;r)Br'o-n(!9%CMc;b[>Ǧ(\XGfL3>k[=KtUgfX]&f;l6?vw ow z[??xI[+mEy},.fۮWge7Nwl~1;Zxs^~)32]$˛j{Zwg馾j?&~?|XJXlN__uαHJ2E$(Ͽ7W|'++*z> ƴ>70tr>PYND'vtGoI¢7}o٬ovko%gQ(%t#NL֜ eh&Ƨ,RH 4*,!SD 1Ed_wkxdL3F;ϗu7Taq3j }ɌDSd1d9nTwF%\bi/ Ff/Bp 4YH~BŊ6EZ_^_39L[EC 7ggS yi\-h QZ*T1xt5w KOؾ{mwk ?O1nԝG?ʥF%QV5pDVHwԡ/.2h{qۀK8zUdsSdMvwç`21ѻm/ʛ"@8(P,0;K0 w3W-W"wC1qE.HK`}o9O\4\rJrLo|nzEK?k>i]>z E*,?k 9Z29}}(4ҲIFyG -^6xYW<*uvf d |TRZ;j?| |!I瓓 sw̧`{S0Aȶ9W E%*mG:tëG(;h0!}qfJzϦ4Cmw9]٣Z%T%x~5r*N`$g`Խ!zW ݨB.wcP:e7Vt98m{x!>t®W>]tF-iܪ%ҳt17ä$ ֈm maUNvS_%qj,('GrgTZ'U鋊TqOſ * /Ijo!՟8`"j}jӲ$K3jU|C7;A)V6wr?tTWU1o jjr<~Tq: `=tJ!aݡ=i6Yݭ7}?ze\0Ac/T%;n ~S`#u.Џ1qNp&gK60nqtƅ": C@!P q]G0,d!1]r"vxH?)M"뛲@.Cs*H atyХ*Wt1z\+`E8rVQUxMҔ&ۈtq2Qt<>t\ڡc0SAA\cη}or|Eto®3uO0T-i+tǭQI%Q$SiJ. 9F[L1c!zG|k{kEu+Q & "> 3J?5OͩDH/:;P_j(PJ'ήYXF'~GہU5$o&~ayXCJ68?%;b8&RgX2qBMoN {:!%Piocej_H!CE|ɦSWPKi0>,A==lM9Ɍm4ެjC d-saܺCY "D^&M){_>:i V4nQi1h$Zb)ŠȃAݢCj|<~eQWQ!q/pCTSqQyN,᳌qpMl)QpL F2G ѭj( ہO r::1v|ћrۉt٦K7˽`!i:ګPSPٔ@5;ȕo}PkڪH9' |":", 1Ҫ^9 -lg&:2JC!Mjܽw.Ci`du듎q+;C'16FgVlWaaB)"F,u@30YQg˾_^Ҋŏ#_f^ TD=VAKNl4Kš4GScѦa0 J ()ƾm'p/\խX\=z,Mw˭x.}yWZ,).Y͆/h7n%PAU?/,z_jx܍>>o낿kg{9𚃚p9wo#z5A׋yTJ$KOL-aP+;;%+_6'Sr|@2nQ{aK|bjܒ^(מO80$QxBcX; yCùXz!bm5uA߉X})0/>nNNXYt\oP@gV ]cӰJ:q';E=-dZB4']a.QO:#'6RE'E3 */HAYk%C6Θ%|5u=kkN2{#FEc* A>{avdt)8|mg嶚TN7,TEV tɧ<Ғ8_qqEo b}$B#fethBE;1"l r  B+R6Qp%;R8PӦUb-L::Ⱦ7,V.JE:PgXoΰUv:ΰdɆΰ (ΰ0eTUgXun[g, ׽-t! Fk qc=Ú,u̹/*]lc ؠF e ˒єdV3_F;TM\jP' HT؋ Bvaijz,ljvK2Zu8C1}PcIr.e'.I(NJ[ۨzHJodo\4"+(Nck!a}P `LC ءzCwS%'m'3ܚ|otoʉ!):PZ"O5M^kVځIX%G^{=+Fi7Z(ZN~;MM/O:z_.n^2] e}gjFX@&avF묇cTy^}WHhz#AE NIZ>n ,Ŏ7Uۃ󂊹P-\!3^.Y9[XԦo Έ')J_i..VՅH4~)(KKC&wfm#Y~!%rpWMEWMjbn(ek~iQ)à/2,?O Y]Q4`Iz_*2coT'ƟlQ.Ff!bpRw@\6"yr+i37Z_j*YLfnYJ~Z~okJX ?A?gU3U;,ד1t7lJ#wՆ;I|p"+I4ˬZcն a.1wXhxDI:;.^m9W_c.4z+ϟMn?!ԫ5H&=JkܓhkB\LQ"<LxeLo4l_m24^3.{oɼʪ~75/nQ?s d|pxu\uw?=QR -Mݞίk@Pc n1æ*m$=4Dbs+J \EƄզ}@۶(ߐ/ۼ𹫘qݎt7Ym݃|M$ 6.x5 TMXbXj-P\jА޴y$j`ROA"EkuS#q * CƂ lu" yo6"3껝I~flQ~NCBX`]ڦÞhkXO _-Qy2$?T3ͤEZ긊mۘ$XD.bͮW`AީClСw5/lbl[N*t*@56."D/< {Dۥ sLxZn$N(lYiV =?_e^0)?]{ @| 6+#gPX>Bk2_@L `CZ?z3~ }[ tŪ)۲-9ֆP}b&x Uhm._O 4m6^^osVЦ+*@5Fˢg'!>$]0 5_glg}릅h:@61Xv` 5DFnx ˭jCtu,R|ۯG8`&ו:ݓ3<:~iXN9`2ŦzhѤ^ MW`c?&d.'[\]}7A[?~R6*.9t,綨 3 6DFe^u; +֡X< paan}7ftJ^%0\?mg5k][ip4@]p6Uu|܀|Kx6خQU2KTǺ.ȕPQVzWuk{n#NWj8+\[ ?yiI~fs[:.۽ '5nWppH? 8>X+m7_Z`V j[ s3nϏT=1:T <= pDCm3-b _F(/f<8sl, 0۬Z"X.~b٦G3TE.֣eմi<~ik[m9뀥!cNIl8y$~\T B "2j*ҕ;ێIs ɛqQQKY`\ +\0(FęRQ hN œ@n|Vo|6 8~J[,o%l%!%tyNO}}=ʬ-'vlQ]m"ifӠ1˟ud9)˔~BѤ]һS8]uBi( Ql{]UcLxٻa,2r(#'CDd2݄kTxn@v7^58þ Ţ&VY+yn~F8I !6WB3C%X)ybLFB%X2U6vw8uUF+X|YukXxVO(+gIQp؎Z{TcR@MSRδ~+1æ|mq՗5$B᲋eY(|*磎\Dži`dZe j'V!Mu@ KV{XץF .Jg< ƜINs:b zĄu3=Az4 u5'og^s7`Rzu-anOIq;6z( rx߅ euPvIɦ7聀t>G;_H;2ʗ6 h6QװxmR JQUbTP2j˔Ni)C)HKE"$ӝ!@2<Bq 2oh80,kNA7,?ע|tC3.㤣TiHEIǢƅaeGF$ u2`d)/-st{E1kٌS*#¦۵_Vu3ЩpRIDr/TxF8g4sѓ{%w .ʕ+84ztT:eEK[[;0(1Q@ET0>@wY)aL5ׄӫ A^%f+[`sb˟(]m`F3 W((!5F-9]dDqL&RΖd}})7 k11 K ;%v'_3 dG8d t#MTU']h7^)O>?~?_ȿM4ə#a&Xi`O}6a-xm`8@;of,![0-7 4f kUy:M֖Esa./zʕy[/ݩqz2¼&'QxJE{cZ7C:?pM z*"#窾+ HsOt۩%͟A498SwWv|jNQ=-[ӓI+mj(^>c/"ɭex^k$# $V :]PGszyH(^jJ=䄸-m!AdEږG)շj#v;#y/hbv BO Iߒ {I7!UՆGIl HƗbd#HAF:iI }+2kK:Sov3b:1)'A6@\2X#Ih9N ̢t-mfeF;gUаQ/ .D%ES*;OLRX[vDb:7a}YF30H #iSpʳ]'_'ĕ -׉6tfЮ$zͪO_sYq+q艻*vzh5~Yy;,DiYTP;o./~^.6+zZFD& m@WXe{sa 2tc^XS?irG#^ŲDI'H_Ȯ;RJ&GT.Kwj;of¬zHmmS2ҒN'=zAΈ\b*K ڤUy""&D@iS=3&N+ǵtX^7ǩX"CA⥎å+4@{D/-:u5I꾧fY iʱ= %lHsd6+H~ Δ,&颒$tSL{yєYa$ H>t~q؈xRmkscXQG~gD20zQ*%iQI$!h/Vo^:y1(t˥C"*FFDEMAƚh $ /ɓzwG1Ƙl"oN:*xmS}V<"dH,^)?CpҒ7UΊ,*n.֙J߾?Ϲhӷƀc"@9Fў-Zm1_tH[A$lVE%BDI yȒv $FO[axr Y#%b Hw)j4&hCU_8xS] _N_Z6KhwefӞ@蹃DROo X"%q7<# '9l%w:9^1ee-EKQ'<1=iUNiAp(-I*#iq&CpB.$lٴާt!jU_L~Tb_,֪r>8P_䅱lw1ù=LAЦz38ckʖYz ~kQRL Q rGQ/ȆMC)vg1Xa!&'0Dp\~^=7jv "8O AfI; P|ޓܜ 8qܦzl5tw@,Mڴg$%82h7էoaz32h>`XT>%)pQ}Tgĸ6Coɲ=8f`KݜȆqDDbZ:B#O^?tNGw\Q.pPO @:Cg9dTcxRk&%])ў}VLN]Nbjgg`d]LGϸ.yҵUCL(us6*>B 2K^ sBciۨvtl:J;quӋkKϮ듃ԁ6Y.0O۾'8V%1M@)uIw].5km~Ҷ綝R(mtV3rșjmjJItHڒz>6nOj5~IJ|~!yKڮ2 h 3x}~ے4WYr9Ts] AA$ұ}21;qbUwRK #}u'tLi'^Y&,mCM)eu㠥Ѻ\a}1:V1zMzT}R,IA e<%!vĉq|?mtB|A ?dXuWLGml?*uTC̶V`FVY>ECmDnG+UaKtȃbeb筃kݴO~f^⊈ 8MK?:mM;ߵoz+O~e3݌ƺ(ܸf)*gCQE*pp^~x܃`U'A~E90t~8-2S󹞙nk56s&"mgVKA: X>7QQ-CDC'| #]Y1E-$nP4N0#C'dvܸȯ.vIH"ŐR ;@~y>Kv{) 9AG ćͩ$.!б~N8i"1KФ\L7/,U@.ڮO?mُa ې!rGHw@56DǑq LA!&mYJ*ixz2*{_;IYJXFfQ* 0kA".mݡ"3`Rd1_u6d逖`7xGMf}k/⨼0Κ_pLq7k!dT x삖A7 u/~&ӄMu.<|yi I?@)XJ7{ޱ?Q]{#\4ZfR-dVaz./f+yGNMGOK?2_~3\z=y}^G$*A! IcuR.o=MZ9zu b#s9@*иrI@*qQN||Ix;I}&ݢ6ɢ}{]x}_o>Mm8S]~(EX{v8FHӜ# D$aǽO8'1lfYuB!6!=?8oZY|Uv`Ƌ-vo|J:9[_v~\:衡O`c IcjlX):_ EeV a"҅4jB7Sۧ2t=t).aQcb^CZ-uvpr!(dvۑ^'5|XOnI-D!PltsHDwQ$zzBvU0h} -_.7޴kӔn,?WTm>C9O i6HNe"j.S֔(*Cj!);Sak*ep~C߶/*~qzJJaQ#-n`~_Y阡ǝ/RS>r,CJYXzv9[ezksA`<dkON৯s|&*pNaJه5B5H:WztK@6MR:Y5ΟUh "`"a ߒ"G̾H nCk(O rS/wvҍu^+Qm0c:QZ].1lcdæ_DQP/2 re%_bv%"s#PCoT/*,:[4b=]N!rV%¢EN$iԱ)e\rxac8{ =CNc\E)7$%LO./!Z&p:ˏ!_Lb a|D>{N{Vt:S4q:i Ǟ/"8+MI}O+D7 P=x@`^_d0Y@z BO2k%x90ݙ^Oe ]nH6ͦ+.4=@Nل̀\쀜*/]=Wd<)L#AV<eq1.bKʂnR]>̤+ kj· dM[aVۿTQo,?9mw*n\7[cpi,}%juWʝ7U6m%к{QWz5W߬V}Z۫BnUn.Uqsy.?W8gqOg-?[~,;n9 |q|w.dަ'/q/E :Xd,RLW"Qd9JogT\1f3@KuJ'@B x,kA k ^d kYj5Ah1T9!(*t 0'b@񲱥-kc6V'Ó5huՂUMpa.% qZBh]Q; Gd:|ؐ3$ "6meO>Y?HELkY<ZP>8YAC| w#Dr. "h l`2@K$`#XtJ^ zDpC65]K[r0Z;`^ʁ-G$\~&Q;e[Od  ^g0uE~ۊ$q9`]T#CJ1Ǐ9?M8]o2seXVt=e!`JU#y8@*kI0{G\ 2v[{!fRБBmLaCfKywdgb񾍠z}(.>LC,HI$.ObKjoJdO UDp*cj|>z G` |]}4:nq!`{ qBPu(DihU9P!`NHɩ݉S-^pşCx$BBRJ@ѥuȑz.#&UݠmF̤@U' M6MY0/r: *s5x{gsə$ԙy=Ejl1#XX۾;R;+[M&XieIi5lݍ3)`xvcZRݩJ$]>:YF2cU(7B~;Wi+f{v-_@q.+?D?C~>_)iqT 8;DQs@4¤>nl"jec-R9~ {^'##AAwLѲ?VdJ.ԫiE׬ȱ<~^SP3̛X7t_"*l§,̀+ å} .[c&SX( Qz~j@-E} m"8_hץ|r]^#}8@*AM.SEh:Yn\~3:&58*: 'S!効<9ViCbw!s1CtӒNc_:SE Mn޷wQb0M^IFic$"AhQ|![NIK q~,Jc%+8h&4II36V 8Vbv"wŏݙmn&O-^׵O;KaRˠ] ?Cֲa9H] lX9^vCο -vd+OU')X<seW);W= 2.Kfs%(DA M.1%]vato Px"3Pc 7?_SJ1IX liWv-&W&]~Xjz(:W֪F\ǔz;m|Ȫ~_}FI \5eAl ׁG-3 ;UMؒtqg%:W\mQ!%8j0dUo2=rh>*YȴU3Q,̸*E4ߧ%&UH`Tz?]ecD-~V,.Y}r|%s5Www` q:cdʰ H`X;"}B=-/%!C`@ шv1\h):=m%V RD3Q{]qcfՅuБ:A ѹ @˿ޗ#kˍ- } :OVA#&<6JJ8E88AOecgXdy-Ad)R]n4Put2 Q}ppS@ώ gƖ^̹ʇ\B~Ie^ CZ$."T#yrHhlك&ٕEt$d:z4 \NR#cDB/eW< gb`:~Ihl"য*6 ny!袰{"./Ep]В|Ǚ8Hi7.cZ0ϕ!1Q%ƩJ4^3O{!5Z~wˌJ`.3Oaz.TMk9gk.T4G! ^O7E"`W28  Pbn7?ϕ!l>g/OEU:غA>?=CۣP8B\pup6_3XqCXz=DH9:{o qcMו8`n{j\FW+[3= !YWX Z}|>TT%9d-9Un# zLd%~gj|Lcw!ޤ&\Tru{//q]qY7߃뭽*{Q=^]rsu/*})7U6eq\^{Q=\ɝf-nn_}-򲺺VWKpx ˻ ymGCCc^ M7ۊ$W߆Ķ\cEyGd ꠇLd!>_&AJa/mg_Œ8bJM6MY(3ͯl~l8V0٨T zL{Ac:&MXA6n 8Ugݗ^s RTn|RKm;wgRCsT~Z*2@"G{yZI>UԞ = Ʃ%>HŋrwOt:7MdE Ͻq*n{۶_Pn⴩qķ>MSƒ\J{Wgdɱ398M7yBU<42;JbRV2OvX&om$MQbJ/z/<\FƏUX4xGE)++mlʱ4ƪ(H""Q$hW0ZŨy4eUo>7[SYq{\~(ϣB6Id Sp/TZGe0VXMgQ0Jg:ьiQG/iBwEf지}ZțqrC'X% 2~SR퀧NI WnX[__a//4Be2"C1rMÎǾoO/+1[*chcEad9jő0&6 wL󟅾\dE+1cWY{z|9?Je ٌ}z?U#FM6Y7{Kޥ?ZxfjjZgֵ hh J2sp%`䕘NN۟2C̓Ww?<)V9Tဗe.w8 >e/ iP_ 9Xj"_x]Z]BD4I;3  !gi6?Fo?})r[V_۳6k$rGdŃD H'"7L`O( u^^ь E] x7Y?>Nik`>˲nMi O&ȃ-x ׮Sw!T^d>e*̮pSw2+y u)#WWCj5n8@u|vmod 8mu%i=ŢskQ\u/c&xNDKX#)BhW0` NӉԘZDJ)<}3EVr ]EM(#G @H,qTL bq౟4 )JJٗyU^p~ܞr]ߴ)shtt+I"e+'R{o4{h\gG@h@@LEqmE?t_At2ڜ(]5+%v\ArHcНRQ-aػSṅ`6Na*YdX߸j^u Ɇ kU3Lo'(meD{ ~FHٰvOvU AnVyp kxe-P3%\MY +_PHw!}|x5_=D̛>B{5,gںj,}cGbIy m-;u`.S"צ*[-ܔBKU{ZLCAYU"voiUUQq#k p zovAi59R2Uʵ #y"asLUj)V׼Pdol ;74l(Q#LW(&Z,5"Ø$ KWOwțJ\ Q`d'Q< j,ZpYѾZ*SPh.kn[ d nxʄL *I1ăjY Waί,XFbֳT͡ԗxZUF GucT 8A喪џBZv't0ZR5 քzNW=ߕITmUH۵Trzy+b\C;prVy5>L2wey Qq ,&ہR2<>ҴwЫu~Wx4 KԺ&*gh\W B^PַJ Wm=[Ъ 5p-drU]G\v Vڂf%9p][ES~զ-^نJ]9WR+@.S_y5TU­CZ%H+oHd?!k\ VBDMm#ݍqީ_t0 otk4z#dK$TUhDϲw$ETr$'˶W w=z\[6~dsoaO_\ńQ!\t@,eaܰ %yr;Sa6 5sW2pWpnw g31NR ꔗ 0a(!,ʄ}̘Avۇ0Jơ̖c['nfF~ `F00fy+`Tff ;p#zqgLW?kKT;gaysOD 3@"0G3wjtc8sEyf Ps\2Y,!T,/mkxҖJ[Br^~qY"`1P4m"~0<z2 0λ}'1ZߺY'3[wP" `͘7f9ȳCbl:q`̶CC xo'xN~%WI!W$ CUQ]'s'tc7\=lԇ06Lu;sz, }pSF )Tkժ:5_85UeIm/xeCJ/"ϫH#>$jO(NjdN/bF('^|Y'[WlಁIt+ЂuXoK.؅j}pŸ}rL?=wsR^ԩP~|Fza2$=)b z8B'_i Q(nxāD/Xs)3whD CsUG~tC{bg@,n?NJ(@q{/쑭NB,-23tH~Wwl7 j^j¾EQ/o`Eaߢ]P#H!զct2} ?y" !1WN ~!/M&S`>ģ}F0QOKÝDџmۥ{7Dз,؋! q@Y&Xb=|TQ[) JNp!އP>b c;!//^w'op2ﳗ| -It2>:>B} Ʂ7wyaMbݻ_c g=⸁OAgsSWG zonW]FpWNq܍Joys;GIq'~ Dp3 `[OFQ颟͛GAle= [w5F:D,0ȸq RxolՁQij棎c[Rān?X<lŨ xԫ Y װ E.  y&wa! 5B3'Ӑ/ĝ3d&ϐ'z(vQ5f}7ġR 吳6r}gI,B/h;4R:Jh|pE8GDGѡ7DB|폶+{_~YżCf0BvssE='K(bke %Hɨ3ch5FpF-_pW yfIy_Ur/AbhsVI"#rH%SɑApv| v^} vXL(I&TuMt`zh&iR\T%۷zO3t"`zjm<H( x$vx#7v$a6JyAn"}8PbT8rv o{3z3 ikZz ̮`!bj@FރgM\P Rv?MLML6J,Tp^$@j8!L]I[dw:,%'gYYƑ"C ?ᤘgL/E8;fptB*z;Χeu-)/cǜ̆ESG\7ɻ_f2Ep0*`[l>8)<] "?AR-CèD#oW-E0Jt.<}h~pBuBv'4z# K\'%| eV^ψ`b;3"`Jw>+<OV.WE]Vh:6.rAҋ Cx'Zc 0`L+aGe^I4#)졣zHT֘I?`ހڞ |:p$:_C~="HY^0?&  WEʼ?# 7@2ߦN <89j&wµ_<1Ր_rQLU%u "yUC,T. ,i!?qϏuϓ(E2AhYDfS(LE`Wv!Ǐe%RT) Lv+1:£_?a|lh[WMH9-OdCa<_AE}7xϷb pZ{ t?$PjHa 3Ԑ*@5mg7dYEjD Y^5?ޟYB[Nc6p e&{Y"pGh=l\Vi|pwּD׻r2+%P:e>ͫdb xwSi@O6<2rzk b(, znR ^dOI&KA?o0ø蘍h*Isx^@ݳiNb*+B!o}4ɼq\"@;'֣yςUx Imz!S!7Eth l?s 6Z Әc,@{!"&z靄toc0Ɤi;G#nME =Yg|;sE9,|EKc20'bc6_JхT|(_n[TbP:?M`\hAWjLa~:[# Ǩ@t%b0kK|'bpZЎp]V_mY%32 r$CHT*f ƈ.za K;Pqmu8buV&wA..h!pzvTҷfXXd4ADsVAƐP3qÝp>[uQ!m\)>>[$dWqAWX8LD(YDIh%k"3ܩF˪#o zܯsDzH[t @8_s9 bL^{2qs2eeó#a/GU6%CeQ#kki B_"xeWpbW:NI3nZΊ]ZiefE.b =.G@sr9D-y_}NnWֲ`9#mUd ;Qnkw;tQٹ s9L2'Ta4۰D:#Q)g欵^iJTvMಂ.z?Ícf;03|Xt2|C޹*d)_B C.Y)+ ) *ѢHIK辶HD\xj©MϔT]~~1|w׸Or~vP^?m֫n]ux÷鵬5 % ɻ?08%=? U}ݿ#I\᳿^'{?`iF +~҂Oqc 9S>vƾeUzﰮg[`I9$ĽW᷷J(V&`^R:^2cOI@<ǎ7`z0:VyaiR6*udU\m!#߮>aĜ?, vHCKSs˒Fmu^[OI,G c.b4U$hQ,Xu7&V#>a$&dٗ( p4\8fjVdcê͢Ŧ"i4N11?` YQ6.mCtcѸM],y%k sI\aXl fZZɔK:D[V!BVm!qҪ4!$& )RY׌PqV#-aǺ}F:bM'"Ĉa/5=׏Ta鐕ޒ/%6r &m)F_bszޢ(X? ^x+X҂3/BJ%BCiU{B{gX2)Pv#oGn(xYS At,PrFp#7>7Kc: l4L +`2Hb2_ b  Hդ3&*8nS<[bKaDF,Z ~S`i+|1%BC# 0+:j%hn )/E8Q5lے(s3_ q .!{*w eHz_Erpdl+ TE i~Uxs@Hb!bgEq\t.%PxDd*ZpKb$Zt_\_z_u"IkeH,^/QJʮ2$y{-&Mܱӧ!ލ`K+F7X! C֮XMlپO#hZ)\D_tkT g)@_uyt X/,j6Ud F mW/cDR bƶj([|Hпo'­D8- `DU<˦!ACcK 3P"CTg3< LerVH5@|xQ+]X&؋$^!":/qkMZ~leO„gCD6[5YI[fC I`г:cf<咽$@M׶OZ0a3, "&UE,C@p E)8(apjs)k'ϕEԐiv$i1S.{+Ω[QևqQa"~}cNrV%S s6Hbf$u iGOtkNO2j"I~-;߉CG\+(GdWǴ`NC@f0C6{o9;%rܜnS*6L]̥9x爍.y'9GiЃ]4UB/s.Y=T~?Oj./IsBLw{F:~ CqFXU2xa u43BeY%y: ו| 9U1!Ed UyL 7A JJcK!{J(L c !$p^pP:~Yn[8?Ç}{$8y]O1q?z)p\,ݽ^J;+)3&n8/IpyIjO )x׬iB5^ VRjmq)5M]_ hI|DJ jg~zQ8?YHYxDDx2NqxM0udrGv,U~Xγ<9}ȃ7*΅k>y*TS B仸aVOt0 $ض/ॾS?v0ŲPF+S6J-,tEL((|a$@k嬆N7&ٛ0s bKx`\n0Cʵ%I9uc/$8-bg8ZC $70p,[FIc"jg0<)Hm GFHmALp~ԳAHHgqU(ɡ< jbrx !L-FrH)Wa}I*eOReہGtqpfV*'oe4#IP3zF(e=k3KQeFx"$ҡ.Fs]d<2JI%](!TV0pnc۩#0k9#ɛ YQ̿>&PA}gg$HyU& &&^Ip=38*+0aSCLA}-uGh* Σ+fS!h h~ jpHʸWhIbQ6L81BRf]еi+ EEBcòJ%%}&q W!sVP29RAT-#3h~cEJ]F$vԉK{<.~Y`\"]5FҀ딠w^VBޭP\4wIM\o\H&n$$i3w16@Q Ā΁%;`w$UW!1! `~Y׌P11Ws6~KjWC6 7͚~SS3 '}!-y˟ֳb9ibi0j]g@=AOC}Å}A]A풝Pr%!Əc7KⓋtWp4ɱZ;ժBds!F[:gE1)|sKFU.8!Nύq 7Ɨ\PޗGvvr7./ oCTU?-2<{JΊVe&UxL'g݌+^;4 3Hmb)Ek $]hd˿lx3#\`..U}p$8Sܧkd$5(yc1b:~=$0o0Na;肟_v!\~&&b0ί~"S򹩝9 fu1:s7J$UÄUkU6 K' nZ b)nxL)Ū؛*ܑ[` 'nL}1s8B3s}?>ZtN<0CRZNqC՜ Ib|5*J1!e9Q*v!@q=Z&">_Hp{-wYQܥt hļhN5m$qfٻvi\2B>[qD}W_'ϋ/~8nk5 AUM7 kVKz\AULh0;q H/rd zY~_v~c|M;gc+SȠ^shhȣ!ıEh&xet\|OwOmp ɎMp|eamiPfI{ {TL" ,1}`(ꐒ[nT׮Y%~{'d`_Ggڛ(X19hI5?]Yo#7+IF6E\fF:80X\څȒ={XUJm~pRy$9I-zfҧ|BYVm%lcv|nHbx[SuQ)f^G]xP|#MB6W5Z.$l3Od& Lv`˩iеf|(~?mb_U,1q1^0~2/-C>B 蒖L7.L:v>ʖ"SFpYH]r F ('ľ$3?4(uųs؈Yp.QGZ8Zd4G0dhGր"EvSFI*h:,\z8@'UR6"E-MWNsK 99FozR?3ױ?s>lj4d1)|~4:;*tLchy3IG?Ӈɭ|hV{ӟ\x<=DShp{a`[,s!'8' П¼ʣl"ڏ'osP^ gIbof<%?흫f;>:؆G@pq*{1Ga|T ؎Ϯ G+!_ uzuLMÈ)z.@x/jh :P5wjD/+kN]v4J.?d&pBb_4 \Z | =zhn(%Li(,WU|e^\Kj]35/ۘ׳Irx]E\ !)V[eQ%CiG/NGhz svwK&"Zi\Bxe bJT,UHA{ӬQJG`f}K\ ^xfbӿ>C7NlJMs[W+Ex;)w>FB 4XPܒS7w@o$t4QR9^_H[&gn Lx6 5-@_Npr F'2LRss;.2Kf͓Y˻m⍈dEI\DAZ8?F]Rn.Iz3ʕʡ}S{3DrSU:-Wm(]9L=DY~f!ڔ2\'rYM^q+A10nt{7^+Y2YU+rV`IĞ#gehZjDZWtsCnF2&~qkэ%Qț0~@(i  RvkZ%*\|B *#*EL+(bZPߦL߲☷s*ZBU L!L[d@=gVmcQ `&DVBil ‘j7p \q "L MQ3`Dt VI Zw҂jҞhivqU&he iF#Q-+nc\i1E{]lQ"U u)]"2>A| Zh7cx26R:NEar`c}qZEqPWq^bp:9hܬ2/h'uL^<1(Bti0ļr#Z$$"_( o'\{WLCdRS\tJ _ٟ|WƢcvyBjj+ɏPJ|js;:;=j){ZaÙ:ZrKH *dp,hwO)W ;10JP7>U LJ.qKTXg!gㄬtvq&[eyԶ R Re dQThU2NU<9+U2'nxwlA=`ÆE2u*q<;C'i/$=sןS,P)Q9V9eYsK"ʌj]xC9- hԬ8'${u~xxlnsv UCMcb iSLwP2U$V9RWW 9I~"1=x6)77-_ ZmEN^םŶ5ņbИjKK4#gn<*^M /u>PM+0bҥH2hQ<\&#L>R]ԻRDu q峹B+ʼn}#+Z|@g4g4Mf9_ bEMN`qb x EJhxڌ8,1 (Ūz;W)vRyԳ|";y9' tz ~=pfELE9R4Տƀ j_d%sx|y>\Z7bf*n'8!."]{M䗟|A˗.z0?֚_li>> ZuSNGq0<ŏaO _%Qdr VLE7z ~Ϯ G0E Ba 1W2-Rwh #{_f?+dUԗj)\ '3hKMoW2(NSv@V{wD? |sg/8Sd>xW7M3<%QhC-4m:$S.<Hp醁›tUM:rJ IÙA*H=OSl 7Tk6 1VxlˆnP GUN#bjsPṆ*Mh205{ 'x\Ǝb`vllT.l@|"tmuH#LMwpٙrLMjY^YCX11 +ABЗhrQKf-xbԠrL#TrAGu/5ҧ+sZL_bq!CTHhx}_24X`60D͕-EЪ*|\_6?>,Е$MւL*7#Z1$XgJR4ręPYAVï\Ɍ 2p9 JՉ.ʖwPc)>8d!B+fL Z[C@)5)1P&N0̨xFR¥L"߃R0-kl<ԟN&^S02 \0 ASrLJZ&XdgZ`B>MDK5>ŲU lN7E j9\6C`{62ܖxH!&n5i7h赇0ӛ~+Cwd3`?sh8\ƓG['%k&&9p,`>78iیӘ3}31b5It(Z%Y3lcM 8 kFDʢ 2m*[`6D G63ʠ5i58VqynHkytSPjhs. Ts&)ܳ?פ!1^zSKˤn7^2\dAj(aaxPĦ&ASzC頌2\ -B)IʰŲoF=Ͽ(IR|qJ$]4C|IZZ{ u SCN ]$fe##hu^& ƍ5^zD'5z9ֽ0p~!0$M$1[gxb,(^o-ٰz@*\iaK .t50HBq }єJ1J6R|G~>%Jt ZĻv`=Ά# &PYJ̲x<&[Qn#XGiBuŴzl+w`SM`qx/] \IC0ikR҄9H pK7` #|vçQ46$xMUʨiHRbh Wa3L  <6Ov×sQ"pځ|wXOux°r)"{(L8U ᓢ;!XPڥ[$"CkqHn.سl]Lj898&AU7f$ [D1 -"ꮮ2e9!J(!eJFZ,w| J: CACTRD&RQjPnX)x(7  DEw0Cw \GpI@&T ՇU`bMcXhbphT Ž'@j̱&FCH;05jmk6%(S՜ 9ES eTE"ơ߈9w" XLhd0JhS)K蒥ҧp_X+L"]A9p̩&]r 0W- `ݟK#gd 5L^Z[!x7M 0+G^ YiUqyE amw'FDMuRg8PDb[wZz<Y1ք}L7oe.WX,fA`+lh4IaK#xFM:l7NjkqYj8:mWGPPQN 5 *X4 K:Y.%],f|11y@X RTixlQ$抖 &ܭq,y;p[?V }1"z4Uo3=a&M A)d|0>bBPhh9ImamzafЪPgl)z/)܀OŽYJ<ˇ7d|[*121|uy"?m7̕ mH4p|栆/F*zSA/O9Ϗkg&]bő!{4HXKܪ`=byK||n~|l]Cqv{$ M{كW1\YRo/{;vsY8 _=GvLm6+.{^d]6 ܼ}a(hѸ(Zz ?){RDij|D=NXs5`Q,+k҆?6PTKjpNc%*zwb{95\jBaͺrB&Q̸z971/ ZR`}JEop8(joOd剠]'эL LR1@}xmJUVGpzs{٫& DuzCw❮QH[0%Bqq`p[sC+Lwv"Z6Ovh7j5\ L;4Ĭ3JuRY>R|?ŤB4cO u:J$E q c %(MXb :7"IYaSl)+ &WgX=-9ըts`${0AE8>(+7o!Ysa{g1X9%In[.LB'șu8x넞!R;7Df'\֏"pEu \bD;~6qd,<=ӃTe!4i؄ |c\&{'NZ 0-\AZ^5e_Rʫ]rfWH_!O3S$(>i2<ӋAYf}Vp?y|b^w/zϊVR)ߝ#T c.x4F` U`s2-˜˗^(mY'nVqֲNPW`O&Ȼfتy61|}5K~BXRUum1-1ŜP~d}CJ%bUϋ}˚^iD j2f"0ث?G&gTŐV,04E{m@&uei@^:&7 % ىRkf}rm[ Ȧh-VWI`fӬKgذ_3Ώg恙Oѕ3 pl2fu[^ Q91Lѧ4I*uݾ댼hJPF%Ws]~nT ;{D4]VWU~t5]9<m iM\a?-H7K BwJĤP$lkV8DZ `v,ptJn1kco<|n"`%eE}03 i#[m@;Q{!t(5Z̑^QN+2ܿ;>_.Ɂ#4[1K *5`1i9z ~W\95PT0 l&$`'Q0.'ȹrD)t$<\L&<xAc{.Oe^eE8+ϟ0 y`cPb4bZC);l6Pj$SDmm(YWWh/.H)au@H1BRЮeMr.Dc'w!Z s'wbd]?xRT\5=3>7QK*AU%o`"ة]x"<7ftwT ,%=+ӗnK6zAtS!ȸ]v:!L%'/vªSMɾ{N^\znoQd3wtgJgQZLڢ{x+C g:q̳ CN&V"R-iQ.YhC7)'Z_b>gwS /Ә}wY\w7W(ePnF 0 `QՉ__k \cV󊋏[ڗongc_~~׾a1v҇+Y>x|*dL|ίƟ Wp|Sald tAc?j4)|"G$+X*)X+>ƚDY25|xP_ys fS7' ` Ks|C҈q_7ʑRj`Lq"OnnPGѺ j%EP{]p&2P,rr6l=V(+If_^YK4lo{S~إl<Е&!.@JZSN گ`_s&.]TCɼLUnz[^2.`\$^Ya8杵#oe#+͌Ib`_q~1{G&^3 *z v0?{YB50|ձ8 C?% E%g~B7_|A^SR|NO) }h N}uƑ96qmQ} =DUVMKX0;Էͥn`V-o)i`,Bya$ efqd!kB2n zl3kJ#1{ E;z>n_+KB^w)_O&[8%ҽnR;hZ⡈cpU 47:@BPbtLQ8o8ewk3IZqܖy(o&%JIE%EõF57'u?5"*ʿ&a1֎+٫KEuI1]O#A 粒&k#8wQJ!<4'(ς/Qd{oxD>_焟O9pO5`X~^@M=yQdrH܄1~%0l1%R EZ-}0Lx=)Yq'ŷ&g/~l[7oyiDW1XYȻ//FT`t"=MazE߼Oύ"IR0ӎ un(pi'rf1; VNwՈEueɢ[ 5 {t@X^bQ'KgՓHo-[?屩Mm]|,y 'E?q22GM mnrx6瓥x#As~c/:1IX~̥y7쒢<3،8 |Z/Ef/[ Pll,[ÒX_mϳ,_t>Ol]I??#"pIuq2ꔹH|UdZtY]{u?NO06_g-,g-! =.hMĒ\̱,(1AbX'-u-StEw/2rRqzcZP>VOSŷϗz4jc\Bqh }P.2dgyb8 V"*f/V֗ma<4n?]jJ û,q[RΊYճYRlhlU4僮uɪerMr@E%ťtVZ[lң灠ph4Z?ǣ0uM= ?_Y޾mSϫU<5UM(zM9˛*hlG噹Cϧ1,2z;a{?|];i]zX ^ѹfN0GΌ,_:  jZr *F]!gmjLQ%4T~IRLFԴ"b^h;`ί=Xҝb6M I@ZX@7hd"2q[.XK˭4$Kw-"p~lg~yUXVv6kӼ:; սIv<nXFvR+u78ȯͅ;Hr Q`,J9XCgkĨs\N"N`G-B$ c>+S^UI'Kn'gnuTx4jIMx,ff s!818Z8r< fboS\y[N,gBS1Jqr+㹳(c"0 _!)x'|uW6@ԭwQv NkevJs/Իv3dL)~pz;no@WW]P`J&EJo9gfT*I [P;nH5 FʉSOXRMEE@ Y$fPdTH,y4UEMK7Z@#EKz>U ʀ<ǟ,m6"q}ğ3 F)&8!>i !b-3.TK-S<7pCMMiGEڭ"u_m#.INh$o_Y3X?'X&!~+"Qdz_ fqPNyע'Eo6ee闫Ýt^I(V=_ @uN孎EKfd#de#A6b'zq1,HyCU&\Ewe":mR+jzpt3wɧ"KLqze7ZlP ),Q^b]0]KWڹNV\,h,yW`;yMuSm wB?rhpu>P1}9a<#wZh9ho1Flc3 !"LA319E"i+[LEٳx>eYQj ߚ4'ԬphU+zͭ2Ez>CW5xî"]+|ʞʌ2u]oQV{py Lf3p#KCW{e_v5Z/(ֵXUs8ܶ9ADurwêmV81T7}V8ͅw [ gД3orB. Ʌ*Y)k-gt|58IVS*>>Un/ U^5$:cN@{Ly̥<}IWftmlaGqqkK@"l)Xok鲫 34mYǓF3 G_jBϖm/Wjm[jX)r%Ȯɴ`)')9Yٰ2XGd\_TV^qLs?zۗ~/x{N>_|?-rtuvP[t`鿿ihniާiQWӮ;TɖVyV!0nz8ݗW˫|lOB,g:-w,ӻAQlWUlB+N62 ` !h'S%GW nӌIr6O\Y;um.P|}8֖W< `cA+K%vJd:pm$("7j*'vs`W[Ċ{{]G|ʙ5iw{:p qj؃GeG]t>IgR[Ϥ*xveϨ$8!FnHL[MvUӽ<:Dq}99&HI[RXjvD9"x d9/koyYv[)?ÚCnmFP/6o ~;L3G~.]!OcB| yqNBrmyMG4ɿh'>r"(LkN)'ǝؓ_l/ᓎ#Dk?8;/2WUVn){Y- >4- IR9Jo }ڤB2N[$YB p'\&vԮEyc|Y~L:&z&ѣ (l,qa6P̜f;Ƀ&hk(7F1a.lCxYiXRY `*L,*P@^\ l]H"`6*kwaK,G;31 RqQpFL8MY\p^($E=YpOбEˣ5iԴ'a{P\^WlzazE1{K:o^lv6}6s9V/}$H/xӀ1b9ĭؐP>eSsu|[i=q3YOPzf;&ZOh4}bu!-2h!PI:ysLFJШu>2^1׵&PԲE' 7J =gQZEUX:<"g6Ƀ )5FGτV()58x]$2gt:%0 &R0cb)nڦ=V_lè7Ykchs$f~V?X {G؎(5x.rO\`*D Ϲ݉4Gu-kG2&gƀN%eLD&$S#FkuNTIY5 ϊNu01a1NOAhKWc>F6WZBr0gs4^P J\q2Jp䵦VDCGT<vq,=x7жUo`'tFç Jp-W.\Z>]{lN N]yw[Y7:If[hHP{0E0u;%#CLTLyiCm$SGoIr'O~V% 'id8i`.=vVqM>6ۯ5lo]Ctueo4E "NqK$2%Ri h}>o_- Ml1㯟檿0fE6ZYm0}h_-IN[xP12fEzp{ 7U4eW- G3x_HMc!ˀ !BiA bB.tOhW-˱9mb^ŷ0fϘ##-#A;e 3CXEm !PԨktW|KJ#'Ë"$'e\(yϑ#hgx[ַY e U|K*?2 ’c($a#ǣ@Dp }iP_-޻D{{N1{ʽAii9IKLaLFT  '&U|;JFxA8.կ8ٻ O~vzi<]a{p*^E_rݭ(˃ݯ'7z?q2,B^ > k( G'xt2s`>,ثp7zp d1 %}V2ƵM +snCq>V7*~}^;l E2;K[-UO *Ų@ >϶rUA>.8٫p''zz ,YO; 6*.`X'ߖ%*~w^l|ҍ};y䕥k٣%SWvA<,᜼{tߢc7b)bԡ&1mSrSq ku8ٿN浗->ލQAhEdU٧zY?&C2^:Yf]vSu]x.*\o&m?4q#@92L(ݲn[iވjp#QU&XwmH>y fv p7U}Cךo/Qʇ"%A'8%g qKA-2f9o`\~)cI3K|Zgr> ldAK9Bl)|"#5F {ϔrX~@jv1{fl1si]OG6bbYezw ?C6ø!3V$9ѹ847ѳ8**+Q:7 ̍\19,5PQ54o.,Z,+tXiwmF *6x<ȄX+ EF^5>;[|}Yfg=𸨾 ?YLг \upȔQPYf ψQ򬛟4A}U;Jy1KЊ{K$J%#..11vRuX A=U.< .O|TNOk7ìzX}JhM{>5@,(x,ϧ~ :C$ 9c朙RH;(wV"M}ZeP9琇"RɈ+7ԣ*%ƜL UsG$zTIs;;TN=^8?3$5^²zd%LG߽ q"Lc}yʔ =é,Ce$])\Ű)NZ9)5 2)pC0Jcu>TgY_؀7ä(WHA,))*Q:,uْkTw[ KFw)h3lK8Z̧7RB;d%SGixUq1U@y\C5o^|r~ŗdE[d[rU++Xn&t *:oTTC00ωf(E͈ B]4 !дfWI 1BP %.l-6H@ѩDD2HOiYh: 򞖎t=<VH,0n]>ovkC[yCfW[ܐl٠%M'B.eSoJKRJ4VXpUQDB͜<ش c_ZVqL+HۗC߈*G1~TQp?"#~rM˝ymvҌuy'>Pﰿ_bUP5%ݏOi 4OQpL%q-p԰/RHHxA(:-^B%"﹄c;4:`u&]U^pqQ* DQcd^2Zf%zSL]ԬZ-0GW[ku!BܺQm 9"qQ`FNJ42&1J,QSlH\5Ev:L'$ m !nm:5NCF4|Mzt0aZig7z7 :=QMGrt;"%{r9Be%þˑ[i!kN  f n{vP̕+; Al+,S oq`82n2f@̄KAmv2< NZT[ 8F͗&!?Z7iw[P@ i81RPvLbCby_hfT-VUcPH^= "-f"UɅe)Pv4m a=#ݞ ȫ1 n#]yA=]aH;u HfV3ǻoҴS `Zdd2eѡi"#VcFp*w't_.!F/0utme*8ϐpF/F/Q8/TEF~ DT9#lLp~bw4D<tbqJ&W}G+ڪRcB@<=A},+,*4<#^-t4;~WJDH~cxxi`!'gɄ#w(u'BfI($9Nul w:ﴅQ8k diPYN"2!u!\%!{H|S; Ѻ,5W|u`D/DERŘs>PX]D(Ju@2ǻA9ܡ5/To%'ODˠQx4{RA=UxU!\i"#Yx./jNl p0^}=cPp Qx4ATΉd Vb:p`j.Rb*EJʄԫcԜڱh: *4 TF ]ڜ w&1Psk/&@r$:pDW6@:G3s1ܙ!)6<$~ &,ʞ=FrmdY?WVHy A}MPxyס;Wʀ[ddŀ魗Pnb^پ=j2T.B{oƨ\BG3+սEFƨ|@*R*Rȣ {j׌ѡ~xTPAB!A[B51VFm<DW{vAU bgXe(kuAo6WmϪQ1'T371>U y M"a7NBqp1IctDp41tzW ze]p &"_! |(ӴZSrN<@pJ1y0 C-u>p\cm'm|-2rƹ[ǢHu|rRiԪ&^X|ɹ=qG/WgZVl._ۜD-NX]%/78[.}&UH jKPlm62]-$!ǝCqǓg3A8Ҍ JSoq##0ͱ2 8qUP~u(<{adaeI6Ā+: ZMޠWl1fDWr;b2.t8ey=[,MZw7 hsBS3/@xʬhD:t% ʔa'l8ʨB^"r1,7/ 5A/-22F{\wBfZߣٻiU"6SoVaKofߤߦ2[OjjerǗ⛛\~MY]i5NK_&~KIQ >F?C J F_&~n89*T9 aVe2~Ԇok"8!M?grrwN'_o#Y3lp/M8}OӭUr^'~\jPyA0EDf,j6Bag0Ǹt]-eRLfѼ,֟o+RS2|v:xusz\AOezG{U&j44v7ݭ ZvwޖC[f6_VfVf~4o )^U}>a,4· 8ր|xz0$?V rZ05idD>ؒye$Hj9 1 D{,N$g=T ͊4H`f19N7 \&mژì?;}'?x~8;gw?hvk벥}3k_Ih-=mu_w T ;0Tv0mVR_fӍNNi<6?N?4Y%v4P~ݼ+kͫ傮g?o`c|,MJ&\{)6kjJ1*RJ(UoH}p1nǺI9\ t`~??؝^Zm6i1(~RӼnc].<2?¼=Vgx&b9y.{p!mK Rޑ0i?,xr|/ViWki{/om[kJ: =RkZNnx %/jq0};FM9J!~"R"T~"zUx-I%Z8F_DksJAs7JtXKqT6+#fU$TT]27bgTFڼYO[w+ZY5|ǷxK7GWbP g8_~T}K*1qbT YǑEvbSě9a}vj˒K].OoBH)iCO+ gD>=0?L2+^zKbF`=#\ e6SDXh%,(r"W0kCP} !^ ѡ2h9ATn8E-MIsݔw4Iz$ up;֡'j^ ә eQdΑ¯n;08=cШ:"p;բ\$=ñ(>!5~ a<'^AD91F3­bRM3f+1_،f:4gwJ*-[QG rS뢩9;滺+OlyzcDj $ rȈhh' ;̵㹲BS2OLl/qi TZvP=̨IzSy;]oԈ[[k&5Xl0Y8KQE@'sY,&O+w4%%(,|L&-%]ia(Ʌc6Y0\M&滺FAeI{Rwja8qNtak &-W;mQ5B7Bd9ݕ޸F #D8e3x{Ti :~ϻX{pX+\HQ*w@;iaWBIs2ޟ,0IN$0 ZVyavąnWxziTu`>^PPeS`C|y y%ULeP~V, 83cCB6Չy:݈7A?b>%pHtSa;遥#s4#pل8pxd cGʇluHDV%wB%7p܎蟓\ŀOHDE>~ڒŻ{T=Wrӆ9zn~ܷ z'E J(6)6/GP1&ITKnK Tmnt8!=Z[D1*.*[/>m.駐}N^rҝȏ_D-Nz(Q]UN$שb1򣭣 /fs}-AbbҟW-Ę~ۺҊ%`\ 66AKIh2PZ̉v-]{ A`bukhWMe )R~1\g$?S,Փj5,n˸YTy0Eh]#”%q.Lߋy~ kу|LX!|]TGݬ˅X YQN.\,9qre>TDAd&FXm5=N eU e, nSaPN2>!(HX|X)ȵ`R2/Bmlw"ЫD2e\Ö1L%|YT"c)J==q"HRdRA!w'>+zcmq,U* ϳS.Q 6lZDzMˇk}09Q'xGp6󤚗qs)e>֬UXVdLUskM:^pE-Y e,0o%}-u$/uXu2 ɘ, H1 `Z)8wHj[t #1m:BIΤG|p#=oˈ^^:a}e\QE:BJ5D"fYQp0ԕ|gʢwЫ$8e\,VBa>uVuA.bju;V 0*f0#=pspC3 VpcXePy&3YH#8Yx9x2 \ p5شGMl3ϒ pvu4L󼘵GkN!,tjY6"%\C maB)v5%0Hg5$3>T/x[f]y'ec^Ŝ! w9Z{ozI 0V{oq -L_{s1hjefx<*Ⱦ,"ee3uA-F^o <߼8&ZQ5|~N()ad%6# q;:jք.>{W F5>ޗ g귇}0-4>>(,W Cn7`1xxyg͔l,cL;+y:-$:h!j1Ir73n/ЏNuM,WVm--/'rb~7 ޓps z<9~OkŁwMl0-0][! /ks:4k*x*`YjxOVZt[Ȍ&ÀFp6<@)EG`>GUP}ˇțs+:h >>fU,.:%ѼhFǗ~6w` K4yBrJٌm q=lbOۀsyj'bGxa=9mI>MP?1a'0>9QG<`]̏{uɒ*N$n U r^d5 {J@Fv,y1ow͇bd3`(Y/\9+% jwjP)3FGStLC,\Ae%i!E)iێFFQ=YcT}'T>S̨hy[x_ު?U`#8љ}%' 8 A ̎̎jcFPǺQ8L5L/L|]c3/eM Q6"'ω1Tӌ_hYQc9/]^KوOpE%5qβ4C6ƣl@VTihsi1Av[ vS6zy ( L-lM21"Z*Xs` 3 Z3TFWLF|\g2 /%FmӘOV /w|/w3/#|T#Ȕpel+'3.M*F|ZQ=+| 7ri^GU|{̚X +C|S[B28f4 'e~€YO>}Zf-+b:% -F.6sYHVxʞ'b656cl9rbkKX>T5%.t&dx+,߶_kCqNbzirvIHMT0$F(-ațNx4"0_NfnKوdzIcR )Egi4)YF 3nvo|^P0tΆ1ߡvh/)͏J$Ywhs /kιkyOן8}W l|4*,Auzx] :In7s|9|Pq|Fk K5>=C?%.}5$Y/\G5,_].벃ɋ ~H ٣/HRZh|U3 t!mc7yRekGi"p.yZ0P`[Txd B<'BdĦp)!E.7qx0)&z8"~3,٬C`ιcˌe8`@`waM^ pB:+bjp^As~kKJĞN\Sޣ FTfŊ31R*Y`Dp0<6 +I sZ QW1[ Y0$kl>þDA19Wz"wHXBZӜ?LI}|! /#Dolsj: uo6 vks$'!+ 2TI,劜%Z|X mK`)6'[4h ̜ð.3_v.5ωFJ;DyK5X{ˠ1&f:^("R[pN(c"i'[n$GWz}L< >wû{b@$(CRRw;N"*Pў)Dey!hit^𴑬GrP!Jy;Q+~6bI;rbrqI/"HbT`<3*PpڑiQ'* XΒS29U0$㎿L+d'ax r=oHpΜ% :/ N@o@J/3:;6^fցۜ4ВT5@LTTi'Y~H3s1Jo+C[=^O2qژ^?/NFꐓL҂+ rK_{Q|Fs5\tlI/V]MCy"(w=ν3;S&47x-RApL!WIx4$Vp8$*bDЪD@ZFJ\o\HB7uwF($Q:PࡂUcԁ\D3\a/q  Ukk4 :6Ґ68E+꒓5 I7bBס*$Q8?-t~5Bw`fpHS1UGđןw;jQςX=`1s^+1T'e| kpL}IYTy dfo?nVpb8 &ຈ&"5i3qŊlK`݄(ߥ$EKY4 kpnU@mj=J8 `7~7@pL"V#SD먌 U Tt$T۪3ATr4ik0=wݢaXuʶ=Gđ~ ^87nբ[ *Ԁgtnkr2TG4Ͳ=Sn1FbdCv9noANLw%Q0Ill#2ݯj/Ck b{8$+ArQU<']^- \&N)O!Dׁ`>0I :HY4=SIm4י ?t=S ce[L$48ҙKT ~fo謉5aVo%¼!ۊrBYm z}@O┄W9#d9g`uV& ?wm'q5hbsU$Ĥך '";@GLj|my"7p$0R\ʊGđ&%hPQ,}V-.s|nD =c= &UEQN.%" 6ݹ*y`7cBpLfz|JdVkgjuVIA٤+EgXM'5_𽞞#\=Ku/^>޷'Dl0BhI]g%:=I־=T=.rGsWi/F0 ҉8-bW$R2 _%N{>g|ia 2(pKG(Т  n3qDfd01MpLyj/~8qy=3DE$/$Y#σ;Dkt^W IQ7mևJ+ެϙƦ_fM1s抰Jx"ť"K!*?3qlQТӹGa.U6YA-35xxkM̳T#?*ϋhEU{Lzh6ӏk{b^SuhWwl۲8-U:(g(jRqk>)^XuxGkk7o]q'LQe,Jj4ޒ` 7usX@st(ߖ(`,GW۸ GV TTǩwCPU( )?T+erscΛ~9Mx |E -Qg}בBٹۘZ#w%9Vm[n9c؎[W?{O摑 e\& `O,'/ Ɠw|94)xNs+%yfww?/>0|U.I1@`|T`v9 w9i<`c=(c|&tH2 \JTsqefm˭7]V۸ G]ױ u_yYq3pxob~7ھmYmpwnc֘PqὊ$)xU9,$W3A鸭ᗇp #`<\m jlqpcE07៕{`J^xAB'6@ļ};52nf@ M`ty[ZpǬߦ9IVR$yy 2 Wr:ywdj3/B̾UL!05?5a '|5qkJ`r4/M.ԎnAo7A?Բ @\z+D\ὁlBxV¬ Cj%ϥ-Te f21W@/P0 dҖSZuj k #߅1U-4WzyQlìe 6[/۴-^K{2q[9!X`̳픈uy5;-()S$'*n#\@FTF c_A?^ՙLd2yTa1O1Mmy83W˳` W|w~efǪ'RugvYRs+G?=ߏ'2ˤ2LrO'|_vvvF}p ~7Μeet~L2 J[Ѝx:Kǫ yJmOCd>y*MstT4o;/\2W C_2aKvt)"N>d˴Ҵb:kW>G}g`s;uw3Ti*[n3s͏ܼ?Tnj$$&ԁ!JLIDYkY߻AˬU4\dGu8p5Db ; O?Oflcÿ$ DX@Q4/OՆ\KЛܻhPVP}w9mɺ`J'uvɽ[\e/Vq /L폀7`'pA~$6S+9." ɍ,ёF=!4}Ew'[]@b7ʙ.[ \K{LUI*K9DVΓs g:ק-ĦN?a_o;5Ʃb`<(= 9eT6 =s ؜qCs3q9 (β-es'Õ:\SyyUrK0:\}Z#8y8]"9#iFq?2ietp}# ґF,#9f ͛Өz1{-eVAR^|қ2.ƻq5%BGw \|Zzl>}F)OPƬ[~2{?QO}2w0H~ _0ݡŽpۿ. oJP|ଣ\Qsa^SxL;Ko¬W&+ ➭+XA$Pʼn˔LF}Ld4V S@:.nxOd˪P}b<-{x۸E с#\q<$a8d9{WD~E{&}j.ݦ bf4&^Pa d?l2r$: \ as0աO~_kQXA7No|FGW5 M `(A(Fst. 8gށ{c9-8ʙ.H G2nƹ.V4mE I4qIV&}>WM_^}h둀a֔?B5AjA&9HH#c ÙDP; 1svfL.3RX%ʉJ/n"$A=g9.oS1eTvNV'1x};y٬Ҍ\B9Ft 4^&'T٘uw#Q.7jy17?РAHI#Bxhlikfm7FlSyH=~@ zKc9-8jW)ضsG;0!͏aB:Ë5ń:^b6ȏӛJgŃbSc1{uiU&[_qahkUɛem탼AFSRz`^zG.{³->HM-vꪢYª<4y Nȹ @|[Vj7fpNΣ*|P&WO:+ȓq&N~L糦No_ _`;)EqM6YAGC<3jrtԒ.Ք|+M#.mjڒ0r XsHr*C~i ڋy{I;1CYgYt06/!w1sLho3<%tj۸ɽk=koG0-0c78/~Z\S¡,[/)rN1ȜfuOU׳HNY~p Ř3B5>ϢsMY0RS .tgqYe篲 Sw`fixK5 - oNo8XJ.vriƄRcE@#KX%/8hI*,Q?&v0W43-Hw~ҹoHmdRixs+"bN~6*EYiض[E_E ީV?GQngV@o 휆[pBJy|ѹF| ?ylUcYC͟ϛ<~goK(?_ ~ubǓ?ۇRMc/XߏClN"a^]%ZZ3/S3lOPlُZ>}aQ1I$696S;.Irco悓FQ,e|,͖bVf#`W +Jm}6wNY j֡`%Bp+#z۞y9ySdLϩ2t1|[;&?>fv~еw(+_f\HV0Y)_Kִ* k=`gc,VRr[8d;8nR GpV!f?Y%BJ>~?N&p=?Q&\e@ -0wx \˻cQ%43gsąlT52snY9)H] f6+WE݄dNC*TDkd:vV}FɰoAw4Y0QɕzI{?'fYԳ2:|%:u'61y ΁@9/p0N+di e(KLP@rn&%H @͐BܬU=}>UI=z^Ϥ`e=ŭJe㖑T+YWDy]>O;x 4CO f(Ί%scR ?L0@l4-|Q. 9G%1ơ6 mCJ^%.[K$2-G2m-AEM*+ttx79z3X#)ɨ'`6r%S q>C?lSt583L`?q3W}ÿ6.}|zol^*W>acrV='N)kCQJ=l)/dOSԺre][u-?||WAlNWofݵ1-L.@vqw[_KvQi\`m qrQtc7 +uX!;OaLRVIzTGI)u>|,TSϵ+8]lq721\QoP&u]PNƶz湍޲spT z_롨ZRxWc((l7W8Aуc/YqH3p5TSaX'ͶÇ/Ӽg4Y`NKr7)'ƶ)3#҆ `1$1wu`s F8Bi3?sW|ŕTBPdے 7m\ I^Nzq؛Om 3|~;']eI]OE)m6<#q1-imK2s#IT'>dӮ#<<2r{Vbjg}<lq|_eo%i4~άim_]/Aͻ:NݜXSm'tv%PnlLYм-r[0clq+u P4zwׯ䙙X^\ڳNX夭gkrB^BٜIQLL _,ramUҘc9^ v=~rWaoC|} ' #8 B0!%cܛ/Ne܄so무DܧӮA-}yN*Fj&tㄹsp M8a)5̂$sᐃ=wS47qǕf[=Q5'v 2r̳ y{o00S{@5q1r< +]KFsﭡ{;SJ~{X)E`ty4~&08bؗ +8T99iiA۷*!8jfr$-"Ύ9^+hc^䃜Vom!mU:1=X]m-uնMpZm {rڐ*F:Ȟu"{$+ /^~5gmzg廚YCr~W$Z$K11oh rQ.s7f P[q[>qԯ&o7+C2o+ȓ |{b1%ͰkD؀~DP{iYJЖyr;_[Bz*U$vqJj\9YKɎD߾sI}vJI%XD}/;SW3nWI9}AfmeNҵt9~56mt\k s\ 5])ϕJK8 (1B(H,yю9VbG WG~M4(a+ °8:˚V`8ؽoޜu>6բWyLh$Z.JVˀkIqʢ4"-Fg-Dٱ֭Ew n᪉aZcZ8T^f/(Td>dX PvAi{|-*w8Sm XwhKLa_5mnZ7cHGL\s\ "tgK Eô-?x0N|\؀* 3Ӆf88\3l/h[8v[Rj}W8ouMsΛ 6)_7ERkȫf[˫ %Ws&;a;+>X%,^ B] "u WbVW8Ybhjhiώ,آil͊y['PtuVuz;7*$[24R7{|Q,8֮ XS s\`kTpm=;*Ui˅/8~3@ fwG 0j8G`C GFC W8u. h1'IVh 荆Y74S̆R.( }|wn97wv nLۛ(HN Ԧ1:č:[Ќ8%e&MEI@sjH  b'h <z8ÑeON%!%ejkM: 1U$h&JB73E"oeܥFAs C8wRsbH&Kў12ބC`^#[cɵg,v&gI&pP;Ÿo;\V >o K,ΗHzC.uŶpm ogF R!EK` KSHo_bm - nA6Le q=5jfE N!#Zqz=.+2(?w 0D Dւm1>PgqH`?:&;BJ%Xj:LUƂ3[0щ D bTӊP;t!$LoI*6`i:T[j3:^P*Hq@ P(vόKV}A3FBte޸1%529,oS􊽔% 5ާb[%q|dL焿}%_+Vx貯:ή2r_jdR)܉į*zw(iIJ12yP:@yka{"bmUaQSu"PXAlRn 3gU2th~fP(t&xXh>}cR8Qq3(%L($ R̓B *O&azI^B ,e0z[_!98D C?I =8)i$d#,QI Β*:uI+X'0`l7֩rm?ۣ{HܹUQ]o#IO?4U}\^w;Y(C8a2/lBQP!%q;+x yRXg.8%JcPil;)ض!+̑N|R+זH[QM`{ =b=D I9mDodp DŽ% `.9c(N JeGZńc5Fޔ~عSƔp3B^wA83ɞz0z: ~Tcᤷ2u-? aUgϟ:\I:53FnpO1< كG(sҢGX,\&@9+4{ԉJJ\:9 N05 mJ]n\:]|hzk: 1yMH͉z4'5R۫t-v (2*b%EC$ҁłH6)*P_Rg*KڿQ/OZn. ʀ<b$ _ 1rޤRRv  {hHGCHswr}Zg:Z 3,SܖRuY(3Ffpz88"/=u'=St2NӑBak35Hĥd~Qdcvx|ƕ #/D!i| Xl ]۱jd%H1ؒ¢uqg=j%]3F^ph>k,4yE8100#߳p`Zh8 +&KA9tDHR=cds]$x'+x[>`I3~gz$kdr,| ٙqApXi5rSv?@QP1kzmq&8}>a d(aEM?Q_vF^߸Cai$Ѳblxw@I@ۃ7yA }SC,2Is(rOw:~'C@9̭W뀉!%ahZ{gqW* "ђji&Y|,N>,7uwñg`YeLgOV4[ hr~ F,z``4qqӧ0\T O`r_  7LJx8=x4^]oedج&6ㄛ22NY5۬oHW?3}m/O3)Fͯ2] 1 "\4X+7jlufz vAUk;gZ$Mwbf/P[] ??Ӛkl) Q \ Zb92zP׫|H{icD*s7[WZ5ȍ.8S< ^h4qg'{t+rtFo uzqGB)m^>mS] Y&v~aIw+yO=?OgR!$ (0;>rTM94gZf?5dGG !W3_o/ELTmt1ĹRJr$2#{>Nyh1\LbwD#,AD6P4Z% 5}?z+D]E=e_(^ȌeUZz[40U_`3J],B[-(Mv )wN~᪾0QyK:0'un޻\I"[qϋSSکU9 7MHazAX.鳪ƮD1^'kkw0iR Xp ?tr3? HQ|4VN=q~1c9;G8jܷ3 f02˅k333-(3O6Qw>H?na%u>vf%Zܣ <qT83 RE3pc޳l_%|Β]lu99/jwljs1#1f,|,Y\sWG9^[?)?cVo{_ZLc_;H<[UZྗ0{6jdz8OjI&M_)P1E59Ĉ E*C% )JZ%k: d[?qiSDُBaېiª8MIÞa_ym&I |%:3j3:ym q& Xeҹ_0*]vtd)4?&]R1΄Vth4 x{R3+{ ZfxXi3},X`N5fMq޵TgX+)^W][2aIR&lLӉ JXIX ~>0Q;?y*y+V`B0V?Phf3jy{SZRٹo{jKJwWRl d֦bѮ7l$鷺}AFswۈ@K܅ä'uC3.y/NmR׋#Ϫi\ūHBJr..$?˦y♗ke4w^O&(sS^ye~hn7v?>j>9Idz-yknˬV6 C7NgO\QN?V9Nx<+"P֔ o0桮dy-8Ϣތ\j7r aMdo!ҡydJt_aμDw_n;L-㫌3H=ur.g3;/lBC=O |lq! 1"ggqC ,~vR>|=c[%ZT֪dO݊Jk\$LtIxs3!e &x\ acr]r"K^J6ŎXRZJzOrdMFvayzه/f%N$oԒd_UK,kFddWɒ $I͖GS ^KyK9Kər\<()/4TjI (UeLuo $K=v_yt 䝗Nlzŝ&-&Djm pF]@lUe0((N2t;e̙3z[}'k,;[6g/)^|]>/(zJ ɠ2co',™qj0{fUf0O8i> D/;?E/mrX:H[z9<깟K("O 2@9gm~o q-.yOs7V#67$r|h:@d_ŢUQn0Q{t0LO~0c D;/Ձ| Wmכ v嵎}b7{ [6a︕~nǗwzfE]e߼}C ^2E('b`TVbT CaXRX"xh&* 7/v:|7qrzqRw/3wƼl̷ۘeWo.ěb-^^WL{X:U{v*=4N*oyhOpWۓh>d\C$Vz];r!{])'Ui;~GhyNZM>Vho9o:gSeW{LR[0k^i@Ieʴ-3>m36iE82&&ĦҊ:X_ZVZ6\xRR/3ˉCSइ`_w]ރLX˖;kг"~l]rdgb8e,5"ws!@Gȃa-ػ6$W:2V6Uwe)Y;ډ;;ZDV @%ycf5Nh 4IlEݍ+/P0E#zB2H2 Q+$. gʂR c 1)kplCG/dpd~qc8xE^˹z9w[o=w:=KŽ`FO(Q,NUID%!X=>.i$xV؆c?5#w`M:#Fn? ¯Z(#-D9 %:A'5^Ge| dO9IC$H{+nزZvyo͖DQ%󨹢pΓ騜t ՞SF̆G@%yD"׍#-gQ+Z*9c.3`?4Pp*ͩ>wge3}͈k/0ǫ, &k`_ {^st;4ѱ浯]ZoĎW/*}EY)._آŹVĬhSÙqȚwzd_M7,Kr2޶!:c(859Wzc9 &6Y-CJ> 6*V̒dsq Hȑ%4a jmW34}1jPEN-Ykig_lq&P=R "ˤ!h4mt1̘$&|\D!kHhd?N? ӿ֖x\ YΗsDQ;  AkS ;Ag'uvҒye` $_9Ȝd˕ːU s H\^PǤcVNp"8 n#6j UVrl+8w'FY֥PSbp5-1k\S(xX)Ǣ* q[;"طq,o.T59hG׀=`MB&-Zt.:JZZ F&pcc Ɩ5O;ZC9n^MbvP_r)GA1KOvvQAA HdąCBX bwrGlMZwcZvyq@9<|2je&P҂@7B4*QǸ ָ y"@6OW`^\.bMƲ_v{7 9 w-wd骺usTDpn XTFD8#2|͈W8a1g􋨣2G9QGzQFYD@[<z!]YppA`ʂsaHh>FOe\X{N:vO|$ՉK#YvVD?3~go z9ڜLлqBi}KF{ ڊ>d[z1P[ɹnB. &IYiO:8 ޏm訠c?PH~-oKü⒖IV O Z莖Tib,DYݩA(8+18Nvr)Hp8k]T/@AK'[6?z)4qCmJ&P#jA]9CeN@mrpkS90x;gQkB߿vi"8 P*d| 8.i-D%k+ˉ7EpdC"82D\Yٟ@,!)[ CiAsD.k|~u}sޏ dzpO\I㝓4~e7(98VA9%(KQn#vڣebI秔Ó>LW5r= 2!HΔVMlg$P[hE4Z4PMݾI5Z \MÆ@]DpMȴIV$0BzR-2BT\fBx-?VC0{>GAǬט .5z(ZYZ\L2 W9Q?O }G1R8D j8 ~)8Z.OZШ84 KEBr:Oc :FjI*^ dR6N+HA,Fn 5\wLv}h0na}D.]8-F_N88(M|Y9?bri}|:L>/nn,TU{?螙/'caPIުg ɒ>B9ag\+)gv2nl2EAxq[V u%kxB1{J *f3| ^>}~eW*q+zF iC^aKoۅ_~Br˯|{Oozןzo}Q?} j L!u].#:pyӒ4hh4os oҮjs5dWVm/j@,_}}3 ?'>{}[.>IO Wsx 9a(+mdZ.b%Xoc!0P/\&GVftIጰ2*G)թ^xF=ƾK1`eO K0efN>2]={Jۨ3āh;%kMq2 E>DCE+\iBc!VꌺwG/._v!( -Mu,$˩SA@Z3zӠ\Α'@mLuPBT!h:(tۂ;gP CY:Yg'd39S2 BxQNǗԂACsebP-ENÄ]ҚVW*>Kx$RRO `.j7 @ @#Ze<2Q'.0/8XR-ԞlkjZƝC<`qQت [ra^.l[/lq!v@h&FдFY05V( J$48mp<0 8-ZpJhH9wFƆTL6τ~ x+!-x1q rBd!*eGוm!`;Bh h+_8F!Ҕ:o#u"'+#,1Δɜm9EO $GcB6# DϥVj(= 3A4 %*E!TIn hBځB7) E9[ ϳв~y#CbۑXvGj?cEN^U}Pn"J'о4QRH%hy,ו`VW5DRqɐk\h<RRƹIR:1IPDRBHψ9Jn_6q "1Ϯٕ1U{O=yT8vH3(xnto!k}2hR[A9GCO0{0Ӊ!a2ꨴE 1bтQ$JQ|Hu I! uH}l[Q|I(FCd/.FÊN$ɋY仯(oJ|g%&v&3wK%51"RD>^D$qrw _ŊƊ]]?WG,K6&>u?GP7o\@R ZFRj+]յ|-r][!*PeZ].&R5/w6ɫh2Eb.Bϊe㤸{dL8HJAYլ%)~mq_?T$JB]Mtg3]\ֳ_82\H5mL4{$;Ff,^7:-Tz3]Ӏ9?~G*me\ ,\%v^:9X)`l/lmIg{T8Ք5m,׮ȥj体e֑~FG{ZOfUj_ִ}j, ( 8f;Z={{qfjrN[kic)?/׳I<|wS}p_ˆ'Vn!+ xᑵ{{YeXū)eKXi,Vx)+Őׅd%ǣѴJ>cko~Ձ;ƺz+ly+lN8ߠK;[4tpxtR"a u_mʿZTe$ٽCtwmqHzBUdH`,!`&XovcdI$-Z˒58;-uWnú:mı15ďcy[f,˻K\s >¤sě 51j?%;A*zƴ獭1#=dYd}ghq?Θ|֘%YcobYnnAcZxFvGyȚ<޴v g$Mߌn.v~8~;n~yG?67^#K/ɽ98?6Ǜ n;V -T\[?7nlۡ+ Fǃo NlFnηSjK-HvqևjT Vƪim|ȱhw>;'i.OlhCO,_!ahi: 35gofZYYZ1._0<3ibIlfbbln1*. 3c}>s (hm4QUZMxe 3;~ ^o 3b҈YQ>k;= ~x6ܹeZZSdH%"0<ȳLyxl}ƶhH6c7ԫjy2͕hmCx 3kl8ƚ{1^3^6BiiZ,_!a}Q?+D8OUDžhFJ|!O 4&MV); / 3p 3;'IٌvTSY+94+zߴ49hzZTa Q0'{G6yLY:Չ Bi\*oH>1CphB㾡-VhFϯ0<q2$FlM,^TF M_F%&VVTz)!/_0||d: &dRZR;W: u ;bfѡliJ-DKv+$L483ڦuEb=1ev_#av:xj \dd5MV*^ 3XX)7;2Ǩ#jQ)I WHlVG9.cUKθ@bB[#1<&6ԛ(=%!c;K##!KZj5fwg0Toj uڠ% 3ٵ}¼%[l7u+?Lk0<#w.㠟9J MGSoN+tpzJTB JX!a8\rq֣ӟOnZlSZTB=XiĨ$:lrH`kLLI+ ǁRpa )cmpcI֢8 X,.k$uL|T%ei6Ն٪R 3{ >&7!&<#]9F /ꉰ)Uy:UZsIf)^ ޷l ȗ(ҁ , 3fdh5UARB(06f-Pqє:_\x0<&?;sւLٜ(cm7* q._!axi>R545eeFg,5QŽ5fL_ȺTi8D^09Y,r0|$?{+Wt"TRՃRi:8[H[جF )ThnTZWTS{%K>.5&wo)ֵ^m-Ͻ2HiY#axp^2+b-| P-TeHa޵qFO٨LL궬)7 >F ھZO@i-,'VZ,_!ax?>b8mK&h B 'l n `vm߫d5]8R0$(h5ܗ6뮑?xp?ٞi/'/g_rwc}-Jy9Nksѽ׫'ד<j.siM|D۴bcZ!S|:M 0XC|W{Y S% wEuZNM*nMC5{,dqJ=n0%ەkhԔXAgj 6QGZjFTeVUTycÅ bnﵭ9zU?{*m]Y pASKTרT]3?wPC)C±lބpPذ<6K̽in!b4'2!&URRșrS7Yk6}?hl׿sP6W)Rä 5]4炚{U%X,^zNvJK묾ֱȄn97t3a,V%x7kgv['yz,7~4*gS~bƜ<:r]苆ZFZXvq66 X#״ sg]Bd^_J/0nN0(CIql3*N7{޾b4yV;_6=w tSFQS7O7'GǤ8/~{e>"~,zu/4⍠w\aDSkEXrp1u_A fŎ):}[z1%p_=w皷|ryW?;*RBU=r\jܢta]# G 0x7($K ( N3Qx-,q^Qy66d](;Eocq J&ٴ_PO_?e5.` &Sh tj{Z|]eu %PW?N]^gg[bqV±:fBˡMrMA--4!${۔RkZU.HhkNE AOڤTX flB( |53":wM荾\o<`VG@1^FW$QF[녙:{X/3*=M1f/wLzNw}w1k㣼9yX^b%7u[Nti].UnZ.4r۽ a弨O NxMϜ՗Tǣ?Z~sz_V.o~[WRx;\u;jp[]cXu?51޺+Ɂ %=H4h'm=Twm~Xy_ C87HRa̫D"e.eY%")."")2w9;3{sdwn ]`C0w_,g,v$c<[Ɠ\N>NwC2(:1k5f,`ZFL&Z*Rۓ64? mfo*)_fϝ>( Uouz>,Kz6~m[wMn&U~ X;0vrcʑ*Kqca&N~Rtw1fajbza:Fl8Upgw3im}why06W_{s c;\foixV0nZ>ʨ78*YCt}ps߾Ug[|3OV} whܦB6_cUׁH cB)7j*g;Ff)`STYq^He)*Ñb"UR+=Cp XK31t:,\.%6!`B% :5 (<_1ѱYN̡LqW[^R6ymXc%BDhI DQ/M +oq8Q8cت-: Aeօ'>*#R"%1XS/* @_+!ʓH&V <=U+J< S-WYۈ? a䴔E촷KU2T%[,G~貴?JB!aMAe8FaC'8hK8 igH YUs~)e/7s.]u!5QS{?"^: .V_)ݔ1.m:){cJUE<{=ѺQ>V ](v0YC"rq..Swa2\E9" t1VJDE-6`M@ZX. o}waBc0;~#0:Sݘj1w9B߾}8}TpɸN =uJχ5oa-$?>@)g[3VuMT]vr5"8[錾S>˪3k˫R0 E f_n'ݕ'v1> #IucHcn|̢JpTQ0iN>.ٿO6 Ƨ 4V02BإoKXb/X7VְiTmhTLsX8s+?߿o>f`\3h{w: G_? ϫMkhg湢!7<ʠZ)z~~.EI޿K]|B;M9,CW۠ו뗽:]իW{`]j QvMc=86FiI/NDB :EB9jm8WLӃ.ٛ_>wP"g4`[($T`.,% 9e =Nv1sܽOigL֭3[Cs>_OxNϪ"'=l$YYCb-ϗtK~Ҍ)XM\~30~jAUfp{tow`qMpR05`XӐtrRNDUr\1q*:fi-r/v^ӅÂh._Vª6>z!D!e!x0k5f,`k1hFs+%3ޜxǀ5WZ$A,@^vGEwo> ̂HKB9UaaRUnrܸw5d|_48L7űn.= ovRU]ʣUKUsTϻ;JFHqF1 . >jK ^Ȍ ^`A ƜZ؉!kV On[9bu=/dP6O A+@ zQk %i]B:`^e]Ibu /H"88fSa3 A2$Yŝ6 d#6HcаGG,YDz  #$$8XG!1z ",iP%LR#1s:օ5ZfM˚5퐚vv2妩VtN@N+{ ovwk8X 4 r3w<zF7)\0so?zlcJ#68RojRz\Tknl3\r$ )Dc2㨗 [ WFYemEY.Ujσ@f֑VR!-Pr45ʚE̤fG}ť$.""`.,C~bePu(Xpu3sZ&ĵ r B($W1pV{tHbaJ3k=<'Xa\:g}Hq L׋fӶZ Q a9%7NE.\vCBF\@ J:)/r6K8-*>R1G EέRȰ, Ƹ#aREbHR{4gNT1+ي"<{H٧RYIQ2ȠH3DC|ș9!g>̇3dCΔ'9!g|ș9!g>̇3rC|ș9!g>̇3rCF]C|ș9!g>̇㭶[0΄9Ms[K%15WnnA}EI($wFXH("gSqgT Ke0`TGv~8@pGpyE;IyX2=},8*S6:p 4{1XaGZnmVN`t7 glоJHYs.;"gDb3Uq*TřSocIhoe@L2" ^!ĝLQXmw0>#w[棸:\y@lW>m$,ebL3t&@e3tֱc6KgM˚3t&ү/؂a!9"3Wg̩3sufΐjHE43Wg3suFY:3Wgꌴ8gꌺ2sufՙ:3Wg\ٻ6r$4&ش@Kr{f.tt%$jYRGl5&Y\f7Oxb]5YDG".ɨ\ bQc Izhy8yT?B} O7j_陘#&q`2O0ftUg(RdKJo+O:LיS(< ov(\g'?:.gL/T ÓV4uߢ}T*7޵qՋބ6~`Eo ޕ0TDBK$egOuQ8DA ',A1z*s4 ?QF$@$>#aR/1",DDꥦ0+TD33fNDŽSk\9|}&/& Ol& ?mVҏ?wշÕj蘐 XA&cn<$G/ %QXeAc,6ʌ :%1:ڀmT%XJBC&3 g77Yx۪h.=Y|.o)[~EY>g1C(Ө\ ~dRb [[Z}vĖ2NWO53ձvyqP -kW{cӧrBĄN/U)fEqs9ZyMR=@iOӶ 0a7']}-hEɖr)z!pY=27 fg#W  ۍxbja+ItנK6vaw 2!w} ^ʝ[%&ܦ/ ö˛Q! (:;N >VhNVZf;x3m`kX^Тᑇ(dTh!ТnQc1Ej6Qu5xª杼ҊNa׾ׂrw(ʊ'o {?.J; œ,Ձs2CWH!!Y-+,T2rRaG<=V9mUrsGI).;rH LJS-@m|%T xlaR)1푛 [B՟8{]8@ DSeP&}.zAZ”4A(ΕɛqĠh$00rIKD RjDs$rŐ*"ifeNGU `naCX1,2R!DhQj ^aLHKW]^ O_r_Jry 2*z-5 I(ѧ67Zn>OZ(w1`/2XNhb#s9-64#8ûl<ݝdﶀiδcsiPjg$qkX@S@ޒ`813іqGv3 e ȾÙx0J7 )l1#%78#f\x %4J7|=\/:Hftqf3 ;h&ed?(!N>䑂M8F4%WT%ƕ_c%d )`)@>UfK+x<#dpuĠ51bAciF(zZl !QRX2jx7vY4a Z{1[,} *ٴ6Y_֫w{3ԻL}jثQ^fLnO7o9k0Yc@Po"o [ b!կLZO\"ͼk2YJm+w4 Fo;eQCn6i0M+zfnU` tًњP L,LRc$Njsw,{!ԭ0e<_.,- e}棊g^KīхEeg ܜi ̊I0'V W "0)~uqSFԤa b $M.;zk07ϣ+\ۙix,Nk/W׿\e᧠ ERŮIW/BP A)X!$):;-$bB QedgJ>Nw1}]P+RX,eChw1Y^},a*[Y]^Tɬ`I?0W#XƁ]UYWI{ɱ$QK-9+R 3lyUZDwt;1X9Nqeskyak'ёn1*6 xj0Cv,b$\} -K˭rb#VUzRnhe,;+Q\gl]ū.:~iUG4fqV:~kZ)MkYekx=+m]?{$)]R)kr|gL3:m {93Mнo6ѻ_n_%_f'D?.H˱"]ȣE8IIHƙnW9(66m5jK*5L?ET{){x%DKܥL"3$ G|)Y$m7ҹt^6ք"v+VBVC_)%JD1Q#i0[2#a◜3޲b:M}0~8wxIx% 19Ny Њc%V)fyjfFގ,%3†pj0 V>Q8c4/Pz41}<8}{o2^Q3#L+U)0:}d@`2-CkblZL.yܴ[i b}U+['x^5)AȂ0h&s3갯KxA QRzګ+@Yjta?fW DnEV_#/]YNA+rvޯ+Yr׵U&5NMr|`MR:??obutHs q`d0P XH }Vxu"c)0Gk+=ʂ3Dc1Ydb*A5y_CH@Wb &_3N{y<,>XnyJoauY^T;MKROun|yIKo/ހ_ LOHyZk.ždhbKk|@_jޒcR)BQ4P| [ SޚJ6a(#!FP#uBQo$ iKVAaE11fclJB 0/CTq9lUDuZC q 9{ WX, i D{&$ CF$Ȩ0FH&l$ISq$b@RI7+B`JlZp4s,0'6DyX2r<9BgC$Dg%NwRiSDkԾG\4tFV ko{mwۻo{rۛ|m*!Z(^m@ TG#ѪHREa[`*Ʈ'{?3C;EnѤ[ayYzupk,)Q|?7J.x:wU8al~6w: y@ ӣ:e[-n^ֶ7.R4ǁz#EID`wSQ23kmc7Ex6izS/М_䐴Ȓ#I_bIJDYA.Zr|fgDܡ,%R,[LyVH#ǂ'7r˨Bdjh&X%AzVj8;&@s*Ka e qzM>'(UCU4Xd"OQPΓV/) v zw9n_<\`Z=m cFe2 k4!*.3Pl+bpc2ށ^ %Z3HlRj!hU.oT [ƃ^2fQZ*G$?鴫TN@hA-G/iJ*"OˠW@4*a I櫂,)[.ՔeR\UVnϮ&m?'@gu mk_ﺽ0q0>EZou^LqQt~0sn1uSÌ:nٕnj[dx֒uD|xu5hdޏPN-*'e 7߻Q\ĞfEXO9Kj WP8' '> T}?.0 rsܺZLg gӣ|Q%.H!7mkh0kܡFR'ߙQYziΤyv3kwݿ&W~/_O?xHVc/=v9̍^z2>낣Y,W]>Y'JwcOB馩 ,.ZF &&{=ٽ7=$smluM6=+C,>>(h>r:ns3lyȰM'6 7U&=Y?܁5.?~y_x?O~2_;:O76 AݻT߬kn.]O=]U%ߧQi\c緻 Yyz'o:=j̅:a6f~ GQ ԱQA h '}l m62Xc'ՇJxR%Kie*U RPN}, g;32N?&gd4Hj @ .i؟C7ΡxJ?Qg7\:ݹ<35D+@) *ZE@ Hm5^;2U s@@Z{nBX0фz9Br:*C^@Fs9k<fv&(n\z빎bkb1Bvv,<]P&!^;d+CPΡVk8`oux+B"RVk_OÇ/% $W(S $UD=4&lB^<'+/H &%%D\\Di0$8Y[]&Q-PC@M 40KmК#_3%sx=",9A4p$9ERkC*?EugGyr'o^:`~>?RHR "p I<'`,&R`XoN[.OS;./&[[0(tzўYb''HmkߜȾͳ2j5lftdznav( LO oEGuCos{n8V3sȏ|z^U3T'HDhs4?c:sxYhR CnEAePHLϒBt (D7qɍTX|A-pɋYF7ҖRR1%:Z. {xjɶ樓%G`$rTѓ_\J8u1$I[UI$%Fj104tjbZo$SZ5UFϋǨ}Ж8\֏5ƗGK1OD1і͛8Kmٽg*y.z}y]r-*f #γ$+ ^iވpu^a'y £QQpur`l|jxfjZp]cECxH咁}֫蝈Y-5z|\U{җTE$}&ʐHH5 X_[?0 ^bКD,42AJON㟤)6(<+gdy܀`Z$b]A[.JYŜ` mK\"pvL92澼vN뇭WI\7CKPBz7_n_.Gqu/Q2"t*%U"`qxgU*ku gecM>R \Tj͘*q\l8gqjs:jnl:Dަo+Z4`_BYe}CYJLP+iP9|3+cH#ڑ{|k3d8FA" A @\5̞|v!K:t>(5c'?^77ɧ9:Qy\W~j|z]~ddn|[E[5o]M_˞ Slt~0 M >7YY~!W<^n0޾5o30"ZM;x-._5n-0[`+u\]X-(^ObjXһjj'mzJ-M~ag -!2/T|Vʝ[qnnzqSHj͘!7*c:XX0:Yhج?r6{4mx!M *b4Lд彥f&sThL̤PV=w:vXk,UZPy%I˾e_әm*ȹܨu)5jwBt4B=EmRHEq 1$D%*ɕ= \ o mE0 gHhS/6-N S?NAtꬃW,s4 ?QPQ*Hz#1Bũ`DW.xSi}rvRt}XOe9K Bv5pb/ɣ*b@먼IQ4P*r#fZJɣM"+K3Z!(kt%Lء[BpGb):)pJs*>Yl8ifgsr.M"d ~i/37vBʝnMg>"KGNZ_QVri] ޙMcQS@je8#x)&1R"r7r8JI4YDGf9:!%HI0XdJpτ.*#qe38׌3e Ņ3 Xf֖p4dbF)9qY숳Wof3/톅cH^K@H/Y&mz, c02k|( @I|=/XSA>0Eqjm `![ AXR 9 TՂqZ;i\edAs"H@X 0pqI6Ơ("8jؚknٓ}~Q x5- <(HjX\OS E'T|0V9Z_\Wl >Z|t ]/2Y Iᴖ&FiBѠ3Vc$I3|£ۚE·rܟWDirlщؑzxrOC(@Jczȑ=U9EC6 0slj %G8OuX18V7"U$D-xiT%/,`U1YW! 1F&`r, .Z-RA?lt`9eHчR+KWG9fꚊag3o~ px:>?7Y?>_7Ԙچ}+ ӌm1UΆ{_ztr^sc(VȭEf'kN|B7&,j By'ccm~bg._v[~w勓ȼ3ҙCEm,%vG ~ .3bER[N=[B97]7J~f|U"V]N-js#~|sUq.OG[.}{ k<9R3?x75^{uDӋ>>Es8zy(_w^/ vAFEAjw9 y[fG3^z1KշqrGvF?Ǐ8~#qG~qG?Ǐ8~#qG?b}h&qG?Ǐ8~#u4SG A QkǏ8~#qG?Ǐ8~#߽* ?suS}*מnSKWf7}:!'y h"5Rj9uYW7ttD0dzM@Vgp XD=_Gyb<^]SP_w v)7zoc%=~w}pjZ\↯7%%Z-PT=9sup'kB"Ɏs>Kmv:z] bS@hT%VtV|zv\iʦ=c־h`>t`_A=y\[BhbnD|ҟ>.9TEiTe ʭ$B I.Cu!y)6!A )m/V묰.-YnKZIr)vFQqTw"GɇZR'BR^U3^y`9Y:go/?yz|2KϳVcv1fBmQdU3ݢ,bHޓ^ \V eV%lnl1QPZC0i!heH&0+"mPBR SYK})nSe}\,XqJ ޢRgp|G7KT:]g E;Hy!_6ײrE &yq> m LD=cFX.,$/uI]֡c4) 0YhY+mm9::3-"{4lƣ=]U76ƺOWR1΢Rl V-?eNȤtBXc @ķ\p1O·Z] M{R:wM RLFƽ 'Ljj-`Ѥ2c5EI!&2n : ;{+ {*ܶW_ ~VY=zt5X"7Rʏs㌽I2?֓ALN|?qIӟg߇]GSALm~xnZ/d}o͡1 =u-:5@c\2qpHe)vyٱa+l]%VDd,3ACd8]^\Ems~0 <=gl`:MwmY'uF#Xch&w^?7NyaӋJğӋý?9?~{7{_ˇWo}~x Ga Xm*7.?oVAϯ֚uukҪ}n-|Ys_#oc}[jk@R~0O?>,ngň9u:S/ͿF3?n:ghKR)cǩB,뉀Mqu6*y e$`?޺fR㬾3<N}_6MPT٢M,EtbJQ1X?FcSX%Pce+:*`* : iIh"ȼ~LSaÅ|."Y8}x|FM @3\&zPp ]g*у.DzYoC+/6>ui d桱F8h1 2 Y4@le#( ɥa01b +6 ׍ nK"k:I1H K< Vq$0 _XV9E)WGѱL8eCZs7dm~n3sGZ~ ߝ|| '7}w3zP(I&49PADa^a<  A@ۉɈ0{62|ʎuwOkEŮ@N.ѷܓc^&>M9Og[zFjDE"wpu%B]lGYTi7΍|boM?DžzkL:~>碯Ԅ۶0yVoijrϣK{k !]6fYwe;\Xusvk}P盹SQ{5*E~(dQb2 eNe*\A(.f]5E1vBji&[9Z?i;^'=ɵVͧP;T*'v^MASdՇ`6{r͞}/Hl]%lBI,غ zrb D9RgdT vR]gwϸ]K_Yx(}!VY !oN&O,?=ˠ96uJEn1Y@wJ<8Z֡~zlieQX I0v@4*@=f]%N%"؎{ΊaZr.JYn-:ڤ`誊6QhyE:Qfn75mE4r݊|HE'+"F]Q2{ZΘPJALDRVe m6 WQ 7dididiiin(:?ɟGv޶f+b%P6c< |,U?VtahXT&sUQ5AkG,d>ƆxYН>cXQQ꘼fLJydz .ѦKZr- K"( [#CW7_~nVs;nSCuffC1fxqѤ⴩X>{$IYU_)fDԽ_ 8 qٰFi 1pa!җ - +9$$IRUPIvyOɉNP(p5R(L|*MMKB) &eW[~f"E*k ͎D:`0~dky%uOJKr-$qOx8ā'.q DxS'8ā'XX1Ȝ 90[ѽi[v,| .F qX&#%F\h>}N*q\n*Xh)_Si"` ID.eL& &p V`b4v)X" RMQPƄp©Iē\x=j>QP%ftbςF4 #[G(_i./ l χࣇ1 M'۶U'ͧa*BMPx2꤈3Q#5`(/c37AobQ|Q _YTv)!3A*ֿi+-XD<v=XѦߵ5mV>W'OX6@0 !U,auz͔<|k1oCGj=Mlfo pĬ"Fƭ&z&9p4W'JQ*d*yl Kp1rKFx:ӶɖϾZ7wS4gLϊ{#rzFT#9 S5!&$Coя<-N*!B\ù=0 j:YLB5rUk!w/*6I~ќz KGfMv]ArLgJg֮Ʀ_K*'tvzw99 m9E Zk E/xz| w/$^0O0<^B3 V/O,wCL9H 3-/6<2Im}E-Զ{*3/;6=欔Y&>x/b) Vu&:ס諝 Mmjoo(5#B'F<훢%R蘉Bp![-%)-zLGSi2h3-1'fr2wC'r;tYtꤓY\9xI>)'?i~US[\3YTLvة(pIl'&_&荘aFM2Ҳ0+JqN IE @J \'V*F8ҧ1ffz2/ @/ eЁG`S_&DHɧ@'S/9ʵQj!-[qdJ 0r$+G0TG f{X'RȰ5#rbv'."EIiY2X9Zylt {B DGB`DCDE|8%VQwJYg<A53 C.i X1# pcmJ!W=*|I1tA1m#uj{bӞ9qT1eyGqRbyM$fУ]´*`$>aW/ݞyB 4 "'ZD/xK%<=#Th c)fe$iz:, 3lDq( v)3ӡ˚Bz!erz,gO\]W~o(}jix]*ݫw8`#܂~}3țOC'FXqhRćq=KK:4KJ$#O&_"@x~aҔ#H'ACM?QD5Ncp0u#WQ_a9a1r:ַmtKqs>USNoۛKj4'fUr.TS5{^X]'ϭ;vpoE5"wT'ԡ@hzvdO.;[u# al.jAwKTYr7?+xJIx~iWALW*j/l C/.M4 i;͠9o5|t6%/ bWW@!e]tDH-Bc _ EZYM"r:G̊Gxf](0)&^gD)f f3{ ڂj"K YzYf_GGjJA#dNVOj!I2++2z1ɀӡ^EF*eo5{3MkP@ 3UjfM%O-YPz]̻3՟YZ 06y0u7ģFQrr}`N6+˰R=@"ɷ_[eQ1swZw~&66^<0H!DԆ#h4~of9J3+rS{fxSW,|2 䫜 njdGV $8*jY>_"{,O+g}9r=g2A>E[xómbm@Z,ao}zp 6ަt6yF-ʶuJg ~dД &S+Ln6ݕ!w0|<]=$_vwTr_w"X?ny;fͨaZ/Ro;xOww󩷕w pm}aTU3~V\ԦWZn;gW+ۑȮ"Ա/`kc4/ƽbn$c'^Sn2d݌9qnκ3#"A\tDz?bXfy:F99|<+ye#&zg9Jƥ5>CB^):Ub[Hkb[l/m,oW*RWmk߈}"+sHRY J.[#1J(^irZ|N>z')1BT\jbkBm3HaY|PkQI ~in77鵺y:fX8ʘsu,a'CM P=ÀYk]NZ’.ѝFk1lC+k.O|5Y!6Vщidm5U*rFLµ,ܝ×2AG0c1AÛ116iM ӎrBBkO0pfyovܲ>҆oƾ;3dޡ ]JT!(D}nbHpD#%i-ƹEr1dTGJJ8>;F]!{d~Gg[/׏oYaoqHֹ d.C*e I" G\0BLE5(%LZ0۪ٜT I'9Euľ7:%!gFrYkiQ3%B1|KƊ͠4V¶?+FDmX/<$sk ì5H5>GmB Y5;-Z2DJ1ΨBU)FFɬ""cXb@2kt,:ZfPB Sՠ-JEuvFNE] ȀV !S.iE)1M PT@t[ 4sR #:l\ #Wx<%HJXBX ypk#9p38AEAG\ZhY[\Kt ƳHj˶&`!Yc(Jl@5iaho4To*@Pbܕ%ᨑa3ER).il8!oyVjEk2grBfMy O:4vh 2eY< ޕ*Aɸl )P/i:o41< 0 bG\.Xa "sL a' ,QqlkL67_9tRsu4YR@(qHseԸYcj|#TDRP6Z"AO]!]RJ[۟nKowmjoE&$W՟#XXF\2Y]a%6}R֦1Fj0Bfc/`7DŽ 0c3ƂDGhP1UpJԃ`(.  * N)n1փ-[:= ?~`+A IՔ4a9w(.r*ap c:-bp #+yNF2W!g5(d )cthaqǬjF_ ZFZ߁XV=\V9iSI\@lC*!?!]BQϏlu~5ԤM_}0ƛ Db1ڃi#bU1/`"*:쬀Zx0E#=J2}_HM #Y zQi0jV@=mBPT `&A}6oA$\0 獍>%Y*lJsriD|9qaTǢ"ҬR-ƇH re. 8aB$Ti۟:kg Xx蝓"2B!-cW?I;ݟ-@`u^*i#WDl|k^.SL'crW^~\ߵys_;r@0]@2@ק>3kblYGMm\fXU^^wͯ:xпqjchkh ~X@7MQ0gBv?TO1:-s_NOasPiu*&ɝkJ-U)Eh^Q6e;쫴To|/9݅wA8+/utlt[Jh$h^9'uT+qJD5;n!M-zW/.hH.ײT]W8v3V$%aW(oNMRYЛRw @6rN>q*Sp:3ί)>¾Npe`_,oϏhUwv~|(B;D{zu Vo*RX5Ub `&-hs?"xK_*V5DָIW4gL05pRW$Elk b3Φ3QOQ?#Ӷ-&lESVΙX hYإw DlAhxnzszR!9{o='؏Y$NiNa(cKRi]$Npm#HK9-\x؊O\c1ylt|罰2GVGVۏt~DwOn=}({{mNw۵m=!./?~ؤ2νs;:νs;:νs;:νs;:νs;:νs;:νs;:νs;:νs;:νs;:νs;:νs۹l:::‹t;v+{ν^Eգ/|*F?]z/չʔ9YJcv7#H{vWg{v Yso6QmtCOL;iğϵ{4ta褿D4x5kmH _vH~0Y'{6q0)H,ˇ߯zfHm$itU].6UN3$BkN,;W" `$jmK",#Yë'I]7\Cͯ#)PW4JŢ,5QN3L*6.U b"r9y FϘ#M9Xct|s9weDZMͼywv#d6z}KJ]evG׳UxGӺ*pqV{_!6.0^]m:AMN@ v@ ZϟZWMp';:ٞ?Yw-+hYun ﭞww:+{r|F[]^~y@ۻ'zpUob>Ȯ9{Y"טj,<%ŗOb|8bN, &V˫Orwھ/5s:o\7}v/IZDݱtp&(,tQX: KGa(,tQX: KGa(,tQX: KGa(,tQX: KGa(,tQX: KGa(,tQX:ۡ0vKa 5]}_ oo_v!Iw߫]@90B|HEalHE@Z.TT|"N*^NYgL*3U'+6 D#^rÜQVK*5!t!ِhwv1F"xgӈI$ipd0CJ[3e#ӡ*x>XV6Q4Mc F#m}j3r-GΧϟ>v_͖ޙ֙ڛ} Gϧf9-}޲R| >gA[w%`R(*`$ *|koHք#Y)O(( 02FO`hH#ӁfIVc3=#B&Ξ|kck\:7|5pM)ܧVۏ]?lc 4BKWjz޴Y^zJwe.n P y YpsJG Z {%lV ΑQ[7⬯@Z`Z\{Uؾ}m)nx,)‚pYkdؘg#"o+k2JF0ڷߚyOя\l;&w5V^'"miܠΒA\ù0} jw7&LA7yExmY ? ?jq6`|_phM'uiUϦϠo;́ɬvCvI[svެ>3".$*̒-+zS(&C&-EAs 0wGـYnsojA%J Hm-YƎګTA=I3M:iy=jκ $DtR}W;r54N/< g&7Io@/?s}]T=Ku<2CWH!XGc+SɬRI7 ρ/0Efo{> &Gb7]\krcdMGo:~SH1CKB1Ŝgy5ci*g!ʞ::`+椃߂1D4ŦDٗl풭}8FKg9b/̰֔&N^XR ExAl6qFW9YݠXkn\Clۃ}jw%c @*A )x@xhPvZtV`ЪHJEa*F$W}{ݠ xqC{!9}<UQֹglǣznuuB޸HZ4.L"E%Mq_,S+=CЎN RL Yvv6qe"f]6I]K+dt5vo< ͘t58}فXkY\;r6#kDHP;p\%ǎ&8 JB0q"/U;t<{}n+;auމw ՝Y(oA }j#F1f6Г=鶇cm!tqtfT+{hRW]vO #5tJՖN̉?r/a:叟߾o~L?~_??}z߿}=: I>X&|n5ovk>9}+r>r}omZ(}CCR,B>IN9 8 }Ͼb_4kǺT عT&D~1eec$F10Jo4 1u0,  @G!=Rr0oc39y4;H_1~NFVilH :mSlD9)1 &x\DjQB1|OW,w.c sWς"8e l!! +v8-͢-zJOSo@ud4/7y:o.]b n|[7h.6U7 jYX8A9bkpl,)!ؔl]nq}PZ0{]>/Icm1Ux+ V] zLWgL`B<UX-zg՘ `e4zl5ͭlejˏ 9A f2q1Gyu:ŷ2 &LHNZ&It^(,a = IML"jxT"7 qfAo$*ŰIZOfgqxcXB8f<˛c96OK&Ll82 ,Z~}yaNNk>ޖ xc;ijljwx޷?oYD8+P^VYJT0g`Eqc1JJA("wV92 %Z@R.&`zK0Gm74OФD0R-c6qv[( 08[-|zs&U0ػm.{'77LiP`t=?=C[l$VZ!8Lq1"L :"p+5#0x\ uA>M !D%$ItY$R:())*&nmGC\ nq_M2[mRvA[g4JD$aSOcZIErl^'4b.RIcg ipCdxю &Ąǐ>r8Afg=ij2Gcs͏ZD"bEB0BB2p[g4AdN -9pr Hص8l{[Eӊ=k {Q~)WX֎c:lxHv@2*Pߧ|D-h2z/~n4 OI6~ /꟯^4éɫ'oUGA+B]_Gq~KgW0q|&cφYpäzdQ,ݥgk\Y:3{u͋M0ҋPҁҪ&9&mжjRwfEly>,͂'ZTۆ-eۯ%t[mhg7^c;ӫpߗp_5W døm Ifcft52Rꥆ|ߜ aE0LdO5*Cz ojRz\TknLg6V;|]+M;`yXPoW_6O}.k>ԛ'ak;]zͻsa8v&7=sx.L'sB౜Zr!#hi]I ں@&@*˳0>{MԩIQE)3ZTX|M#ɂq"Ϳ|aU*((k+k`@b[ady:7aQ{ԒJ1Buu?I1Y_RIk+˭"ГXTA]_==Hk^*8ʝ7<լFp jg>NLj9+o}ZyZ(ê!s$iuW+׏޵ö X ;#-{~!~8f 8 a pU+\)\:P+)Wp\!9:T%K~VON#[xH2*SX)GVFo 7sL(QmGs+:2,"1r VJH?#Y/`3R/൳I``v aT8mS58"EEօ3]SUUuuV7s9#g}.hbAC#1ƶnuРgTx:o8`^`1|>Zy 45"w`FD"J.R+>rT}ORW@0sTRձ+V]@u4H=!u{2 e'Zq TUW/P]i$?c@1dk:u*~ cV]ue}oN]q7\IOE]!ZB*Mz2W??0CgqSkEտ輝Z#얳&:D^H~Ax:p0a6{r&ȂKgeYέm*`ɨi$t|vH=iҶ>oGM _<#Hvx6FO"WqXu4j uiT#e$d}.O_=7n0LWq@ͫIi׃x*` '~8Ұ d0p(MPr$wKIp2jDaRkRD&Npolg<22壶4GKh81r6; WuDJ FPX?>[u}諬o7D?8 k儝ݐKEIy8se}">K=[LME]9XO] $y\ԁ\Al;n8J;:p1ŎӗjSܸJLf fI{68]ߜTH )=ʳ=ݴUòV80y kV~Rc7Yv<V\]xs;|v-c@z&71k{˫r{?s"`o*W+ןMl/_Þ(= nX}7v3(Ac*3+߻Ntyg;zlpBNYkXXK<蜦K`-Aw=g3^aFyQ<Ǹn|7W|7?|o7Q w&_w A?m%iko57byL&&\CQʠٺZ :~}ӏeʨwzp<~bUEЅ;q b~= &*L5vFU 1u~qkt@BGm]-c6R %vFQNLPm=ucegd2j#X1\a:뇮-6,Rrv: ZSLP."JL65;2E%򼀥9z&KEͰ8Y^^EaК֓ˁF rVQ fvM2noyc򨬟䡞") )E/(18-"Q8Z)ϑJgɅv]rIT>?N]6<.S+tJD DZhgLGC rf0ٍ$ a U ن$[L 7qɍTyVY/VCXBU7D X6㓭6sFû*^&c __U<91ڈMNi r5FxZy4L(psf"+:t^, B!b&f52:ǘ'XX4Du:O$W]NQK ʔUqO|+a ]B0bx ;g1k&3Ye)ɴ͟`or0NC/ϫzP~#JgXBa"F8.` -kCY|JS"@.\blSIrI"uPkQE9Q?0ljR$YJ6 J @5Ƅ̼i0yn:Bc0 ],ho$nyՄV =ק9vĹ[CBAL$1{&ƮwC977Ί=e^?^` hz7`OtF֮JFnXy|Q`tY+7V@Iw Wws 8,\d lAт;ofR"ZL:xY\>\V f+%˺:?`BBqje//.L5,IJK:6 ~=p#lhmBǡLOnz\(AF;&-o5* K,_q8X>XŒ{ ĔW{&a|oq?Bd9{}p[\=n[fNn3I|ij54k={t;k  E(lbټqy칻G:.Ǔ&;Bhk[5u5oj(* oC|/@.NAi;;.:~AB&(16ԟh*n㩚[L4WTk$":)/}V8ŢXJHBRkfuPFk!h Q92C>o :4Pb):)pJs*hdgc䬇C&LlҰtsMqvCkB^K-=y0KM'˻tk|]h}ZՠΆIK= P'Z>$*NEb.T<+*㠂wxVUO݃w FlZ9>p5kl =ysOQr A$w\C/m="D$5%[^PVxesC+dх8S@{+m"Ag `3z܋G=XWRCn+t?bZ6VgtC Ҵ,4%#%d'xlF۠FYIۓ A(tCoQZYH㭛vKŨOSi: ENI3r;DFDCg !dNYA$tUhwo=)odJj׆걒m[N|bmfnؽڢm<<٠v׻o{tH Euttv1Ϻ5>}5fGؠQtf4 )U1\O?`~̭`U6Iegs #%kB9baVYX)2,w)Š }+|%sNzcTTFٳmP8M.ہT]?\7*.w]Ѷ;SkY|q.=vtpgG5"u'*5݅Ƞ? dwtgW:_#Ek{ۖ+_\xGww)u-tH[&m|>Lwj!sx͛`o=HExBt>>*`k2()aPee5(015c@9ws`d 7]I="m#SD2skndZ `ld>u6u[poϠ^\sFe 2BᗀB\{NŬI) U%Ql$ǹqsȶi6!-h'jGU#g߈9.ּhhyW٦~޲|$˒JV??UŤnOݠoB:% m\F! |cdI{7!{PMHM!DVl5 "ѬV%ˁ$n˞.ϭE qBwkt ѻW:2ՌW8-XF9 n79gg `xy\;p%_temW0Ə=} ` ǫW5p"$4AZ5F julhөP{p}< ,Wbtv2WgNk(5&7,Hڴ(mڃx{=fi4O2ژ% wKY6(RTғ.es )q!X$Zh ֡ R'h'3v/ZY!5vg^m_՞1`0`873ObG+02DVFI(nc`ya@Xvu'HB7AËiYS 앁)MD& -3*v Q#;YH"'AHF" ke(u)O=^XWuWR1] ZMI2/Q項i1=? O/GsCmM{ R*8wD~ M"ƽ'!Z@XXjb R]AMjpӨz^z}vܕ{ׅk Նa-_xEl0-&y ߕ@L=L>Q4V2Xfl4A ol_47y&J Bri 4Fv40lQ =XH$$c"/pEIxEi)g8U(E{~9aC&>lh:Дgݳ"saK֠v'=tT:u95kT IhCFCkIV ½,X8 ΄C腷7a4jØoiؕ)E7޾y$)~|";`ms1H;FjޠТQ67@D(v}_ѾzgahxHQR퀋1^WdkᎾsV~;OzMng|MZڰ_fwQ]RrKIw+x] N]nDS=yQ%'_1lP %G!Vb9&Hn@H-U܄B-"gI̗4YvAj|r *{՛sTZy4 el4! M^޴=іd)bjFc!ǘC`H1rA'a1 O<$<#*II֌٭Gх8Pt!u•񛳋91pz9yۛ;;=6 u]LdrP qLxukR[ePq(Y2GdB64*A=f]&RJRƔ=95v5Pv5v`r!4 FC,Ȕ$ Rf3N[,]&]`@PWՇY4Wd!2䊬h HהBS:^!J!fk7Fn}iQkm5e{㑩̼(vђiKFQ:IG/ xﴫQr (V2ZL2D"%-H\X9{ެ pV/zQz׋nJہjO__dZSt6\މ`)$o%2}1Iw"W\~n77,RF4#h)3Le.q+gB2R=RǾ<>hS胲+JCJc7I՜v;*nE["Enkz>vKt>?[h7XA;h)t_t]Ϯb>;7.%NCn928]ZvA4k>E'FˬRFygcnJɇRg$ߺr됣-W'Ho&R[?SHjM/'ev.뚑ٽ_6ڕ¯/70|$PX!-x(kS>J6k;s\,']Yoc+A=`Z"OݍsӠ1К *Im8$Jl'TUlYtHDkI{(lҋHы v P}:xgHB(+ISʁ4#&UtsaUVz iߎkVu_ZxĨŭϼyZYw}kۀY!f{_z_|ږ+v,P/&3^V\&^vqK@)Φ8䠕=M{nIPT.SRK@M|!M2@Ry'cA1}o+)КB>,+Qz_Q RXVKbXk2PٜNZo TK&%v'.䯐yj}G~| sGeXoļw)/h2%bQFi.kn\/ hugo۠B !Y6؈m TaxBjIQtMIRr]bbp*ՌL!`>W)+Ys)D(x8[#\{+9&3Gd A9YJ&rl2EWC&HF#I[ˋV"٢~\!`)c)a+S7*&Y 6H'Cs*%C{Pp?P:9&ȺhjMxu:%KP=V%+lS)j_ٹ߻jxeqKڰIBhWQ"d,:X2b(B"Yllll\fہ-(@[\Yo ہ;` DL(.gæϖ!42 o Erh8~wH7@;Cq,0i}0%⚸D\}/פ~(Kj];:KYOo'8^ sQĨ@PIQ N:l{m'ʷL͜nr2گ!󓓶^ ˧fv!6ij`i|)bG@oQOI!>}7oB}N yI^\.ڭѻ|:!+X]祩t~%%,nKQ6)gʬr+ Z ĦPlT%' Cݴ~0ŸY\] i&3kݭL[j]15}]1W v?v9߃e>`a;üY®wlXQQ(mb˷*ciK&ͧqܰ#qͦwoy\./O^yꪌn=#בfq3:yaL̄ 0_>H4_NZo&FPYMyiN&b6θMx=\LM܌-gǚ/Z,/vh &Dt2-zzr~&~@905N^x;mt}vߟz/lD &h :f'(1A9rzzώ.1!U3ot{xݸUB>TbUǾ\L Y;aGشj ݉͠ Bj&Nc,9VV lCNXH} >PA>'eyu(#1ɚr2b"j 6)Ϩ"bQJk2LiMYJ&E5W4a,.z)Y˒xŸ0b0V+x7q6C@RחZV2<-W}F]=*qSn^>Q[Wݪ t#f%bUbhEikv09,IU?p 8d q͖%m9Ņ,ZS*'U@::qD.-d KŊ\CX ,WRC+:cb2xκM0.Em9${B ICjlJ,)焮v(g4(U`=M>?>Cz ?c6Ɯ "j qtjoxA@BJ' q+rm<OS Y ~l "K5l*K 3+WBbaX q]7x=goF,ӄW!DUd3.GDEz֧ F%k#²]h"'b|TƧ:6ub5t")}%*Z RU!l9sr)yչ?= hL95Y{kUZlP^"j =@K-H+{1E՗;鍰xܾX˶m _43wtڢF)KVy『v_}լA\_Rcvkxj+z7TWU;ظ62*u>W+JrhDM:q|1bO /;ZYUG+r6ջ8ݮ{=4QygRVxwso磺 :xCK^%/neWǭ?WW faʧ bso/rqzqLd"K <ѧz0y#9"BV;$#!^q~ Վ+; NQ Χзk5iuTFidKHdYb&:#RKVQy\J]{6&ڧ+vRO{_N(wtB"8@@+>(`;͡p7i ;Iip~V9M1㴓t)IA9&K"(Mѱ* p:qQR٦Hɡ8bmqqCXM:gҲϩqW: .bV|RxSUM91 嘤3dkȵ8ЫM8Jrܖsr;+03!oÁ+(0Le3׻ RR ^Lx*C V{xݤnax٘: wt0 ]5iwwդ\v8㮀o6ח &`fb~Khr$g#dTx`G>(Hsr4l2|.ϐPJHs#_kT7~̍]^]M)~4 Ͽ_ )X׫Qc|;:+4şς;89= i`@Ąr|qV1?k @۾n49/Gϧ|ya6fnoe2Jes;#w=N\v=NZѾܷ@ S|@v`U͡&{wդ\ w~FX``Uա&}wWMJ zӓss{V)f/Ͽr: [Qd4)8LUqԪXM,pJmbFvay7}5^2bs) o9S`{6/'afە] yflz g 3c>n7|̾38ˇ?e=ХVUEX;)ۿvb3IW&EpM4gPRXf,s|{=GN/?]>+vMe(OKLZ]zbUX5)UUnv>njC>HrRN[UjmFKe -obҭp*dfYxPGZS1yRU^f]8[V}RxccQ01D4drdeV 8m((!4e*n"/dH_HgWvC]\1MNS,f-:*NagJ{H_!SRcq;`a) H5Iٖod(ɔD%m5XNjx:PYù!@OЛ[pjBSYL~gKƺYtƈyt)(e,DUfȼtB&PA a![rteɽ M[b;I&C?&5N^dc$ C98 N!ΐ.GPjk#&?;gŬ }':~x/\tE2Q̊7}b796v?A}Ŀ(0??ܞviNVoK<]FBY6u(qK A1.8:gG'FKe<1xBk5C]9~e&M(-.p~54/vrcY^#w[g2%ٗQՖGҤ?>[C_r(a\<XB_A餧y[zٽeo;]7W^_g7qGd=nZK s濜rt\kEf_.#%0v8V w:[u6ҭuEyKFp0l}`ɇ|W<=;{80+*r]W= [zq٢s(7V>1+[< {xoE%+.I'5`zO?:}T~W?}υ}o߿}Οh&)0"s?ܺe[+Vtmn-k>l+{|lmW[]/qǗsv;sE}1oKdg*qvVߔ (.gu!n `ռp[<-|jN}m$ȓH1o%J|{t)lJ5Ü{^8V9}e~؟3Az@mb)2VH0/+'G~,p$ߜα`4LKCFc8 &e*(KmABgJ2S.{ّnZSz3$С2ͫo\ELq R: :M`KǢȓti.hf,Bqrq~(tbWgVyvL,E=y]z݁Bkol VI{oK/OO肉YD˴mAL v]0;NTɥf.3bY+m,zeFEHg$nqL1SV81 ()g \(E}MRPM0|h  <\dOO66/ ^e['=tȱU9GMF"{MQj6 {%Kg!0kCSGCP!t[xgϞ:[a~s:b]~HF{E߰?gf==]b=oMΨ'}In9AFbmJ|풛YX켝f04g+u\ *ϹbRT R/I1M#rLgXVw ' "*[I+^~KE!ݘҰZ:i w:XM&'tP, \PN` fe}f&˶2eE9o=q&s@Zyٱ W*p%`KB8)-,01`#59+ -c ׃ ]c97HeTTmXM͞q=Jy_XM3ޫ/\HTmrQfW̒N Ǘ8WӇ'z`p,ӠH]Bm`)[a4`xo J<`[_< 4 %I0MM4$#\2=f]&QJPY46yz-q6{0kxjڱv`j4&d.F"J"̖\Acjt|62c?JOGq !ʢDClץDc>T?&f&u#ҏ]=e;N#̼TQrg!POBK,%:%S+FG0{\]b(7Fe2 r3!h5jN&icLZp6Iq-q(~q33C묦%E_\vK}CմcW*C?hpr<6Fn+!~Z"іAnQוC>N9,^oJqBF~tT}|ljր&]ǣexj媺C}qdߍSE3d J_0:TRĽbYr\st]t,)R*EIJRd\3d `duژE3L[fn<388@poMz6 #ALU׫2\ulf<^y%N& 8LÌshLR~: p*et{2lD={i4H E b*X4F&bUb$pio4.1qLqUd9+jlQ9:kv>eJ`yx(MJ>% F%IѨ L&`OEłfJCAgA]˵z?!Hg)c)as9A*0\֠ZYlE]bPzULGd^"'eLa3}&fPX7>8U3CJ]*nc2k,+hdgL\HQ2FlUBlb\lmgcdc#v#w=U;;֢ݓAt#M C~ۋLCqJg^Ô^uoeo?VܯV'{I)l]iV]z#ڀ$be?$}ж\J~7ұ,tvds^t[бQn}]=׋ޯӏol zӋ-`G{b4O A?=.S2:f]zH{NUjky Ysoө?PD8E7)g _(A@f2: k.6J͌h@8 +ٹ/:5moU[RS=^=:ItBrdsȸY4 K$SNNn`7wZU¼T]̥xtm(跋~jG^Xi}ބ+p3ymQmwo%^kq09E`:*kZ+,Ǜfvm;8PNI:JB@Fywl$7uʧ`#"K w/[C&yS;]쟒h9Bd9H9&rW.iE~ %cDҝb ǘ/K29hnaHǹ xm$k-&$C`VV)rqkŮ{P)ڞІKP=*gL:)\ , di2L{-XV܆Pͨs>; 8 "e1IРr9y!Jk 0*ejg=B 'EFnP&H',1,l:dvVM-ZEST:|([r-˘.p& 1\8",dJ2iy 4O~ϢRRBF@jRpHCr;z2#&WbZq.Oz0JYRw$߯DoQ%lAP Sq"(X"P 6b2R881N.w-ةLw\}oFۄiem .z#[@Hk@wlJ3"$O 胡"l|s:m(H v6` 1,8N: @6Aϟ;g 0vP4dWC|X͠SfNwhD)G$t&Tl"syouG7?h KG(u2 @noV!y\O;S̃-dYt\bi%2:'e\:xHRB2Jg0n@>{oۧYCz& W8e5sBmkKӕC;zKrjw}ءGBG1BΊjpVyΊtξg 53z27%}v0] VHyp߶/?ܒN.X~% G/]Jȵ#k͹/yK~#\RStvCg_ng< `ᢋs6W<tfCg63m֕N|Yŕ^FgI{8+ }>،Hfl5jkyG> IQTK$`dWWzjmŜwywۭ[Xūt ݃!6-;sҬ_5ur}Uo1l_?O-C/=6xP"Zh퍦E&^8el_HRudSk,IQp2]J{w>jF ܫLW ©"3#(eGj3WS͑wүJcL5?rO@/J촸=E+vnf7/! }K $t1^[$sw-[@ͥ˾wǾcrzŝiYS#MX'UbҡY]r 5iwy 4}|yZ Vς:pθ@Pf )3#0XqE%JfBNfSL& cت1n_Nf˪>|)37^GsԞ#Jn}.Ad_&N6Τ+A`I &f!NSlSmv}8TȜ^:Lq K0RQ+2s2kLkȬ treqc*haK(F[ :aFhNPf i\.ٝӫQermv-qQq[ MGǸf>Ѻl׹ԍ^=J)~P&-\v2WDh>["W<[W0EdUSWڇbzJR:!qNF\EpXWZq*Bڋ񞢸R0vJ*NG\u%Gg)+ЦT. ZJx,KRN˛ɋZ.3M8ĻvM'yx8-zf|k.#]"ϱJ#̈́$5HNdttRVL3bZ1oLX\ \+<|7(8.q% d+]z)$p'#" Z]\E(mW"\WDdUh:z*BYXz: $9!qdUWSWZ]\E(u+bbOH\Ek+SWZ]\E(Y+R<pF`_}_Q(=;K= AY7l6 ?r#ʢۧaYF µQ#˜a=jF kiq =Ær/'8ǩK,&ig;@Ԕ.8>.mu" ̓Z&Q(@ɇhnd=[N,j L/ {1문L.2͸̲.i0&Irko"+~u|1K`<iR;ɳg{X|M\ۺߟJ#R\}2"UnUQ1 +r2*K婈-?zO%Byl|z$q^ anxu53@ͦA^b09 bbKH3oSA32kT aiΠFݛA,[Z؋nsL?2=׽ @`ʼxqiMH&ZKjQJ' 9h?ݹd''({<Iz*s> "1vG.%Nyaq/KM#DגK`c nr<>0G\;_" ɳeKzЏOQDBI%DYMn>V1ôVn"L@&~0E>& zQM/CM0K!VgNm*'dElGp%9S,BPjݚbOR"bZS ` X|2쇴MqC-WQó78 H[f2*ks;-cc\Y`@[`̪:Q9TgP ~+j;d~eF;_㧵VR=Z(w߶l Z?8C`Rg(ggi,m2c20 $Xpg.m]130zn]u]{6E*5xߐZ #  & v*"n+!=Ń];|t51O2E`EB!w7^83DQN ~[ e`'Z|JpzN!n΋2^5=_ޘU7]SFe\5,)~UT*]y'>juE~E^wV+SG#Nȓ1?lDEh>vO$X7ɩ^#E19|rV.}'p >pFݠe0nP#WJjߥJ`ANH\E5?qNE\Ehы%學z⊀:rm5Rյ~yϺ%q.լ)Ta)̨[MR"Z_g`3p2uj)iaAR.k^]Uxx9CYaTFmQQZe #@YȈxKNrx=Hk4 v 6+(M>ehbS0lژA!|LV#mFS-sH2҈9G3IiX)'Bا`r=zBHs{dJr2[aEÍC#3uUox-7?_T1ϑ ʄz>5{ŀc?=UVQ%z怬h$e_dz0UoCbl7 J}:ow\hy W|Rre=VX0szh~I_䫼P ̏{x2*vrW>2 ~D gcQeA+v,vY@ qf3 1:-b&!3tC&wp) <5eq =Zb"g^#ϖDRBj9sNu@WB՟?ADgd1fl0lE-4gsq1?Y'Xofz9g \ 0_N\D8q4D<#͸^t6,Rfw廲mA!6 1ucf-~n]'9x?OVKVU'c4cA< !L9 )(ԝ$\y|VUÀ2-6`M`.\ ޗGGJ neh|(G:—ddraRb 0TWvWN4&δ?S okU}LM3BM)Udi>eh\>|u*?a6,xQ7V=8ڛse>YwqY^k" (/.`~T,]&Q >j5Mn+^\ul*s 1$²騃qgySRS47('7r_^ǟ^߯~x_c޿{Ƒ8`d~W`]CFvAO0M*$e[^")D6ő50,3==53Uw=ohuo`:KF'i <[KRxܚ|r_U(~ڡ]z?~0П{3x3y{&wo16a%H6Eou,Tq8m]^^scHc֚@=͎}(^>tSZlm1m~Ov qPR4yٝw1(Lr.pDG\iBg! wNSsU64DOoaү)S)]=,o4s9}s&A+,Mhu,$SqA0{.GaI5LuPe6&qz{s"n[])+ #e3($mNيI ]{=2;; UB=~)gZsVy `yTIYgC~ :?{= k>'2$N 2,ezm@+9,st;Xh:,φrBu$zPd}>C 'N{u 覈|M~f)DWL5@orRV?ƣէ?\J+rJQd_*ďH!6d=:Bꀐ;]9PdBŎ15{N~MoX!E)W$450=wNI[Z{魎,J$ ͱhc"+43Ԅ\ X)iMQpTF%b`2aI Ea ,IzȈ @/krE$Bsb.$G:a1rکoDfX?ՈFF4.D$bHIDjtbUq#%cgF4@owKYϸQEEQ$I3>7)s@4ڡ^TpiEa(:]P.^# Z8aN1k@DO(N0$*D_:qͨN/C/Ec,q *lQf|\4~mAZ .'kFQG=E?^*QKMQ$XQldrٳ?2Kһ/֠J[h [h\l7Z uDc8@.pղ?-=V:@u9Nk?ZJ4:z.}&* 54rge:%*ExB&oԤu^W\sasHHX|:J9Aw7e!>e3kuz5ZO?670}nYȴo*W P mcGs'5Y-Xk2:C-d2uK/*XeO*dږ ݪeV8@Eqj|\PnZ MLTRJ":qq/1r'T$mQ$mE$ʧdɳD]Ns̩2F\=M\@$hRTQNT5.%eK빱1$$AUI !=#iEXĜ@t݌bQC ]hoF@]֓}|&^ y>ļdozjj&rPc(24STUhBeJ$c"@صXs3"p& oTp㩍9& NOPBZI-dAKsƋ9 ۴~3]K%@Nt =dCtA/SDr4Z9 wu5}A:IiMUDH)w\ \#26R;jA81Xt%zn@0$Gb%h)ʨϭ.CNxfMMŜ(XgdOIrv/]x%4,F\)$!H$r R%6jɤM$$;k X. k/z"֢Šn9Bˁ(-\_qO=FfR %𢚎o syUPZ3+anOۈ-\|# -pR`m͑ PStD)TDY.,wAMBQRjp,pNkՇΦx.La >77]7H wJ@DK |ZI1*vf3^|7H*tdh2;T_}/zɯ3o j~wSkd|7ԛ܎m~^Bñ&quػC˺ f;8\m;^{w}u}sA.y2ȭC[ѩ܁nOi C~u]h-[e"Il||xuZn7WW[[_-{Y[wꎎKf4p^Y)]o MܜjJxk9i'bݕZ:a&n~$n.{..oURo\^M _פ(UɿX]pvW+ЫmdRq>mB h a`zlX|fݼ~L LhB< +6d0E1Dh)aMJBL_hJj_C?C&8A#AxDsV=|E~.APZXR*U5QGGM D-2PJ-1.6 ac0V\ (?-6m 4V#yy8;g>/K Y=`rQ>2~^#1\[cըIh~Bih7>#_;h3((QPјM-SE/l3cj%IjbspY5l8-Y{.ڦko־Cms)f^11#1K@K,gۛFs*DqpdUpn1:' 9QȉB*48RL^O`Z1Dc MX6$;HH-K&%N^PtWgd#OYKF*2ZkS{.yz ]n%ʕ2x6fLxHx5Ĥ<)V~b*&(e]ac 8VLuA?!(+X>ؠ]6ljƙ{xl eZlYGs 9H}*FRkJtm\X5>G g7/Tp}uy?ר>ک~rJ8 R)ٔ\nX8zZ1(\f)hbM+gc\r!Hn5 [G ('!KJB!hZA]`ڨLr:ٙPem4Ƒ r }GE kιL6[)Rl0C^B7!z&Wݳ (+Yb/'|l07Ji@5bbO6vJ6Z%ok3!c]jeMW-B/dc'dc7{U s]V?r480j9a"_NЎ+R+]9Q;,Q5jZNуl-d?_%AP|J~k/涚sBazS?ceVۄ6?7.>':~wHm~E/eSe}v^gn]_B[b>7٧շ>>wpx=An-oY,!lt3 /* 2AۺNH } ?g.ÍƼ=>ZN]ϼ`3}(C?{A;"QM+)ҔՈq}=;vYHD wnOycf;CojHK v~uY(kQ7`ZQ<4| gPoFU=L"O,:i/zk-$\b:aL:Mf<4W6;]+~qfm.SyOz;~aztT=v{;ԳlO7t;l'Rd1u2xq $pb6͐ѾF[Q7;^LZ]vX04UZ=Z߉\cR$b<)2S b bb[|OTuJ3)PRѲ.ZF)Eӂ]/ )FLИX7.\4S F ^֍ pEYK*6Վ?#E)FS܃TAPkG1<ڤÆ 5rܐ,_.Z݈SF* e{j/o, :fwY[I,fP|mYǖ\BL ѸrYZVERIKw}ݝ^AjKϩy95kߋ.站Z @]o9_7nl!yϯ|J=@ju!sy?*ﹸ&2WRKQp9z>cP6%'|]LV&R`M_w;]hNGLBZ٧JzQ 9'tlRtG\cN\)1Fnl723@[0\<ё<o9I>7TF}3hpC|+hhz!%;  x/jsy̫S{Y\ +UX[jZ%]&.ee l)RjQpQhbv-GihspU,ғmg+Ќ`3ؚ&XvpPf%[$N뛋Azۋ߶ު{B.\Tr{lrHfﳟףQeGbƷbQSzl4ZSR(DCEcb6:ȩ26Zbc:Lz.Q5֣ g?ށwGk۔4tw{S,~ytv_G߅E.ѷ3xd)"SX*%Mu7?Gp|ǵ\]^v3ϛ]] zc-X]9D,C,%raf[ 9Lo~ ޼ؔN t}\^ІUwn;t`봲͉g}_l6}+bp[;2Z&LapHqB 9үsA|G6YnCѾvH95GSV`M;nWv(N8ƒڥ@,-h$'$1dJNV7H޵q$H~0Y'{.q]]-H[(߯z)jI=B"3隞_UTZh{SILNώSzV^|i'C \2"f]P B!X)qCjX'^T<?WA73k7 b`DRF s):/86–N4mf1h2(R_M/Я֫bn?j"^e7w8OߜJx/<]/\Gef،LCf|"ST5 dsW=[TevFH_x82I:A6+&hjBTZ܌I %oڱɬSpaŢ-;u)D_=_fngW!J+XD3IbA(ApcU6H%"'VT_޾f%^ @v_De2FQk 3A cṟ*΢7&pPQ$iUtWzU^@GNGxUjBϊpu f(Į=}>3si*{RG~n 4F)H|jQ`>ah/yS }I]- svd!NJZkm LvȒWqxIKc'x@΁E<&wMQHISLPx-&#se[M[8;8jKs6}*{]eu.4|<h `a ǫτW ):_% 6]mVfMh_N1J́b 鶢QC2ӯs1‡ǩ4}/v>VY03fVh;9RL*˜pe"UkbtB-R! e&,y BdtT `MBs,h9 /|g@)+r!4O'称fnbMۡr[ϷV+w .nQcN c &Hp ;d ]L %ˣCKNOKk u^S&ZfP ۶u<Y-I4R ܽW[*q]M_I8ouT!OȀLJG-3X 3-9ggazni#A‚C^Џ:,B>aaAsU&Y}=~6:85C*K]=n~t=r:OVPetjeͧM&㚚bܛc席A;ź ?8};B꽘ﵻ?erlqWlp:-Zy*Ek*8I qߕݸc8lc"%KͣB?K:f+P<଼Ew`hnN᪍fg4:%!ysRt֟myAd.-k9M&Y9CbSR% <z#sܴ.zHvy⭜4LUzO50k%1EࠉRp&,}B}-fĖo*~Jl$v>3!lddK#92`9(mPe-$N<[M=O[(J//>y~rD&yM A3AKJdA&Fc/Uw޾iP+/y_2xҞv$mo<}cn/Y2,F_&;sʷLPU.R.y':6Lߝg!O&"4!~3)>\p~ۇQ](ΓFY2LI.Z DYeHr.WYzg^tѫ):&MЁܾEZ"Cn٥WMv6:$/.:ZZ Z0vu0;~ut7?yt~44}t< 0xnI;@)6ѢZހqp19?]Z a j|xjRsr>ݟ&9¨xOjgL0 0R(#o%p5;WPJ+I`e̥,gRE 3 :̈́#HUr-b@4: \t# B+\\Cu46 SʭFΆ;kU.vV~($K t\h_uE%+G+ T"C2<(,^uv>k^>% 8$$VH-SBec^,ȹ[PQ)YxZ_+x+dۢ|G N/ȸq^]׷ >Ykjz;롷wj N笽vK[U+/ri\20HEv3-.-t"n" Df4r)ƕ|.H P6eE \*X$3H TmX5c=RMV]хׅdѲ 4v@FՓ]Єhy4|ZdO ZcFAi|B\d-xuPPIF-< ktJ'i(^HKARX0*fH)03ƮF;Niܙصc[-*kmkxCfiLX $=(D,JM-Jo t2c]Ufml " !WdEDK0 XxY)Āj!Tևȹ[F|vGۮ/oFnTt_p. ~`mǴqkָ]k]Ԫ =s^;a|GeR{巷noVV$ ј']>zeG4m$/FYY#"ӽ!Y@(e뛬,&tI2[۴LT+e}gx;QJhRF&e-H8uoo0mV wV60|o+WU>g{C'W@ |N+-vGJm→"RM^݋`o3#w ?A|j%dߏJc= b5fN'#ԀrhT:+UM|s>2 WQ|uMghFe$doy`4j51CfRs'd:gU)ϸҗ`ߜzӰixp^.O\Dܕ~Lc+`|sXi[D7X{te2 Qt}!ͨL<㘱h<(x,8Di=hUDP-EH> 2TV)bz.LGf͸w,ᒙQRk.X:lϊQJ`@zd5\Tt5r& \_Hx@{)ޕ'90uz >zLS+m뇙2v]p?A2D䦡+5^00=5&ήEp&wg8{w0 /&x@%֓ 2 Bp %8<1qH W/g$ed+CB2sRl|  $eT`S%*I"sR`j^C1]j "V;3Jb!_ioU/VH+JMk}!hA[p }wk3M.JA=)Mh-r5 RTl'6yFM!6}' ;HŬ?G>+Λ?;eLi^R MCsSEuk|]Ƴ`VJE9{ Jh Ӈ"C< R>ΆiCE)ln{ % qlNS-~k6}-֔P Rq%2 ;6ѢLAa%9,0d!a NFxPyyƽb ()= Q*^)Ʌhja>>9bYM!)afBEdkT1{eYP,wI$Zt:#gbz ^0Ӕ<ճInCⶪ;no6Lji|FeWgPJ2 FG0ѫcu7{5ޕGuXCpDZ6mx*ӫzmp6}b,̔IXՃD[(Zֻ̕{f>oy.`]JТaeJ hwc=7I]TpW[Zŀm_;݂8RrrKSZ' >w _=#(J=%EF/LYS& hҨs_PzlռNIi޾fG6(ȆCwtBIGX'h: d@Is $7)>^u*b9i4xۘSg]g[ɒĺTIn?E}~CڐƒUBGC~BO3W:掊¤D"^7փB "(Z!lzل>( k'!%3fmQavBv:dU R@%23Mİ{щ8xA[{#ُ{xŸUcD$j/O"Whe`HD槬'rRVA&zi0"´]NgIEOE:c XIFV%C1%) xoA%eƺ١ s+ţ;c /IWPPKjXΎGwM)|i $걓偄 6qV(dIg!Gl=uQF{Ș0416l IX,W}6W$AgԣglP6ci Ũu)RG҃¨Tʎ]$(ܹƒB'+쑞Bz#n_pyŭ 9c{nwt%2JV祳;Kު8Б&v/G/hJ6[6ъܰRs)15DZC482%2z6قA(RNR ^bM*wD@lQAd$H2BJQC_E3|e %Ap3#3\7P8M*T0on(25] vo=ߣ% 3l;cjm.쎮Vxύ;~3"6O?Uk.ז?X*۞evn;L_y/ε]vq= Ow'iY/}X !Rdw2do}7܀T1q:*~ߟ8 'K TЧ1Yal0hM # 2%X$ɔň2PtE&mcsBp*A1쌜} z M/Hhe>w<5L>%~n܆|#t `GkVALW_ dxu)hU(YoQ;L"P(5>]g=in笙WA5>gQkVr)džFZ|f?Ƴk nQ~75ޫ]o"_,Z0J9䢅6^n-(:$g#!u@(ݖ.׽@$i3:&A`ث%IپeQd45R> f ox-zK6_u+=A Z9tر;vT)#CNB$© B˴+yC|.5zG.ï7vwAxTψKY"Xi[%{ Zyl"~UレyºmWX$Ɯ0ju QllJoq=OFss*FF"_>R1NmEֲgQ$sCQS[0ԭ =$BgVe)dPg4kѿ.GT{@ ]<ܢz]+֨ ϼ k"?FU jf:]OSa-yG^byZ;eMټ̑ 'Bxx)fl38A` (% ])oV3_ NI4U;^_Ɛ)`0m#] .U&U~ᄏ/>jc46ΦGGniJ=^OkB_8eB7Pngz0o^Ìk.gޭ;#51pTΆ'uf]WEf_/";ž{#ڑ6d7 6aV7Lqx*VK>]-zz3|8697Q4ꦹ*śN-l}~Ov}9eU5"~,`<갥S<Ά/N_~㻿c>?A_}W`FȆh F?> ~h+[gh ϸ)o8}3nڮ0NZZjwӍŜQQ0,筷\w|pEKOw2lcܪa `S\Ǻ5t[}Hʩ3x}얧tU(ՌKHQNiU˝%~m a؟IsKAm-*"ma_6'0=e<+S?|N])D 5iK k${ɋ^ -AvTwmTB81o2ḞVcTvnjʢ%͞`S]uO{2,d";Ȓp/U.NRfhG3yTܫ(LZ3N 4jpl73LuP BTR7y'pQhju[kr5jWVPk&N~La과X:pnշ@9ܰ`r{?Tʻ_]wzxqpa0Z`YQ˳43OFӊ|ޫ^ jHkoQ9N{{_]t:cTRd+[WkbSRD}a2_BQ48u)hVH5XWfe<+i=Fm 0&1AgKRs=C$J1KmԚ{;+S)βԅt7~8q2JDՆ88}`nըt%V|#z.x=SwÐHsb9J(R*sPLZ'9Ⴑ7J0mBs*6b gI(M/߰[{ %\#x8%2dqlơXK.tr }HRo?2֬>'.CP+tJD |Zh\Ø3aHCImTd̆ߒ_19hHƒh%7FP&7?PSpIt di̐~UྐྵOߪ/L-aB{-$E t/p N7#b&fQcP9tֳR49 Xhe 183csZ1.le Ҏ Qo{V'U4,Wn8?33JkP*4Ox$JK&@I &fNnxs=8H\頄W"jfc3@9L6M'n ǣXs1f ZǶfmֱv`l Ҭ L#AXyJx.AEj WC6(sٽTځT ʐ OS}(yDŽx1HnFks?Nr `-#w1]->2pju hN'V!?zћgե*gq,(+`}ILQ((-ҶE4 5筡i+Z*<T|49 j{S7~y5e(4jSP,b(m ei7x-Fz!p->ďN3 h< tih펔J(!Dlr9̻K :ѲZ(~w,ȇrBTvV=ܘyћ~\^¿ؖ0o4.|ϯy"sHTBb /-28ɗyN,jCB]-PUb[m3)|!Uu1zv55YUZ=kb$VKQc2Ja=ܻG(0_pxz$Ze<Pphz*l]`tk BZ:]J}`:zbRXkZDWLJ"dk JB:]!J#;:FRY++El=)e5riI#QrP65tU5m+Dk#+4Et3+d[ >]!J::BDfmOgU;eG9]]i5dC+Dɺ1ҕ1D0"B= Jg> Q ҕeFԄ@E> +Sn -iL[4Āpek"ZuQ7CUOO̪գs4<*VIU><tE;ڶRZDWXB-;xBvtutŌUD0׭++Z?!?^o'?'VM*lDz!4~jƦ[膃StIM'pe-|8.,Vao/?χ?N>+-,iu\Oyo>*Mt_Fr<ͦW /A~}Uq[YaD6w_pɵOuT:Jj1z1"|49&B' õ6*b"AlTSsUyJ0_YCڤ3P:^Kcu庼7F)wh>Z?Z0c0R~:O CG0x>)[>rsp{GGOzcgPGp#Wk7Y]UCشT*SO׿'S:0p~ ҏ.l Z7 <4UԻnJA]){I4NOJAR_}z`S25 JU WгB ˒.UmqOX3(T~&7GYG=dWNG}:ֆru*SY]'np8iwړҒ8dJkwV]٥S]ߔa^p5S6h-K֤eNj QxL4qo l:g*EKZyjFk}jsvNZ&h8X5w}~OqLkL<_?yluiz{zhbjDnk_\G $T{E?6IQU5nn6)%-.GQR';g@ϟGh^ԝ tQUh9K bJc>,VZM$*!=}`ªy}̐]W1N0JaPFg5ey}X]u.lk?hitE^iÕwGPwvmWC8x߳$5 JgBa"Fvh0 ]@Vp¯]|'tc 'Pۇ@wgf3۫k_(̾l!Y&dP,D&Kل\2kI)$$:I5訢(1 ås.E:'fAU(VF`Ƙϴ҄g1 g87??5%Qso=Waյ7XNu`=H zJ+fv^|=߆D_9l-(ӂPZdUfSb39L1GL+X)-AH̡(fTjZȵ/%lMdFm8Lh*gUuD<|M961{g]Ѹٜm_Ζxݖxqhn> br޹˫O^ {Jvrhgt<b6wz3~1+fu üu^̶tzfo~crbCo|aJrC˧y5/l{3{[:Qe WF\ni.lJ~.] pdqr_(Hm"n}-[0jW;u~ r/x{k+s7%?f4^AA[%zV':\ӻ !)_)B0FGAU?i:;YX vް=Ʉ(OAdCa$/f@)$R`bsE2TLT8t TyBd~!m0y}80Opӷ pyo{!gÕ?Av=¦vqz }*9 EhPjha4Sζf9{sޫ\ /'iMSWUy$* CWlshBBh0WeZMd\@"Bk+Xufc/|}#̲s>an痏)üsR~E(;kaQ\mkM|F.^ub`|̕@Ym#5uH#BZ;c2bmF\4א klr&Rդb^lGyvaSy={5l+xܦ_+&,K6y}B [L۱/>uUw?$}Xj$sD֔_dՕkqt]]̿/]?eZq͟f k=RI3!L3{ؙ][u9ɻp}CqKSs:Z Sy*h=p uP:6%8QANb s4^jeb~Wdv+T4A 2M`uHJX3|)^>0ffHdO1]@99_;saj`LQ} * 1 N>7c].D.}q6{_\y6zs;fY|{kqw(-ѯʥw6xIs0A! ؂)&Z5.gk'D8"X0LuJՈpK@9bE;)Tm`}!#inMA݆b{^lŧ5wl PTF%PιUk.V'k "%=O'๼'Y =5)GmҊ\5BJX1(ƐR*RsN`'['z幕LURre_Rզ*w#̎DJ?*}jFgw=C#Cۄ)R۲;.S.d'ě/؅[_Efe)`H@;}MDu'vታ#ķwp/p=ξ#_6$Ԛ7N;E9rʊ}ql:O ʤBֶ =R}r{D]RvMB1\R"IHkF|餻aќߡ^9+ۛ/gU$_s=K)\yOՌI%^<~Gl(mzyg9w6ŻvS^WDeJ9bQI+gרqt>"m&G5M'={br2d 1T| FosfࠀCÀ₵!-UƃGJB.:`xOʸ6PC)>8ŨHugG=PڜAvQeL>(e*!&.9h D9a%oٰMU0@OtL@No%SSHsnYcgJKL.1 9b5![')*J w0\<(; S .esȁ;Sn쐾 zyʆ>s1K]zI s]keHqbG/Ď6ŎF@<=\}7M_?Jr!3BJS3\K5.]{@\d9EԵR[6Eˮ[ rdWQ5 yS?Q!`c`--k(PU`l#I1:Jvv%o|Z>Պ=0z)qZL'h;vR5S.NYE8 F&W~KLNh]GjmMI 1v.A"mN-kf+Cn<4&g-T?>-=vvV[&?Uύ@(`IQжr P@tU%'IEF|͹2 r5zJj`>WbȱP+bi HFnlG~\v[ǂzƒiKsۘh1QvYZܒ? x~~c]:PPŠtA+S{g+DO- Bn)PdE#$'Q0J~9WU;8p4 0 "Z"錈fB 7RXU46MEgIv(Ĉu9Vb<Vy,u Z)Qlb46)`qiQɐXҀD1'TKCfٲ41uv[%"vEpqō908{u5Brr@A$}K( &*.iVb ]r.&Fձ/xh@ؖzS.s7q'ޏU+y#NU&Gȃ8Bc͌=@sKRG[ 8gJ82З,P}c;7Q$CR߷zDI4Bb[suuwU\B n`*Q p&q.m b ھ4ꐙ&7M1{Nć'}\W m{U]3g[]\A82#~{:{o$ߥ~u[)L?1D gn/|TG?6eS|z*)\чŠO]dELodf ՆFDI &)W"pCS4{EDB[16r峻/Ku ᤄ~v;hxyхl9:p#%/P(>'@r'ӽiMeDNpolg<T&F0Ѫ)Zjs>ߦ*o/3\|o;o<;*:Pl  Yx&zGb U9"J(eg9ᒣZC폵4H -(2=MI4긌 DkQ¡  BClYAKPcs/Ĥqay}vTn{]]іfۋi-&^#8 IC ', ?{.;OLU}Ddy2ƸC #Q>DyN}3"<@ϖ{"rW d !A@p[Ot13'AGy>:\8ue=Ud:J8ߖ3 zHw7TJ )ͻ^pgٷu+h9>õ8.T5B.V3}ϮSrQy_x}69^x .c;(=NR[su=:ǕŸ.9M7Ǯ?DurK[r}Kavc3ˌ 0BP(l7uF@ vy7mg`FCyD2 q~ǿßMeuxc˥N۝g&hMq0 F%VFpyu'"[qL?Fɖ's9ю&\,&өWQ(f>@hh.l'a݄jfjJ 7Ϥ]vN縲5!{2k_Uo_A8̃yX3d\a0yK&v ZI/UTx:K8]F#X'sRD#WG?vs?*R*\T)«Zj\!glw*0gU8i ^B^rjA҄xoN]ly>,xo+0v~*>֗?jz欁D1N|gr6k (|Pԡgy^4z;rcq؜h𥛝S Un]YC^-D2HRp4DR-Sz\(ahw'­O[UH4BMDyRz&'LI"`L` 5Q =19[uGm *OBN#̕$NOҘ|:KQF+ƧNrcF3JmBs*6"Sc[o8W*|+(v|ۺcX!w l5wL&w䖻c3mwd*[w̋t0ey?C (~wjʜMg?{(knA gϚϖw;hn!P @H%* 4Gɥ4sR>D{~aD7XٚzCoV|]PtihgC[-_H*z_zOc) ̗:1Cib8%=6wji4S˳b0砅v+6`&=+,=ODtF5HJ>z,*$X"BZ %FRP|ӭz!xh1E&vYdv#yyZ-6YxĜT[Jy0_v캣>n=ux4VJJ%P hK Q1KV Nbie09 '[}ګ=!_F}:E6r݋5O Cu .ۖd!ak9C-EAuli]q]qq"OP.`R~5J+mBs_ gUi]Leޣp)&nqQkiªբe]ft]]^ockym!-MCZ(Af(kZ":_VY5umyOQK&{ TRbK/-LX;.`-h%Ik pZ)C*mPSR$*m.bZ9!6IP"o|\-n:l+| qf١4 31i\v%& c_gNˇ ' ztB@J BZBޢd"a*d#v 5vhiʧd)D=%,R'rAbe , KZC|*ʉjV3*?PH^'mmT%20Vk y&<5}1r?ł\~F{򏗹rVg"B'>jp4]Q[0(EPzL"!@J9W)Fnk U&`T0dqOb\ <9'fN`Q<*% i!"WNrT*g8P Z[ƹDl1rnpyhĉʽ?uL%>v3ٺZi;,e3LwU,ș(G2 ƒ&:4:0eDRUYrي"xgXJ:@Ҁ5NMO&\j͘ ;k:4)r8h0M6 2*+Zi,K㺚Ov#oN#_yGЎgΑ6E4o7: h'DEùC++'QH.0C#Dk7?\-J@L>z]u-z?e[Ya\nNגWjy_l05dGhL4ML 7D0bF>Dw4\d hA4B)U O( ACJT!h{'b[8$d\{R{ͧR^nIѾ5.)~1"p|:Z ֲ'̴2n9mqⱄ\or&b=G`)CcNR A8,dDhxR<8gi5s<)DG6p L 4!1@*T;8Sn5O#'uz.$mU%űĚcFEb@@n6Ȑ+tSmvh6D@L^CH#J}cRH+a]d uD 5muyoAdCx %xOr<'0yj}[5ޔGz%hxYڅ{ϣ⿾G(~ 4' ˣ~S *~Jl 0!Nm(u1ԃNUtŧ`8M[5!9ac4Hfĭipᱣ{8#,`iqhܙ9 ƚ&{B?<9ҾD8|CPWsJOAټ'Ǯ{u*?ewG9[wmmHy`;A&4{љ<6N-,)m ֡"OU+.svjRV_ii۳Ua16c(r16#akXկٮ/bef/Wq/K DlNDiq8fIjN?uUɛ5i7ܑ4(;,6Aֺ)ʩmv ]Q9vAy68Wfz=v6hx9g VPgs7F }.v 6&6ghXK]bV{xCg&l'mYM6iE7:μSaD0]3ڕ,4<=^l)e:c$  i&2oyQ:y]RA`Q,hFS.}3P<8 //():nB*,H2JQRckFJi5W{jbxcl} KjM|s~Q۽Gu(^2r}#3>BwS+ 5^wnr &dڑgQ|GUĀs$| NGK=597b]>)sDWD;h->(+PhC8 !HFJ E'ViN3#g3:"q\=Pg~y{GyS[M[£M gơh5((nX$d > ` ܃ B-&{땒h^h6Y-CJ>;  Ћ@I_BO?2* <! ~[̉|_EwhnnXRY:}A-yI􀅳/c#h62'(*9d./cE1h+B8(+;]-d{eyoMgdzgKj,H SXEcJf 륭v5U%==+" -,|]6, l"ԵXEGZKK4Q!UhЁl z('30Vx]Aycu#lRMf^~Hd!w6y$g$q6iNb4D3aǓ:ޮ 5kC~Ջƿ$7y"ݾW "Ǔ /9j~*ˏ?G-KJQ.jHYG-ZYS/3>zƋ7ˋ?A?(]D碭>MBۦ%6{&>fg.`[G AZU/M/г{QEsy5M&,jb .oаrw益'_zJ>]>B - Bj+rVY_kcJWӇ.]Mt5nLS1DAӃjK %ł8,cpbJ22Y87DH(:E-1AL]0<#U#) |JuqfGܚ}JzAVU%ST<o8枀,] j'GFk |$$ MWMrZ&b2r= E `_0~ s<B&6<M)q15)|ݠw}SGv->A dzHO9.;'iB Mk"(Y$NAl'u)6Ų<hMm1\8/ sULC `H36"gKr燔~>$eG{9M9(Ubgh x62A!8<ŀ2o M.sL%-r<*2%Od +!MF!*.h69*5C˞8MHWXaOԛ}@ժC$XnB:(PZ{| ճz{ni+AhA G?єTDF-A=qЀ" əTp>TVvɇU.bRnKNc7@sv߅#m03hЁfТGT4^Ɵ 1IDm2cjr, ^s͟tO2w&2y%2MsqD |@ ; GZqN?i;V&*ʾ~9 RHompۦo2Mk½#n!qrrj`a>mA<3qݯ͓wW̃%bJbF̹2f]Y׹"dW5ěv4[س8z%u$J%+GW Va= aqp*|<:zz3|xnrBv\5ꪹ2śNeF#,}6uWczDcI5rXөB靈y<ËȎOO޾'N^}B뛓^\qc8ZE{~=fxCKRjhfh^ro3*9~+>ߌ[C?Wk/G.5f}/nv2 ; }F1?op vQSe3yU ~ :~oS.d_k#-"ݙ=̗1[R8%ԝfevڻx).&ZWhh=Mh,$"B QWtX:jkXؘXmi}ZB:w*Bk|̱=rFQw3`*s uP5,wRb'o{i [=ZNz/Im^VOV"^<&)x*Ǚ%LOlf}ܜB\>qI#:f$^z EEaҽ?29atI[V΋s)祼D4>h_)UY's"KA=WуK^!h#Ť4^"3rN: omwM}"eÅ!ǂRVG<ų PI֌٬abg..W]Pw*#4-KLy7 *{v6|6|4U %РT&WGx4:(Fb5J6I&I&F'.C6l`;FnV\iS"hfNh0 IɮD95œy(Zw쪵Yaz#ص,M0̈́XydkQw&XkdWV=X[K0~(%CpU"q9W]"-JpH \(WEܵR*-"UTWVXd]χ? .,TZb=]#5]3]>0!L(8`03oa04;"ٸe=Z]W5;E`&q-WY0]= v`|\<;\O`4 Wײ煫H+{&|?)y˼J\AW>zCpU63pU}-Ɔ\BpJ(í\Dgઈ+yWJ%WEJ =\@XChWEZW$aWR!*k \ih;\)•J.yWE`愈ܵ'-~Hi{%•ֆ !*["qwUo)uw axͻ*^Hٿ?xݻWyׯ.>}ח`'i6iAԨ:ɫS𸎅cn*.켳=yX=eGse0M5oǫw`Z4aeakX`J^>`ixzJ-/`öoTwy@;+L,U%J +T>Q( [KkmA^͛O&FBbȕ * > w}ب`PܦFE x+hI"TFlj !9sֈR,FبpƐ|]RC`7̽Q:h'J5+_4%ݕ/ 㛝.-ϕko҇8ʺ&"SR/yu'/4q[ g{ZG)5^nk}NQ1*#G}5*VFyy4i}wSP1 }ϫ.wk-2L~\!elZNw7D]q`b},8=7|7c[-[k4KNv`=w]*KF J/f3ӄ aѤ4w]Lqq_pdS:LD3zz5\/^&ώ{apkt[J"ۚgV6. Wjge97g oם+=r%̉:ǣ׵"R/gGjb赬Q]3oHR<ɦaanf]Qޓ4E3XfrcOnn^Q}Mnuӽ ŋ{2"xRGM =>Ϗ<``n&m~h OHO?s͏0oOoh,Z603 0\oo~hŚ[ Fn344y׋ f\-oh&ðZWk0Q|7&|YԿ>)N+, 7 Mf~U$+I!Đi_>۾1 wխ'qOp([E|q%tg+T5yYy:E .Oo>>ᇛƾ SɐSY 93T 77=*1!/M,r're0 &X2J#A"*)4'uO;6?ڸC4/zyRѠ缈u&JDC*^G$\rȆV3#t!4N1ة8Z] CZjwZ]ju[=O20ءT@XBwRRt%HLSg.T@Ѭ- U6ONƣsS ǯW of3WU/Kj:NݚW~ ?j zw&rm)c?+n3D#^8v \G!׻iꭵӆ1eTZjM$ $$+ôdrPyҜ ߒUO!|ğW8} M}䩓隹's9*(S\ eZ F Wdڌ+{mx;7߂ZMMoVqՃAC2ob٣~oy6dJF5oln eEРx Kb =yPH>ݺJ(EPL+*@{.dg8e|3b2(&@8˹)=kmĔ\JZea۔| {˥L\-f9r eǭ&w?fn֭R_S!+eq+r+d ʖ7IXR܄hSDC4ט8[D%zԞpqZ-M{iɮ(Ebi< yB9X {f ''m0Vx-ՈP$L,@RWP'[cb[1UxvGU نY:ȭ ý3auAp}3E?*kME?UkE?zn* Hu7:m RKUFW2z]yeY)]I7/@ !oi-r]fL d|2Ewȷ01ň6$ W)P,qpcrq RN $n"Ӂ2})/J(OsiXosP9'RVC7Kǵ}=kϵ{BEdTՒ]vk޷Sg~Zj+&z[J=h;у=xٔW|nKO-ХغFEZG5ރԎcN.1!ZCVx'brt٧$ &f$:\.mCj`CۋY( թ$3uZu.qִ.1I /.a\5㲾v| j.* PJ +&hMw`or?UYBhSEբ<<6T{$j%]%d@g%S9\y#NLQe,0^vhY))NkόJD\7uژ8[d/NNp761:]!z$%ƀ*hd2x]T,(5SH׼}FLmpAzKi\'F˥QA+cs\ E.ߕpw<ԫb:$8h( Kԇ)2# -.[JёyhGRlWvQmM6u4\P&;cJj,-BYJg.%˘U٘km$7E&imipT_NG;ZNo5Sv6iϷqpw'FV.Fm.PX<_ o$z߽Ϸ-rm~O>[3upgs/߽d§Q5\[38[.ԁ".ޭTT2gT(7qQ}\U”I✻ !U9ɦqQ;i˅UգeDYٮ]@h4{aksֹ(N255ۏWG1~َ9(KEmEu2W+a$ӑj:by6xXBre>x07ov 1$d=1n1 ٛzNC3~Usћ/B˫#u?ou|5$;φPj>,6tj pf; ~?+ R|?F7ٗ K/ Q).~ y{VJY{?h{)i ;<%Wznp!lO`2P@j|v^\KN=u D-wlX6 |Q\u@;TK*~Ƹ&h'ҔM&_|LuɛmS\4R gDʜ? yU'!K8QFGυV()5khe:%* *7B G >ړI'H_尳2j4Ӷya_tT\Cϛ8^zL\!{hf?f?v)|ʨag)ɮ#H(T e_(IBJ+qt߃d"TE RT)X,QG¸Jj3$ M\ Jʉ,[ Aznl $GI@e$g#FksNSRX9*j}_~)N^f|#'m}%<$-q&'5iŠ)e:19|ܡhhн ou=}fW=ֻs%28h]0\cFڔdbA-vRo'0p<\d_9O~bӞ:&\DzQDqjՍmIxbSSܛ:Ѹ8s@ݹ:!EleQN(Ywո1 95rՔz Ş~)Y/rE~J|jY?k5*WsVK*>`}A8QwE^3 /gVKԴ V́>xaWJځG͢D|q;Guu@D墶{{yT>xəQcpg|̅zXp冥\2IQgㄓ\ȇ*GKzfrVWH$@niRuځ#g$ړ,FSQ§rL.fdc[:#TP3AUʛ_CRCW^0vm$TT !%c\} ]yRN ô7hH(Z+M#C\pI@,ucP1HL451sDį$"xY-1 #bt,sͯ/u}0_ۺ|~ؕc 3'ժuy-_ftڅ;w=.>[}+{IOXGTSs 2kuղq:v;uز–Uíݻ7zk|(C-l2~ϛ1w8us;\z&{&7Fh͚ᯚvufʚksJG9 Oi>gS5;p"2%*m{=(aY19U:Vwןs]Ou]O}]Ov]8QFyS d{(ax0R9XK`%e| kKL:șhz|ߞ ܙn(vo=9,-a-v1oF 1ʱ0H'RiI";lkR-g;%CA)4xJJ)p%d$bVro"ʦb9OȕSBe80(3Y`j-mm"0B,FNk&ܹdžî{adukO0})|zܡRC4?\y!F)b6k"3@{T8P*F-u4Mk%Y$*y- Utt1rKN 8\+_C6ݮ-3YZ-5z>L#&Թ9*`W0W4o͎|"}'!+ g(Ύ[< .~ tr PL\LI PU s>CSX1#C̨ 5s^=ݵϺ[vRo(73ǖ?q6ӻXµt!FA1 q&FoQ"')(q,Z0W<d iU=ym'`.D\F[94!v<6xxûA@H4 7k +mEۆ\ydr1$OGFb/Ť`a݆-2 "Qe7t \v'^*ij7v}Ysr5H$ͨ4$Ҹ .8)铌VsDv*KdsW;) r㐭cAq&S;2~fV܏rbfr=}ͯKY[5AN}k$j0i;urzJɝ- W G`G[K=kʅ: ÛUCkZ7 z}yꅺz=TFe$@z$vFK%j#TyYt2ߠ]!鳾!|C]`D:f ֯h<6Q\<{{ t[-QtlHrC?u㶎ݜ4GHtoAnoț}9LBA]yA 2[/2Rhܗ)T4]@ocmH+ݺp#t{cm2AM#yZ%'jN٠:d[wmB7mz:ld&V7GKnjS.;e6LD s?xqF3s+- oo<]b;_gH TB<[gTJ@%BvK2?(r,107x}n4 ܇.>BNq!,wAMB(qFJK  \ӚxeWzE^@ny\|^LYRBei .x]J9a8i>|a%g/~8:sǑky>ZCy>ʍQQO]z Lb/F]erO] +Rz*SdAu4c%L2 ^tuT-WWߒ`}A@m2~o|dƹ8V LPRƟ+vo$EUp^V01x#K>̇^lbl` edO{ؒR䎣Ln7Sb1DRmͰ醴v]ټ(.vM+SvZ voh?:ަhًx_#4T|$ס{F_SE%(K6`09VO8BpA;ou*Nۧ OޢE+5[)W)r6yK0MWᴹo])uZƳf2]^i _ͯoU(er8HU:9h,al)R)yPPP}_Zbv) k 2K%VZKφD!TӰւ-9 c&$x[eTdr1``=aɖ˃Mբ/.Vˤws?irn7[ Xs/&mһCgX0O`{$Vy֊IxJMo g\~z@d? zOKoDvDDv'Y'O{:_c$izY}хR/T=IFO?$;[xjN ]?1E5ݯ8D4?b*yxWج1_Nj'/Fpsa 2qMVVkǩ.j;IPNJz;.|q>"bǣjVLNv ӴS ,9W`rBKēRT.kt1":lmOac"í HD,8M}1pHdE#"]YYgFw6I&i ׊_R;{tyyUed6-,|ѫԡr [b 9+A!cUSv(-9GD,bR";ý;n:vqlN٢3TvN::^I`%%,i9: >;jUeZXwuפM*%e@"emQ-[r˱,??kʐJ!NKO0|(kҰKbʘ-=<~"ztoпu;u%Gjr%zM8Y7rtۊQji+[`1^蟿m0ѿ_|8?U`~Pr%bx$уru;j[ *^z~lrXmU[]uվ*m_m+ ,G_́]uf_#9:_{Lz<1']TuIdKXi@lRZ+Z! Mx`Ų7Tϖ<߆ ^3kYA?=7)@aUvm31]!KڤU{_ٵIաWXٕwje5svu/!/#v'e.Ƨ>aWɒ㤌$8Rci]Kj05|v 佨RS RᘳȑIaJ.\E*ږzHŠQ+%bMU,8%ŜB^j]Bj)x&Ζ+ԏB>na@cqely$dF-_'"ӮQe$S5e+JvɽhjS1 ?~%$SFbэMZ'sK7QZrh՜k!=_*@OI~GC\,5KաFٌqp0,,Xx#QUc=4*D<]q8;<gFl[djGAŻJP҃]4J^SİA'1Uah^JPrluhuvgW ŚJ=8;OsoSJ;vEm50jjoV'S:e 8[]#r+vhS2a5#5"d0CE'*XJ}V&Ys5Y$Ņ00&f ]E Hz׊Owa6g#M!G Cf2ihjINp&X[IrJvI4"UG|4}LKvE=0..p1'&TS̭a%:^EDBE Ag?. nx &+,e9|VJ~ >nF%iowH~df?ӎ6n)]Ec{ eM0V .W} wS{[y%fw|խeIrq͠],yk;[nO9PZhc&9)\6Ao]m2x83m0[CKмitRk;3ߤߠ7).ӥ)ء}ZM[F}+sc8ݴl#f=|c29m0u#w/r Dp-oYEiזD,2ɚj9Hd ʾdx˥H 4A?RH:q $4RbMYd+quuNĶyWeVSA+e6͝R;9:]!jɅDI H,۔QƅJT;dنk 4I#|J[DX!"{PS*r:?;ݣ=T1)GEU#NgUMP2ioԡQMBrB615V6cVPLZt}M1a]wV>^ -k񣾈J M>*!dҡa#t0 ATvUa,.oY ^4]MVI}& ~%ԦBz-@ †Q-;Y-!]8$! q[>(@IV2vX^d$(jrHTgΉWl'gQU/cPB&v "luؾQx4-^JN\|E`ppUWX>D#4y3C[+,IZID)k]`0Z*l.xEleORrC ftYh! +@]ZuAU(xGmE".xe@_F\q SlH9E'ˮYEF' 9+` A ng3ndܠj`:X˃mnC  vy2>`T q3f0a dA 6H}V]&)dgNUEĭ1*6  #Kt.%")(ؙѬuQQ rXVU;J "l VJ!(u0`pq 92^hӃ; KFe] % /K\ G/ZN QH5K !! 𲤬DD-nwnXy9x">\F(M'23:7":[J{3z)mƬjD|-b }q-IJB^b;T/[qUlJ".9=HĀ5\ҷ[ofsu@gsa/%վtf-xmNz-ZL.H$hSd@gP j6V[/5~a<(9D E"&LԴPyEB>X\8Lt~XemLmx|Q)[ڃGMhq+*Ao|X:;j'vG''"4+JXm))ಒw]ɑ6Vx4Əݏ.ZZA9TX] X׆X=h#@ Q1sE{1*6Db6y##}Y] P9B(5h)"ev Z%tDs Vю2Z8J3ZksD;djMzԞQ7kFV8-5* ̃t eGMQb8l,댚$8K F|@F-#;`wDd !ܨmsWXxXe(!c=H>K45kNO?Hed8Ki@J ΒU UMeV+ԿQn239l@E}?WNV%TaM \n F`ⰱAsF+V뽺~Ynri$&%2um+Z7]pd3=0p[ 7܁Qfq׺ె$\kpy(Y/s6C3|䯇߆H|6 |殍!(^o*Zs1%p#BK6\D[9jPʥP4]&T*t=B@H>Jk$=|P蔫Aƭz 떍M 'ի [QW*ϕ+"3iQyM  ;baT( OZU:cl~ [kâHY=zV9E7̜N@j`Q_85͙˃h&}ckt$?!],~FrqǠ\U8/ڐ?샩M5ʾawx8d񄇍eh(5rfF8ae%3}πzQC3<6/ӈ0"#A |T)d9 He[s#xJ 4/A$\ 懍9mFِCL*V:LD] 124+MjSA@$9D%. xp9D LBF p^\ר6"!w*;c70&Z5h^m!fu~7[F_9yJvYt_n9 oly=@w}[M ֋ٍ]ue};׮tЁ<t+OG*{:R[AuMm[< 3iZlJ:ynYmbSu׋:,VcWE,&,~.ဧ4HGXN>N/&g}ۆaD7rx(#mZ1h7vgxeG, &8d?"3&C~{;Wlq6@nK( az}G|f/6GR䏗#o7^Gv*w[J\B6Hj\Mj"WD&r5\Mj"WD&r5\Mj"WD&r5\Mj"WD&r5\Mj"WD&r5\Mj"WDvՖ{'N\mQ鐫a.7'CM\ + rHXj~^Om6d`}e {TCX6 JYbTꝎt%X_ķ&|nG{ʼnѯ-q_sd5qsMk\8Ĺ&5qsMk\8Ĺ&5qsMk\8Ĺ&5qsMk\8Ĺ&5qsMk\8Ĺ&5qsMk\\]ٳ|_ oQZV9 Y;oF]QӺUTbCtd8"4~UƇu\qnhe5C>Sg 2 kv)ϸAEq!84Ƈ0[NIw~E4Lki߳#0}XǶ͚<Z=ǵ>%ydu\[7v:Rrua^Llzt(RmGe[ѱús6%6&RDsb F>$ "(E~]/va◲ix-n0D3]z~_ʹmSՆR"ԐاWzh'\^Ni$~*>"bǦEV}i73sh/Y}h|B}Gh~ TLබUe`M#ѱ(t₵@d%RJ&O1xw&9qw>rz~&3^5Y;Zb1Ν^̵^VR^ַu_@ $f@b$f@b$f@b$f@b$f@b$f@b$f@b$f@b$f@b$f@b$f@b$f@b$f@b$f@b$f@b$f@b$f@b$f@b$f@b$f@b$f@b$f@b$f@bߎå=V8yf֖n)l-_/;u5$ ,_V/b("kѡxNS`c%hO;=9hi=Ҟ/ΆU: vF$a(G 2kxAߐBj\Gtf$&Rrn=׈`fr~JGM |f4Ns1E*" <]8]ϼ׽>F`_ŜS{-dNB d$YY4@h,j[䨱2bce4s y؂PO$)ȍ@Qje3cU9VHF;] KUE0P#r4Ӄ{踬|r\-%c1Y *Zb43\HWC6ժP:ԞYubblQCYPAWaro_y~A݀Aݢn  "1Yf8Y-Q6"8yʂ쏟Jztxևѽa^. }պW߼Md۲`M,!ݽvp?[A 8fdkG{#u_?]O4& g'%]Oȫׯv՚ߖ1ܒd Udԏnܩ z/z2ڟcѿZG~~Zpx7nNߔ.vtl;&o~ݿN3T+k7y}.g?to&lPic:^X^EAA i)I`OG ӑp-"^ V:Kn߂WVw'n7|r_Oze6X.gbluPɛN8';XbcYy1}喤敟r/;{a*NR)_hl ׄ*pr d`jI1#Nڌf_ьnFoiwKxQ6ʴtn}qrsi~X/H"HT/4Hkt*ЭkmPѦSEA*++2+B8jg Ə{U>UyVV}Yh)Zs>břKX=sokm\kP*"|EE1jl{9cb LrZL?xxv8y]OB)wu[w?j FAF.׵&elVX6 vbM q 8"ֱy]<vgsn}=750p?]#-,t&~Q1MzWϹHVk.TZhS)R`erRN"V]P*Զhv `1S9sXR _,eQ2OkCl~=D)hWx™KM?+# smNy ke0m:UmI"X-PIሂֻ@8f1>)M?ȑ1x AO/<-_y8VM 3Z@Cf|moJmVM 3Dg'Ɋ}JHw&zh.=^2nYyH4ū)yQΦˋ? L~Ö#0]6zuw>=]Y,_א狟z 6Vٝ5L(ikkv(z8ꪭhjeH0fN} "÷/B5]oŽ}lmf1Z [o6@n;ܽDqqs'a/WKgFx޸>]m׊C>B.skEЇ\]8ke1t^y/2z{ԟ=˽Ӻcwz'^Grn9Fد[=t"}Lţk~}`q9ms-`< 1^Kj?yfG[}!mlz|fcVXg[G?~v6[>rλ<ߝn`/צ>MJiZ{߈Kyw)ط/{Th?J-t)}҇п0`n2;I$**̓ :$LaOqjx/}i_ #i.$N@ĪrG*yQlrVFiFhmhaW~b:}Q[LtdTVKLSTHi׍~ڡhu7mC\]?g1n䁺ƫr[;=AcBanuBhugrxg6 !ZQJs5F̟@{S} cwb\6YLgolX-侇~T t,%jCQԈEWsb*hcTt c犅nهˋMvвC}Xo>\Qd5_ Yw0cgj8\n[^;O}qy5ιN.n- vب.S3-aB|6(mݱnu@6g>iZfܣj[#u|ۋ,~8eh9]Hojj ]7]x.y,Zkk{$ݨ-Wa8ZLfU|uՋZ]=/og}#78~Wn1эS-tH)_v9O^Cq&8>/dwK$~+X@|wO`LK恠*j(+o @̠m1-m"w9*E=uf˧<$g8SV2V\K&h\$):d%h+TUl!v#gGŌ?>|#,s?ܷx|#V &!]_]٩jSo'/w&նVF3RQ>l<AYmSΖk D98* +Qkcҩ %ZA)V*j`hNR=egUeջ[7rKv7 sQ7;[_jk:4~L8Eߌ+&d2)ttGw;ԱljHA"+se!3 G>gkm1+qۣq_JÛ^O88;+_6=]͵JWڴcz֋T<5菛M:}|B%XIKytT9114ℛ7!qFJ>NiCrŸ0tD<͙m{~-.RXEbj &(#@A8J&Z5.g+ 4“hˇ֞qR ߓbКduX U$1"%̵_tm[?)ɹ*YLRUT ꘽7`}l;ҞŒ@|&LgЍrF)Ӥ핾%ow8i?ۨ;d  xf}YDe##U$#ׅs1\Ęk bd=)Nbv6Og4Ϸy*mR zU P1$DX1(`H)Ԝ*%c!?i.ߟv:M*>qš7J(fDFEn,A *WJ Wm 2I}79X~5-6GáCۈ"7;fw'C7^"Wy2.9 @Ū3j̈́ t\@r8Gt$Us^ r*LJ Q! $C ߭SحӿzxvO럄痉??ŻV@Zy^ޕ&}h*)D׀Jpq`ѬԳ~~boW-h}iAHڍ tv6*v&/_UEҋSpUH8_O"9oq"w'A 'y"PHG%Qǂm|jB5n?8cЬx6v{=v{s=@_~Y4omއ_^DoopϏ?2maQ{>&sKru1myyDfMviƸ .4wܛBkB3w,mk???1Ms^T;s'~<Z|beV6,NŤv(NL݄w9cS,Kp5z>Xj5Nߣ[`: yirV[i==9~x%mrISJ`\UV} fSà2W](Ɓ\OyS`~xx|ϻK% :څ|ORKC<u>~cí>t;Q.|5~-NlTz'~/'3aF **̓ )S<ӵȸ0ӕ poPv94\6e=5,:QexjCɌeov*"V3gE8TVɳYaD.ږǮ@_̬~br9=-diPVKLSTah/yr掻ܑ0BLixv l)1MVq'|ҒB [ݞ;N9ܡ7lzS1t ( 2F6`[4,.=5ҥӷTiѺPmOtJY` :^1=^}&j:^]Rm={ Y+Iʽq̸FaM@,g': v ֹ}r'ԔVL-. YMW6GRg22EeA@4EJ̮K:qZҿsI+|Quds僋 KREN5)i\`AK"9IuVzdӴ /yB_2єYb;f3goU] ߪNw4!u:ro]rl)9 ԼE{2l"o }o|-yyNxKW`e̥34gRE 3 :̈́g"I!RdUaE Fgᘋ{1As peN< ^GS+jl0VK?.*jjG8,3SȔU6Eɱ(*+2$ó)yob֋sC=Ng˧AT(I7Pk̋ j5rk7S8-a:azSbTTϧ'&U(|^ xrVE',ީ6[ںVUbr) h\20HhΠtv]%d>1AtjF2rJ9F\ʒ R=eߦHYY$ULJfFݚV qƮBG??-x۳;ٺ`>m&FMXcynj""$zkuPPIF-< ktJ'i(^HKPRX0*fHAM sƎqdbLC-v`oxCfiLX $=(D,JM-xY:o tU},RDD -隄 J9#(:b:1 mVak/7"bX?ee({k[{1Y$K=DdڑkRQzFit!x& 8ΘlcJaLV\LLBH:2X@DPjE4ҫg^g%Ջ^T^UJ,! 1YyrSу"Jy-dV~0ZGpzaUT$ӬB Q.;tCiTڳFbһ^􁐏nBrkG -!*у0 F( xY%HTH]^ YlvU:+$kń!R 9A6$rBWF2r4'ɧz(*{qO!*Yfmʵjw^y۟jiJ$2c# 4N~ZpXG"-&EH U{W[ѷg<`W_.]gp>Q-Šdc|09YL1I ~c$u~^t,ϴL2Z~Alykehy= (*㒉wXr0qpŹ0 +Ci a X!h */f3ӄ  'x^'a5;?NB41޺x>8 2Y_݋C6ĐJ=[\C-ʖիǚ林V̐jF+ϳz]^܌g;=hqepvGZs> y4<EK߮Sh]QǩU!@&e74EWvo"5vE@ Co}~+uVT3yYy;M2\<.96U]&8bI Y? 'myAd.-k9M&YiUP)lSLef>7{GiE.zHv^(+ <:B)zFFKZK)ɒ\JkpfN~&Nu_HlE.`}[Iezx 2LuJXlR&"7:5^00TFe>Z;TP\Wʣ"Jz TD$|IK5J"p*Cu:: *} 4YI€Qyc&2&&+jlQ ta^LFC8ѸO!%E 5A"\t9$@hҢX,rT/J&%ew$%I7r.&rv^ }b񲔒` *dA[BIEA-;.!11'-!{fr6D+ў{ծ}2&sJVKCrƴ2 bPM*Blf\\/c;$cSz UwmA;܁d_4/_LAc(>A+ԇ_ILSyBbeW7:^ͫȻ0{i*t'(>S> >sü+wiLQj+rPlEm,ݷR[d{HErySJ5h5RبMCM)rqԗ2[/Ve(Nu;G ]i=rhHv6e-ZC. 0ROoGsXu錪n(`:& ]hB]7:V@wRC !\b(O*!t\,Sq/M2sVu5 Jͪ1W6Y7W [Leh CgzriC4/Z[Ҫ Bo@]@rًkiPSw1Je2Ȕ}\0/:D˔ 9bJICp$/w\ڦlpu7aɄ#iF2R՗$@bmBtly!)RT&~ɇE&~ c]zRiq64O _SO /')޾9h%\5$ pᔙcm8Ty{\)G T Lgr>tRkzqLL?\=`M}t] W˧Q zJ<D Wz C+$:2z(pf*S)L W \1A2MU&WC+p7WHanWNȪȨQQqm̖x"(4q'3r얷c9_+O[t}q.@+D×0W4ۂ'w`ը! l9X:3jY 7?_W?=?gUo|qGV1Oс1XZG %רt2HDQ 5W,2ܣq2 5.}'z=EtyM K^ڡům3RMmz֫ޟqBp$jW7%Ҹp"\p S'6n%crgZͨzק]9jry:L9H|ȵ3D9?\GߏU!hw{h:J|2zԬ -&(pdbQɲwዞ(@w5uk2lZPo0ޡErEO'؏&ų4J\],kӼ3(~?G.#( _3,"xW1(s j4rF ޏPN59Ps WDZcѱZh+yDg#1 .xJog3\Qlydv~CanT~|7ŵd4K8/ƿWdRHdHd5ǍhXaqxKHߪRs@+f3:OyAk]Ձ?ߏ8VcQz 1We7u./Nm!u8Rdb;Y$J'XeaL3k p,敼܌'zp˫ZTFV?u=ĮTzL^2xrym}n}{Wg/~}~y^Srvqg`}.򸌄V$c;= M=7~4װΣyM:ϕMysGғu }c7ԝ\zW׹@iDoWL/ҕ0F1-*z*5vB h6S W.%5Q+G|Xi#m n}Wg[gzl~ϗ9ƞ%p gmEԨ<1"=_Ymu8me)Εw1핢8h- PQR=Mh,$"Qػ#c|.+Q;$P-نN( ZYpֱ,Nɝ (ŜG&OBs9'fuB5-wЍxbfc-2X0]UMbӶ۸~i)+ Hy@@b)"W`-\F&7,S`ߓ-2 &-Jr?\1 RK@[tRRn njlF+`5_$8咘D')CQ=wDEU$o oudQ(nTVUփh8D .hf Jq4@p&e6Gb% 7FVbbecv^qjOz h*D("KAK^2a๣5c1If:(ڄzf ZmDxGWҧc4jd1Orv|  ˽ی;Sv~n9,h)D< b\pJUnւ9 jnjik7|c|~V+ ӰSE-BhYܑ'MuW{ѱ{M`K[QTK` eh9Doq̭$ :Gl;i(J!$FR!D*$X1'yT Aidl9]1,l3 i; EylÏY3^YD 3X> tz:W2*&Р&Df#<SdNb5Xe PH5: pg#94r3lr'!(fJhmCx.9#uÔ/wlڬaf-jJh&Ebx0Z9q\Lro'nbTaRYQ3VQ!R2׼ǘxu!@<QY06FxXcGdƽAĦcSD #"oEU $b*ER\hNF#V7BPbWYkiHhG<򠀥3.8!()-i[E3N0j5YG|qqPk:MqEhqUjBɻk&JN׆9ɬJEb!Qz4 JWyF \QVƣ47,WoT`"Ѩt?A< 8xB&~BRo؎!7@Hh< pUBQE.yc3δ(TQk"Ty.-80@5 Bq) "Zڅh,h5T"=&b֬#g@Ⱦ>Q#\U7Op _$Yfaʝj5bu]=#&0y4(ƸO(YI\Ŝ,J)h|HDUiu_0O)g'sn'1nOEw{kNR-D1*K{)UR.v`X#]eZ!ye)xZu$c,=1BF\IcJnhڮVh# 8Y˵pAb"&?ygLDUƌrW0r߼SQ7 ],:-x^^4?bF8.٫q+k.^}nv?t{KEŤWÊI-I/zN߰bMkJ=eڙ" mi=؝;uXEFN "|MLHr6N)<1ΔfNEj]_5Xp53CеlD+I=>X&*@FLQS ROյѵ9+P{I N.>sq\Bf>^n4mӪ+|=(ف1\H7l@#R%\|i$%MKG=.RYh[=[}R}Wtj_ѽ@,yrEnCL& .HVj4q)X*)'`q.@N%eL*$ <#i5Zks 2%ioqc-F.g+)KrCe:^GCF5yU1Hw3 6^:P7Aҁr%0CK+/K^9ϘTЪ;0Q=ObpΊRq'8t w>yy؝EE7bdVSGX}>\b Nmqm$ aR sLxTR nA28Mb$9pvsڽN0*_;{rBۚ0OepMz]lhQytW"J*Gų?>ƿpLSu̒΍n5y70[M{rg=y?w>V%>D?,rtrQȋ˜mVCz2.~ HD9a&_U\tߚ*niU ΥU$[/7_FWv3)πf5{훳=_ЊYKq#7 琵coC[.dapLW!,JBxQ-q*p(hF Jp*1hpr6BC^P'n_܀/2@!_ /,FV:@T4 R埠ȋWi&Wy ūLlijf?8~s\.oN_˄Kq/Vx|zK_~j׿GՇ{Qi<\Ԕ)‘< nQUg눐['g*'0uC LA IAޣ ! m`}-,jY~vplyك%Onmz5n[a݇ZGMT|vO}~|fzya߻'ZFȍkT+Rgy<+-G\6w lG/>_&ʐHH5glH 4N< !I4*M0M&'O҉h6(  UK׮_joDxhnGPǦNy˛}=a\*SB jZ;P=9JH*|#>2+h0 zY 8TY먋\88 RH3;I:.o@2]^| 6U=lS4U rOVF\D `W2tb`e i#G;ЎN(8EAy'AhWD;A8&*Z\9k:Cb^7$tw1>͕ȏeouEcjOgqEBDcHhbR08w4E"2\т4A4Bb0(S zc%bAN_Eh&i$L26q[0h5$(s]Tӳ<\O;az=akNҪ>ⱄ!FpA3A,p&Fo5"')(/>L2"tP<tӫzi䃣@9[E  " 9h B)Px\NayJ4Gg?>{5bbaDEbrDzm,h04y։C}<2/&G|DB_ mh#!mD}(*mٝvc^ ^*ij b?7Ɲfh N1%h 2@Dh_((6mQl| DC>Mx,$oV>%{;mf|,vlC'e7]W91b|,\~ΰf6DMoy.pgMrTkkk7k<( 6Fjs3?ij? :o4<6Cݿv,gO .}sb>ψWoWS]k9ݦ^uAat?OߡzhQ诡ItXRC;ć60:Vh~g,St5Fh<{7ͦwמNF;_㎫r244{~ #Fu͟tUKZk|2^̖ZL)Vd 度'oio2~aу5?ߥQu!(l7h/wGy^(웉׽>?r :^ c,)9Vw{g'<-(]a>oXZLB{JMpѱde}lLW;iWg^{/b`'&Y}-61z| :v&N] !wʞf"JB=doW۾g!{}[c\%I $xdѬ1pa{FbZYIɄˋy.̹S|{Ν< xi}a叏M#()QJ!Q !bH1J*娑 `k^9Ч>S;m-N[ΧdܽPҋYμSzi.J9$CRIj!͕ZѤ(!h(T c8ռ( `*< 9PA1wx4廨ɣ*b '@BR8EJވthq^EV +4\ V'hthRD}RpJs*/9Ah֡T )yB{Q鈊,m=ˇʛU41'#<QQS⫉EN`SLhc|b{tI gy$2GDf9Z:!% &b) ]TFJu$Wf)K(2rFh`JXf֖p=K9h$?ȉ#gG97 fF͑GRHY&mu1J 9CMP4O4AJ1؃[E~fo<!2 |_EU7C4 7aK)~ZŠ_;i>W|\z{R͍ $3>E20&!"lA;QDq^1x_6{oGl9B+YP52wyI3"8%N,:ɜ„<9("Vٗ?Xfyy|9{lAq@Gτ$Z\~9?ǂV:_yu$TT !%ka?Ce2V&7e9?Jv#w5غnZ{#t.xyf~Z'ii~Nn+j2p[&Ɓ3/#kvu?u׬3/Wjl`PvKYnmL9csq>[fYTnOz oUŌhH¿Ay>+S_W'HhkмJi雷^=Ja5\8)$a9PE9wl)2ΥܺH1MңULZcO @&<|RF]-tL#=7z_i㋷>n(tC&8@g> FVЧ50??ud&d@n2OpRI 9C~q..E4{%ӻ\zr#W'\wm?j큹tY4d %EM YF=jZ}9+<59y`ZЅ~ oDܺ ö$zwm"凮TN].E^RNf]: ~7AQ$O}~|9wDo4Jr~7g:Fȍ|El>B</'yה[R1ⴳr~}Zo%U?A o2rhe^{_tbrlNS:UєLL^C$D 7n4`Ո 9I\N@#)*DeppB('O-bR Ubnks'zqyBȷzbSN63p L bykHŘיܦzK03Zy\;Ǽq½y:П^\Lw"4GHO+Ϥ[c]X+_B$VgZ{ܸ_vq~vg, 6ΗIr %ER *I]Z%dU*;zK2oF:XDwx&sN/E6 N$,%UCڵ. ݻ1:C?W'Ւe_,܏ג ZRI"K'!ZÜ '^K0'!H(+UXoYa;>~R)y4|J,|72HGHN2P^TGFUV8TU<[~Rp X?O䃛VR/~xZWAܮY+#unEM{3Ff%g0nyyujWKݱQCY}98ûR5n88\S7Cq (ٍգַUJ'B겎׵a̰f哐坯Ԋ3V;&ywzb4jv)P=^;=WK@KdrCز[9/P+f/.В%:~[ZU^[{nUZbttV\Gn4mUTef:"6G)n\1S}KH 9!t+AqM,M[1U}%xR(ZyxGث$Qk"5jZ(_{]aW 7Z[4K[4Jݖְ/ t4qiLyRVH.ۍ\KSNJEuDȁ׽dQ<4GP9wdB) [JTFtH>>ߚ?:<$7܍C:y7P-jyT}o2ْ<{o {`m:f<WD9bx;q`e =d4ڇo2UTnJp3U-܌h-o:7玛AHn[DWmd++imRW"3E5:Dktk+M3|B=ӠжD5%c]]!]Tጷm4(5thy++KmRW0B5th-i4("+J5r-C*{^F0 ZŊmu|>e_r`uUNlЖU> ͢i}M뎦zhMV5tpuk Zt"wtutpk[DW5tpn ]!ZxuHҩk+Y"VU-thOWd]]] &m SupyN+DlGWHWRzO[bÉyœhK6<3r[quU>#Yꊑ& a긣6m _<7?(Kgn*U^=9;y 0"Ia@kF}}J^|޷Z\viq2 k[H!`k[#W'-Mjw#kj3T 8m# ʚNWi/;*tT0H¥-thj:]!J:BhMӌ[.Dk }oP}thuSWHWi)NJV=}N-vXC6Jlx{fWh5D/*!K_#KkNJ`9^ǻ}@AK,Sc 2_u[++.aҦ 朵l`О Qrҕ6+L[DWWt(u绺FqEu 3[CWWղt( J)ah])% W++ZBJ4J+mx|Wضg!>0 Q ҕє6=3W+D:uute(oT0#zq 0l9ӴE4[~X{.oa/,Ѵe]QYe6ZwPt1Mqj3ʓ iMF8K 츞6;>noW՟G v}W!D%i*\2W"6j}y?RZpq"ZϳIc\0Q5x}.aYyr{3+΋<ԉ#8hF=<Q^\ P]gAudMoc-+`&dlL77A *ER\Z my{VGOc&2|&eJejXPY~X'ǝZPf@n?L4($Fs~TO25K.(-6@Ib4H W)1BIt10L)FRϟ{]6c'h?vleҼ ;deJة<~vջÅK[/",ulknqNggD$C9 P Ŏ: eu; Y":ğk+&G g`Q2ҫTzƠI$"k kԑVJ=Z)c$0ScHq&\F"B`A[Kg[!AҜ<*7RS.dWXiRrkjХ)S=-&|dp!YQ.ZG0,H|$Njz mScʠwӠIc  ]fY .jZ5 *7xwmImg zC%QK)LtF"wZL de:Jӵ12-觇ACw'an^_j (gz~C+Q@MbNn4Bo}Lo'7MO}B4+xB읓Uv <.~P?Bl춫G{ֱUTY%wrͥcx8ܖ0VRdr'8ʉ_ݟxL\"#իbw ᅈ4uC'"  u4iԂ|jmPQʔKMSTy\[)Z4 Sl8ofݬ7I|97 m@㷡sw7]U_ͯ\%]ؿls>-1ڮlwbʲqWf'vCn:L)dQ4=Q]χosv%rmmP%qVT1ͅTTI*{>IńqpuTrz:j!j?_T^ukCt?FYDWօ t@_J똰'*;TSj'KmZ.(IZ=>82D0b6Mg"KZy:Uzjq/@fa7]Bszvu T0~~SݲҮBI I=u)P]R/7y Rr/TA3|b"&F{|αBm ټA/hM/NS̳h:3>hZm\@$fQE9Q\J8sb BgLPL !=#ZcXjô҄g۵ٿ:yЍYz:7\cY~+˰q܄zڕ.{ɁY$$/1+yܬrJh'7#7)?G/7ˈ}ܔ6RrMgOpͮ<9v (QsGKPGЅTH+Od4db% *>di'Nr.w2$u)^^\@Zz9zX߽ZgZryyҳi?|q^S?Ht;.'-JpٜwWzvun.i9PLo!M'MȻd$M.-BnB,_v}Kpsp%hq"wFzH@O2Y d<]&m𗋖s\rܮpCs0&nѭܽ>^?{= վe=vwy|лWԞn;(81Cc},;e,֗wun/mz@7wB3wT ~؁mA|/S@^[ṉ$'rL7~3g3ϯyضje6j ?^ԠD:;z'OO/.Y!w^6;?=(x# 뺋_o׫YYfKwg.\z:~'/lUoLeNlg߿MlzY{-4]wgLM/Pd_j Б7?JZVt}.1܈֌ThYroe_oE{P>cw|>l^?KhmLڊ _\߷/烛7ގ_Y/ 4{nMX>(ϗk Xg/ui%֫ۧ)Ivvu~<ҍ=Z7|f7)vYHJPAd}dET#u^ '_nqgUںsQW}|Ӛt Op{De:╒ZeyC%yLI^4MZ^Iz,]U잃GSژʰ֧!b΃h0 T( h׵awFuYMr6jz,S#utjG&0Z(K.=9^YGH79:׎gϦ/gVH}@МO΃[!qȁ"IBR$V˽4vq0d oՂe~"&edN9`V[_G#fUIis)S]zr|6)ttX<&9TKDL-bURXbvn{r\Yhg9e!A6!ՄԚk% xƃ%\H}S'UEcJ"J2"@4>gSPQOlUu@;[՗6[N{z[:]Št &&.AkcjbILKM |:%'WV,l +-gC+2NTOQ16z TkijusdI#O1ӎH. LWІĆ BL% ٨"ʕAq6re.qRtH+ODI.!t;PZ#D:[)P(&;SVQXW 'gRJ:p;C }8"l|5Td2`M-)*@1. #& !ѺjpQ[hҐct#.~\5IaP-% Hc^)6.Zm`%ҋ^\.ϗ&wo!ʐd[\mVX3x|5ukF*)pt =fN*qYJ"_AL,ΪJ)Xr_6$rp)fME}No]I $p >CQ\nW|AWc [㹷uyWZF{xKc8qׁYw_}_l:XpY#NV7NN1Q;xOYO TZzB'̫zIc ْBnQUf[Z]t6ƱCv.;pz۬#36Ž`jժ$ʼnl=@5{br֒cz4%h't6)GOFbk{]!G,w~l7yD?4IebWfOȑ߼Mß>NǐʱNeؖ EuzƙBlHڣM8qu^':Nhu>|+&IE>jV*X dq)+WS1TWw(\Q8p%bն*hSt[pg8! ٧Q}*珑Eav]ک%+ 9g Z =45o:cPM 1N 'uf 1vFɆSO6n&,N2-y I',MOͨO$zjv J֩_l࢓0;÷ɠ=--6~}y2]8*?bjK CԜ8Ydn0iʶFlT\2"XKD*(>R6=&?ɔ% -'6KTz#c7qG~J7,zgpcw,{..]|^b_5d2cr&/؞Q!>6)8((^Iˆ 5Nly b%j+sc@ X S!V!3bwqĦٴlQ̓9K;Em͈ڣWҊʠ\l1C h"g+yc$j[7fq!$" ,:[F*t:!{qDjX7q%lD6`<D"팈vDfRPU.[ŷEEh\K1dsڱEc|$xRtفm~j[Lm8Cr쵘dY 5dCz%,Y^5 ..6Niaf)Aj^j)( yGdaBE0!*YD =G\|\<6X4`@Xki C*@S6Z0jqL)Xiq@[\V:i\ ދ!pI-D^Z){%ٖlRbm%y8CrrHXI޲e58+Bcxzʎ>a>4/,|Wd;mq'j@F N Np:SaUȦZKjI*R[I\Ho/ko%L&YYue|wf掠-CVUw ido9O Lܫ50h+7?n)#Uq|Rab0\loObC*ǭVYPgQ2fŤ[J7mc3Wl#qr:H8MI?"36 6I8ԅPtDDJ0T% V:995ȚtaH$rF{Lr:DiJ(^8O0'i䶽u58+:؍RK@S|JDO3Ȁig%x*DQ'' + b1xH5 r'e*u8}p<; gG)9M-Hfe*fJ3nKo@JjPMc<(F9)GI?AڿԴ"`H544l(w~`9 TREĺ"mW1D8_9?Gä1>d7.]Ց$Q2\B$9.,8U ?ODߧ:XbPP|daCstw٥S?<"xCr# @!LP"넩{W/z鏧Waz`-L e¯SϧoWmU USVnURf{W}[8}̫Qz~y܆:N N<߇ _ؠWv|lUḎۨ *{ Q`?<X.2ƭؑGiI,fw%zgӟnliaF(4aX'7ecߧ,!`?+ý!gGH)-a?,]p8'M۩(QzݩIeЙpYGE (sxŔӧ.)z&\)$ĥSD8$ևS; r8ZTl8lBmgj2oM&&,b؀xRe. _ GGz;0S[Š/Op/pϦ)tE.Wf%]8* f:g3Db@&E-Q-jg$xNlw"0[N=bxBT” &Lih͸ϖwM708wxILqT.YUHƨ@xǷ[f>t/F&sKcM PFYw@@(7us|Ncc\i a7:pF [: +O1 EYmzh%y5Q SK P>{XZxc̤-+A϶*լ-thYtPR~=+8m]Ntk*-tbNWjLJW8&m+@KvtP2q=+F8aEtXWb hu (<'"g*խ+#NW%SCխUIkJʸ-t~`s (99ҕରf=kW.mV<]jJ1xpvH]L:]Ê!jJS1wKtwp "Ŭ88 Sز7i6c),1*\QPK%KKJp˔eɫ'8m8Qb '?n=䩏[.{b}=bzBsQ'x }clX2 pj ]ZEخU@)遮8MtU uP y=+(;n0,Se.qnd-$=^ CVz;Pn(H?l}˟VYr稁h6tj>R3R;/mRAN#%il2HQc6ќg೴$,X)Hf{s)7>pOК8\x]\N27RPj~qne$*k~Obn2rNG.ln4[Ӄf>auZ&f wwMT*&fD(‰nٿe:4WAޅiIɚM&5MQ\-/\tv7hJs~xV&$5˸^gP,snQY9:Ͳ7=}O.iVi%+rS֚S Z+qo3t*-/GbwƦTpmcOr)E`#S0QJU€jccS\Iط}I%eo}<^^X%9~\[F@@ҷ1Li&?K.z,Թ$N;+ÊxKuĚ9΁I:\`%|˚!0߆Z5D O4L*")8̰TiƭVc  (`tH&ȃB@p/|Kk7;PUğQ0ZJbSlK8[*@G u'%xF9)GI?zo55HiiT~|L-qm J(bZIܧFIc}~oV]໪#5`?8J+[$љJ ?*6O` c?B#cݰGʟ.b8L<]uE> ].{Q-5La)ď{3SRэTOG'3rݯWōeut8Fw@{d +󥟥eٯTun,{; D]^؁NB k¨z#eaM/jY5dy5T/f^P@ φGA^ 30\_'5|)ha_ke.k+]5v:~9Oqq4Jdt;8 aeeLǒBoeaoz/O_z~=g=LT٫0|=0&2n ܩӷWѶꪩbTMWWlK꽫>iQom3 Dyvy^s;קëʍ.MfL t_ؠWS*@mT r(g,[.r@̯>cp:c#ݴ$IbG|EGTsG % \#zJ҄(+%ega )QC)RكWj΋ sWa>yykBu0Sl&1>%NISVcR)h"SII I-v\Pϴ4U0C(";d:P|gGbdbZya~Ύmiuf窼'MwJ!4s>=L C2B)+*XEuʽKY7UGuA -2yݨtg\I:gTMNS=(k({,JF' p@%68lo,'HɌ~Ŗd%6e˞CͦUŧ2,;"j7[m٬̍wׅkuv+mr٬zb kքT2hFTc+قB"D 6% Mw^t4n_ÿ5(6;Qlol}4Jne,/b^x[:1*#;FذI(4E-ǕFS+MDKO9J>)TR b(wrMa#D'/QNZ1$\HU%Šb0ɺ1llFΞG>[U1!JwWs[Xeo|j +ՃY4(Qg4'GB;j8^.i&^="-\KP)^Fֿ2E@ eZTu!9Tnʰ]:i5>..\\&*yC\Oe!pU׷ <9S.혩֓ي7_~jݞ/r,ƃ>[TA xJF޾ jd !2&fޕKJ3)R z=zr rKKUURu,Zkf֌J3]،3EAT^QTm" n;ov9l/t|l6ے>xa56dP\dcAF%j`&0!T^PAȭjSYV']aNP T@kfi2k&hfq_ 6 Z{@[9޲YZ*)"%2rhaPXX %4Ma1E?FjJV4*rkj>2pl&bdlT>lFn}ȪF#jDX#A#qǣEDePcPdMݟUو9CRiH9p)`BS:Ͷ1*pReT=rBɒ,-iNE'5B#gWX/κpW/zQzqЋ8^9FBSbN}d!%Y'Aـ4l+%';Ccч͸4և~*rs.gsz{;c#N2;a}ApCE?2]QԣTϣxwVKÝ2=U̝*6;y.ܟGΠD@킶+T1/ή/:l,u7^N)u:D3C! Y]fpJxF⍪oo݂M^J:eD !,*a ![ʨxILF{{9ވ'|夽63Vp{ht> ?>Lќ#L}ݼȽ{o^8~7|oIAl0#5q8\b5pxQ;jm7*nJHD OX-,?i+rWio~.5M ':7_ /~2|A)ypLX}f6f;Od:8bMi%GˬΛu}ۢF!]˳v$u4%BWԺu(9c={摞֊V P8k4[LT Ygb.E]T:i h[j>6nޓ!5zV]_U2cwe[v{{" ^NRkڦy'+`ȊN/;k Dy %6C A*WLQ< , B9vYCw74 [X7Dz|.y~>D3xP-xWfU'Gz?w2m7E.i: AvՙT%Np@%ԡ<'jΩ7EitZa+״`86dK^>W=rt@7ނs))mt" 9*P냠WV$F+9>53g#ÛKT'S{hiL#]:Ic䇛v7Hf<'fdi۳BF?~@oXB^±.NUN1K[quoe2}&2ku9KM=`|fqmaZ {=G=ɝw,ߞ՜|y1+|LDb>]i~ƿ> >깥Ϸϭ}h}??'K5z3!32}b?_ ~H32!/Yϧ#?w51{wxR?kvSnIZ+A[vB21<$!&X<8$J1%IRñ' 1Z!I L2V ;qr}ܨlJ$)}ẘgEκĀz{'}Gsg*ދ/"zP̘ErdC-+Et(Cm_g UC1M~ t.99%afėd=)D>!ǔ)]2:hufUzLT`xVuVoY Mjn:Y X Bbьr"' &$(\/G$v-^rv5 2KpA@|,љ\ ,3ʢ,Xh"YoT΢5A B2#%NsA(x-|D[HTc9kFΞro/@zFƨ|XJ #%$cZwX]%^J!4CUb0Xd_OEـr ieur1;N^8,o:XJ7P`' v#r>ͿwŠ@N Ԇ X~ yP&F.<(&cFTJ~-ʿ,Ed!'iuMB :$Q9i\aj&"ZP4>4H[CaJ]"/zv,׌E+|{Ӡ>s'JR蜉Ejccevq:\\t4J4fucx~֪ܘtn9\* H‚(,+yKYs:Ag|Xñ3E3!KwZ ndk]f{_;vӽ8D1IlQG,V}!.+epi alsj15/s1-Yv%'{=(aY29ĕ:V?ālWڮ௽]_m"qF*):: Qt+&;]RH]]u-iƺB]L蹱JV\dvЦ܁v(@L\Z@ch@wo/-A)8cc HI$e(36d-mn[HόwJR$jiL)5*ϓUR(f%&C/p0GOgu~~adP ^Wˋkzq4nÁKEJ ":r DPa"f&24hZ 52J1ڈFiMr=˛T)ƄJ}ڠ58_Y8;iZoװhآ6Eݮ,Y\Z,ד-SXȪ>\`GI ss4/aTiJc#"ng5~rt<f4Yo7 t|E1AsnJZ@<׌$c=8vs5)uIOSWc zK}&xwG}_x Z:*FA1 qTL(!DNR QPh!=G [ebɘ,pYcYuFy$'^̬oEO/ݏs.|z:2tʦ. >ɦ ' r@"W vQCðLm6OŤWAqW[v]X Xٟ޴jN']NNk- )@WAѬ1p=R2*y3xA0by+{9w·ْC ^ZG@<ζ,K:EM$rM4 ,)-B3p,pNk╁Z!,>ʺܚl<):EK-E\0w ~!ŁZ2#Ӫo-|tmdiRJ_іS;ij"qc 4ФB&ZOgjMh 8Ido7IOZXh.P፧:c\k\ "8O` yjYm .g*^-/g+Um>16/?nӇ:<ڊ? ^AiԙH7A_=*ɛ~ܘڎgJ~|xUoxS'%.ݛ'U¦>/\ɼ(Lqt1;^r"^}CK_XVqmdʍK2Ǫ'&KyU-8k}UٛZ?|Q*>~g?E$@mEx8n/[/?׍n+;VI +uΊƱja~r|V/XujQl=˫dQ/? i/KU@i,>Okyy^4>'3Gt) C_*R@WGHWZ*E 3F{CW.g}'UF)@WGHW+Mnӆ!k/(~ ߺhĮ:AAęgEm1.[3J~p =,]VCW Nۃ@WOz}+l@2\BW]+)j#+UlHo誂kBWVS:]evutŵb07t ]!Z 73J:BC(U p ]etQj1ҕ0\>]eѮ2ZP]R1ҕzDWd ]eBW<]evutOkW潡 fg*y(D֮&7tRh9:]e@WGDW\ѝ<*JJKD3!OTD2@;֛iz?[ {5ôZFCiM󁦟(ف&{V2\&BW-ЮUF)`#+FQQ#=+ ]!\/t(%tQ*5\>iWXkBz]q uUFic 7tp͡\PEJPC]eJ2\BW~(]#]I(= XGpb?UFi@WGHW Cv?tpH_ yGD)@WHW~ f?tpUF rX:J2ҧ XU+{vj:]eF tuteĆ_PǼUA3"O mxI|RUl~IUv&^5N4wL`r^Ceګt;^ݩ6O֡:ysy瀃u 8!\2}н~f?'PC{TZ&civ=E@*>x>r}UF j#+L2#ʀ ]!\H_*}h/ϧR8Hit kzCW}+Dk:]etutzDW.vѲkW% tut%ֹ`#zCW5DW-UF)@WGHWR(PGtWUF;OW`]!]`է+,x֮2\0}VA*41xt)̀5t*etQ tu tNs*ThĎRuŧxkLT'U_4n"uV}wxB9Vt-PuRYG{+BR :me㩷K~vRMe*gXt².֗[.yTëeg֏f͙ZXK7׏xh_wxv_3օ-.p,]Ե)5⪧Ѽ!fTTXv $7 ͲevSp&5xa9p|3W:s xw]bE79rS9~GoΑ]2өpBJŅeB9-3SR` Q HN\KHR[*=*Ʒ}l 1C4w<8c Ub5r4 \%K=!& DBܠ3VpFI'%al~2YTGNqbR9'6Zk:%H?gwTokl*J Jn짉U!bϙ&(¢[ l"*GAV`4ehU6FPvwcg )j)P5rLnEM1d?JMRǸTrPX!SjR"*0"Rd Il^gZ{8_!i` ^,``kL/zP@KRm%&l}ֹT)v\+,&3+$"FzHSL-^rJ!ܑlQۃ Zi cVB\G`v=y6J#TEv̹c03>M#Bȥa1N{'l~%B!}s+B`FyZo橴,HUFK{qcFgdɺ5?}P9!HUEzr7NTRZ  $Y1ڢDw ףI Y_]oaJ<,QڎMѵ($S&mI!a+V 3_' H/"E!-NpPR<P+Q8dms)T.WPf4X'餞.]FT,CD bzXMI1!Ku4Xg](TGlOMBw_KͲnG L3RQ/3*`5. Q[ݫ N6p Nkap:^vNq2Qh?i0QLnՂ!WˮRWf htyֲtXRs ]cz:!lnR4 ƅ5 -cwk.4׀˺ FBB'y+GQ6ga=gAg`UhU%`}5 eo Aԓ*e,X\2؎~n!`KU ݐ * +5+0 dBCh\'S  L dLSo%e:|=@cYb22in vEm%˲fP57J1d a6! j B@ mn;clQȌ:B*C<t4XYt0wD `L ӌ/~f \(&|JAZ\TO[:*]:H BРM 2ݙ`(q LseԸ`j LڳFxwD<@w/:I <5+m7ebE.5힔u"F_R]3czFӘyիf@!ѿ&伖( RDāj`hV."2Hb0۽@VQ=j+P"Кe$M'2ЙyHMthH+fD' G1͋BIJ!`B6}o *ܹ_` C_m9вޜ1ܳ]-^6 H}6=Dfxs:p8olK#+J$W{HV*X(THPjz/3iyᜁ"CO&XVXkxhDP(c ٣P.O96HA/&9І(T—蠻%_>1&#TϺ]J@e`PJNZ%l-ѧEk@.Ii:B 9^AB ;!3ڄ`1btGJ߳k..ϥXѕ޶Yw+#_/mnn~A_Bv'Pk4?C7mX6{oII=exKW*QHSE>'v=v@GM5{Nق@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; NrȜ@w4N /.RqQ:Ya'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v:#(xi8?'Pg 7z#$cKiivq]*9{:{qPM6dBTCXžZ}*b(ZEmAAؙPVJ[MDGl}&p^}s?/\1?/JLπi0C/ TX8"9BVc+W Wq*#+EWpDpEF H"2\=FZO~ś^YC߽(Y, OOf?S#Y'Oޢ>rƾx{1 ,O_Vs~3oۋ$/UYMGBh.kּٿghCivHa|t4oܗGZn;p!x:'Kػ5r-A~qW:b弭J\G D<`ZC* N U-nbtZ^-f}~?#W^^i~ջOe;1]y|wn?g+i/G|󴂧ӵ6~p.ܾ>|by6,NON5+- oK٘eT>Ptҋ. =E XЬz9aog栵8{IFO`QbVtFe޼#2܇]]mˋ v_-I(~>\b~yJ>LszUZ_]ݷ6⍁ڸN6-O yn>]VN>\ެ^JOoye')Ta~~o?2r:{vzF]ܷM&eZjZѮ{qO֝W5H-dQD͙'$w@?aeZ3󞓛#O1P[;-Y-+رb?sgiS9>1%i>@'[{FUr(ʪ!llnZ,l>6ha`/."><\SE&o*5_/I,uC׹چ\tV[h UuUR5IiǔMH' nڸ BCbu2Zk7GLL*L:lٛ[suJ%NmMa."Zh3=ЄlR!lxKuLm҉:IN¹{Sa ZږC>HKo}<:i60O^ӧF^7&zWvEԍz ~<#0&8_Y&suvS-Tt?`mR;+@7(GN+ NZ֥1Ddί8V`ɋo{?~\6淽ӊo{'ZIq mT0V$5CSTg6<;CR. yڗ26*HHܢ"5r۾ K'aK7.d۷2e=ͻqs@/Iڣ!}R(1;g$3o(\kHYl]n+1 5T5Cua^!Oa=ѣPկ-_~Rٶ[:UۑTm4MZ]ϼ|QNs}y*ll־"0+Orke!ͣU R?)NS씥ۣIDL+{];+zrw]'\aޓfys-TӢZJgͪ=9[ 1w xD0 0I,KZv*KnT%X,-pJPoK9+Y'$m[[#gK{Or#,_/i| %}jo15ĉo607QJSJ*6cy.EK2]^3OE4$+I(nc`9IxXo(ڸWi a]:l% 3GW"K-HUT{3/QA!O9ӕ?ybu5msa'#Yr&so !Bw݃/JKŝ$G_&u"ͽh+!⎆.c-XJB}ތ ?#a-.^{amZB_8UR~Q*FE67gϥQ]s1׻=jZ ƚ"M$O2B,:R.%(?%cvqUB3 ?0x#HW͞Z͒W2=2O%{ dt&Wqfe>wӤbu>1Ş C{c_3*y׬{wyw3\~MR+ ܢ 7{մIbZ{+K '(Ue>gֺ7m^Ԍg/g/_g/qtzQ bn0Wui]E͆F>z퇋Uz$_H Vt4hFa7:1 G5,Xnp8_Әݺ)8`GOiԦ :OsSGiӏnt}?!ޕ{E#~/xo4aN]ө=igqkڎ>/\ӿ/~'.Ӈ_~[Ow bh"aZM$m7jyCkkho>je-660Ӹ@ŽZ )?~0?b|5֟btz1I*U0-D6EKuQSחT)>-[S7f!n I/smA=H#rS+AIլ.~xAKU6B!a mTQQ230;A9ϫ'HӗxzD̞>N-FeHȣ!d2[!Нa$ݏO%p=Ssc'픡wW q/&]V·8>L=0=Y殟 ՔBBRb3#HUv.y5ߚpB5ؔϛG+CB5J7`qA={i y* 8+Nm!.zBȌ$dqҸY)Yˀ:04B-"GL}|G}y7}ðYh_S&'"Ʀɼ'Pu"à PKlAm(Ct>+B* Ց 0ʹ#@*:Wh%eDʸ*)ewr>9Pt8Q+7] ڟW:evz@e!#!-ViсifejK%{Rg}ꂜT)1]%r*@pUPj`Pw^o\LO$% %``LR$1$*&:9sPYFXḵZ`;;Y.F!u$m <=?x7[eh A#A= ጪ=*G\ΥQ֝|#R3U9)̀%8\GqxH] b'&b%*ye bhENO#O;V N/VP?:Ft>Ad 9djPRe%BQ,r;Bk.nBJ(O80sYqUJ Db!/6LKFf]3}BOrkwl*W~ؕ*9@)`SA5$#Ih1,W!a5m pS&0+2R BFPmT7T6SU7S(^T<<+nR\c0af0)0uV劚+iP_->Ά *:ٕ sJ>!EPu m՜̷KTXŖ)}潵 v"qD^o/_P uQ]QxHJyk+6*hB2No빞u'7Ϭ}-.)w"%W2g.%5w@i"j@xBH&YIi &j~dr,s(7j+jX' 80_,^ܴ!i햁Qء## })x?*t_6.$aDveŊ*sv.tNl:lNHт0Q^%P` *UdhR)i2,w)Š }+!JǴ B [#gKLmci!ְ_׉ƏaWA5M^?ߢÅ[&tSb7,pvnz~ oh|W^.D%m2ϩ~V}B.~ "c.Y_ZW뻧N6thw?_ͺCjYQj[vg[r~7>&i0okw 4SD,zM֜Ot\KOm)~o;8`-Y2%N6?l.ͬ~-]9"h]mCnA^B+a$y* *{ȂrsG& Osgr|2#nv+Vs &]O,0 n~6n|K23*m ɰ@b {hA799 5I8ţա݊6x\;O$[䀤MrHʚ?gɢ.3g2?ot5;~W@|v7'a=HqZ$ _QJtͦ}R'"^],F%ʟ[(^TlWZG56OK9VNWYɲ_E320=:eRZҒ_IG"|-~SJV3PΨ;6c+{jLk4ۏ~3N7yow%䆪u,/ .ݻ'էכ`yurLdж=> ͯ* #] ] :K=d^cw}}&YM{9L8?/Ix@[;b|w5J':^G_'[YvEj+uezmPq5Y73vK5i7Ml)g!J\ ;3̴\9|vp< "2 [K9;vwJ (`˾cc:+pgWǓUF'U,7AHzM4QbH ,\՚8e`"5.zSneӍ[[d춖Rs6PvR0.vvJ{D;wX6JJP9jRaHr"_q2A(QV9yBLY<} B*IbhX@8+]۫8OC&^Ɏ-&Gɿcg/= :vu)uYyOW3t}7wG䷞ۓleh3V GF g&|.(*"a QIz%1[r>m1 lI)Rz-2U;0SFƈJ];1-0H-ƲkDq& IՐ)’頔!\:bfC&I8A Y1q6Ʈ3pQL A"4!<Kc ^SʌtdrYM!\^&P888x` |^H4i&}PXC4I+=#>.q=4Li-K"aV;F|R.2aC W~"qPG{tb;ٻ)~:M 0&@qEZjoCŜ5b>yN J- ( {;X[o:sIBȶG _wx6 85<2:,%J Q)UJQi@ `b`cn:>I) :Xa`oqjËQK?QR7CV,# '(M i'!a DRAU ' <ףfmֿ߾|5Rkpʥ%1D 8TM&$QYA@hQ % rXBVF jf$Zh+ߚizZO3`dL9= uDKL/y,g .tE j 1%Ƶ1&천iȰ,Zd㕠:jF E0 PȔ\dpU1H@!3ň&-`*V13~M],'F#k>ֹ FrB~ (E_<~ ˦ϓVI'J3Rl9.={ yM;G; ɜٗhwͳ##BIZ5}YL:-r1RDDȺ|hu`xK@slT 8k| $4RWVl'-D94*`jr`]W%^֨C;u)b,OetI>jL/ &0Sx~^!Ot6lF#ACJ!ޅ= vA䂙)<9\Gw|<A腑 gCd&$ DCf D8 )a*7o.)DUE#abV>}-zlozVPT:ۧL ! K%@<p;2\b1*q4-XE~clzz>|սZG56Y^̀ }: USaw@_ѲUj)T-i .l}Vb 5tFݱ[[VcZ~ѽwtRܸQu+!7T^`~np)=>ݼ~qo>xrLd(4- GOP½ n6w/nwEXQ2OJ|B7LƳ>Or.q9S${b_ ;b?9]_tgOj2^,SW 狙0EhSu #gO&'=fO)rm=&ܛV3f0_7S.Uo?ٺSve-Bعvfݙisqdz4=M4Y[KxB$syCcTJ@E|fF7(#^#бX7<݄W*u I$r@n&(FJZhyjM20^y}PxVWi u<ܾMꈻIrIC+Zg&tOVMrԵ,p@TQ5zkI qywow'?NNvR ꅤf;ݵݠm#yUAZa&V!].!'+ydɿcg/}}u'ˇ'Nq5f$ ^ΌMj]P%UDrA"6JJkc(,}b<ȡRZdv`\1w4c&[ZaZ!3iMeT% A)Ct4QLv6pblhg]}f>^li EQiBx4!L+8sLCHL |;$ p_)$N;zxf!W:7MZcF&1F/Y!*jŐ' yx G$uM)`E˒HbՎLؐp^1hNG{v 7q-u\Q[P1'A OSx VpG$'\C1b* PUg b;iRPh湒[.|d6o(h33$ xy%> 6N=:z"tH\YOqAi=6o]8bCFq9;z=!1S+B>22U$k',:c L7.j X Ze-bE?Hɧdt`K)CDJ%¯,F_ )9a5q,o*4zFndVڽu~n 3Zkj= 8Mn=|sHqGȵučE y;;9 1sxQꦻ"v< +{3}{;__\xh~Whg7c~4-_>ѧ3/x5K9.ٺ-L}3>!3tM3P[?o𑋓iQIO_ҸQC}\_Bga$& g0Ȃr~p|p|p|pE %*DmS0z2az\IfR*# Y9RzVwөԚ )G|,k2ˊLR+;k-fзL'޵B>j/ BrxI>O}$[?,fv9wG+`k2()aP{0(015c@9ws`쁾3d.ɤBM}Yo$^KJߘ02_[CW016bz< -p}wQ2r3*Q,[ʦX (TN0͵wTܚ$њPUёLr<|K&oӻ٢(AR F#=)UFV{A5q6kR q\UYs_~͗*j,ȳSV,[SLd2_=o ~.HXh=h"7 #C&Nr߻ ٻBm!͔g`uVX f,E-Y$u3\t{n-ZM(\CN ޅ1fiR42WbODg9=;shc, X.d]dž4^0`mmڸ bzTxUýTҙ|@uͮ"c?_>%/FG^ɛL]qB~%e|u/-iZj#x?iyXiM[s?)o)E+)uV&&ҥg,|K-2{Ç2qtB4Aʬ: ɷF-٥k;:׮yS:|lk`b[w˖cmy5?;j1g3/~tK c_h:w[4WB`W%Wݽも!V* l|mu,5*S_|Zf1u:;\LVOgc4դ05yUQKN|]4a'm̒IJ`g9EbtJZz }NuL@YO "BdUt)u21 ܑ6V-q//>>t|kuGA3 &zI\D`[|@?Xn}1!ە%ޱaM]sSCcNp{sӫ_T~ra_׫7YOӄ41A{ ೦կ7YyҪm5z۴kj>v?W#m\̫IvǓ ՜t VمS^aGHlR.6t2T?ߕJ bkB,@uJKFk}HbtN$4G| \\ 9\GFZ[]6[(`*J!4l,[;JW6o ]. w-S'Wo.ᝯ:ȝ.ٖKpUr$)qk̓nz)ϗm>?o(ed⡱F8h1 ߒӱ7 yڂ!J Bri 4Fv,0lQ=z@kLN"i$c4"SḋR,p\QX8aC& ?2~h3n){a |6-g=xI?B"r9jרJІ@ג"e{9yq A+o坡oe4Ԋ7/ٰǢԻboUYo{@%La>h -ܑcx ~I`멅KB!#B4O0f2r& 1 l(9)&fB rY;BR@K0(a溰JO?~=aNkUD1@b q! Z4B l1Mʲ18QfxnHä巘@IRRKk7H n\kYQѸ&XwYC\8)u?Of_|Uy5em26a*7-Ө]"ڒ9E y]h,s̒(Ɩ=\IXx fD)ڜDgdT3ږ8-c=RVb yo ު9(xŲgwrv Έw~wAOOG'$Yf9$ L&eI:/#TABRhG&N2+( b64*A|;f]&QJ.PƔ]꧖8-v s_vѱv`׎xCni&@(2JY)I&]Ac% YL)j6 hC&dy()M!f@C1ḟudXJM_Mub"ZDY""q݈G2R"(2OBK-9&FӚ$AG^@iWr (V2ؙZL2T"'-H\X;c8[d'zՁd^}:]lUo{nK $1B%hUphFxgyBlc 9| ۍkv!jcW{+C=|`]5v܊Po_s%6+9 ~|GaճG͍Mz?L (G&jlwqR!(Y4*X#SD^~'Bro;%B$KDHwɡ0 `#BSRHag 51(Wtyr}4N4Op,tʝjgxjO#RxVN Iٲ*9(]"7g>&*kF3Cu߀2{m9{~6eknSk-J:-Dθc&>9',Ϭ-]ZHJEwNLI\̑\."@3Ek lK_\IhڥP%ۏH.}2 u]LQ)xEtK1nt%s$ǕKo eтq\#tr[+tTf8LllCۺAG1bj:`fF>8[ tJ +Oөt0:yGq/,wOCmXw!d6p;i5j $6 dt5|<3yt11 j9e} A*9*:aى 4ri)sHQ+&6Ւo-Aw."!׎v*0Fy1}E<[vuu'坢m?@T>_s8X-!T!Hʓ yt:|1`@Jإ6L=M'k>+R=kYfss{ܱsp~ vzL;Y,e#})) #l-Sm-[ :%)3>d %(iӟYKZ7Ce4:^,BTSTξyrgou{'azzqf?yI;]x%osg2AZ3ASZP}aBbOћȼ!DVo#  Q@6-/TtjpfKwX5R~ތuބvmQ̳C~dkgJ0;/E([~x0GO(g:wJg_[qCm>E[iZ7^XyTK6Z)BjQM *%»3-w`Fo>o"on[j!(rT23K0J&:EŅu"L VȲ$+2tt>N q嘒Lsf"ZSerkQwr<9E w y1b푴4Qbh>jR$Zun3 qpGM"ȹ$EAd,Ii|sEeXR ’lv(]XBOGLk9( JtBHDx72vQyk7FUǶd.M,"%1 BڬBYɠ|Zh!k5v%rֲƒlfX u<](Dr1$_Fw]6~,bYcޙÓZ6̤ټz5O({;Y_nI LOI#|ho{M?4ܣg/w{{t/H=>\f,iwW!!}{ ֜yuUKQ@#R_KKtXY% T\J>GJAT却ԣ/4:աuMv`0~da!AZhiRd:>m*?>B !c(;+#CNYw|ҚOlSfJ[ų`RYa]d,E-Y%p!bn[zn-Zm"LAH Qyc@fW8-XFۺEǍ\~Gstn>-߾>#CjNVu@+QIJl=Z0g ëW4M1ZayזäUQ3*VR6b:uawn9-jw}-! Q 1NIFdRΓ,mP$j!*Fϩ'Y%͟plT&Y,џM +)Mۑ ;|@1gdt]Ml5)-HkaIi8mjoec=ϵt;0jiwf!] +wqghVI38}<=!i|x:r4ӽbf_ks͝W' b.q }ҝ9Cm':.v+G[9WtjzVs~!&i8e`YrBOMNQ~$WjxlHs(k>rn']W,.ּx=eas/o:x~?9ͫtV`O"AɍIc3`. Y[CxU -ۜl״9^fY zY ŹZ3o{|y1/&o`tt\BjOq:,w^&6?j^QSϋT):cTQE%p;.\W{\mx58Z~IꇸSǹ%,F"'WL٤ E##dVFURE4$t̫jV{ mzpKǎOYD g`QM!@*hRp-Vαӎcg'nߕjlhmO,ڦKo9o[gӨxs6Ee\0=L 4L;n\R_y4@2c+E Rp&~JN. ڷ g~*eF@\(ds2Ch-j6A:h&qPtc8ey, ]Wh/|fCZRdǶ; F0dxD ?]$FU8&Q;Y/<|wa!FWG>v۹]ǼsI䨉_J(MB2^˒d/Dr` (<:cVwޞHSljGnou PRJoHQ`? )8b vԼBE;Rl/lSl cl]ݢ c4 Jy2r& 1 l(9)&fB rJ!b`FHɧdt`K)eCp\ s#9x'@e9k|ۄ(va_?>ǛBqrP{5*v 1 J-c!(QڱʨI1pdIg+gO1y $ *nBE'ƌ973jzWWci\םOů>_kl4'+'H^V+iONjݻMEM[{zF#kjsAhhׇZM^\JChK 1XJFc!ǘC`H1rA'a)T1 O<$<# *IH-[#gadak;c[d!dEenӓW=Ⱦo`0<{&2H4䋘@;B%-<$֮)̊({> &C %Mi FE.B&ݎY 1e`[حsa)[B"])]koɱ+>X`A.FcbZ&RW!EɤDSM&LOOqTMuJAe1Ea\Mp շQ 0gݖK{qjDic6p,nb=Ů;LB/}69O$}Y9ԧ. @)^iOk˴Z*Ǜġp2KM$GdEqXϞ~Oz.i@eM RT!e:,O1ML@4q-#9u׿ո@>>]&=A|ZB17\on?-we`ޞj:ԋY^F_A 8i̴&P7C x04 8|'T"/}]&:0dMJ*S[|k.[#BJ傴.E \g@j1! H!̵sy8I_HK]j5 X>LrOPCj}~| sO0a1%~,cg#47ax}傣ϷG{~(qKsMȰ?/6qj|0Sѹ-yo:Wm7n6`gG]O\^Mdz F~xՑv$4.]̢@͈qt=.D- a\.xGzm|)W=DGj\.Bt DȠ7N)( F]}ٽӓ$rǔFw*v}dW5J :]=>*p%h1'h2f+P fV9 >E2mp&՞䙫k;Ttɍ h˞2."7C劽5Ǜ.,4R'*6V0l![ׇ4)sJ[}8HKPZA<wv \S\(ߨFD!_6:,0(Eǣ`y嘒1Lq̀V"Rjlp]p19? Kr+61:]!z$%ƀ*hd2x]T,(5S-G||7PV]AK9KqAr7r.T\:'2V mdyH>>_PtH)4>pPν f2Hah)t҃9rtު O)v*F3d.,7+*Θ͹&E3Kҡ%rb\l`cdc^uF+ւ_~u2hH'8cbGYw;Q) ABbѠycR̡<3) xUppy>4V4`@)J c&lcArQ(e x}VKĐڨXxCBG[@G1UEdCk뭖[:aBK/QH V5s&@_p 3I`LX{KTL&ﳩ-ZM5CMMvcf7 /K 27bF @eR h!!mhIX<y L$pT6"x u "7l& ke1RAJQ\C!xS4N}vc [uc]~nKڎkgܞs!YgtL,;>5*(^ i:Q}!,64T6 6k.qlyfϷZ$jGcMSFk5omD,Uc(-R펞Fju7RDz~p ͖[;]JH'HU il~V)5RJ߀.45w9zQjo+g߳Pր6m\FutOZ^l'KTw;z SUNI{ɡqV M8l2;hk$MFpĎ2i1(ٲA!O5d9ǤVB `O%3^׳ yJ$L@c,cN`ŊN p^7}89?>ea FD"1 ht BL2I4X}5?cdކklvSQ80PjGvgai`y!d%TÉ};hf)]x7< *۬ [7F6J&-^y3^*9w˜M |)qfgm H/q݌o9^7#Ui/-SOF}!%Cxмz6M_K횹 eۋ%bToΌR~-iF-! y}II;K?1*3.G.\FwtSO,MĮz uH33:mbswn*yJ>jv=#lWԏig;g/,?38X#QK\4l<{`cvK p1=YIw,R5оؚԏzXLFcfXS3k(sw}ύ߭;uK"Dz6?ymaҙnlՙ\oΤ\:ꁎwc7V{juP`,6+Q$yXi$HB7 h-W'~)UP$ɄRsͬ>bc^9s%" *Uv_D]c|֮($I7xɁ$xp%/=k0->h+d Z_2I&D9"x0'ٮ'*Q#)L.GcLӕ y E91:4Z.˭f0vv\ZQx^$q١9y <Z$Az]F):D!"'n `IIe+&jfdvVM _s(j?q gLqz*aLf빳 @ BS4 kcҷ&F4T<jCxz?o‘iK׮C^ &; U j!N@ @^UqIx%L^zAb7AH{|"ؘ":8^584G@&˴k<Bo"(n,S9͢),bF$-)$gb|Td[㕍o"bf\B9|C,\ւA4lFA?Q<` 냩:!Zl5&Afea /v"ҴCU?ep $Jɼ a(6S Ѹ3;:2u2fQp$40 S hekl%[e{{~Y}v P1Z)sZq@EzzIfPU,k8CKj@{6ϕZ~SvovK=i}i9W+~qnvtsk -Q(ЭKmnt 6) ?uշeۣ'7*ӣ8CO' ~7!4*lj>?qҢޭs"Ii%E+.}ڻ| [;k׻[,p`VG?`dt,?%Z|rmLHVYh?6[?(uyX~s7aIjNFCo˨?m6Eώ^ntt< `w?MW2Gn`q͈qt=.DN_z`ǣ>y`9>jX!ZU.4A lb! x9[h5bn`;$*F~ kkVEmM$yX`A5xf23r,/߷sHjIVɞfWl>V*A1 F"Ј^`^['yHAHUh(Zvc6gE:s THч SBK6ھʓhj Xˎlkp<n߂;Oo&.m[ެ17@Z,KZm&!^8}i՗]DFE'F]<{-6W. `'iKP-[#mvͯv':n{_Inc䭒t<#ouR~D'kh^z@pvzj'ꯊܽMڟfL] ^d:S4>ou0W1? #}Ղ0"|Lן]U/]U/]U/]-[t^+mP7uQ.FxAi(*#5Jk!>*9uQ, L!@GE蜋E`#\7jj a. gDU[u'Q3T^)z ؛>chLa hXm Ks fPIƜ" 2 ggβܪJJכA}آ[hD1XSggP$-=Q$e DUO[鍊UdP.w]rU…}Qɒ(M w* 8y[±e 58G-G6?|#w4rD}o fL_ϯeż}SL^nrWWR" duW6JHk*d{\ݚ,Dr& Ep(Ag-]FMަEYZ)u4ʓR^| q\]:_I},n,_ɳtC L~nLWs+ZmO=?}Q+l;tU:"VSR "]9r ]`@"tW:~%*+a`c}{Jfٕr}[o.jHN$q}%Wg?t<_M6{4 Mպ$WstVqR~F<#4]cw\ű+4]nt4]Q vh?apz]zZZt0Ntئ):DWX ]Uw*Z'N*Jt=]ARU]g5h[W+r+c8~_]6PMtV%]lWߜJ0Ůí`?j!X"1"220[hp uNbu_mx8 yN{|?n7`+MgUnh5?unnPbo`` 3tEp ]VSrk򣧫CWتStEN\՝h9u(7IWVXU~~m*\L*qtUQ*+'9]Ճ ]U`BWst(mWxtRYt+;a Ww& Zړ(ztҀ9ݥ RT6w.A;@gΥ  6N|0l*ʛ~6Ɵ;5D?"^9u&?0Gy0"mN[|}9J}`ꥊq/T4G|gsVU xOsBT3Wf |urlRO_`{5=% Ufߎ(V2f ooJ;`,eը?~.eAȂȊ2Df~j4-Xzɰ<ڭ}}D=^,fl` Q0b:hCXxRs:ݓQ+dtзnY X,)e=>&9/)JEľeew뺑1Y-b4d@j`9uF ^mdw(ɌR@$dƒ:ʻ"< ~!Cyj{NL-~jM/$ }`qE ]Z&>*S EJ*k:DW0bgBwVS7HWJnm;CWW h ?u( ߎ>]`-\g…UEkO޺("].!ti];tUhNuʀ;DWXI^iBW6NW5=]A.YW㝡 ;3Nh8`E){ztp :DW3 ; =u(Uw ANiIӜFCpy1ܹ-]w8.΄*ގ4]Q>M4 -7mkm6#^}=IWkɾZ@IN-zk]ֱ9v{j%'~s 2!sDdJX@r| YyJP1}t=[]vקo=mzתV2NMmkmVrC#<"~8Sg? QN˦ _UNeݲ/*k1q:`ˋ 2pt!D^bMСdڇ&ȠVK4Y#QZZyYYf) 眑R/sv6@䘌kJ0.+z`.  J} f v. Q9t]|ns˯f ᬹW 6Ojn|Oc?Z˹;2(|ڭJWhLai0үÒnWӬ_[ytQAJZ*@P:8CB88n/%I7VkV{?ʻ?.#P SXs!rf0sR`4=tvEG{&2Y|AKK4^ .JsEAlzH=qB<<6}1ӏ_{>?׎l3Lk@,g\ Q 2GOI OAfG.k[3"z'F =;kHK1QSRUAl[ڧ<\˳ϚWgP<9jE&>jt4?k_˦Bg:wLo.:3#Msf]&Fo~>^i b12K )u'Fcu6`@ә_H7<~ <7E9#G$qͪ82ȼsnK|"zTc|'7,),QRB˅9PL\ R9aAg[52=+-"r\B _1 &]R)B,$Aؔ.lڵiwNf:#oߚjue:c44,l_?~|"TY<'i>XM[NDWoqb_?dT4B>i|tpp쭣}>u$r.w_b?I-?@E-*PĈTЛ?ϑrHST)-x}Щb+k."1FM7I@Rd.I?"w 9*C@~)%W_G35e˰58-2Xp`ӛxAXӛnZ9Z~!pfGXf6Y{pNY0{@j>Zx2odcA6Ld_ nvN0KvTQ4[/^hv@oH%dz}+ɛ‡/ !y0[]?O1}~<*Mo-n= xyba3{wjgJ~5W_=v~+ozS%/53tN8f,Pmϭ_7W݊\ &"3^pZד"&SB?<SEqܫbQ*ZGtI\)dhͷvl-,Kmc~}~n *)ؘSdAf2\Y[@Iz3[a [(1kRZAdD!Ж5U>n7*bVB%ȠR]"H 媄 )EҔzB w* 8y[±e 58G-G6?|#,4r_LOK_WᄑD}o fL_ϯe0_/L#VhU &@n]bRAɵ\ݚ,Dr& Ep(Ag-]FMަwem$I23!XxucF{-yDJ\SbFAR2D%X̌ʊ2"3 ,RAqB:ڠ3rz7 "y/}oo4lSTvEoY>e| !X'=UySޏZLrGz!ʵHkT{z9?wqvK\R>`+܂z?Ɓ`4 e~3F+r뷽m={5`Lb1.F~rd4i{pWOmQ%&>9~'ЈWD;A4ODEùIpBĂ1`I6cl! }Gqݠ:mm0,7Ʀ_ 1SWltۆ0!:F[7M^1#Ax";H W2}-q 0PBlO9p41V\ gU+eBHIS`Z$BD42b0RL\)AIP<՞J1p>Cc`茜B}=lFq;~gɇA&繻ͽ؁-i(,lyǖot:Oc 7T1 b=GDSV[ARpxY2Ɉ6x:t~{iy5I!:* p L 4!1@*@Y=txLm8&#TN4# ue`fu*LJ[LDM+ LO:yQ8)#=z7ŗ!|7| 7^|~÷bY -Jb5O5AK^g.V$n_B-F/i_RC_y~\apj2}mv2.c w q~Zi&p}(Ӆ>eL1٢KjWҸ޺.L)d֜JwF]x3ִ^}n9 j߸ExBn^gA~(l+G8'o륲\ <]~c8/vGhsyw>x8FU{*Dgz˞!Sǚa%{etȭjVp9F_HK`2Y)ڽީ{>0T0F9WZ<{szwk+#b'3@3^]{YOɦ֥v[4;Aۋe"$߉ /GEQ&U|ruUʔnމ{ Z;>}ۖuo.S@/4kȮ^('9T[z0l5KIكkQdIsŘy LBCKw5|ٝLNox܄UXޖ4|4@nzp=⎌7[waa왠#TPm~LNVwHܻ ݄1+MyotwLtqov|m6t{=s=]?ZU֩ Ġ i&2r\8& BbvO*J&BI\>ŷّC[vx%( < (Dž>bH%*娑 "CN@/s`H烃*oUwKա^xXe(=_>3E |z}\?>.J4Q"($,r'w퓻jrW:xI4*7t4ВPT$y璍ۚt69tS>RlJ3.> 'dDS*ngQU7 r~yըjksscv>fiUiS+AO0JM*諔*ʺQZu4~|J.`Ո(I ӨW4hWZOS'J}?$imaXS}I̾haf#[-f^vt!q%sIG'\hb&wsJI Zv$w$mpir"u18AxgP\ո`iDѱ9;fTZ(*{o®9dJaK+lڦݪnbKE|㲖rGw.'yÙ 5CCU40ãv5(8B3Iԛ;dU| \E6W&W10i-y BEIFe2 k4% (.h>9+%CeP%">C]_pA( Vk 6z)Zr~(er%G,?۟ji)FQE[:gA' ୔hxfu.HEq u+H Ⱥr=ўum긌 Ը?'U!n>gRxl=yp -w]H[dhQ74;];y1TS@4)}0‡( IއY|)`ѐg-D'ⴸeP=Y3 A oParŽѹVSl~t}4#^{o8ܕB; Vyo3lka8+7ee UpWFg9ŕ+%ٗͧNNV;TȐBڜ(r뱲ǚ \+.p `S&iqyEрF>ʸ&d'Ӑ^Rwď-&Ց- :[:U]u6W0@v?޾u՛gWg`60 >?_Вt54 ͍ehS˸GnKQ|5nir]-ş,d<)$+fu8ȍc (UٞukKe,!ĠQ!}k]P"ޠiؼO5f-uE*ZE䁢ڄPz|>]pX4фg19|{Bkd@FU'+a?`":㉽"n?L*aM>LցO$)8P?0D@`9sE X{G ^[k~N~ͅI3N"6-CIIM&pO4ND 3@jn,T"&N'+H7pKp^ehP_5|j'Y>a sKQF+ƧPiwb+4-XEh`.XN޾ٝi &O0V{f[Eelɉ7cKܾ\qkbXQʂC?؁a#- K:LԢa*O 1Sal(Dq.D~vF݂ eRnC9O|E$x>.OZ'o{fyN̻;=Yf k_Zww|')pS&*zfq8OI.NM 1xhȨN1&AR1!@ȉJtLֳ ӀxTT9w#cwJgXg  =;2|5mfxgzYW n4| >a#M5D 4(9^DǨEid5 zY6˼b*-` D[2pͪ;}Pv{V6}bU<\#wJw_qNʍz ~|&GEKǺ"pU.h/~0U:uj/(J@|kuS1K?dz0xt<~x Sx)O?gYVg۸nj32<9y؇۫p[Zˤ"ұ]W_7IQ#fȡ8L%r ҥ]8Si,ti)C_ӯmkiU*0(օJ]ʱB*`(0>ξm.oޫǎo{/,ۆ(.AXp;e,sRkxr\L7Ah/HAB)*d(a#U1 n*taۊ iombc/&78\l֣u&ɥW}yNV@(-CrΈI'fp(sZŠ/ݜD]Q+pQU@#whs>u*~:oKm'*mE.1+BnyA_8/c 1Ob:)8?]6cs[[|4tA) *JiBWJL@9GZ KޭDA;CqW/^t) ѐ*Ž`uP_~r0umK:< ;܅WΖs d@kQ%%9aS4L X))J|-|?3Is :~Ԥn.K}*SӤ04\-.bk`i<E.PUမLEcv4 ,W_V=/˧^%JK. GIqL u <&T媴P`;-y\%gaJ{+,U1ZYTVXPp\Ϯ.÷I6 \V畯<o`TХ 2-mbAR3%IJ8uǶ#=?:"7r.JKtMguʖ¹`xIKbs bnP*cZt+cGI`-{6f̂ũc2b_:ՁlKۯVRo0X|L$r1%4SRs&p2EHIqŒnJr{Nk޵HktK^ ˁ'*f)u7~QKfx%/qsR㖝 mw A Aa%fi2*alSVA1))-W,ECa^)0˃]icæ}-"9U%L9,]YjW:-Dɳ} hLS ,+똋cs sfU3ʩҭ*mHބυ6oʦGqrwg}^Ռ|~/HU\Foc'/yp~5y78}/ogRb՗/ބieDmFxxr*CTʁ0'/Err(4.BQ;]8_vTҌq눣[Uݷk((iAW&>َ։"- AF']J պ'ҸQd'@jzh!v>I!]U)HG$Q(\BRHHJ$$aevӤRC4 W8cZ1 ;E*YE锸ITV:Ƶ$S5ݤ-dMvI/lBus v~ߗltmL ?fLVFٌ2Sl!=N+$=3%5I13Nf*URfDWX| id.tEha#] e|+k΅W)+Bi`!ҕM->}M h+O_cݜ/K/BY03yx聯C;dDG߭+O_F* нDJOFBըbb:OXn@]LfbXL5o% CkY4*A$1%m-VvLt&_]>['_\F|UdD%WκZ$e_7yɧ͹.7?Wk6ψp=|ӊ:jvs\Fռ=\migO_>6JsQF&V#!ZUc5B c%l1-4#BZCWWC.tEh]﷔!ʑIW r+u6tEpɅ-tE(kk#] ﵽmB_')%m - ^7%P₉b { gUpd6an~c:njP4Oru')mGKS*՞eO<̗lw &ԝ/#kql2n3O/_ȷW>]G?4چ}v*gl~@[B!Ng9k~]fmyTBCV>Е`jّz)]q'b5*B1 VdDWذ|B6=щ(jt%cFdDWɆԹ5t(ˑHW .dFtEΆ.l+B O #] 4ӊی k<"tE(-jte{R6*lADk}bp"|] ,sȀȇtծtE(HWC+˺ӡt56"NWrLIWRZnտ|G\qB  Z'vhD+P[4izߡ2 ]+DkTP:= @Ȍ ]\ ]!ZlPjt%B6p|[Z5&" lHWRp,#Bl ]Z%NWvqHWá+%a"#"NdCWמX%ZNWRۑHWZ+s]!`'I utel!ҕLΈg#*B~ePAҕ8S=ɕ۪n9?B^B Yz[8U,ZVo eqy~XQś6٭XMZBs(FN:V8&h/ʚhh7i?=PGP6 M@M{IJW9@AӪs%^;44/@?ξLh6툧6;[&]QlaR||[fSl)Ag+#/&HS"kk->|1᫇'7)@T'6}#Aw:c1eY[V!`.l2Mb;J?XM^U uvj`wP}j>JM>J;Dj{X3ɗcgn.ynzM`((uƳ '`}h]} ˁ8ogˀu'O[CnfEyYl7gL};p,֙|AJj2ڑ㎌#~ȋms7- .ͯL ו_pGork(wֿ+On~7uÂ[*Xq*]TŨ3<%eUps\V৫_]tm)Eu^-t5㘺o32-kّbsꘜV&芓iInder]wjA鹑kg+\Kqw}DZO?KrCqE΢ŬllwM_vWPbguDc+J|ήzC6)5d/i_^1]j:dV Ȋ,|gy7*Rrnz/pBs20k,ߴ2X>".dsU"7rSU;'%/O~Z:MS42j#/4nȍ yE2a^TyxZ\ϯYq΀cn2\0 \ߦH?duF^mBx__SEuhV?buZ.3&{TYټFY8ɄNi$UQJd^)Q2SmAkZ4p{pf*24rI'Nb,?q.XZyJ3y_hbØb;\+]U:"jtJ?eDWXˆdeMyЕPXی +T.tEh;]JF ]IUɈ'".BHWJk.s*ȧvEpU6 5w"Vt5@*)BFl Vh}tE(jte 03+le6tEpI-罏%+C+k$w.#B "/lDWV4|ҕRծO T(\pVarjvdN4Mˆ.g&&R_ MC/Ι?} =];Оj{l;gD #];lFtEGBfCWpȅ=V({0ՋXuNƲ+˳-jҕ+,ȆuCt5N)/p+"ZBW0;]ڢHWá+͘e*#"gCW7 :&NW!ҕa6% ]\MtEh;]J7FWC+4OCWة|+y.tEh;]J;ڇHWS|+Y6tEhvB)dpt%n~YΝq-ml@6 m&XmGgDËWݒ4^ӅPک:ɩ.N-pҋ+j1thu:ubV ]!]cDW 8}xt(Β1a1tF-}+FII 銌omk ^bf>.P%ʙ|Z]p$b8mTNWFy+ouXԕA+ +FN^]9P ]KA-T;~7.F]m'Qӄ ]Ev + z)thodJR$tuthsWlrN3\MK+F{DSIW{F9PL Y %'v/i&Gߠ|KϽ8LCO6LvBӏ^?f齳?Ew{ۀQǬ|G &ڻFv6ɷy[_zz.(U !(ݷϞ> ?=>L B~_ڠ r?}wxQ75wַ p`6yOG p5Wr6O7hM1~#7$od=|ܧ%໨]F{*PR;LU=^?;*;qayBs*j G_O<<!<-j'癧g|`1?}"B\@s5hI|׭rS߮j }%l(X3ʶkmN7S/躬*]un:[¯b {|">]M[7[5ܬ/[\ތ+ߕxQpSP%egM z*8]2ͥ輹\OJHB:vZ*Ac u֪j:?r)fͰ8iA5S?kdkg,d?-4ta}mB! TšS ]6yCюzs&LDK)je"9цsPU,QVɖV ]Sh) mGylNPD6%Krݼyj*pb)Uf뀬-PRJk]reGCwHf,0kD34p'̍-Ռ JUF-9+Z@Dgg}r!6.M݃66m4%cM!:F2T`̐|v,4ƈa4B1fQGW@x`h%艤&o^~*6YtT:DyKP[P)}z:LAZ 3aPo,ĥj*A BqǘAQRC:iFe9;vL`]n͚OkKo?U0oYΎ#D m3bk!!ƻAy@r *}rl|V2GH}1[ZmU&1:Σ&i.Y. Ao%CJP$JETd4^ JR|yLPBb2$kuk+O< o 1tK`ՅY,TG7?XE9¶l^ A; > $dƿ\]}I]D]eb:\9)>> ]{ #AˈzuC.MAy1k&! 5y]%@_!8f;@_Iih(`|s9q[S-Ah[Ru !zbA jhjw ,hͪ]FpX2zv huz ,gD $'e#[qmjc:LˤUU|2$?< B#zw7ygzތXt&*b,z>qVH flN,UhE_ 3i  j@gV3 o=ZCfk4KoMu^I"`!-vH¦lpݢ> 3hJP0\qih}Gy>গ X:_wio/7;>͂ ]H77kF֞5`KΪ2UǨ]ǴjZkIQ3QFhћAi7NKo=H#ts8ݐh!/Q{úSi*tyF/ŹCI*WswHw[Qz낆dj*g':RC}C7of1 V"5sJmhB $\!Cɟ$o Qް bp\ 1U -N}7#ᦫ^:H9-0 ᧀFSaМ6nmVdnQ4 C5kՃ*H]6 |tf҃d &S@rZx tmKzzfisPqab2|57לAPK}4 bVqdi1`EH&XkʛvLpyP/FF̆ M F9,`=KY{OY (t )nXI2[1xxA4Tߍg֘-jUetiL  XTTǢhBMS*BGR!6OSZ,;wQĵ-* g'b!hE%蕲_H0H5zF\ǽe iSk-ShfW@cnYќ{;^qF)p{=I_껃_`-VKզw7 M5_]{ymҋo֗_o/_ oq7͇XvwzݾxRB᯾H/=nm-o.Ρ/N?mܹ"a&vSwcޮv{?_fh Rqm:\=HW*]Es\hO )@ڊH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 tN " ''1v1N M@֧Sw1 tN m:GqH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 dђ@&qYpqi.'R@.Wp[MyԻ 㮽Uݼ~ $%/FbKhN޸Zatƥ,DWMZ ]1\BW֙S+Fҕ!z |8,lץ'OW2:3" ` 2tBWHW.%tÍq)tNbFnst;DDW Iz1tpbޝ BWHWLVjItŀ[ ]EeK+FkөtI *) DW bzbQ&2xtE>too8 ?>͐wf(bb{v[#=l{3R eWP~um+,kZ5_^pGK_q#w˫@.W/{=H|@0?6Hc8B H}l$H;x[BW<i[~}f$RɑDF [z믪G0i#y9쥓𹦗ZW #9ixΎi\ˏ۰uga9d.6B.qv''=ޙ4A{0 26/c> CէXK3mPNjpVgK_V:&p)JXԟ?H$"Q - :zY~6z2Y|m&YUXsk_z5Zzq{v,8d{XX`9E09,1M'swN~,:N0YB^hI s>0ٝꁬAbQZXK`*;zgv-'2ZXOھkA\ v=Zu"\L \B:\GWOWJ]Fн+W}+֊]+.ĝVSGp?p⊽+V]+ra^\ ƥ'cV \ \ϵk-)59/$֊=+Qfo ĥD \|](zp;pZK? Ut_ } (H^"\i?W jo Bi@mz9p)و#Bki9ycdW/,0Y\YAݟ0Qג}+ZpR2y}p%@ C17Aʪ k4[)!oY=T&&exw5 okQ5Q&C}օ ]OnwQ@/Z)‘Sz;^jK >6+N3+&;eQf%_CnIaIB>0kiF~O;>.\+xï0g9LQ *| eZV}>\J|&FtpoSDXFC,PL12]/v]0b 3/d "T-uRZ쩽վn/I 1Ep_ /8OKpTe駱<ϤgItx:2ܧ }NiYx%L?RfF'ٮIST=ެ߇hYrPuꑪU75.85Y6'pޭ&w[Y fgB=o4$ZOvws~,SƙwUeE}[f%Q:9Q4gp_K"0:W$CrNʂj1!s y&g4/;\>8/O m&` YM}5eh.~]!SAzr^Yd0w~y$U~&+ֽY}9/O*B{5fT-) { qyq:> |+P\'܍8.駽]ǔdP]|~nuXm vLLf^NS9P+ZJ4K̒ V`IfGD">f"5U w-V#$&&+D^}52q$^hi5i}W~F> 56X[yn/nd4qgCH9IpAQ^}9nީ.Ix?(7D|ԧ&Av8S⺼8 t@yrŽ/VS]Y]cq٢7(z!ɻDRrqtiqpz[}}_.͋8rr:+}=_S^F^=ͥB*PH!ɸW<M^,kf_+a I9F菝kiԏ{ [Tx37.Xb pTRLJAO =a꿁auv*Љ?ºv't/Aw?woNN2soOf* [v-UUKVPߴjnCmW> nͫcɗ7} C|vؿf[3fp b'N_agq)ǛR&^%U B HR3>SZ-\1$_13$IXN=` ZS ki%ZE䁂3ڄ)N' mf溎KEyTܫ(LZ3 5XIXEy60A:i,*ziN~Ա%*ۊ߮O`Xk)p9( NrhXJ`*Bs*maIӚu\ǿu6bgz3j~{oo7C@n4) kbzEteW[0leAqI$+D O 4.c=6/qc-ȧvnc١f'6gbKcdDL20Μ;EOZ:%b6>罥B$ʀE39RYLL[K>wB.1J]Z%׿mְ<.R+tJD +׉-4F3!Df9DELVD[3z]>5Gp@mdlM=􀅏3m 2nfv?*øW{ݧ?'[%M5D 0(9 OVv@PXV{< ig=ރɕJx%f6 16( C$h2b&퇃`b j6k NW`fMf"D Ss -Zdc [ X4*f# !`E4` nGWx.CP@5#,# xTveߚ8f9Op,KִdS\-8x #9hXsVi $`+y^m\ꀋiǦx([C>8+~5G><_㣘0*Apg~ HYя+4@׻A*`T_!0( eT!W<+8Tv ${@ȍdi?%Am)0Fԉ+gZLŌjՁ+'v=L i!H`@i-Le 93n9خd s|'p}3^O܅ϒ{=i3^SJ[99!]/BR֖^Ԅӊ˄Xk[ u ^:Z$*0/I\a^'sm|ҝiqAWZ9'j6?+զiLw x&={6sa ֤eNb^"B8K6`L vgKb[+uO\[X\9ci~b5 49dc=;oG޽io\ +RQe'4Ը%V|AYzuUz 7Onx;m4Pחc+$IvVW`aÈִLjHod(QSIYdV*"/6oqo[^ rJfm$2;cdzxosʤ ~K pxy#,Wg;8-l"'D #O)ۧQ%"TW|ewM7UB!mDh ^PI(¶HH$1 0QȲ},r ]Zri&W.졻{a /lOu5v}_@-y>#<tq(YV=$̟j~Uf &8b8@١13Tu0Ce/sLAa|x؃6-ې0uХjՕ9z'`Q1 )(ku*A2Q* 9 d@G(D?I^:>\ڥKiC#0>KZwyWzx(yvo)pQei;&uA 磅Bvn}Hs8) M`ww A(1C. %cpJXҐW(Ti;Q80YdA1IQR #kTsRqʻ8; 늃=⍷)LR,(t):-L &#d|滱nֻk h)c)}(=hV8 I0+Zk$Ve²EO ԇMн6fym4nxusu3/|Hp-"r0'鐂iС%ۮz_'qS1]j/m΄+ҾE5:c(#Nf9͑#;骅?n? ӁIJ5äCavD4kΡV .1!(Y!8RLy=4^4`t8JSalCȨksRU[%* BXQ5GyjLmKmzSBG[dc-&<8 D6PyȤ&Ptl*]S1佌! "~  `^kɗJjЧѰT|ޢ^9[74?ie]Je)$L9 2Ծ.G^vh,011p:D^fBҢ)^ER[eI!:h[Hz_W:`vu~l+$Aw3k3cbH9i#.Ja·";Upo Bw=, t!9##Q:"u M Ќ~ ޢ.I[_/[Q/dIC PA,$aV;PW>!rd!jR4r=聞Dm7RUryr A+R Ri?bZd":5uvS*9qx_n?I ~e(kW&5j7+r@b;"#!-+jPv4 `qI@V7 Rj&JK엔/`fZ4AuD2u !cAQb12ƂoCvOX̐A_C*}U=g.Yp"lZ)-bbOEEp)(ʠG0=D~Q:%=qyA{Y6r}_~yՍKRVQ\ADGa5%y( YM4FY0zmMiURat /Drfd"B*[!Ρovqdm'hO{̚&|L>ǏMrNIRXcmB}(j]vy`!d#‰W=Oti#_n\ 'M j`б[7&gC΄6TςFn>#]2ɜ&*Pd"&Q |i[$,>9Y9"a:Lbj?j=AT}yΦlQEbWx]+n~9ߦq'vgEd j,ּ;ԗ{MAÊnE!Wԗ}srUw آE˳ilwbv:9[,}@+On}tk;݃unaNR?h3u3:Z*ӗ)MN}!Cj^/ucÚ݀AK-P;sb+](=5Є0亡gq| ^ݡ483gR/gX$Ą^~t/.~Jf2۾aӓ[G_Mi1Vpc<Cpu\|˜.jvy>J,R\tq@Y ^-񘴛3V?_S%~pe.v`?wN:rX\*bQz7+QCڭ7xNMBd! PS‹EW@*i佈.8|aBL7"Rf۔Hvh“U'L0mL?zG#3ϑy|=`6ut#ʢt%\|T MxG7}ʤ趧,oN[iEDQb%CJ.a+9Sw_FU[*X;hA|5_ X  @غG+qCOˎneח’ɥtyFs߳5'ٯOOTꞅNFKA@lXNg'#V WPi !H=Zn3=9„f%gw[[!J8j)#`13&/$#j'qkij{Ǭ( )^/T"H\$)CacN!8W{tz>oG,B(IFqO": !hE"( |J4Z[ÌH[ sWkIAOȾ'ߒ"b * X2e9()c<ڙ-TҐ}C57F<JOC~mdY0؄}` 7 : +.Θ-lO(CjY0MfF+{7ݼ߭d♃EٿQo־5^=EJARK2PTJCOJiEi \!^  mٵ{n!Xfl[sj얩SO9fVBBnm[뮰ۖY b(Kڍ:[nHڻc]v}4C˙WFnևta$-I-}y-'̟N3/| )[\~]Sw:>n>z Wiv5`_H9<웋j"/YAAv{񺠎T' K6 0y]_u~^W-\^\g,q<*=Zt-RIZ5N1Rim09b 4i( mD%ཏE9Q:KkflJzSLX6~/"~+t7O/#9 A-b(`s?`{<~? CS~SV\pny;up6e-ʟ|3׻ϐ!aϓR;km#G0_vm08ds&{7A@I[-y$y2-Cmձi[Iڀ_T*VZ\Ԝ\e#^׮EcC} J#.]>{$mzo$:6?wUښ%2jX?׋ߛKB' 흟2]݋N3bl `?L%o|IB˻ct&8'WŚՋ7ˋIk7WQ.S6ʴqn\ocQg'gRbʳ^I2$*RM43fFmewYtUʅ(&U4i@))6(|Y&!7 2ͩ!ă\0(93&fьBHle}sS8u"*Ɛ(QI&> }^7-zdvQp>>(*M~;LJ(,mT/^:HSlR܁NZ\H+גG>üf LK>OWT=]DVS/vpPKNY ѣ4Ar8mMX{\.l;%'61E%ə-JQ}n(y=\OVUMsv2gG_d~уʈOp"3Έy;9]ﳏd}w4}/n@H>/(jk*j~qQiY>B ^>QԖ 曆!g]m{hka״7voCoLbjf/mn^[kr^ z6n$zMI65L~id6Cxw[Zwݗp:'lm~a~;)/xak~p5)MYNڞpw Jmyr$8$OO r(P2CfǃPnC.P]Φ#t~#Xz8Qfm,HsW1;"D+e>y*>3^2Z 󑔓J䕉;xKKI 44X1vYLpM]n-oLr޳Bkʰ[>C&Un?;j[Iw[(zP^ׄ9ѥ`,g֣h|n8j ((nY$v m[Lr,rv ϡDf9QG2)4 L 0Ee4,ڭ#q8:DŽD҄!%\јdbF)9qb Z>~bm91Qف'!"@$ŗ,6d=u]1JIL@l { 3@Ixo=?}\V\0/rzV9U \J!)hj3`u6IHi @ ˼FN.3#1h'¨8(D(6'p=L5-!˴&+R{Lf uFDbq>NS$xdRaB oX7|CbSG _0 1. t)Nkhb&* :c56@6iϟF('kc$ɕ!VX:\q{b6!lRMև>|9'_6$r\0=Sn&KYy I[]6.uL֑\O=<ͯ Cz!tPdز *R֏X{} OڈH6Ǩ9etR&,SF h.IP6ct1*!jc ^P8-"JBW4^]r9-b 0/(WiԢո:y7Rs,uٽk܃4{cj5l*ImT%t^1c 15~344PS1v;7!y};ŭ;xGء?Ep4}aOQLۼ: W{_"=B>|*GYh)&S7;ĸ?qdb/Fd$Z3\ܠEGj K6@ hU錧@%X&Zg\g}NSV2⩰)X!kM/:Khynں}w.0 ޅva=AӤl1L*ךj'hM!BFILLRLjd^ѐjqu8&ݽx*Pĕཤ'\hb&horJI T1:Dܡ!R*Dy9a[kb AssULZǖ`Ibg`ZjѲ'=Ew'ΔVCCMthXd"OQPΓK1v5BgIlwaǍcUUdZK@htm(:SXєjP2Kj Ty'Tt,6ŇXu\Jj QKMUq21-'\r8e/f0ZGa`Q46ҢJQ'曃.)HEq  mI-K{hkSemP-FB2, z`77nV=i~m2:ڀ5GYh\kvr9 8C]y M&Lm'΍@,/?MNwI(F=k.8o'I*Ḇe.K A f rtÑ[.?:ZP3[FήEb.xnF3\L]{g7Uoףn2(x5p2iG G/^l"J*B;XZ_/q8%$.]2΄~;]'tok}=+CY<2Gc'uw>I_.QwI#:7:!DΉiMg L/Nq:?y}:|ݿ~^~2_^{K\u# Q ?ƀ~W޵$C%!mQ|oNaٻ6n-W|NBŢ-(vo?-7ZNbŖ˖%ʚ'@G#Yg>4}6ucB,H^I?(c$6HLQv:jX={;I˖QQ=l DR'Z[5ܟEQEhzIh'5w8 m E%K^VJr&MHB5W(DlftFB|@XeƸNN6Ӊ>ѝu5kZ5&uG+ (|?Z v |gP?eӘ>෣}s*4a٫0[PjB6r+NjTW Y]&̕~o_,*[}w?A[u|`}3d s=ߟ0i>?]\mä4tc+sɌ~$X8goCkSыoq'\~kw[7M].cf׮քO96,^O~Z4sd_/|tZ]_\}`O?NGڇE"{;Lt儵q嵨Ր8y"K/o'9rCb]g)WhѧX; hEUQ.@4TsjY`LPSmPV 6 9e wz@اu-'O)?$jtR )j'N@B.JQBHI,&(m= / bօmU?U1cMRG؃(MҒrV0E[v>^aMVrxO |A5k;kR)&Z*iQ6{@.qT5LKN1("TR*!ˢ[+ޥbɱT\PdKAOwSLWi4/X(G,<*nP(]ggq9xC/>=}['FlIZgA6(l6u &(^IǬē2ӎyMmNPC5|RB`#&D%*l RSKEDaӴfbfv+UcV#j ^7l \Dg r@3gKZ5[\a,0%vlц<--dfيv0L4Ls->uy0΍UD b+1"ˆ#"ާQ`t(",4mو5)Nk3R_bӮm{b+0lgPyIHT )Dd#yKْVHIbT!.0j&wg^8[k듯싋qx fB%T`,S( QfK(1&8MdR`^g̈ža3uc< LAَ%'JfRf^9]z]ְK粒#t lT~5H4hIw|ʘS0]NtSdEy5&B'Rb!["^OH{>Re.Y ۈ!(PdqĴYx䷷8(d51Bu֬ )”*S$S6Qeo!Kz_k̬c"e{ys%_Ymdϐ˳8augb_s )ΫeMzbq bv1 uFiQE cnkֵdu_f^/]Dix{uk`ݴYnvhvՒR:oԿnhʺ&_uSM [CL|*]=|IQ!hEQM۽Ylɇ]ܩHןuv{W5;2?#Bg6){V(0$BzY dB?\2&Hc&9ͬfrN؈Hbnۋ,yc|ȓztr難G>R^N&-}Yݘ[Ĭ׻}wǶNv8Gw" НJzq ghbڈW[^:}a$@.di::ul4XtOtaQ8k7W..JFx %e.X5F%u*6EQ\(E'N$EI9[+7AX4Z%}NJ65g:g&Rw {@N*Y;M͇˘FVn+6*<^ e)ꑵ"hJI6Q`"><j{EA[ޥ{7KD$^5`]୷@gҎؖ ,ɉԺ5X3qvt~IAƝҟ#H) Q웬i|sK)^7*_`_u)6f~qXBMk!JޣQJAFl~IUB_uEn~I+F7<R ZUΤ> lA6gRbz.sm(Fw$ml/qыE/,+=;p;\vP":.#Cdȱ͡,V x]#DeKg$[i!6!a.ETuxVRyIcP"Su$ FF;5gnmΊ_햣tTT(0J-Rv(Z,-iVJXEU X` ), \Ui;\U)iWX!`5 WUJ#\=CrHnN5 v:t^,U;1o^O[3~و]-pBZル\^\-ޚiѝEe缔J#śO$퓶84VYua,m0]D5siRlWgs\e4O>a_.>LK~P \g?N.1 |c77M*AH}Y1d{2Rx_8t,vo[R j{3o;hqK[3VK!F!j򇨩t+ܥs;*})HkgOnb?J`c͉MGKNkb?NZ+W l`ŕR  V),\W,Pd*UҎp s@pUv] PJpUYZ)z[*;^{KefSm@u;޷{3oz i`q>i~F~ #`ڍ0Ӽ/N !:)IСlG*xD3Dg`Af`g`)˳qA2)!DQbI&PtR0q#+dm Ph,1_J06a68!XbI4lo !C4ٻ6r$Wm,3d&o[Hr_dG-w䶭$mL≚*٬Sb. /LT ٪)r$GT2 _}U&T6/+7nxTHu'DO_ôp2c -&b$I gC$b&%,f4*r1,4$Rx`ҖLXJ$€ƃ6 6!@VV2mJj`Dr;嬖O߀ uzoc)mE98$1 OB"Ч?/W~: Bƨ@wRhZ2&iteF8Mt9G jb;k)1 {"ݯ a1W`C@oH+HAD1E21O뫣*ך8#.۴e'=pe6z@Pܡ9ZOQpE i#B]z_ыD>޳5;G)9eEL1X9pT:23Ԁl6@σx'k0VͱHGo+P逞G'l/Nx5&ժɦ[~a0(W}a/6RhU.ӻTzL6.TJ-(\=I\4LU`TҪDr{6/":(x搎V yƭ,rrNps1 >kLO=&r#U+{*B:O%-Fڍ&_UiOuvɼ Qn$ʴ~釗{rq^/"έ『 Ix_Up?R9 x%e)UC9?9>Ov\3Βy M^=Vẗ́"Y:B ҩP p"J&uYrtRZ2xomгQ;w1YFni9̫hz|?G'StB{3AkK`8]`emНWݧ`PބC! U%V%n^ ;b88c ot)ÒՎpOܙ0ZПl23ZDK Xn H n}LE$)AgNJHhE62,bfg;-fCu5~FYh>߯|`iyf3 6[.̧z} um OUAjOy2|+LBJŬI$V^%88[N'ǹv9&-mrMt' PI1*h$T9og}r\N.r[W+_||Me<% RF}=r]ג*ӏǞ*mzH% p~uJD|83Ճ۫`nɽ籒C|e2XΣSi77ׁ=YX}/GדZ) gӋ4{~u觶%&>9'2QW̠dFdUB#j# ,; `1|tL>~o(^5X6;ލkbsY}gӯ̟mR`b60eƩI.{ V%}£@SB[U(E#|lI=)C(͵Lg@ ޣ9ox.u JV5aMxӤRU1p#!tIE9'-81)B xdA@&Lϊ7v:\q`W~NH>w 2=+vSuNM;Gª=o=};fX8& Fh@2SpdyIcR >VIOOg:[AA|k'\YR%>sgQ`CIJ!$c1C1rMj'0D%3h^4O.ʺ?>#T.q̚ `DVLcpmJ4>"7b&EM7A[6hٮP}L}(۲lwg'ԇ/j;27ƙ'd.kA*4C/mK%G튆/xi3Qe/Cr^ns^Oq/1'ܱ7iRC{~R -ΗFPǞ="9O#[J&hXlњ{]ɛN՚|ưnjB72GMdzP&;1iCQtP:0f[jJv6JҴu%:q%Q@aVz_/WWuhpzFȚN} -AW~҇|km[%yoR^mnӼ#ym+YzQέ&0/?o`a3#[i`TXe6ش0|\wg:!vpc q^zQ۰~'0kl@|f@peݒP?j暝3No<]i%9DRKY &*gHQHP$$C@ xìL/no/~Cs ڐ\KJIF!Qhe1IμU(ֲ`^/JϪC8:COG\x+NpP*>' i)̓TJMʞxݜ&`c "V!W2@Eppez8d J2nrȂuɒrxQ =bv1[e$`Z=(Q%jENZ/W Tm'6},Ne}.AaV6Y1^}&JzN_eVLUVFllXI0_UkKxj ɎߣB:K};5b2#,*O"ژ0+ #)8gYX/IŘ(I $~/e0(ߍ`x[GKpOtL}9:e(7v:XkӣD=W?-CE[ |z= mJbOY~1QkEiYrb[huD$vdF $"L A `9,=2G[1u*\\K);vbvcX %rƂ˫[NT𵳆sC!i?8 TUne6cAu eK!g!W"VM+?& "JV-greo&aXO5uE8rdrDL0D;*p.(2nʙ H !2,i$ws։6G!hh/7!s>݇ՠT:ϖY~ ~gmh 7,臏".x3Zhǜr%&?DߗFE W_^NIw$έ=.oFdIwUĺ~%Y< !z 'Wh!'8Kqs2_"{"#`*$a2@)c`'=áQyU-=p2DkgRbFzZxknVE{GhUþ?޺|ްj}df, lI.dEhLEj`Ʉm2yGGլPή51yRFجofhbՓcvFw1=Jc:mI3yoG8[5ϻ~Zc!C$ d "D=h˭%svv}܌Yh6j B!\+ s6]ZQGʀ 0&G5j˥R,U㔋NRrD 1׈E7窡fj h9;r;-w9ظQ ~=tՑh֤D^yʑBFn]TȦluѠߪbo ̀S@nr &T0$a"r -5/yyIp)Qtle˕|k.𣍓};kU-{Upp>_bkoZλrl߿ދru)ѢuʨMWͲ|fNZ*ԒStIˌq͹ƨ@xFNsPyLPf̌iѸp1. CsSvin^\풼\|U>߼agg_>}[7fl[X;E (t-YJ0ڋ* 9AxE5n *ЂW&XNg)@i5{48/3v8k!l:,5:em32k숷VV |BJDeX;5"[N|. Da%AiYFi3lk#ҎƪsRḞnlq/#Ȍ #.܈O @ ",yY4(rڜMO) !x'vZ)(q!;XCf4L19,X-Sҥ$AiF٢": /^9)9(ٗqd^ą^|n)ޯ0伉oDBY8b$R9[o^</΅G!̇^Cx][6xǝo(VE[-k0#GvryPi__F(誣L2vɫJN»(oBBjw!s+;P!^X/4/>adg| X(=XW"E5) l@yfa(p #U,ͺK"i=Ud3M vY٤\Uhp,! ./5G CkdN֐lߓG-VO~ڝ?p#EW c41 J{d[cpMZ*ɇqޫN`Geo{1GGluv}>nYb׉bWtx~wn#`((dQ¼un4nmpE|"PMe6;$k@$]*R%)ɶ$eRrԤϢTR lM>d$c딵i6*LvJ6zd:- o駋鯙z.#r:SQΉ)H2EWC&HEXؒ7K LL -ZCBJvdyr149X~%@&XSgO(=) .&-s4-Ǡ. N) 3Eh6**YVP +)M:̱ͱr%( 'ںot2hINptmݦFJjA <{9(b-*Di@=4 ήri*J rms ZsZɢi- h dhTa_WHgx{6]m1Ab  -YnG: )&ܸ s $_ JN)k啟XؽFY9YLN+RJZ1Wn>#RUTPL۹fPr2^r 5tdbj:bTغTv }H lU9Y"F_‡AB`l<7{V{ϤgD9g ۹|սQ*G>bY!M3N13 ?]Tfh墱֫t'W px?Lwclo).d[ѰQbH˜W(-AՌF}e DSZE7MrffO-iw󟿗t,=FZ/Bɷ/ys[so~9x~|h琢oBA}`{%47֦I^{ _l>2~@2ڳK"T@Rɼ\_7clw6NU  >:CqdyQ?zr _Y5u-*kQY/"egk}]DJK;ԑtf+pق inR{-Hf=v`(W{uqV[m;QDl>~-n[ 6ߤh&HPkj"?T>>z C]I@)bk" D1ӵj ١WP~?~ps϶klk7=m`sFoHt~L'Lz _[ݐV%x&_mA)U4bՀ8VC3 c*-:0yOo85qҳh+LMgե@{U٨B@*@J{bA[ ʎk{DPHCɪ5hWԶbJ;R51kز:v_\++ =|3^q^n()_4ʛ:|ry2ocIL}2?ʒ9?.·OWrYT4^5qLXEn Q<< `oɣ6M8ʘeD+Fux&斁l2v,=-m"`cpd1q.F@42rqoFyhe!B5LSpyq4t>g>.mc[='hf';j jpYRЮUFʹ~KPzm#"{+̄Е ]!JP:BiJ:n誡]#ҕM ]!]1ho#j \gz+x=]5:BrlGt%- 6VBW t%(yT:MkJ@/t"̝JZ+tjm;szPV翽?im't zsw:%TགP/'7om͹ Zhz=©fdYu_=\g U~rszםy;X/>S 7«$ȍrrq^^IUF/. xm4(-W󪎠@4Oڻs#;P.xm;;/_UWW Y\ۂjmjC_6㯙s`sc̹}:Ϳm~.{2w7y꯺hɯ[ӹۜyt;Qm:UO ;mjaG4mjaGUlO(`7n-ډsv՛mGT@Wz}ofh;+l\^誡r-mЕh+Ϯ#2NiU}/t骡$^ ;+Ujh %/1:tܕnpIBW -骡z#+rl0~R .tjoh͝JG ]!]Y,DW'?+\^誡WW [1{㡧UlU7tMu/;HW[ƣ 6 ]5NWN+ڏ<)mglТ܉ɼ}*ȕ%-\Dk"fi܅݅y\诰NCZ bmSml/cs; dҼ}v'vv! m}?g$.M5n7zІZaR f[oN҄ :\&n!Z3QYn(5@Wf}oQꈮ-Fw*NUʹ,t&tʣU+KBW -^]5vc+f-w̮jp=BWr %H7<վ\QW JX1ҕRW 뇮\z+AKjR/銍Jɽw;u;sW w ]!]9c vl&7h .uZNWԢ=] `r ]wSwТ;]5vQWGIWT '[*lyp@˝wLtTUVddwEQ;d=9loxpH[+E±P_ =Еu!-g_\BWjP Ozt ѕЕ+EݱӕL^"]1S!Rn7R^ ] P])`1ܸR^]y?]pЕµ~+th;])Jt .7DW7DW w;th/+J&]@~w`vۡ+v+th%;])@^ ]%}(=ձN(t$&DV6C W6sAƣ_ b¤H{7=#75,3͍%6!訷:$d7Cv+IW= 3oC9|Cùï7t%Apyah#qJ97|@ͳ~Гw ѕR^BW6ұӕ&]\aKtɚЕun+th9;])J']@r>7DW 8mG]. ])ZwŠx[v ])ܴkW@_RNztB lBVJѦ+4%UݖԕХvR ^"]%>ƣt3tj̙ g^>r"dݢvGkv 8lg#h;vNQI34-zyebNW>#Cqts|t%ЕLzS2T>0==˛+E\[2IW/lr&lL ])\[+EJQLztě6DWL7CW ׻ЕJQY 폆>(V#7`G1'c;ŅLh8M}s{6(E.(ݟߞ>~z}Y:3Cv{\.F>Ie[mw|7Ww{;iw8ˆV|ow>O g8st)1~ыщ60?w_zwN}v|'o0Sdߚyv滏W'ū7GGy" ݲ03p@O| }uy̔ gOgh}|t'UUW@i__9l}d+-kvJ|t- 5[ȗޙҩ8O1y?}d;ćFߏ^9Fruۛ|8E30PWZ3\ގL՘D÷=))-D"8&}S{VB%؝iBRΙ r+!lm\z3ť&2_짅8a$MZ6S(͙!v-HFk 6X"FɕV1;Y40ȍFy\NR|(/EFC^}$R,!R[·!nvHdWN҉s7f9r#b! &3qX 9q1ZcF-9Zπ@D{}0&S h5J{І҅!lCt-20T`&?!Kh*qèi^[P # Ag@x hDID~o_&+*-SY(/`TX -$!wys鬪L/m>LG*F ރF :XJqM>dws=!v28Ǟ1Ǒ֑-~F_[2dT&bC5_*X )%$ښ8@_2"$U.s}VR K -Ƌ`1ȓ[5/VWlZȑ*wr.zyR%cPG"XXgаH! s'# L|Y.MV->1rM r3B,TTPtԡ-!xy9xi@yחGɸjD(Q\z<Īv%NA2hCGu0cm̭a\,DIU@14<ƺ?K+5rz 8W}&"X#AoٰQѧc5h[5( d{z  8lk]MMX@?5JҜqj`܅p2)LHpm 8) d>PAqi4\ҴLobs oDX8H&`EU%ٕ/R uW4aq2F-[@&X["*ZJ,՚Jy>:lAZUC.$"XY_u0wLL\CʝCI*)>#72D VU r 6\_ b7ڛ[(SѝED]`#)#-EUPjϲ (Ez@?P֑: qiNvՕ/H!H]^UuAT"Fzm>GAsyǒ5خpRE(Pή 6C :uVHu{0H6CUTM/,dPgF$=J_ݢł4$b68!99h>3(rTvND( /I~ u M:˚vZN [ŬF]`"AیZI|tx݃K<`t u:U] jm@hiU ưꘆ'z$;J@ūE ]3ʃJ I"912e(X,LrzxLPBb3$k\uU;Bdm:^XO]a=Pt5ڞwb:@vTVF]+#H b!җa~xWSy'k7YB,U'XO/BfȈh2"A]R%rȋU LɣB- ā !u $$6(Xjq[S-ChW1y NoBJ @.em1ɪQ2`Ż0 j-3Pы2Бxk.,EUIbe&S^(- ɀ!jPk GE^lC כoEa2,Bd ⬰)خ9hU"~B7^!]gѝ]5M6U#;KTf5!(m@L*"21ߪn$l%t |4EG%CzMm-ٗe;i,._n^_w2ɢ5P.n2ĺ lѳq3}P"HuZ:5WZsLdG(3!e-6#YEF,$7*365z &% xKdpr* P.(7fho[q.([uR"Ud*eTP2mMYZ@z|B2`UߺYOaQ6U d+EĭSN bBuX1wU/P j Q GEPXeQEq,IGq?^[s uM)68&ӣ+19]n]UbxYHAjփ*HLJF]@-d3R:@׮@=!ϯ{ *6lLV/f&M-C8ZPi` EHN֌0h]Ci!y/FFV M F:,`ZQ%#pZ{Zk(=]Ge(utqHZ};Ea2 1_5fs˥:pU\BC,Z*c14&u܊Ppԅ˭qf@Bׄr^VY8D;n0BI>zc7& Kt!ލ/kq~)~9.r/Ɣ{)4{W/K>8}?|^ >qqT ey]޿/+tos~P_6zpﭻǮ~ӿ^n>~W>OCu0wB}+8y?2NE'{Wnu';;] _Ņ.S/o./xWw?N[|zu{/k>ɿ_^ܼ>{lxejp\7\\||pĤ/Vmۊ~7'9ڸk뻛Rѯ4wSA J(dcܸ` 4 W p5 Wp5 Wp5 Wp5 Wp5 Wp5 Wp5 Wp5 Wp5 Wp5 Wp5 Wp5 Wp5 Wp5 Wp5 Wp5 Wp5 Wp5 Wp5 Wp5 Wp5 Wp5 Wp5 Wp5 W*6U v W+?Tަpr WH4\M4\M4\M4\M4\M4\M4\M4\M4\M4\M4\M4\M4\M4\M4\M4\M4\M4\M4\M4\M4\M4\M4\M4\M/ib9o7d`gf W i++EKo W24\Diiiiiiiw%mY4`Tcף8dc0kCHb9ynQEvXbQ{N$J$J$J$J$J$J$J$J$J$J$J$J$J$J$J$J$+4g$ih&;/W(RL$J$J$J$J$J$J$J$J$J$J$J$J$J$J$J$J$J$J$J$J$J$J$J$ڬ꺂A{$5{>XJn (1kyCQ5>n}Lm 50ÔD$}>î4W]ѕTYwJΡ$gGR& [B5{Dt(L}+ҺAx5:Fp^NPI/?G? G*?4ΡTɨ9g>}F ttZZ#k&̗BE.(.+GU|&z>?XN}ʇwe()31:?Na$'{5`WՏuaOk98<`8a")J' "ܢ;U<Yl`V=Պߪ:E: L/BZqtڛ`G*7BNU .X@%~eamG-ag߆3BoUweMy'̓gvy]kjv@ 3H)ÍM%VCVC(-KjF+EH  ByQ6#Lt?te%t96rg+_et֕s@fGej .aoȈJ% KY6NGfg)ˉ(Jcx_ ۏ/L7 J Ccڻ s:@"kUgl8\ѹm>'_뾒YQzpV$۬}'js/OC矃e~/\U"!Hr9~1f76( I㋁Y y?696oL y̅'6/da: ZkfѲiж$9ȅ*.9z EjGCfEs>(жNsJRs~݁z,hd(~#'!ZÜO"[Ö@7kQhZZ@5l-Z@"Z}B M:r[?Şj5z˛lVCk ] ;pV+MO"rt(Mtt,T֭+kI[ *v%Thu4j Q*jJi]!'n-NWR$jJZeBx5խ+@kv`p}B&C҄+[DWXĮPm+Dhԉ QFie[ZCW=th4)vteV1mZb Bz72 J]#]IgdDtJaDi4UXn!~fgZޠ7>!;s1vKY>+ntφa|x5~\'pl !t>\r+I#-,- IR>X\ m‡mc}!/@-gŮUX +ڈREQZSF'-3 AZ&:lNjCPPF|\R] U-7pxwfk-4uCIvVSq4{Pw׮u 0P*8Ouϧ@&: U ba@s \ଈfmK"1ps!9@_ǤAo3'P魧&ZU҂q-pBJĠ0n l~pcˋ7FN3. &ouZ/3^`-/^z1U=̻ٳNk0#NNM>6snF )1# q4wɝ ǓSB py0LJ8U{=ʦug?!B?.֯S0o |hW,&bzxZ֡t(SJYO5۠wv;N/m`!ߛqЪVA_;X{w,f\vҲfw>>d8U̷TɁ'䢳0y54$t2G)]G_?1N')cB98.ES>KbK[Nd"q$ %J ~S;5 LhWajM_ߔ:h8(a_3&kFILM\.Ȁ EDۍ֯gҩ m= W5QXn/6f>O< Η\IBȃ %'\hb&`orJIM!btQrB+FL _-ZS#e'xa3`q. Ia^2XĆ̍ʼ̦;cl*el^"aJ7ooޜEeޜ<޺?yf¤H,ѻ\ ÙK8vi ]dPit(f( A3`*}.?qeᡑ GKNJVXy6L Km(8S`_[)0"8PVUBK &a/ x5%6Rj}jUC$8nB1C+0q|O55=d Y 0 FZZQs6ꒂ|`VJpМYKi7p~?F%.hY6u\ ?{_HNPemY.ZFŀJz=: _hk:B6͊i0}6`< $Qm eE`r)9bawK%ེ8Zl 5``#9#ZqN8xDxډ}kN@*-EbU!'0 WPpdz1'^3UЅfJYÏ";sj jYzuR.tH!?<ӾX S]k?Ma8)$Tvg3{:Y~Tx1.5+`lF4Y9u{pR+zmH ~">bq\rcNoIțs:-v{6ޚMA`Xq "Ab^e }kܝ뼐z[YeYɏK>? Go:*3͆u Չ{oITAN7ogwv?9~yO/z/Sfyt0|50&mӾւze/ZMe ͳF<$kR*!M|ŗζm-jq>e |!\8l[3fq Bl構, WQ Z浪PPQqF@K {6c7};}˖J~l$RX-F*ʂ5a-VFp Hm|:0=_v1{y[bɂю@,SBaК>^!O4◧xi`0LuͬdzfvW5lO\5GkW&湁V#7Bv=vw`V;ٺ*-! v%Oq0ˆm]!J{)nT+cnCo|ڄ\&RHerL^prE heHrV>FUMO.%Ydׄ)J8ؒh/ږ8MJx靉,JͭިWGpbI xڛ0R^ əT0O9+γ {usW8>NjBfnXPa"r%\OW7ӧ/rSRÐH a=WћI$SঃW*B=sުEF;]0f7w#lH7vkF+Nnߦpv1qA.ԆN>L})iW\3rǦ9U"v[}<;9f'( 3)3%/JbKSxa 9%2hqlK=P:F%3C!(/wEa20%6#bS.H" IpC.["q#,eM42eǽԬzTXU51(י6vgꭐ=%FIUrO-Dd*bfT9QQHH1R)!AnTL[ o_DÚ &12`{.5iIm.Ne <9뫯lcoybrxiȄ6I9bY?67s?O/&SG5 { υijGDwg\SzisnS##bã}ExJ?`76=ʰ ?SWz6/VҪ_Х_,e@7vHHi@ʔG8#"_1 S:HY8nXI8o^ h:]fD "ui0%frb|dStQS-(9 ` !ؘH!ĚJr }O#Wkݥ}R$`mvva^Ʌ!))0ydF>n=Y R4`͐vC0E ѸKs<0P|v/;CXǂkPT+5 .Kʑ[WMva]CެjbE@&91({VO-M5FR߻b2붔u+ARd@B"C{Ae)X&sHB`YE[kk .=cBuzb* k9CQ"YTq?؀A}Uŵ-yuV΄Ygt!IʴaY"2WC.dSj8̀ޛhK0|^(vKeWyoZNbq54+LZMh/N!d|Y`֢44T{fI = =4 =XFںΔ(1(i~BUcpCy "<*WXkv]3˚Y־kGn&]z~~HvfZĴH}9+ۣ:!%db0ijK3z5dr`VΏͻAP IJQDEP ڊblmS537Ȧ (=ܼVҔ.qkgi{}ۓ#%jyӝ4^4zw:UiAT.ճ䬧`Y'^P3L3~eypr4#Qfif`x9`КJNKIL,QHC"ZN %TJXJЪ վoEiF@3UT_nS=8mÍ4Ij!wlftD":0%c&*t@s.j|l5 1\\]3Bό  g[Xaa~qZڞi|~sOe*Venjm}ĬrO_c6afK;giw}wW_WO%WR?nOȖd_d%)E4\zbdoRoWVUFyˑ}7t_EuB49wg[97˯9H{_gG~"ч [$Z;=ܶX= pyu 7Y$^ lq[PfڭwHw &w=b˱_ި_hyA![w]Mq9shgnWzvassӼ;0WFUfʶuLee3 9BP&|5sX,coV s)jZ0tzR'=vV\.|47.2/*y\\ gG1h'B'8r[x7gdsTjo ނ $ΈH,)qϢY*L㶂m6U%=9+De>Z{b!%!5:=%i2JR O@4=ʱU0:R+P`}F$ooT)(Od $r< 6y`m h4^NB%ٴDklB U^ )(YWH 12Un x,Ek9Or{9ՎZ"O1 'VCɢz=29{A!)B[r,nٖ:θ>t8ql_Xv6n:z$td$lT\C/e@kѸTohU/xVY}Oy,Wy&of _/F㫻̽_r8ލ\p~Q۔?j]eFZi5epf#0Y[g z] ~74]8=um>(54ز nlhr\_֗^s/Y8E͏.xt6N{^4#XpF؜F8> =UsGVRRƹt_&AQNJ c 1524tib6ݔBq`h!mx[)KC\'H!oy$胢P jUgO`P ^C PyD*C""Ds/8cA8Bw{fk iMQB@ (PjuJk%'7D4s< _* ~L rj|.-L%ʬbN ږq.[!#gO7;.ǣOW_I/ԃQ;76CLMϛsP]|8|ȸTB |ģZs,#Jg15YAQ(E*Ϊh)U:"&ZڦG=Tk$(⬯"mJS\{ͷo)+f,I4V222J>LO%# Skٱg j%v2E)%0!'#2$0ej'0!N\}JN $\5F|]irզ^I6#DT8a<ޟ ۫zѴ95l"|5)~DH* ]^}T|˗ZG%?׈\A~xKۀQ*St>r:D꘠)@K"4RD Kaa6HNw.elrx{j 錧@%X&Zg\g}NSO͚٣I#Crnn\ύ+#B7ۺ?q?:GLS+etVDZZ+qoxBf$X3d \nY!S+屛2JwfoѬ`Ez he<T76 ~л0҃vJ*/-x `8=Tލcӻ{һ͢6Z`ʀ^^nO+3Vpƙ4Y. +wUt_kinkJ Ҝƅk4h[ @Vs`I%Ϝ T@Hl1E B:&h`eCs&unc#jO@Yr:(UL@]$ݴ+`+|r$GMtO4~hQ[Yj~zKgj+ij{PNhs܊:Uc}؈ IUBPo^`qzZz_cD߽jq\w-M  qཤ'\hb&EI 1:(CBXms 2F҂'7wϹ*D& ϣGS$mJb~M+7;n:_;oPъRqv6Fǰ E& 0 e< ] q+Ƈubض#0"ZB 'B5R2BT\fB /zAerC1Yטu/@|  ^Jm" -AH/3(r-GuQjjiF 9FZZQs6D9,7 @:/Ȕ1RYF:*KjA-G?ӔTDMAO@4nڠX*xвIw[ոۛuO/п\XZssM؃?gAh L0ڲ5,N{?84[j?|Rfɥ0ya~$gq"6%8 u9ILr8le ;$'gZqN9$^d$loNJ`=u@H.@pû\Ar6NH޵I Ji&}B =?j+wja_}NOAWT -t{aeS~Yk?p8-$-Tj~[O{v9=Lksu}lz~j ƞ X-\OaΛyv9+\fßfY^bbqBr(i-q>I۟t1lcՅ0ᨖQ|[}4\'z|f?687>OrS7!/.:&8`篣1[&(R/[5>2i< ?/q9鷿yë_}??yGy?_{%:Fd ߙ_F=^%)h|܈}K}Jys?gŗϭ>,՚8i0|ߟ@]^lYMun7 0F6X$A{ޤJ .B-B @@Hܚ'v$HObMf߼DA(NΖlԄ|̀TZ. ZE@QAHmӓ0yU=htxTޖ>/븘X0ф9Br:*Ciy@F 4 (O7T)י˷Xbkb@4?kנ%80.ޝO;n :Н-;Bw[o%J=$W(S US2t)'wiA}VZq)frEl"ZBIIQk pOtEH5Ic (,JͭBdL>W C .ot/Ox {-."vSd)ɂU(NrFKQUs`y0o޾hjԆS.w{묡ئf|7YmҶ vmSv`R1uCFZKZ@-*6VTk]oҧ9v(r8[E1!J},çzώYjHG$NXDhs8c:B$h/4KM3aHYVD[wx(Y)DǀBD3.1*ΌZ݌uLq\Ѝqyu,~b :*_.?8䴮Wme{lǝj=̗OHQ+&iP&(\>f"Su1Jg2*&P&Df#<EdN!@D%V\ 5OӐ=Ρ&Kpb&&!)(W.Fn&㰢b.:ڬf/Q-M0`1r@LFc 7 eTDD% $bL#ќJ 9n"rŘK(/ֈ)~HG<EyPR3.8!()5i[E3K{#go눟ڡ\U-s*9T.Ba\"]G48aN2kR52(=*u%5#{x x.8TP& –V:ʲSGO_pճƍz ~<(jSaNEݣo u=^X/pylD-I=>XD"Ԙhy:%* *7u,kZW7^wIP:ĎY5?_2s{~.𮝶UhCSA= > eχ<m"=/O!9 z5.o.˛x';6T i+ytN?>з4rx}D`.ӨBq|QS& GSp(0k>5r`?ksAv?/Ԩ/dSᦓmж:mfD`?"ˆVݿR}zs4n["wŇZoY#|_?y$m*!6dKW\tIjN8|Z)1\OƳz F91n^@$R%D2Q Vg 5ˉ}v~j̧ E(= Rȟ%u9OT5璉D B&.K%DuIBPƹIB81 L$xh}b)˔T҉G%%t&#rng:S/ؙu(?F|%XyKQtgUM D@ W%gL*q&:9>'›샲J5WS**|H΍0ǃ@$SBe80(3Y`jDzM|QWrn&ki3|\,:/%U1C\K`s*E `<$fu {Vx$yWեjr'qU0Հud4B5z՜<(˸`TVE #"J{ ɘ׸7Vv2y FXCA4y=Y\GOcF_*fAFOۓdJ_:A 10Z]=2o&[T"|uO<`VNxƐev: I"RiY愊]v9|8zNeDH)w\ \!2/6B;W/qέcVlu`BB]{O|2]X-XySc\RQ'DP&i -*1'aTYgN:/*zaTyx\JSbTK3"@RʸN@pRDnIZ0aJPA=Mz{w ^.[A9D@8Ykw^v52R1WheAC8H&B59Lf@& x-߅+PKCBlB) ,ה2x*;’\zF9R߹DdʢNAdԡqFr-6hAq%i yb* iMQ@<~LHkBt:t $%]p1rA43 4~iiJiJ( DbLn$(3ɅoTĞY,gncdM)YҊw6ڡ\fTQ$:V VZ1ÿ"/;UWy{J-1<ÊE1 .<NduY[i\otVYFcq](ּK\cIIO>j 'fC'hp^$pk ]BMp w9[}wR"aJ[_7)^];.|_X=b;S-6m{s"裈S~wI[t\oT~_~+Rp|QN$*[*3I~5ɏxd1(Eٓ)=RR 9dM*ŬD==< $DT&PRh*&uk);-M.5"y{p~` JW18 1AY)⿒ģIB ƕQ('69P*G*H\%&hȵA3CMP3ə!9,~`(F)]p,ͣak?ݬ:\:eف="llytQ=3 j=#uZtp`hQ_JkvC@+,JmC`C5q͋qP)*I[*Ơ)d ϵA$׋C}(dgz'vF 58egQξn X{2&oxճ%;61IE][J*;kx>Y?L9ttǨ^D#=hfT8gt7Έ~?%Nɦ5$CTC3E5(FTdWMkWe]ii_rvן%{m:V`cfxV_eӇ>^Z}ҬA3?ENrlwGh|afBߛ.=q!|q~=p.?eGS*̷#[Jh'z%9?Vt/R;D?tH>4N&7wJgd(A 4t8αpVnfwy7nã;V]uBQ=:vvxoX[2Am# NjDWڽcWa`'Ǽ~,=ȊmyBoXkeEDsvC8-v4ܾeBc$(@WAҨc࠽{Q< 0 SĬ>;3! pyBNq!,wAMB(qFJK  \ӚxezE^^>]U89A7ԅo>-BA\xGR?P괤I}i!ZUc#nb T1x?%ɴ6(7&X|EES'wys3~_oN/-~LF:>o'Ȇe4>\mq~'Q+(=.P֧Q}#_ԔI‘ n5[9ː7e/^7?x (Q hkIm8CݷԢupS7=xo4۩o4H(a"Wnr;]z3|F<p:3͑d4:!S(T̹HrqS+`DU68]) Bd-#g^sƒ&;lK*ٻ8n$W8D,@psrb_9K#E#ٖoEK4[biլ*>EV=(1$et(F OQ:7rcvZh2!|$ASY!`9%qT!%ck&f⬏ݦV2I+_ƅ%_w :lQ[(|yeIZy/:.iZ7IF6_F9 *^aoZ,nY@3 l7htCrœNĒd U䳱Z[ߖ֋dE"0$p`3bs&kLJxY3q6ޮ>frCQkt#ZKօ `)9dRD[clII6 Lv0&asG~ֵo Sd+ `M+H9:KIM9PԊC4I/+""{y)3 6<_86H\e2ƜcGDZUtb>GL+2*56=AyY8֧FkNM) pEe;;]M}=k"5>4TP@{BR'A"QDYG|B_hgK0ţIk!hl-.Mb?eYTD͒[$$ qϔ u_6`xc #d14 5Zd!=SOݳ[>Yn-mJvNRrgjvr3#m;iAvgvNԱS2R zuoGn/;j/c_ Y,&cq!v`Hd)H1Ecѐ.kR>g Xa.PI]:s_w345k1m&Ά=fe|]L5v$o{7!~qOћJy8.|*g;#p4oG23˗޸-WtprttG׳~gنۅ _=25奻cp_fjMiگ}`N=zN f{DxyhX |@%j ;|1<",RxDRo;HpyDP3*/]B⮪nR [tWJu1Sѱ砂ՒWI(P`:ݽ.MIw/w 7}A{s>O˫}qo{Wm.6R#$0Er6QDIԆS&GJ&Ar9cY)_۽`~2>-qZ^/RN!jJXC%)-"{6l@hֳop_jB˓{+$N/ kKt"12Nē?YvYZoR+ A덋/7_Oޛ[CU/9 g4Kvt6;u5]>}r'eZ֓AL}yU.ں\nX `Xv!s6.ORs5۬Ӻ;r@P@zB*;b;~:͇|gzʎp'n~l5"{d*2E!rY84E⢔O3ǽyOߕ7O0ƎoO#_$7wѱ)~jWvOuڰBvӝ9<ܕNn!<䓽}N;kLّ??x}]8¥0xRgUzilYuwd`{֑VW Qn0 aʓGdy|cǧwܬ3ȭ{MuսŽNkL^G/翎Nbٔ}Wդ]#GGdk>lvQe︨ab(Y/o~z?a~z'{o~g?8͑'U"zn ׿n>C+ Zo2iy+ƽi>F]㸜W{R׳O?Lҏi $}Q}~9crTuu֭s*Jl sT&GJ4 ڽV l C{%GF>:a0&;Go_ n9.f}kpOE x^a7Z%Ȝ>̀?i"fu<Ђ*P;FiEa ;in576"U g C¸,c2-o EXoPچ@TTR蜲͖]J];U)H5 MViO/A!{XnJQ{)v( $9@ #$(1YLL,aeҰ.VZ֟ReHjYUrMҒpVlIOͰ4\gۆ}fzr}ZD؎oQT/;~?D52ZFƺJ&VB:+BI^iMQ9(Z*\Ai tYVg-oȐP5m&z`⡿Z VTS􇺺\䛰6 6xYg671 b*)޵Bu3VwuA$,`gTKeR!%؋CPy,453]U_z٬ʷ}NE7 Hu&hK1S0Du6zh:hQu/˳դKv0u=+?vqye7~^C)yOIƗj׫3X .zx7N2q ==w1*4AB``C$:D+P)kTSߟK|\oKh@& w"œ"kPH4tn<=Ymߟ{reu|e}Deݓ JT>@X̃eIw 2$`I!UA6zuy'Q|(EVY&M`o}D/Sd"焖& FCv:Pٳ˻7rBḘ;Ʀ7>d9ۃ4DeE"5q<2ppJHJb(JBڦ v1T8Y0+J!cbA'c:-L"u1ITG`F`fkʞ+|H-8Vq!zrgZLǎTϝ@d2gB+(X "HMiСآԢ87،w^{Y []nw 5SqΗ~I׉܀ƍ  Q6z9tBā܀GfGW x܀;&܀ɊV& cLN)Eg\*k DXQW*`pmڸDŽ&mL63{ h|FV|iuLY%~ LކQ$R0>x0&Rirvzls,S r,hF~ɪJe)ӸA E/ Lk3dK ̲f9r| Ҧ|b5ePCdu./ ufx̭"';^_Mko ![tAvjy?-:x4_fh7g܁sstxnORi "&l.g7Vc-pȍM0i#'Vpb/$T<F^Bi bSC&1;/< Fj}6e&,Q:*%]—ʮS7nӇoht3Ms+#ΧUї3.y]Ջ_Xgj!r?PwŻɧˤ^T.>B )搿l;l:@5za* Ձ U )]6g};U4h}R*6i|ȥB 퀣^Zn>raU5e=6ot=]*ii9e`Hv>=ZN(Co#b{ɜt!t:Ȇ$M7tђP׫B]otA[c0-&I*KF+_g<`~;Q:F̃ k2cwvmwg`;㝜v=H&R5JZ`mrt$r|¡SZ3-oD@UFQ{%QRNYdh*F`uZ,7N9IVK2o6g-M!#_E/a/#cDIyOKA>thuIa 6L!fowVC(%P7.;i^JN"!owbٯ QK% f%7WBAGbH̃EphVt8FMkx !ʤ6:a >8$y?e%E0"G!x.8p[jFM/z~,ٕd̷f;.6t@"Ql٠sأ?cM^|=m3w[t^ͫbM$w!^ҴK̐[2JHMl⽻M6a>qqb?',7@וef]޵z/K~z3[',e?Bb |_Fv4*Xdh)3B1e[FKіZM}죟xa>ЦF,"Vu\z?{Bg_] oebe5 LMFΖfbCX+<ۛRg]Gozs RJEC-#]mw M_dǓr8Uz s ŘNE[gDugFAM'3:է~RI/[GE=zuq: ~3( 2o5UE]Up["762붑"TfU?9fɘR[eϫ^],VoȅYD.=ĝސp5WZmˮ|ck mwFG&ҡaOہoO > >>g >f3X)iGрVRO_B) R zJo tHlXdD@$wΥ1ZkPzv1FLJR蘂;vo3/@032{7o%(޸^ @!Xmqmrz{0h1 "'H4$*ZA# Bgv(ZhMΔ䌗)eܓJSlQɓ"v|rmE]*bPT^!""`k˅P!FΖng1b6+Nc}r1_/ۛ})zxjf ӸWUWuYU}җrsÉI-c1&N944a)TV)Kb ~{u/[A0% eJuNAX, |1`.!K2R{$*iqްH)W -eRfBEdnt*ж=1:]6JColQs$z 9ΏzH>m^==VexA@I'3]AtA6SRRHDh)EzK Q)l !rhҰI"$O/?}0[wA K7RaDF$nv(t΃dl+or[!%ZJE*A;>mGLj!PԷe7#Ipn2TWva 8)HC9VW=3$GGcd9__WUWW#y O ~iy2WOỊǸ@.EBe%1PqGK9=4I\iø4DT>'Xٻ0,#'+ W!?TeSr9degKrl"X̾jrht4.T^$W)jP-VvoHWL8.V)eRd*6OxE"J}6pt+^2m% aDM_a׮t˪J\\^'wlRdz^B'e'YUDG͚bj3µ;9yVȢw#NT?Ȓ43ާ~~~ :;jJ,ј,QG)]#d/u$k|cFP UZƪRg;]ХG]6ɿM?ŸCB~"T|S>#cځ˵"t} ͷ;-M{/%@ p5A+Z!ڼ0xǯx}QM:xXRzP["֞;]jх`ԎSl7U.Ӓcpc0C0ZWYdV)f t"ߩ͍RίRwu ʅ;U{|.Pߵ'M?ѽi8>"WI@b:>.ƙwGECF &1gN:[06HLBoBoj- I륳1Kn"J{cSzAhڻةRNq,z̝vX-"ro0CJf/hX *%/z㙨hM#F:ON*$}&/-X:?_@h)T+BA^}yUbJ+2"tE>!\&8mr*훗[e<.YPFl0uQ"k)rg.ϟ̎N*Z0A_ @LIl _)8[{Vf:FV^+7:>/OfqC`j}JJp9JhuJ(83+ "' S\7t2tPJ~HW\bl7m/TK$;k([ـ!~ ~l2VzMCc09aCH-%foonXslAWl(*{Vs3CXϫ,egl4 `̋ |'f\EǸ/GqqM8ܰEѲ}p^uq1Z@7Q1 9 *xkWC/xvAn?{Myu1 /2@q꨷ LVwAzҮ|5A_)KwH_߼<=10 frH8+jA686 W·ID\dKHE>ג)E 0÷df(|c,IQN)kȵ1O&z25&@p/f?ia=5- inD铘D)N8ZݴP{0-l2P)Bxo*h_*%l*@Wϐ@Br c?tR/tE!vJAt >UZw\ZAvʆoā]1G{DW b{CW ]]=C˽RBbo*{CW -NW %;خ#] XUeholW -yUBIHW~O`vp & :]%S+r ˧&\ZR9ҕ 7Oؕ. gp FFɶ+][27+h\f*XK1e:NMBJLhsIhsIp9ٗ%;oiL(l.xš=QIj-<'>Y-"}=R]5 jӡǚ(B`Fh_*}S*$@WϐX=+B&ӽ}tPXh]}ZrO`,GJpJhuz|CWL0Ոf4]qпP+Ջ~mH` JD>8`)ChiM!Zkb|yMmՐjn9Q"a8*9 z0bGE1Z+A:kd[mm%m-R0[bѤl~Uw,z==d8*{ 0'~ Ǚ?t/߿;u~ޝcyv u=0 6^ʯ~+?o@nN:a9joV5Uli;|zEMntQo|_->x)bǬ2< }g_ذ/w\]Dۨܥ/)DRp.ϔ6ׇt N:ƽ:MIb-=NALt:PHLtK@Ө`PDu9)j-rum ͉yF}kc ~͹.omӝjeAjZ/xz:SBҺPvJB,onKŪ *l䀐A;jWnh1+ԛI*YM޼zX셏H*eVIx J [C>&$O"4~?^7/d9'͏J!_9AI8lpns˜wL# NV pJ ֹA9^iiW9\!rAn)1H[ܷAP?M|/Yf{QVYacҪ>-v^GT#-[۹0΢$StD;Oڦ4:*G-A`$rHj٩%y؉CD8KIA^a 1Y,-,xGI4 G::8 L 0\yjx0ϽB\T//˗$L`J[ ƹYkm9"/?1F]sqeHgE~e>v.c sXGEpF'@h7($Ƚ4X#ia-x;Y5ݩ;[SO+طd ogJZs|{oFH % L͕W VA@̭p2| rBp#ZW˫<>~W rhX^Oa&NPsG8ə: |>a؆/vi ]4\K& dQڈQQZe #B1PBEj2(:o;cƌL@ZFL&Z iX Ί'R.)rn@sgZNG5s`̑aq4V$ !+I8 KB&iV@.(7 qfaHB9UaaZnlP;B I'EInC3Iw6O6:)l |:.ZZo _*-wR9oY?{qn}}^io"W z fؾcn1%^%w3d;h-REtS\-&x#324Tf"T;Y3.lf >*MdrJS=,OIk#G5*H ndM>Hڊ呢u)#4{n6ȋ/ >?^l~C<>>||8ە6Q+P>)(ŗHJ4:*(Yl\ST_Dȝ؞!w")C lg U|D*d86x5FAMJ(F ךECg4bo֧\bHZ hJPʱ1V:#nly"g'9G ?j,_ 9;emҺ[j5=>#~Hf2bAߢB\C `S]57I.kla[ \SnKԵBu/@.ˉ^]"{UF8u*ϭ>XmtXo,㺝2OaHI^\PZ噵CEXؒ6wo{TU6h 0Tc[p$*~r`wt-?g,MW!-f>Gb7Xm^lyQzۻ1ݣ'mzvC{X 0Xъ]eȈ1)dEdVry J; eԼ]~ 4кߕUa;ul~!G'6^|×F9n ̽΅?}G7G9qr*fisrߗ dÍ}>\KrO+cDf ?A'zN{0ݡ^/^/u3?n}Q?KprTSZÔfoZmpؑNTud!BӚʦ.][ mj(VU9jA)>`662X&BE*Vh(ZI!G jf7ranhҁE6TX/|b#lD_ b>$p\|F̱|8zn")퐐Ҁ&!Z M)1p{{}EOS%cDiv+ea}%{VO׍oC M|b8(oZyL )s%H (URB1$3uh䩰2 >c z1<لK,f@K]NY\ RE3*P'Z7(Ҫ2;7 >4U{1No< ] c ;3S`sZi}ո]"\0[t 6{7g* __-mmɏ~#_?VyGrʇ/# OGѾȕ|OֹR{Q0*)\˹HS RaaF9w_z*%9_C&Ns$tfxmڥ|ȗ6r`Q7z3yyhѴΖYel rquqv/"ל3c:!%<`@Ոm\rCvG@ vkqמw33j1!#RQT"XT #L썪!)G8 (R.(=k!xz~\o 8g= иS,5ʥzVQ@o Rq?#I#E#4^oR bx<Š53*X)E̵N %TJXJк վӤkfTQ9?-EmnÐ;d۬FnH<x 1AZ}%]8P Zs:$(F!֓xNIlT=yirTJkX왱 3\jډac7Th< 7{ߤRXs<TFdt(ehm ERmH%ʕkUC1LRM~֛bn?z#@Qo(O,l섗z(&E_o!' //=^EĠ4lUEOٱŐ| ࢢtТ‹IApR^Oj.ɁU5TFgrDJ.4K %ζ&8YoY|/q9':fXOcܲř'gɧز>ҊvlqNݞ^n_#}p%Ul˭; [I#lAn6/nnmvꑬ{D떴qK3yzKܴ;6Hj:+k7eNoV} kpx|n}7& {_fܽ3R]bӍk[Av! ;xstW,\{Mx>92WWbx\on֑)-asr{֯w#ߔt<+yb?\8/$ G{~f!ݞIX9yדah][}lgϛkK/J{{&GC:_Vgˎl&kv»ڱ/l͉=_хӼ7N߀5#0Hv܅)Go_mxU'c4ըLٶil=G(k Flr.[<://7ٿsm/#Zb-Z\W^hsQ{ &0yNJk(Oe7ة㥨ޮ//۩nG\x;{:rN_ըN5OT-yDsiNJEh혓Ȱ[Pc`&p:iGHw1z NʙJT\:"97J]\L]1}AD<bxb&^/ 7LŧABIЂ,&@ػz/rM82AQ7ү2xpoLr7jFۦ&Z J0!hVVH/)2$C0In7=n, a،\8{ǥvZs~#* !0}QP&ʊL168J%믑3ʼn/[[\(*$hi6B'{vhu<ҢQtX^yެ87:~./~YX_՝_Nnn6(nwƓaoT2ch ;]V OiCl/go G?NI / Lb7]Np/w7߀-;Np\ss;ֶ֞c5c1BBXY=J;EC9,ᘒ!bS X0RRW *2(Т S띱Vc{-A]Dk45=ְ"9 $=T4JŢ,j(CHphО.aYo1AΧ$EN|4!˓LRvz4{?)B'IQ!n}4tSvx:ԏp*2! QNCMP Js& $rT,rChmFFT,2iV 7[/&Fkj4'U Aq^S>vN0 5EAx/@_'ie*`EF p$3@Pf@ 1 %2N=K4Zs)6Y΃,RQ-RLiꍇW~g=eWӳջәA棚/Ir /'UH5u%5eY+zn/{y N͉|`)l^ Lw'gs~3lh5vοMd=|ؗ @{` <EJw irӅq Üu{뛗0EN93XZP-ӭm_ -_ m_ 5_HԊ%ql-d0QL/D)39o#b)5ٌsL(QmtF#àGw9lVJHH!M]k}%nr]Xmtixrn?{G,-y"ä"CF.@ Tj ݸ^)wzݯ tܒ0HE+PDZF Q- j!w1اiJH;O$1z*s4 H#ӁfBԧ1"\)V2""&ZH0<mKn!6Fjy5. _ δ Qt~oڐh[0d|Q>RMojabcBzfD:2rPQ17Z Zf+aNVYE 6tK4mhQ9ұp'Mj:Ա1r q`,pzooWk4hX(V|"R$^eݞwFQ17CSN1ܘĜ9`rAm4ŦU![U!wU1^:ᨸ$Bi| )hAI5kCGJ9a@wsʚ%{gqoAU9A B@fɗs|9\nd}ְt-ac}ia`uSF΃> ^UE(:TMth"}Mꝏ*"mt3X O*r%b®tU]=Gv5sp$H8ͮ+3RΫt3,dC@!~1ٟdz"Fs}=0!Z}{~Û LƝnQT,ldzNgzaк||:=[k%W2A(6`zKsFsEb/8k6|T),yJs,ݸQ~I]>Ug0 Ѓ7z 5>KTbS+Ef6$cg0OՑ\drCzo^<10 f\rH8+h,71g/I^r쬅Y $g\Kr1Ks6waxD ֹ|Jxgɦ:{`w\}A}ˏo~~ī^ӞUzpE<g4NX2(A Ǚ*j* bepXGHapUzx!]sc{s? wY>=5_%Üؿ8xm.o<0m䂃(V,UH?و<(qI4sȲ:(+9 ֤(y-`5Q`NTc,6wN|8ZE}8j֗";E3?c<-cJI S䴯׿b,`6w?/땶K#D oCې6 9oCې6 9oCېMBCWfxhiSƿ1/V 6[GtT[ECsFzkj:t8P|2&E,(Jz9@Qu+¥&L@ ViǬQ(A@ kF52#gܸSn3`S$Ivj9Mr)0eв|~-n[L˵7 +WfkU)0 6@!( a-" g,[uϞQ ϭud"u˙bEwD@4_2'T4k$vD0<KvbA%NW3y;kYÃ? a䴔E촷K*`2T%,G9]OEڷ%VH6Q\r :*6F17L$y׽z4ks9&Jݭ֝\겾؋%?#5a?,|)&#]1K#|~iʗK^*`MV+og3_|7׷g޼?DW u#02JݍHf '?4GM5 MSVi6jWiW4%>ڭr@f?Mם嫓St2OH 3_Ȇm~9MWT&U d*w ?浔QY3ռ>yN:xD<{= YA'K6hB6Z{+D1R(FD#RVOڶfyy{a  xcB$^I#Vc,t jg3)u! 9%hiiT N3:0 6\EʢP[Cck⚭li?!k"ٚPkFXպ<Z=3 `L[(nPiV|iRa1Δ3 kp 9Zx5ծXќ2YmCEF7"q)c ][S[r+'k3=Su~BJM8oۻ*==B`Y$ZZ}TR{(y)[@)4ݒrIb+خ¾ͧYWOy$8A}c[/ީvIǼP:G1me٬[gXIfgjĆN&,6)a;Yz-R/M ysT>?9qFs,g }q31GkEGvYCR%J$rjT6Cac"D9$RZ2sbA\MYת͍sr9aQzzdR2U^nf&p,vg1Z'}HBmf5A떊gUa1_h_W_x-Ql:2ޒN;>^*ǫ_|||珵K1EQ OU G6NRm^ 6)*!&@x׈K:]t kd/W0YInw·=T^؊4SB챇ӓzbkӎ]6 ڰx>^ii `T0DaGjhw(pYYTТ$3|#SG魱&smTMC΃0qkRul<(#-qi|1LvTMEFJb}.Y>"pΉ*7BqG W.ŋjeҀ` FyB5{[WƸ%Eq_|H]P$ >ޯ995"|LOHF|QIʕحl_>\0`H=뮗-.:F>i_2u~T~x㖽N)`zLH%ez'_V !=ҤQOXOLLNjQa)O!yBZ;Bȝ=B$BHIR !)G )PR;!RPبPϨOqTZCXRCQ6Jn iC*d 9:_L̹4Y牅tv~댪۽Qdy1eU}k>ă09\=+TZJhQ=IA!}QY4]$ fe=g~NcLM :[ mβ&O- w}ꍳhչwuod:[LҐ&26lua{bn+Y zyfGUPpuYlooJwHۓW5Pi/ڿ(PT9GH-pVC/\=s'ѥMzu,V#>5Ef>$ w%`be]n,_9=ߓruL=Ը'=h B&i=9΂!#XQ0ǖ,v$x3:hF3:hYeG.;Imot.s.(ba+4"b84^LK55}I8>$dKU YMDOP#e`'ڇM|XZT= ZTdjr%SȰ7BР1˲qKп Rrq֋Fk;Pi}(@EΕ`S5 s.3d#v7߃P-d|)D.VmD B?OM2ʴM0+>ڐ9рJ?SʝjkߺEK y3C Bq9oD-42Lqع\C\ ix"{FAK:ҁؓO:p;n0( hI4`vJ!) & Ӏ8 uI$˛H`XYWbFOU#UB#!IIp`ImKm\{9}l+@ʩ󹆪 91ɻV/Dc V*q j*9]D-)~%)'cÔcw<-=L5&޺@VҌzM=?ְw3H͈dPL EÎ,Ʋcy.9IR0h}pn}z.;vSSI Z|^7)|Ɂ$dn;7ΰn:z =$^(1 ]`fg{lR.e07SH{^G[Faڀr" NgmBlaLLR JI`_r}q$'u=LsKܠ﷿ O,bIblMupJ՛X]-XZ5З pFn} r-Ou?XO;>>o__kґmɇ'[P+@g?~ǃ'3^k!/o#o[ Z҅ hK@*wr1} FTg8R3s'O)61DaQW_zaϵe=Vt] t z!>sR]ڬm&O$Mӻ կXl&&NS;uR{zkGz X+$JQR3Pǩb,as5b2֓K}!\ ׸_fm;=&Dח˿gZ[ק[6_Mh@zG7Փ*cb3>4 eJ̈́C)PҖNu,8n?S[̻h~C쌆8YQ$hekHIZ,}=:LQBLMpzYprq)K5}@Sh!7l cikW ZY-=wKIK~^SFyj}< |!]z5-jg[&g̕ZTpy|3#S{Ny'yM]L+TA<7oMŦؓ K$"'/ccVJW4QB#$gQf (g@S4aJ-al G)GT&{S$d1ʋ ̈leKFb|TdW㳃j:h5GsCU Bn6kPBS'(m<ƙ5׹X0 :`u[L~ZZ!(FiioǯwH䥊Ml1rkc)W3& Zui>&Kyc2&팉ff\m:D21'MdhhR'HIҙsj~z³?GQ7AkͺÆlAz*-ggKl@LtoE^XǙ]t|![:YXs76uB)%^GTq!s/Db$htӱ^>g*_ux\FJT):UcKm-%^Ҙ:TLiL4`1D|1}q)޴ڧ7dq.is" A4Xm_)Eů*x͍srY%`0q8t] l)6ɔ |ӃbW٧J}5`2\ق)58jn9)O<)Pu |_QHy{99⣓~Z֢HNEɈD_urtx^Sϩnr;xNTG?;UB+%*r2A#Qp52.~<ϒϱM:] 䏃R|<=9}awSڱuh=ywЛ}@O驠w>_\憪b]̿|G]YC+1X q|KCY3y2="E?J=g8MwmJt_0; ,^`~Ii %${[$j%d)flj.vUd.+0j.;۶K}y2ίsPO1ud7YKM(j$n;*vJozqRXWoɖ r:0GvUrj)nYLJkvNR)0~\L[{f h]?.ֻ,]_ie-zv񾐼n_x1AFa>\ݽIs1ڭW_#j dQM]mmι Ao3Np:j6aJ۬)߇͖Ds '7/aN(E f-@[k6676#ER+BD:Upd0QL_Rf0rFMgԠ~//s/tIGvigNjYhzG }{hhq$C7 }θ[m=NOnEo TDBK$er7!3: GRx((DB02FO`F#'ʈ$1:h`_ˆ>Aj(LjdHԔ1тFQ4`d[ʘtbopan+f?Hu|P+eg5 y۞L sl,a3ԻﳾBG95ԂbcBzfD:9t ) D3앰ZӠUh4BkMR!4:ڀmTt,%! jC}uM58\Ķ#kWk_y|עZEo+rfF%B!~漇<2Mb [[X}HYCl!#Q{CUGGfk'5OSfAy3@{&_!i (5%RaLHK8P1StIA7>.nc<-%zuj?i;}ys\V}kkr!bFCp  D\ $*#MAZ5BQ >{x4/ <;0v.` hZ| [ HHҪBi.%K@%T8KcOL.Y,-,xGI Cot8n` c+}X@$n;WOo;"vR p}pc>0gӓΗ>Vד=ϏyseCL@X+ØSαA2D܎C$F pݸq7Nms7o8"Fx&H@hV q'&r"u0eD" .b {z6c}Zl!C:((; 6f+SԇO\7P׫<8vڄcхQ2J3mUT$M5!pvㇻcpiCn"1: jF:(+9 $BO[sqYV»_ᦽ0bdqQ2KNw!}'}b1V۟σ8 nF{MYtgnnk<@v1MAz2:{_bg:{w _ "?7N|tCMeD2ÕicPD/w|4e&;Ki21ގO +AC̃ڨR@qa1%ÚX dt}T9-Kٳ9gJjTc X_֡b%A>äXT#ʶq"xL/jջP}{/':] E=m3{ 'FBuT6{-a̕0ho-1E5Lj$ȴj:6q~W7p},ZEc'nqt3=t9i+#RBE{\FNL[Me]t9nivǾz,&URXqWI]ů]ķ*ޡ"B#yM(9wԕZhKN$zw]GwE%]W@]&JڞzA*ievW]1)W@aMո.*i]GwF=GF F-aJ  iP4T(0 ( RjDs$tUR aEFӞ7u:YiW3kpw Tx{[bcx(W`r߶K= @RP> fY pw5ѢZ;7J!]V$FQEĿՏ#u^׏i!#D6u?JqOԕ4On u²A=͘^UwJISsm>so>ҳm>FV(t`DvF1N|!Jy LA"^P1G vnӦ CcZ)"޵4#J9xz7b2sp @ZlQj?~$)Mb*HdUe&|G - (ۮ5#}-ԧi/iϻP#s$mls3Ut:}6"يRN]tUXr Àwhc?ʎEd).˜8-3fx+CkwS5S>%P|Qzɨʙ2&iȓt"ImR$lzDNJc(،]&Ow~vsnV^-~T3 Oۈ OguP·6 DrJĴ'c`MY1<$/n]wߊ5/UDz7jCt,Z)m "ر !+-]BATH(c y+WxvR=ΰH @!NI6󌯽C)%"lT @mw)CUyV}]_RGTT$A)j) M4J9,"S.؁!?ִ yd Jq/Cd(Zi \Α QΙR0R[AڧVA%QK,f2I?wBѱZBԞ`6EKs_`jYU\{mϺ̂~4x{~Q>WZՇik\ufm22*w9_G~2M_x<막]i~&> :y'NH0xdrÊ$_n~l"U6&!r Yx arww3}rN畹+'t/?BiFa:}?ޅ2%YZw[̐ӛ p/.Ǜ50^u~{B;(5֮g~hvzuS7?gG~W1ǣ^s>ٛ}w_W$f_n"z{B<gJ a Ql1g\ aFi&rF|2WL}4K2MUchTfkfk 1H$uxzv-ͧz%*G$8kt?ZA' a Py=i!YXU*hvI#H LhJ(7iUsf<4'߻蚵Rwc[{(q}\,cufpEPR(Wٛ4>S]P+|h(AG#s! HΑcS$QvLw\DI8U^Wkk EDQ*A@횑8Lbx}G.|и`GWU} Ora '_>uDlZ^?ngαdUayM:L. ڃZ\,2lZ!I6F Mw޾:6L[=֠ v Ůu}͑b W/}J%2|Rk]߇T*+u,C1GvśݟIآ,ّ2$}!NZH5v{"0}w}\^(W ႓)).&3B P(}}7>dhO9c#^ Rt"7)b5B09! aԨ %+fې{;,S66J9(ud!AJ-9/>89;0>b+:_o݇0|3>+ J= -ֽ Ϫ]Sv)*)e(hy.Mj"(:m5#u%pLu]*V=j;!LGBN0YCa%Z36#q4Ӆ8/PpmmE4;vݟ||7vB..>\?}['6t ٠tQ@Xl6%=Qkhkt6Tc/$% F6rdu: v¤BB׺q+rjVL̾hfܱV`reP4%HI{31[ժuE(9RX l!32hkj [Ȑeki1wBc}،Ԩ_MT؊?ՈX# I"1!"k< k8Yَےd$k`g?)_5̻ܡ"٨?N$BZ3tNhh 'bՐyDH)e!cJ܋*k'e)_F Vנ@9#FFMB" OoqP iٚu1PzɐMrpK’p!#fyuﮯ>~Q֣޽=,Oy>ɲҺ{Ճy.POHu^-hB dW-*V%idPۤfw;RY9ٻ2ڧQgK`]Ynv`v^qFJ<l4kʼvoɫ&Ѣ [·T3HiѺwp%{/Qdv W7#< #F<ՂG?{6_}oKTϼ@wA/.oְk{gP_\\7P,!ݒV=ee]f; #!-CZ,@Ya}/C˦Geȿ|g坬<ɐuYu 3\t@+Mt*LB.oŧ *Z!8}^RF6 Qə]١W09}ZQq}?8J9[KWiX~%{WGW< SH G6*`"4d!Xdd0țVbn*+FXJe7KD4DI&۔0M,K"ӌ-/! )@ mf(%D0Ȋq޸*#> xBTCy,`_P~cY[ zKY.d%)1GŻ:XW-pVϵ}|7 P[M>i>:IrFLmT a[P2`1BQdG2vuMh8w܁ zF^3ƹg3ɵe5SR}uRa yXsd&>tQBju\XTCG/`.Y!8RԢYwOb/Z^DԂPDI¦XLE1̥UgTAA2:*:RjĶ.i VCh ؐ0xW[h}o|qcٿҘM;%y)`fly vD,R;-qluEu</I;Ŧ1TYM"Q*.(\7QJ"-&s>cI}EG@O+JMR^ŶZ[PDΤO5#1X`9)QSϚ==?|DgBF%E&mSldR?ٓd R'ZhU%;ȿw dYR=/C 72[ݻ'q6J__D*$Udft,fiFRft}iFGX fG:ܮ$;Chum=zRAy,.z_~w߬./6&QݿǮ `kw߷`5F6WW񼤋nr]5H ?]ڶ*c~]]';ť_e?p(­|MaTϧ᧽׾} 8yNxϏ?ډ H3 SQ49a}ȴس7 7<#zRfDS7!g<'|&1o Y/Kw9YӞ)QJ{G_xjqpZzǡt|\S S OzR,J/:`Uki)tzutQ ztKWttu{tQt $3dc;\Kf)]u* z=t䄖:`oCW厝:J=k+%u.g7V1wi6, +ן.G]ua1sW@kOw]Bګ/Og+&/VϧRAWѕR_m.pc;/ (IuW1d;`CW,a~hGI૤+Gr@WWʽD j'X;Yjx2.7^ZU>KǠ=^l>v=q_]ubCWnK+$NW^!]iJ +v.:\UGRcumhAtU+~)t楖>0*oԂ1+K(VrW+TKXb p -fG?^]}ITubo6rңf7_Or gxPN&9$K~}/x=~op~Ի(ԧw?.jsqyxƒw&8>8ӣKf֩8=L&w1n>L\]?bt0"_X5\xLУ)uCܫݬ/q{LV}‰'ayDm** [w7 213p@OΉyyEng3qѧb{l!G`??Tj,Mr4F[iˉ\ JqmN"L0$S_=$A|}} e-?~.˷_ָ t߮rtEJjҒn*h.^䒥ab b>']_JAUB&Q(6m1s1[Bܼ*.^}G2dMZRo짅ynRצ BhogM%,גqϑ z-!XDvsZZ$PkA/T2:#s%(72Pl)M'&Ѣiń`/޾}I5KYKf\tɚ%e`+qJ0&ZK}@,8nlZ;B4Cc6]@ۘuk%A1q欪S]3 ф`[GY-ShVR!{w ۼ}#yeXD. 0r@N$劎ƨ&ķ٬ZIm=9q"gb<#3k^&g#BmZ51h#ZlƗIah'E2 VAfmYR4RFvTX R]J0I/ E3U0S`Ų ,ܢ %tv+**T(:8h~ՠ휩ӊCЗ VD9rB!VK${1hd(7p,Ք .)[:+tŅejPh}]F+K<<>o:@C>+N.GWDB˥{ wS`y!9? z/,,tB\8ǰ) >@H&902eް$͵"`z `٣>Sa^$ ͗tdԭɽx3$nCfp6Ww*Yȩ ՏU?Ϣk@bYUY{¶MfF(NAO>}p8;wN}}6B@FF|uI>`"Q)}8#tKwBL)ڽ,Ck5Kq!Q y@DEH!6vrm쬚F$K&g|Kly N$'e#Krq؍l:LL*>Q\!rP+ bwqXբpZfa" 1!,j )flk'r(sէV|M;R!eќk<\#M8fs_S\ 7[Q\7m,e`Yn$, \{%l64 ֣sy^t:]l}L{uzq^o8ugm?0hf&:hL#ll#5H(rHuܵh֘u%n9œGBjCK7#;L oUfu($}  9-1l?#F&ڛY%\bLv L сsH'$Pwp}fnmgc-X@|M7`E^H"T!F \5$\uF1":+># UA ԟPhmQi8UQ݌; ;4Yh\SMF% b<4jhc1#7iMK ڟTO5d(9'9'еIt:CAI+MK}5oX j+ɠU:i`R; ZfS`-ljaQBj4AS'MG~YFi'aٗR=t!w oA. `n֘%lUvTBX2cR4k&y/#I ̝p%G@B ;okw?]1;.<!'N)sT~ԫ/]ܮMRr\_^KZd|kG/ݵ>ǃW,C\ :mWן_ zp Zq//_yw^|_^wȻoޮ]w R\;ao^v MYƪrۭc\]d\Rd_ Y_8kM?]\EΛþR{onBq?v=}8YjOiywutM=.oH~^h'߬ 'ڮs7'/jTs+M{bC v;SW'NK-i i\NUSp}%YЫ,Wj\Qp5 F(Wj\Qp5 F(Wj\Qp5 F(Wj\Qp5 F(Wj\Qp5 F(Wj\Qp5 F(W,?:\yOj1w:{G*a\Qp5 F(Wj\Qp] oG+q2T;l lA`=FF"rl/߯j%(QLXJ 3W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$" Hp+ÀYWdgWȮ+D PpeW$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\=J2$13+Z/JW2+IpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$" *R  V=u;zԍlj/o@S)y7=6PwIfd;0Яڣ> *z}a%+tho;]!JMrC++, ]\%:SD"іxp[nF@3ڼ]É)WSlv{7qsL-LO/M'ţjs841?zX_{]|4s34 p]i\*x:Ѵ^4|8jx5ޛq Mpw ǹgmp7£>XXMCgQ9Z~S_]uV9Ҿ:N`_ zy.KǾ>v;x >-LWu^ZȟԌnԯ/BLU+.)mqa.a$p<4;ZWII+H\ًt/u2=Lm6S܍e}*)eT!4꜄R>f*|h3==@.iOU Y=md4 e"JWQY0H"tIeJ'T`*F T5{va)t[gyF"*%YժJΉ-6>'wػUnvCT3]!!6bߵ鹵Vw+thm+@7fOJTJ8 ] ʮЕpIvK+<V]틮条+@kk;]!JK!ҕrZ!t]+D`QzGtut|nݙjGZt}+DȻ:D2rm:DWX׫Еm7BVcֳv;SugȀ(ie Suj#֚w̾+D} P|wLp|cG_+P~޲ܶNtO?g`e: OK;}m4JٲsM4ms'DW؋}jrqvB] ] \v0 +thk;]!J$RvR]+Dkx Q:Ktut-HB3]"]i}wavl}0($:@2_!q >IUt( M"]Y)]Ywpeg62 ZzBI|k ;ٝ WwfѺӕw +]+CWxWV~eQjCtuteO۷_8p9mR`vμeWY߲KU-FzzP'S+kjNaFr^j6fT\*jήҁp}4wO @ |'Z iٶԂ6=W?˾3tp ]!ZNWRJFtutCt-++:]!ڧŷJ툮6~^ KٮլJ[o'Fw'uRm+DiȻ:D@ҵ`<+tpMgADо PJ&qZ6;tpEg+Du QZMtutedVu03tp ]!ZzB] ]9]AGiWʮUjGC+o澀IxH6`/h١z%[Ki+:CӈV4Ԋhisn6pmgxJ&taa\m>'ͧQu6 RY5FTlyy:Yeٿ޼ym9 ߃07Qv:*d*%'o |o"勔 ,Q9)Tڬo:ß'1דW/V5[^!0~.y#Mn^pcU:)|mSk $`hn?՚_e}o>~|7SfJ,w^5iM{[{:4Vё(0.i^" K:佒B gy Q m&(xܯqOv>syPS#¬T|8 ,/$mB(30tI tJ-P2[/05%JppI̅P+`6Jp.]Æy4l=!U0|t̃9lsa@m tEWcw..,|~NX00M ; ?:f={?r|EV5ٰ9n$NMZy 622[{nt-l ډU4)geK2(T++$A@Fx RAĞ;| j :3 ꣭x㡛]>-t:ucd6E9_gxj!g7|g t _&w'7oclͳ&o,И熸2C"MȴAoͤ0ah{'^SJ͍t!S lӾ0UB|~6*yF5 aL LzXڜ>jzРi7lTC &.8g1\%kni ɽǮb}TT̕&T*@܊GDóSLqEqfX;{(NDqw²}JaOm&lj Ɲ]ڧ5/2\K&Lq8 (f9U !,'e&&8{R,8%IRbr["DxΞegtopnXٴ5qL'"Bw ǸdHe٬ѧlDx/#_-u#CaDmz̟Br;27gErv];\a䑣S,4d%oW5l4Oώ6/UP?nu©)ִvc !e•jFٷKY>[MkW8l%[5`.{ixv{=~?^_½{?/޽9:C,{"뿾Yk!Uҩd-Ye&_"ocEM@;Lq@Rt87%\brz6*U\l1!*Ӆt=ν n~ޔI4~JFNsC!9Ql2^3ʛs_1c˞Nqxdqq"#Q3 : :3p W)L$Y3m]8'VW5s'aok;Rl5PkQ:@sMV.+kEL:l!-.ku3+{o:uao6qizasr0.2|;zQyv+<׸Uu熭;7֝kګՇZ+Yx3RTQPי$'lV۸Ӯdt<.uJ j6sҫ}krد}Oç){&֏(ۺ:kY$MI.ɀ7> QjcƓɛuyy۟Ѳ4#8h2x5֋mš(Q f?U :4.:E2R* -*&S`0Z;"gm* ڣ¸|u D.X42Ǚ.,dtQ&ə.4R_ܩ"GWіJ1/%Ge)R pಷ5u桎u46 gqopns#eɣwjޯ[>vGz5 ܫma*;Dž1ʊ:ekrlkXE1@}*v%SQ:RK7ZvuuopQ1NVM--*a][+B}doA,,3ei"߷f4I32emq_Y*֥_l|ַ O>n;}TMb3w_J%|| E;%ã>VҢ\O߅P(@%W7DDёlْRQ8ŸB*CO&GNT|T)J`3k Tkdl&adlUaa, %϶TΕpd낖w@&ٗx|m;a JSVEۚ.b JP3+ X[F)=, j셤FnM0 F*laN8_X T(A5FfF8-/ltڪ1jw6K8%H 39[0lX(9nEhl!33V4A55Ժ d8k&BLE'AlTgBc*:_k[o'A0OVK+*A[9^aC0Y^s=[5;>iYv12^ތ>MWSz8iO*\{\h ueA^)_l$d1&BfI13C(zp !gQں]UZ`_"WOC{:fGY4bj:-:-~aOyŻ~/ys|}ק6;0} rU*. @qfx=gG/csdB*}4p,~@mqu dk}Fw*(ՙ(}h+69s[7me&i+6)gݿ^++`yjҧ'b·TdJ"h}}/. l\Q8n؈d~a6d;1 OWIv1 ZM륖ڲ&[:}0-*5K$|u]Q.hX׻b]6t \+dw@S a^l1 'Lu@r4|t-iZmh֦{tiͪgeߟfuh_mSW;ՓVA4]\>Hmg0Vw&|:O}/PEH 8BeQ)z@eTި*Y$ B͎]N^=#{\ѿ /u#nc(w2(<ɐr YB{ `\g`@+ut*LB.oŧR`U4C2y$EI9;'(!'%ݷɘnlZd߼K)R#kl"DMJ$] I "Zofzo7齽 M dF)!F(%CVdPY`W D1C L6[#7tPyX0EE$QfkKEWx`K[8x* lZ=I_RW-Niб>Ir hT lզX iCE EqnбuO8/o>şh _=dm͓Whe}ɫUqq5#?UuĎ3xt;O0fl=Ԡz%ݶ0f0^ Zh' ~{O>mA#g[E5޶N0l eⅴV_Qp/QVOϣɋʥ@n9yUXFч·M=fgsf63g__x^7<, BC Ll]$&]6[HIcE[H[XL i4q/z+/VтYEx] z]rFm br‰1()7&H¨X kzuRy/*WyhT۪4䅽Ǫ .PZUL~OJ:RQ4ϑfLUGSe1fQ9t -tB%x3(=oQ}>o#e!ZA`HP$&X9ь.^뚶C&$T]6WXm*\{w6uf}?\K 8eFH;M=Ј(0Ǐ)htk}s V&pxaLf X VpJs*o;Wkpvȡ]*[)˸[|镒/TPA>r|A8)6iXl]ԃje`$#p)&1R>hnk۹igJIX^K rX`\̋9=-cϘ0e4̶kƙ jΘ҄LgƠ„RYY޲zfk ȝ!BL:Xe!xNc-yFiATn _ ўzlOo?w&- W;XU-iy&J}%KCN CəM2c-TLxS)E ) B)&ffJHadgCCsS禧MOV[IdM)+rDP,Wh+Zd]78Ϧ,7@(< Z0Ϭ19ϽQy`.3Y˦]kpj<|\5(ۖs:+~ʴe;CXw8S`:h~kjW-?6JS/MܙZIHP` uQ[u5͉2SMD<-5Bs Q% A8wVeFhծegRšŸ́ޞ)3 2Ox>OwS<c>$zeu@Jg~3AMƯfµE'+"02~zw{5syEBBiT2 0iLJb2[>@o<؆ɦ>Yu˲mr E$ mʒ0[_ vsiCcڌNbQMe>$a(JW}]HJĄ `ـys1=ړh)wxŇp_llk94Zѣ.s7Y>x\f@S5o9zHLSn!*[Vʪoҝq|2A+Ko.kv'!D{~Wl/g;Z'-Pgko>I޳o}{<6҈\eyM/هwbI;x~~Uwo_6v} 3^:*rr˭]sa9f,-  \L!Zi5V4eQ00KN̹8;[8x<5LFr* 1.sg\8!pJFBxѲ˭58uϕ.nn׆_="+_6Vs\+ΖVKyWOmz!L=ЀYW^=jͭ`ghL৳G`Y ` (|}˄>tW@W`}` >|`%)hŢwp\Nؿ{\?NsJXazQ"S2|KzVl(->fe @Yp% µ4tuCq˭Nt701D }MFɪXLe\~|h]Fj]!`++d,th:]!J5,HW*Ib]!` hЮlҕ!Di]`]!\Kb+@k:]!Ja!]a]K hv(^ҕRF+j-}0nh%*(Qm,[8o3HI>"<ۿM2.Л08ȔwdBHR*K$֌U(3,?JC QxҫWuuJmCz"#Mx4i,e:WRyv\{n?1L#7`'iu2v~[R:5(qҶQ4a6,ΪԺH+rτQ4uTo3B !LhNfiSi0E . Tfgn W:GqcKa1ka}Zg)%ڲ DIEWWX ў]d]!`ã+kY,thQ]+DI@W=+N04"Gfۢ++x,th}pB tCyr]`]!\uhr)J^]ape"+l"r#h!]+Di` sؘ Ki+ke,th+D)@W=+ͬQ1Rt9!ŁCWs}"X uG\٨Ǻ\M]!J5 ft*|`P6blD%gDJOB{hvKSF PJ2p5Dv -x҂XDE4HN++.8k(U"1YSTQ଩VS Kkn"k@qc5Ŕ]YϾn=x3g6C;v5XbaS-,h  ]!Z-NWʁzHWLSh{t^O\J+D+h QVk]ˈ upU4thUJI몏t%&&FI+e,th)bQVJ t(bkj0BvD)@W=+e1YW  ]!sh\Zds]L Kbh+Dk;j }+cJWXFBBW誟tew]:*SJSFGHn7[O旄Pځʍ3{Fph5C{f(eMzJ1gOr .U坧誇tň(2B*B&uG>ctϿ I5C;o]!J!] *%ɺB*µ$خ Q^ҕZ+|'mt(@W}+Uf$ Վh9:]!J!!]iƔigkh &NWR >H1)h Ѫ "J=U/ t28jL'e\iA߃[Lgo'7S^AGqѻۏ>UR+,UȜIc" w$ig]$z;v~!7잏~0/nCZPWewPd. 2J6hr<+}ݨ_% 4ᐬ♟9tm -lL.~e]x+Q?J(ER8-=E혍<܌(8ɽvbGD"6>.:y5l%5%WJ:# 97befAM?MV[[n1v%p-n*%Ooz+6=ou_-UX$v(gI$v5d2c8CHE2-@,_a/:ߛ,H5XcFh5]e%VcYea%MI9bZ.Rn8\{nЂjUDO5Hh,CO5^NV9/<9av:aEM@=JRbwYoJE QbiL&WI$tHK|Zm6_-04m'pԊˍB<ܺ[Jc}p6} Qs[M~- =N"mkH9%3Z7#zA(TP4(R{C n2 6cI1fS1/ EܷsG˾ݷsi\EKbKWMM*In~yD;BgXkR[Bl_6:h?\r-%#A9YY0[#K^w~&h{iew獸yDKN*Z͎6M;ߩO[V]+v At!Ka=y@[vU3WlV6C;V}@4C@CO&GDWۢ+K]+@*]1AfVFCWW0 ]!ZyBtCB%o8" x+D+D Q*>U _+]![x 2 ]!ZNWk: ]Ii 6%ѸJ5C)@W=+޵Ƒc_ %41@/YLmf >a).Ϡ}3vHJ)G*"<\OZ]0GC/eZst5PZ){KZWpbZ{[텮iQ8.H]Vʩ@vZQUѓ:)qh7K8.gpYс6Ci)Fhz48j/Qk=i?t`t儮;:^] CWn K+ OW=9SЕ ə{vCW3K@Xʆ,{p9ph(1%%+vn9t5'DWm4NW@U:Brт u-fj jwRx芕2Zjr֮\^ ] ոՓb*(kZ] ~9j]] ځ}~]E_R1(CWK@酮X{vDSSsƑͿO 0ⓐjLwwjR ߹]O\;UMȫ~v1ϋwO|¦a1<|} >?IO,x [}6:2Fh+#;>vwhSn8)*cҚW9rp' F{?`K l -WHGﱴeiCorfIteVf9t:.Pz/tu t篆z+#J1'c~>ySW'i]x{]PN7 bA7h]@mQ#6o[L(~3u fclzϿ<~)/(v'hmoѸt]_ Xv{r}ikHζ6S9[tOraꃿ70L[|o>&໨=yr#mkNGڜͻ;Yuwƽx6TTT1/@$_P>1-0+w?g|d0?} ~!.y t?ҧqE˚ ۶۫rHo }9d(Pƅ޴&Hުt^I%+w{_? gE| I7ܴ>ow?VI_ݬ.k3\J a}B *䬩I9KEHWp=CMNthV՘Y u*֪T!~qqʹRVMUWQUbKTtAQ]8Ec=u50sS`̙=HT'*Ht=) ;*D{j%xڐ]jy;ai#_f US`# L9'Xܢ ^ TTlPtA[< wԡmӊ#8CQY5 (%ʭ:<ĪѕA[,<[]:A8քF_XLčd BfCsduc]L\s |/ *`2OUYlXkYkC*D@ %8ٻV(qhMA'SX/H1lG?5]6Ajc4)*@F&0 NGsYBШ(׀TߔN i[P^j+$_LB2m v^Eo[PB]:=೮ ++#q2FN[@( `-ePDdBE INbTy>MFݚa$$XYu0wT @TLF\J Q9J@T`9oL6v_ J 7PP;S )ti7 v<;<@ڳH( ;:=~ 8ugbf&YFgU]%׭TȞ,f^n!G5>M#HH/eJFjeR":J(k e1!Ұ&4fx_P>V8x4iZȠ ]|$ XܠCŌTUűb)"9YhQ1&PTX!0F!N%C񈝧u&^Xv˚t}6s-SZ01^6us6=Dfx:p4(&M:V%S2CҕkɂJUe ,#(yG < 1/,J茸P4I"iyM,CB6fd8 c4/%$0Al$kukx$ۑ< #7XTuf#KP05. ڪ ]y,-?yy^t=[_—|L{{vuv;v%qZ0ctqtsYhli6 F Z ;,F}lmT̸֚BԦ 3&qj#w?'|_.b䪔7 'wK:?;lo:r}f7_6z'qu;_oO߾Ն(|=8^}XANoݜOoqzSKo$έO>3qf@sRGmj>5U:n`o\^h_뎓{ tN GO0.H@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8  vIN TfA0\KqSJ tN b'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N q(iIN ']h;8 FN'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qh@Z͂@q ?_ZCw ։@&<H@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 tLNoGz6=˱tz\ݥzwR>\_`'h\+:%d1zTՂ 7R h ^NWBWGHWFejܰ޵u#"]`7-Z4)؛"KYߙs-ٖD:΃H? $:žG:DZiDWB0"Tc *Bvϳ#] ]Ih{DWhWWh{HWCWJ.DW-•ѮNW#] ]iɈiSnCWW6fKCD4G!ҕRZ J6nsfADɎ HWVk + u[tp mvhkWR#]"]i.PY.nOE ˥(*.aGˆHdG%Z~4BmqЂcB:#pJ)xF˭D1.v`!Ďiz3z>]i\ д>c*El]`Tc B;]!JctutŔ61tpYc ъ׮G:@bDWXմ)th-wC+FpM+DپGcJ&$i]!AtS rt(?ҕ2p B6nvh-w#] ]i%k +•)thw]HWv 2\NMS rt(tp odUjV>W28λ}:6TTm֠ >x1o캳wCy2SP@gi\|gO)r SBcTH< Ox4'.uZziV&٥H/DB,~PgO+Ҧm_`м.xmVDg<ڠnS SeJ EyUmc]^;<]tYמn=%?)[mGaJ3" I,0'œןzi|<6#!pޜBYZ8)y!}ƔOajȂb+5jccC\z(9nDE.D [.u筗#W4 ‡_r>[,`3Ff?l%.i#H.n q)4xFC,PL1ݳ[rx=Y.D!Y 'dSV27,vpP.]oM}$S^15og 0T=ן-UN}/!?j"y/K^Ս霌/'RyiZj q>Cyَ;St@*O+S8.YΣZBOZ~Vɨ^ߖo/0<xjY*ӦL)gU?L/m!A?MAL"*0ИR+1nũ՚mkG)vF]f̧4*9P_A_9h` S%i  x2&v@'QL))1q\N2jY>Kb;˗Zn /Aɦ Ҹu $*)fw\5Xb@}Kuюi꫶MlO7w\c/Fh ݑt3|Tѭ;ݲ3uS-j ~mDĤd 4XFF+ 2:(|_j,GaYRq8Z}yQ]\I4hb M cN)x H;\N[5ZSeD'76 XˤbbjP4GKزQ58tolcl"z;r9]~yu :yuhEg'kZnC1d$pI Q@m~ b`Z%%Ahô:+ єjAGHKZ(vo pJ0OTky˛<;kH`ڤZH~L$-QH/3(r!G'⏻[ MH:3VKIg ୔`p:"kjqǎݎTitO uI-@{=Y%е2jI!5 -&BlpZEsz^0)ww˩cYﹷ5mmA+_#Жt{v30޷&}t7,[O&4J&橻wCćCC:([g ;ړ<[GބljxߪwL8t8U]GȐϴve.|]/کg_Hr+ $K x JBtqzM~^;? \Lq 9ɇtL5|m^Fb"kz RH{2g Yګe0=UFƊsBtJUR#ֵLnՏl|Tw~nϧ^.+g76̅^켮Wn 1<~QH_$FeՖ[1;piNuo?|񭗊" sQB"R0>JCc,eA`*Bs*λ; }wNP'~{[G2ҺJ-^;^=7opoᥓeL.֊ߝZL}q ,(.idԂaiA}_G|j0{ً}rp\yʈWd31P¥12S"SL&IgfrƢ'[UuJ.%%33xCmJ } {KHΔH]rgW3eqkp8s'eFjޮZ%嗏)sX'fA:4߉%VFZhgLGC rf0dsXImTA6$ ϒ⧘ 4$ᡓh%7FP&7'Qpw14 ë}ښ9B0l,614̕ ׼!Xaձ=45||NF ~iCIpS㑲xPDN]WfEz6LٷEDH:`/chE4^ ]rb|0%B&R!["^O@{? *_|H|eHE[L=jHUO 5IEIۀ|z߈ J@h*f.*CZlU tQ.K JYwgDy I̯}Q֣|lه/XZ' =uW7RTEyfСkQT'j| inR֚L*ARݡUk{lYrv.u FxHXs[&z$5fdf5qѝ &cR: w8M!H*4^5OɁL,nC` IBhDD`& N̠ǏEwiql0F+uٟϫٓ__=ޡ|\8:5,w;;_#.UXWHncn vI鱎w1jAQ-HZ&)^;M$QK6ѧru%9<t.1s6@;E\q1n5;@; v.Hƒ$!%B4 v6$-̈Z ijT(ذy69k$U*R5&*d[\}$L4AɇlxJYɚKqNſF΄Au&-V7WHw> T9ˣdI2EWC624VbKe34e<VmTJGmP*,Vbhp.Y2īZ!X(gӱEH \Lu4DػQKp?̴Ǯ00/kJV7f*{X:wӵt@)de=R]EN,:0gJHedunұؼ,!NSv ں|0h )HLaUa~@0B]Z9AAs*G qqt5`KwL$#4L:W61Wtk] t6A%yZ ify066پWETYې)z0c>z[>`udU&&bU 8SS /)VFW[\.#(zO&ί*XC]Y 0iF~ҩJHU+Q%*K^";ʷNuT2)!e156BCn4P`Cm;/8)W:0AVu&8\ IY1'Z c FtasG ւ`3e0G|b x vLFE“A` f*@{v]?Trr-Yi.(4c9ƍBiLQ1TC%1;/}IjB}e&fٛ .rI&T+MϹӷh5]ʃA|ne@=6?wM~ngqvu^4﹌ )/X@zչ3L[P@E0}TՕN8 X/?[-{ݩEU u>*ZPsDpQ.o._ʚP֋(k@Еuu5?1NHHHK#xwBmX`dzx*mB#mդ{l!Ijmh5@A8e/>Chj6ZU`^D`ڈTDIya,[&}Ee+VhIJz\#%kI*y9W-xdG4CEF|KݐR>-c4?.t&nD'/%|LH)bS|E"imf $s|©wW@@{.u!m{w5eb*()߆[vrxct5طj"%UIنXuE/E/%A,WV?)%H12._.kو!FJ$٫8 #S*%Y5 KjEI5H>, 9ғv\těISH֚5KR/T\L5լrR,#P3ƅmelARB.!E$ ڦ8RYzMlЫo@͆zaBc!l*9)uQ/QjccN)>'t0aF9ȪaRUM*SioNqRƜ h 7\B8:I[vC5[/NPJWOxҩ1I&^Ȥ]( T#`U]yq%TL, Ky2] Ǣ~M^kblmh~wqC*iRn/HaSJ+s 'ma> E^E-}OuVuǹj98BE(KTQnJUfbB< `+ԣz܁1oE;}:k Es)\.ħ:>t @坬d$hP;Gߜ?@1\$pۣ?\*8(`oDhЙTKg9]M2)fWڒ(ҊV[RuHfKd5t)9j#C^!2gO̶kׁ٣m)`l7qvl=9]aw%(-0.!XMF eQLa@S˕62I Orֶ\:I-&o.4om /A_+b9 EoW>|aY[QVLTKbOD:ht9GO_[R<dٟ̽uz,90-^b3 枹Q5e>~g /u_J;NKK , b6ൿ|.[ dr- ĿG =RR\a`d$ӧ/Yy~}kö́>xM].^aRDJ4"nψv}Halvv0$^mW}|| ޭ 9}S+/om\^w&kK?PtO[E[\;&(K~tnKI6"lg˳_lEy9q.g >SVx/_|zܾ[>}wYlw{7˯pᥣKn |<u( bSWӱmaf8ƀ&.on,DRfC- d1D|D?E~'WBV~\ jUKTJɸ- V%Ơdʱױ>Ax~{=Bcʕ3mt-mqyނ/^9CzDQ#w9!&&cHV=|E~>\}jJN"BaN)"I:LSwܱvǵ |ds.HgJqORSYŤk T%A: ƾ_!Zj@h*#W\%Tk[x rY2r!T;#n⼌0!o$=Jo{ٗUׁf"|X Pӿ>v(u4:ȊL -eddZ[GM 0W(4ZSƮP[ k,*Vm+"T5ޥyYYo@ETxwdk o ?)QͼbbGb^C4{1lj A#wMٜE* !mb ݻ*#(pmX@o%JL3d;LqbrIJH~DR{ue;zUGNmy ߫YoBͪtDK6ɱ0)hLƢ;٠6N·Fhr˼%E)A76ycqA:gۇDw?FW䡹rV^o޶"4,M1v &F:b]7QAZq计|1‹rj!ˋ$<۶ mAo:`Zitec$*[V );n:)DࣵJjyO] mlh~F#h8{c] =r_Hc{?nGavM껉x >S;s.}_Y<^(gꆆ1D+QZBx] V_7ך3"x$*TO1f۶E+Z% klRH!!D'#N#B-9yxknÛNEAOm'= #" Ԛ X%FZ+U7m'LƮmӁjhK?ZXX/~Ggxh;8C_5(:n^pG/C܁7=a0׾^I8 hL]sn BGh2jV֞qB8(CKa~uoY8BsX~-(yغtfQ*'yΖmw{- ُw[0t_CeoXq-ogwYxPd\ 緿^7xm>=SW\o-3;g;x759cP:Am昌b@71`ax0J(hG*[>}wVm~'?H;Yp_(}y]hՌ^>4i[f.?>=n(v=ꗇwwI1nB^x{M;\[췛hI>Tf$_ߜ)O\o\o^ ,Bf?{a'Bm?R/> w?>ˮgwnZ_?ߺ.E|zEY 4w7 3%f9}!vG9T t>@m|aM0m'hW{P0:&y7պl:|3#F. q5fsm}+7x!ɶivEllԶ#fѫ6*tMPj$#-ڮv&jMY{9*B: #xn1uʷZ6A 'bA1vk9XEzG ~qPǦy8.LU?5D~T>3ҨV*d ^"#:%#zzF24—dHQY3B?׉l(2J^WLij2h銁>S:/^SJQt5A]Y B4"`f\sjʢ iWOWFW;\gI\WLd)M]MQW^[sZj g2ȸ:vOҕIJ5X~w7o}V&֋6M<=s6[ܴ;RWvAWUN׈]WWs9M` FDwn5(TIvt7կɕMxAA J?7kU U_պU'mݛ4_3ھlpyX͛$ՊkVZ4`uU.;n\]{yNWTǷ5uVVDX5ɠ[/1gU2_?5jwRJo*bWF"vCHnr\ms ohl:UPU@HX|VDplPǻ<7ӯysy1n{߹>TIęZ]57A_}3[.ևFk{tqpL(Kȇw\nę0 N0؜H\']ZIIQj@쿴>,-R`HW,Af+ƕ*]1-uŔZ]MPW83"\)E.bZ)-=2kaltŴƦ+tj'HẂpiU+et5E]&& !]lڙVuŔEWԕqʊ 2qEWD$+DWt5A]Yg5tEԱtŸ* yuNrDFbW,]1-&+e},KJP='_7G6S&&`)Yc|15͔*MÞUgTO=Ni$>q.&#tEWV=T+kk6b\ i_:uEZau"،tFe+Ƶ֋uEZȢ 2SGkL.bZoSQzpEWnj(\tűL]W}q[t5A]JN@4qEWLk]bJo&+ x"`Wesgi +jrH7+GlڙV'?dJ[ NQW^:#]1g2ȸ+F2'h\~xT0ܒ+AI2EM%yX )bRt fBٻ,fhT3UƥLf\͂/0(-i3zXz̄dV4gV #n#z|*Qr_|ٺ}n}ͬ2+MeeZf9_T<n'<҂=yRqֹuEˢʠկb`lډH2oEW+9]#Z Ruh"r 2ltŸN+od J,Ik_usuEvŸ>]uŔe28M]Y˃efR] MA)gO5=Xz78\<$x6p*MC|;v~78\VuŔ]MPW<qEWL}"J Pt5A])ㄶ銀!+.u]1EWZ lt6"\EWLȢ J[|<]0|&=n6kWL2u]1.RAF"`uGb6+EH~2ȔEWԕu]NAZf+Ƶh)e38E]9igEWLk3Ȕ컚8ˌ*Btg08Tg%g\6,9pQig6 H9fƴX%iUKM1ҨT(2c\d=K, /49e1z팄b` jӚo0+ܳL8?FuqpějIl1]Z@M 8`qEWDJ]WL jS7Ouc"ġT+EHW{qh $?dJU&SzuN+wt6b\йi+D(uF"`#Yb\diѤ+4Xt5A]YAIW ]dSG9Tt5]9a{A6bZR)HT銁]>b\FUd)S{`ute:Y挳yzwgT $Up;pGhO5nNlNд.>8rF"`qEWL{,( itt%l*Xf+},]- L]WL)}uE3A@ƃqNf+JIx0L'+~4& Xb\'sS=+9j*#]0<8\fiO]WLij2JΠQ&b6A5/3+^ɌteFW i1 SjWt5A]9ThLF"`qEWL뒟 jhitE` ]1fi}wRJYt5E]Y)˖U0\S'LFO*|R R9{`|Ԍd;X +bjQ9 ZS3vc'N$OJ BQ33J+v>j,9A~8`sTp݉o܌p7(Ub͈SzR Af+U>]1Ōtj4|0A[\tŴ/A"a$JAt.]ias>u]1% dhR*EWL_bJc>zܗNCzwZ)3[5L*ZC65Wg?ߟ~i(o[j[RGq y|P:C_M uPoonbI@k8j}nh#s+܆f~aǷ=)cbg䋟G]#_{M?xjgWSV➭K3 )nk7߿}#'WW5˄(R i6axy?'>5_ =?BRA V%U[㙨﫫 }$5A)]ׂq* Q`jFZh ` %,o5O6H^CECoZ'rٲWTQe5[\ Fv!ه5 )ĈD!<չࣵX,6:Љl P ;i;ҡ9&mжBv6R-:6T[Tm:xIKs8!7oАjW[:XKele[BjI֢-`h輧>IMKQsݣ-} !Mfcv1΍uٻ涎WPzn킚VR~f+Y!-|ڌ)B!A\)H RE\>}z{ ӮzL-x~ h`və`Ro{[@@TR40vڠlc:cM!:F2T`!?!KXhdƈa4Bvc̞@G:yD!8<@#TrHگ}B.聺0ir@,W(P!6Ɂ)BAmQx4{4:BFh^v#qI0iD(Q]鍠o<X,X]pl&_>Hčd XqajٰYZhJ^=VFW,U((MհQbrAEF mzˡ7fpydS>\baJHuPvAjʝS3)M(pm/xN)G1jtPRߕziZ26Hc&mvZEo{jPB]ɒF@CmTzC{- ȸCMAC[@4zJB€(!2iNwQU0}k)& AE΂G iB:?%:T]5qC':#]!K@T>CAN[*MC{Y v-VBJAQā.(ʨq{A(7:@ Q 2|APSWАci]R{ETcܓ. R^l"J%)`V4W"-5S͐(qʰI&P}E׊X{  ^ ڻY,KT1K=5!pBB%`񈝏v7xaYϯrۚm;SV@vM&FL`-d3x:p~PQJ`"Jh+fFVw 2r ihyG a?X hܤ`BąAo%CJP$JE*2XÕ.hZ%f0A 9`ɐQgk#ԭ<(w2xh}`ՅI,TG7?XEޑ¶l:e:B4D~&A?$N79~}wuv)RurS} ]{ #AˈzuC.Ay1i&! 5y]%@_!8.#(v@_Iihʠv0FOZ%@rFG Y)֎ a:y@oBR @XDjF(+oVQ `9#Z&ޔl#۰,ALJ3-;kʐ 6"xtDΓiz3zxHuXB7mf}%Bn`E]1("^uN 'w F!#OCE7,F/].|n*z֕}7#<WuS[`rO*6uLG[c@s@@୭ܢ~W ـ4iՃ*H]6 |tf҃@LсXEZx tmKzzQ݃ -ި \K}4 bV{5bU!M`)o"aв f@2.6O 4%尀]2'I{OY (t| )nXIg@b~;pilOR- .!"˱XTZIWX/#)SZ,;uQĵ-* J+eI0H5nzA/YܲZGqQ{nSK[\#گ~BsիwI7!.(,ɘF߼ ,[I=`Xz} r9]9G/߯V?,C:˗ڄ?4A4g>V^j~gݻޜ.ޒJ/mnnh7al7w/?3q3֋bs׎@{ mkҸpjWV99|@K1.2pN $NctA=|qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8~N zNN 36we:F'$'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N q舝@998878Y8x'P೉:F'B'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qhsrŤp|  >@8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@8>}5Qo7?ޮuu}t)R!ȸD|YƸpqzs%F{q3+ ]1\;b(:B |̈؛0bBW6C+F)tut5ifDW l5bbFi#4  ]Utt(]%]v \ "%;x|&|@1{rJ*vgWwW:[(:%{O -^?/.W69=XPЩּ{{9pNӏl=3M?n|fU(VǡvoBӟ8EzFtZ5~.th]:tb^ʘœ lF7Zٸ:|FfDW(̆nHs+%uut 1Έ8z5bѦo2i:B"3fᒞ ]1x[@{Vx;]ygԌ 2:O+:]1Jㄮ#c΀͆QBW֩C+F1)m%bހ zzuyo*~ e?n_/ՒO]}bWg{8:]lU.(Seq,IzʨT4-?.ެߚ2b͸|1_?,yD~J_Q]z]_`bU⃁xW&T əwwI.6~/n=qjsKꗯL%;7@Blf@iOޞhvC1[6'䱓j'ׯ c'~neTvkm}T?gGѥ߼T10{f\vj- fDWm ]1\oBW6=P&-{aGIWw*y#fWM /Xi68X]۔aa>o[nM*w? 3 go৛N-nBw/@ _|\fg_ ہj ~+L`K⧴zr}|<%LۑlwZ ]KYYՖ zmYc4z⇒>)xι ^|w}|ےO̎mZہ o_Znn87{J*:휤 v4.vcJB%!5ΔV유UMTsZZk!r˔XCh8BL1~ z6| dNN^~qEy=({'qp{kqh3yJam-Gl-DZԥɺ4bͅCiҕ:9+~+\>ׅGLZ ]!]Yț#fCW \R:tb!]!]aR3+l]-~3(:B5a.th_]1HBWGHW(1iՊ׳+g zv C+Fie*h]10upCW6JsK㡫s2Ȁqz܏{HNW2V1U I!{R+)%~cPy>#maF4̀?~*ܤ\h`gw4H..Cgz|۸"б\^lA u%U#9qqFeGey+ıgF䛙>~H&]Df6'q{rgS6JՕ iUe^K^;3<[~, 6_~>m|VEll6Rb孎eO0.O]wŕ!7Ҝ*rfۯyZ+:rwuKO{xr^d?|Jkâq-WQ2S,p|WMAi=V^|*j|GY0͓6 0+)%,KA D:! }2rpN*, >NUL1N u* r3/9e&VhSBHЃS2X*xmR'paBst,ƬSE_htx3ršj3l?ݸ~m}UghD\Cf o|vfAu;/^٪_d5)۹.=*k!w*ڥ7_N]w-Z~NΪ7Bo€D<,r+2OBqȓgR#)|T8nUxtyG`QI@ˌtP+ +uJ9PD$O%@qgqơqߥb2y6`q@`blHjcbȼtBFkr&Iܿe?zjv $G D64s{ so$ C95HR°#0. ;IG_x&^qG,bb̡ƕ-"/y0g^S!nތqXףz`@_al7*a6n/hiRNDh=1gdm6Wq~4%=]t-FܼY=}LqNtٛLOw5htzڟO*]6^Qgw[鶜SMshnk~S[qFMHJ5^M,.~m~iԜx6-<_'g7q'=ʘsq0Jw\:HF? =?Ճw5qF֤ηU#W#jaLqQS,-x4\>`v)Z`׺mnWAxuu4-T}~Oew5aWf>?rcX[j'-l0jOp09#w˯}Cۿ~x ?g,!5 'Lhw<~ <xU?޽jͺ[V-ڥj]/..oyKW6vv )~|6jɐGMvżZG!)W(ǛV)nX~$`]ъ4rޥ1\jL^:Ip65t\ij$U}dU=k)*G,p *&N$eϒ3Ͼ.͆zid _g&>jN5$:ѝ-`$zA ѬdmM*O4'PBJq3k혺S4ke]l|)GB<)k%eVALSya:'6iߘsd_nMG1?dDŽJ0eSуy|g_V0h2!0lQeZmIt)M⠨9H-q,XV5VV8"$F{3 *E"{=iRgMCJ|!'ܓ\f&A'rS?5m L~nN/I?ْ8Azm%'gͼe"(Hqt-·oRT˺賶B y}(EJ." o|NE-2 9+{.[rAГТ "mC3D@kd̜;v+ʽolO6ݎwg՗5Ά^Ppa8gp6 -u*$ L&eI:/#VABӎxMb݊NieVQ0 \LڎYɔ 1e`;F̹x$桠vgޱ/jQ[Q`oxC4 FC,Ȕ$ RgM-b.cGPwY4WrE*eš2ݔuqLe!fN}xؙ9wa#}QvGD㑩̼(DvђiK"(tZs$v"1qJ\FP" 1U$)$-")i@z E3;3glD:."3/UǸx<^1!$  @"% KI@jJA6ל9cagޱ/Pak7U_ֳt޴;wQvj}cjm2wz?J(GI};H4f1TA;Vѕ]yDI!9?DȽ#%BeK")Ca,jnY4"$i ^ ),h#&.A(RJZT*!&:ɜ(J$oe62zYwfΎౚL/(Lp}jCWe6UXm"gv(ο@ IBH-B"]YiWȍ uizPZd fgE89z7lOZ=c $enot*O!I`vM[o{ʽT—6a!Ti]qt6琓OL12f'sJ}!E$ކf$={wv;Fh=.O'۽xFHXm+ԿU8V?TNWބHz&ڪ Tjܮg6 :ر0u`d8=m.=;J|joq>[V(`̎Y BP97Iφ'"k^?&>Fh=ڷͰ![ @\]TEE:[r(t(sTD^t˰s!PQ I_{ xP;u_oxS-jw (Ht 5\t[w/UHTJX9b`Ѩ\x"[wAksJ:> b)LʊHOB$r$˘@>dE8C$AįM Y)Y4M`V->E;}ҿy>pk{H7~1z}$K4Z}Ԭ`ZPX(``w  rÉDȹ R+7Bhu_EF/K)}ވ|+ٞ&Ut`5UrC]ZM=1R6 E!AiٻFvW<&/E`@Y$/9;8F|$3 =Ŗd˶Zeږ5.v*%ĢȌ0 ]"j董VeӢhj7F3Nd. W"%TR1s!MFJ3Kҡ%rbd/fcz;!PҔ{3[֭/p3w;}MW~ٰB%7 J; 8l!'7xS2OV3a3Yi\8!/X’,JIV]\^ck UȦR؃l({mґ F(5OQ*`UQ,:g d0Yiڛ\+gb4ј|M.y*k3pd4ifxQ%6q 1(U_Djo \[F  Atw6C[p{w̽cNYexuđ;#b÷f48WpوKDрd T_a#Bagvxrvr/Ild͏I;汼BC=ĽsL h,q2HW#WFL1@ "Z9f'qw{ Kܦoݦ]0V4Oi2XиE_ߦ9C&&xQhm}Ӳ^}\-LPl:@5z/|*!A*L7TėgiV8^)oYXsS G} u#~ªQVۮ]@W}<3еl{FH2|Zϭ\Fnj)r ZDak)*k)ZZ^0Χ6 is?Z5Fn9Y**'0&q fmPLo$~(#za%% IAB d8˓6BL9sYFq$.9aVawn oн~,R`AzsXǶi{m,S!`*h'Dzc&+q((5\AX 6Jbl$MFS֜9 JV$7RiUJ&NꆴǠ "O!jWN&IO[ƈ$J,xp% vHk-rzJQJ9QN7"ğ\~B{rDE)>x ,zX)'tO/2-H1̎e1rsn5m6`00"g1LJi&Cs9x!HB RuRh8m  0 2CY`&hd|V嫟ڍ:=۵Q$*2%12a0Op.LJ'P!hzק%o^RM] iCLGw Ci5)/P qo'v{IVH+QW„ b+ E\') *㴺[-ޏQ ^@`-@aeֹ5>y!3Ad4lFA={c[0քͩpGr,w!>j^kp߈7 i!jGhHM(D)#؋v6~Amh:[(Q= 7 LTC5^IO6Q3Y(^qѶ;w)C/rVʜ9:nLOϽB2J~PU݋e{19Ct2JڌZe6`d6ĝvROy^ސ{ H*:"[jL_M?~<(\hXbCBs%B,rN*iB GV,pdP:7em 0ꔹ祴NJ\҅PEhUo81zुrDh+W%22F΁)f;fʌN)+LdojA59m,|\Q +;\o@mWZ|,Ŀ0FrkhR|_\'z^Hy\RYUPpvƶ8E9 7h˧Wg3oZg`Y"c I+LY?&z1G糨ؾ.w+?c.]L';+`=5>h3۲ g=|_ (]b_p¹[,scʎȺ='@2&F-ZH2[1ZS(zP9kדfgL/lu^\8*gB_pAŽ.K.ˁ2c\ΓL<&#MbvӖ_:OFMڒ7mq:_޿~#HfHEwn Vd8w>ҵP"w6Gͥ+fˬ dD?!۲ n_F7ѻ9vOF64Yçx|7x䫂/̼1r~M;[_}tC< {&^*fY7VvkzAi׾1ygu?iPr%m{f,Z琣Eay\ҬDWETՍ0F6oad{9k>W;|̒5)I^ %&`plI`R,DJ5ұ)R).T.HR$]ra3p2PA=ƘH?52EeS5r>>BtxdVv.μ<Msz|Y1+ {Gh`m!W{D[Esh Vh?a%Wptg=|99`1YcXbMPzB%hV@4m6x1Ȓі(̴ŜD Af13N`LV| T&C[rcrYF\X-LD3ef+9/̻АqAi^F+"/]hյظrƉkZeꨤh2|!1 rj&U98XNsm.Ip&Yq6}6A( Z:bXTj8q9h])-g|Cnw#L #M\qkCoˑ;~3U,J <6 ڀCpJ˰ގlvZ'=YO8/ ʌ3o7L:b耑qt4R:vN#hE,[DIơ旰[tJ1Հzn ۨIiE}rB,R`D$SA21/ $CS xN:SRUdE$B˧V I0pBS!%Q4གྷ֌eHMR6VHɺ`Y "Y FXn# K =Wb"}F`F+|.;pJ ӳ2]G";hnJ;5X䚾fQ|Wd9n巳aaj̫)|H8&y: ӏŎ屎iaX$Ŋ~X])hC{7KTs6ݠnEeĬSqDM=b$G^iTo췍/h7j.oHD~;IYIiW$p8LB)2Jk3G:svna>K-9[^mIU޹q5)\sIriUܭxVHLYzR%oy =X;zu0n1p1}7?,>arq{ӿ=e%~݈i$$BWSP㘞-]G<˪*Տ=Ls+uI2zӵVc1uC?vF7?#+rTሗ+jyWwŋd0M'PTaetn=\W; A/LҪ޺`\:Bo'10]X|MVu Q=v7V\OSa"j4zͱ>Փ[k<:V<$g%Z(Ͷɺ, 3LLQ N&Z `"¬zKb U:0p iw4_MJL; Nha1yF^Da-B+;0:WGiMԦy*“U8kT7ʑ# {gbv3lMp;6n,XYۻ5UT]+)en QWNx0!]0t$dZ}Qj QWY+Wc*bd^FՀtedzyv]GrTOu9@?^ϚPףǨ$lzg\<˳zq3MłWO DX^]]%?zeީ0KzSsO77k3\\W4b(dԛ_[pBlkԾ8mޛӀS0K%k}@%&wkFg݄Λ[{=f`-oV闿|>PWUʩ,P!E,K,NA6>y oM8 Y mUPZBWGveN) GtֲQVvv2/ |X3l"*ַPJ[zSi vVM-.ɄVǽ!x)(rUtHgыnjRoGBO5 ?_0/W6sD'h+(E|rK]89푺€k!WM 3RzUׂw-] =H@ok:LEWDkbSFd] GWRdBR #:0z]1vYWԕVIQ&+Ƶ2]RuJ褲+6W&+U>v]1%j tNcpiS uexG攺 تt 2^ߝtŴŮ+tf] PW:HHW`tŸRډu5@]9TlB"` d`\Lcim]Lu5]yZtEdW}*bZ}S] RW;ͯ ϸTaxFh~زp"#Gpݑ5D{a!@Ӑ5ЃVڄtENdtŸ֦+)%d] PWRf8 銀q+8uŔu5D])M7JHWIFWt*bZ)un QWhW 銀LqSҙJ+LJdc/F{s(ͺV)銀Hq+U]1%ue'+xH +T95Jw* b2o]WD] SW* Z%{nX^gMۛ|` !|&Ay?TvJFQau^-亙^-aQv&Nr,i'u%f\+ib3Of4z~xSd  T#%y?)J8 rr9׋#k:Xs$(UddecCh 銀q +6v]1eoRt&!]FHFW{Uh]WLrv5D])/^A+6:]1qi=Ʈ+Fg] PWH*;׋dtŴ bSƶTՓJ{oMHWT:}WOEWL]bJu5D]Yb:b`L'bc'FkϮbpt儶2%]1MGWe*"Z*v]1]CԕJHHW L2"\ d`Z@Թ18H]Y%{/rMUɋ鋛6n .TUXZS:R}ZL)l[;m ¾C{؏RҖtiPoƯZnJfvo34˫ygwWݏvBGWb޶# mf|)^鷷s@mhU"TUQU(.J/¶x=mߌL%4~Dmb8"ڑP0`VmO$/"Gf-˭N_R'tUp͑4aH 0JYFd*g =*4 銀Q@2bc7hh]WLfYWO+ V9&+uҥ+ tBjRRX#;TqIEWLu"J/EuRitENdtŸ:(Qfd] GWZZRj'`jg\LvŴ~񲻮An QW9$`M2b\Lve>"Jrcp r 銁]:A^[$]1)f] PWNYR@Nfq5+=tuw]1YWԕGRcM duŔ*wRWVz'%+` @ķ͎r̪ XRf'*%V λŽRP8kSpF^Pdɛ7fŁD7UgՉgJ v] aȫ8.a :ƀ] =q &+U&]1)ʺZɄtEFdtŸZ+u:v]eQptEMlb`7 њcF)+ՠSʮXdtŸRD˺cTBFJp%`*bZgcQRu5@]+xW]CudtŴǚ0D +kDbOQ #'mh~ep}\G:aH(c{Š4zҞ`] `2]16i]WLtu%A 銀tKWY!ɮ]WDz3P+兵>!]3:]2]1>bJ! ¤XtdWLk}"J#Du1"%]10dtŸ&iF4g] QW/W]Y~ch1)ͺPJ+Ƅ+=wA])sW u#KNW;b2uŔ+[g(DVGlTdZ_r|. 9G6}l@(GEMs]ofBTS wRJqxxt)(S̖6'6k!±t:weBh8֎A#[t-ܵhc !]-zFYWԕ«t'+)AVuŔU䳮+%QNHWlNFWLEWJj8nAJjBNIWOϾtEZTtE"QHd]=. Zb}qJEWL2] RWFi)]t+=wa>vTu5D]YAWO •!v]1QYWCԕ^`J"`ΛAMg-Sf] PWN{/Nz~eHyV8_x|\^ux7>^P)>P=I_FWU[bf+ߏ/aa?`;Kao i{>+ы~7]yI+|] {y9jy+}x4gc*s[yR_-w6}nv=5۳?X SA\imy},y~zUeGiy}Ig˫!SŗJQ۹-ëld~NE)>qïϝ눤7^0 ;fѝ=1nYsqj&ݎӣj|2?ϩ< 1u6;[3z6۪ y$Z ӣNB:gq>MkWggz-._k;׹[_>ތG2;z?uOo_NZUY)[=UR5N}+[]Zvmot,{|C Os%EY"(:m(h߃br[psEO69x]eCOl^|7_,rnP>% =IP9D,{N]}xT{`՛8ٲG-`c͗DeDa%`5҅76W9)wUU 5UgdDGix6پ2e2i~n-L7Ker&x)FOO4:Vnrܗs?;cx.%Gu~SS?OX2w&9n+NϖYW,ӫ7vBPP4:uh_RХ#_ng ha+y9,XP_g.{Sݿϯg˜! sx=G "3 rC\)"x;a˺ZR {;y=]r9Z\MN9"N?N;x3fϑI>#dz+1B9Wp Q+A Zz~o *|p ԅ h%Zuʁj6Ws:k u^cX~^kZiJ**)Vjj@.V^cߡV)u7g)==[,>Fe}p!,~5)mvzsO7gs%;ʀH =ÿj,zǯa6H~o):zݧ )JF+hs%9IUBP7ʧ2w(\BoӼYM+ϋzv}fQH{pǐZ2_1ߎh`gL0L ۻw)i<2=x6ì}cĨ:Z5$w>-/JGW+W W'ZZns"#bԪ<ժ?꿿EmGϮ'So/Ǔ\r~q3x]//eIQmYV9nPI Bm5]YՔ,M]-&bܙtiKl7m*-FVDӢi+Tj]7UfS#žbP@ƽgXaٻ6r-W?6m<$A?ܙfL.ۑ-E>dɶJK٤XU>Gg!CXֳw@RsVȏ=2e`bDy( \Fd{3NNCl=mA"Z "zb{Q `LLJ´uHdU p 3VCq^0mʈeީyTzF;Q9*Vi5õa1v$2btŀCQge‡w/0!E $սVrˆl(JdcF 5~!p2<!KxQ.eϵd2V ڽϱf|е_dז,/?]' kt͝Vs d(أ'HsKJARLUo#vWCK്Q_e\ſ>x ^3DvyʃO$ɱU2ŊBd8#I2bV L0$%-IX&)e WOoEMY߈[کݸWՠP]dY/ނ\1#QĂ2$: i\2n8NssnVcg@h4eHxYyjdNjym+Wdm_܉߾y˛%~R:2ٴ(1.UdIkpN3}\7XX*s ߹ls!pz?]q "1'?. j-rov13ҿ煠K$/]%a k:kuZOq\{fI3Y0a?yF=%&ބ#L⡘I%]=tAprF8h,ÓPgvaTTh\Vb߂HN,јd"MDDI#sqЩ2KMd4&1V5ܘ\kC!ZzoV\ ,bHS[qI2h"i"$׼vA,s+ߝk z!ۧzZ8Ʉi25ENBia24#T煖ia[(t;}(D]rZ}<;`C^Ăc}Oc8PVHVRʿ97n1tmQ|VsP4d 槝3&V~4X >?أ^ g쪄mmƬyQTRx M'߯C~]+hYм(Hlz-qTb3w%y"` +hM]ǧOGSr63}nn .j0 ýrKcV GAGV3:gvG%`0~ dɘ-?D`?dVDY﬐(cGk+<+u10bQYncE E!kaKLu:@Xb4PK:@ZOgnWiuVςAP\d84j?>*6VP ),1N/c`L dM.ϲXS}NaV*(k*U~Ϝj(s%c֌Ql(3uE3s0:SЉՊ 9Q2d=LOQTDh6~묞9pP yO ]z0 _L]#6;D&d52لخx8vgR"T xfp96Ꙅ.PC%k|7ycܾ҂J+@zfB c_^&ƲCT@&0?2—֤!ai9D{-r|.)~at |i앤g9MpAhNxĎG$<MxFЊ!R WVvo_Abĺ@Qj V;z>"C+Hf̂+  E#CmmPӈ&3R@9OgLb-)`@Y=x E KG L:{&0<yA WFh e&?5*J煙rZv`Bçn>s']d<|0͇izm ,lk+*JYMP>um2 @HzY׌)'#E;#6}H JU @Ƨ>ɬ̧%ʒlY:W:\{ׅOݺv>+")6j"¥*'5(/Q wėx]+V3ǩ\7}Dk.奇]\_zyXj8m W7Tw]TM>d;Ew9XbV/fUU9ewG[i!+WTA|r;8sKT)o !`h=nf7ޖZ` P--2 />J+kZky# Rr~|(EӴo4qcTŸm.^b ׻ru2NZފm3yђ@B]"vXs_R[U΋\婔)!Ht&5É֙tJ ǟ1+5˺|h7'uCL nW6$CU7VZl5pE fp_z3;f((i! Z#eII,P^7X>FQnjeZї fOv2MFrR?dxӷmz X-q>&AhN!0_:"I0ֵ ǝsA4&?E?Ƭm[_ev}Ea7:x[st4LதA#V/%V&yZWv"V9OQHI( W\;\b~Bpa DinֽEg.vbOt7HT a\>|QpA:`RAvbWYͳ~!Jj;up]_Vv[j`; Z[MZmF;4aD)7t]s+c?&ўjؖ֋]==6DȶwCpk|Uȶ6W u{eQ6ʣ^ bŎghӲɛnGn6b kh''f}ȍ%+w"5<=G+1.B<Rv[x` nٌZh)t6H[f lk&:w4#:fnDT6c% e |f?qAQ,X*]x mٙՎ5^ۓ&U^A!_sF]t!Mh. M,EF^nR-GqsGJ-!<-=G]]+B+L>}y TlG%?h(>J^%Ȥ<|9&p[yzQ/Jz ֢ U&F2+FlIQL2UxBQ%-=˱_,& 0yc%18a$GDd JQBHFՁy6I1^զ1 Sϊ,R2(A"w@J hjb*PLj.#,ScAf j:l zf\ubח<.PyˤrK*($F<˔iՖ' W}SDaU,Nl 3d)ؓ i$fLdFT|ڋ1ۚ: Պك³fw- {Ώ~/n=4ԝVry4ꆠ֒JX ?$ B]y#I,$T)q "KzsKBFA;*dDud)+i6C<{;m]<ћ*؛К9N.݄K7r].ra1wݯpoO1e:4 %Y RW| P2i*& KOyHQۍT|Ŗ pĎh|4'"8ɑZջ˜]OsUSOh# IMyL.5+do{*CA8~w0hڋ^"X y1/wF'мC7ar\MEF|bz f"/4 $؟ QC%Z4IP$yoUk3sUy8kmn)&j T7u fde6NĤx `!)0 pE^(b(-k{OĕڈLifG5g+qH1= :L ˲$z _Or)ܓ/`睦1 w| >?[Ĩ5.eu9Dח=}b鷫T)8ӝT"1{:zWz֑VOKVLˌI):tPK)wZU*֤LVfpO)_\4<艿 cǦ[CuwJ J(ee_K뮙=(JCG 0Ϋer [q́)`Vهŝݿ"61]n11}5s iw-͍te*\y,S,3$,>lYv_eS)CjSS5mw88Gd/DBp?ȿCmJ m򚿠|?# t{PY.0HHKHF~frdZXX'Kvsſ9OHoNe1?8T5c\_ < ˃\d3r@I?<ֵf 6.3AMgTZ;w6 8ǘ@sZ"K_=a Qp{ފ'PzeCdF~APV''uTQBVYhCR͕S7Ăƾ{(2h{Sqo[!B"I,sN}HW:zvS#>6'(;z|p~`P(@^kѯϪX  B[tca]b'9HSc]cXf6#IyZ":oZ!U5"^ւ$Y7r9$BͬMS&R XQk+&Z:W,^of^ C%44fim?ώ4'鬲/sY{1}J*`|֞TWH 0P|(ؙjίr5p<6,AD0Eӯځ}a>Ǒ>Z^F*! k jl{'Rn Fhm/xM*L U*돶**f W3ךK hk ;u ]ҿ>٤*Y͙m2oՆ*^ZG5q$Jc ^l1F9.Q*!8 `Êߣz,ʞ+l?!7V2'f N=N8ѮWzٞ4;_ a B#m{ `bb]k/k- q+qe^<3#ؐ$$ɂ|^ vh1;~o9`-c\x[<ⱟ"_"D}A³h ɩ2BR|X$e9rzãҔzcb<\ !̙"N)eL .g~xH&ɾĄЊ\\"=}/"M5aof9u` fXVgP3g@2709lBkvԤlO`&~ o7>KV hz] aeLz4/M"ߦ=Ei_gxl,d"YVJp.J$9Vֲc(Hy!Eڟ8_EDSTybO=8CN$_!˷3}.߽q.+Og[hBǮA5޵pjގ r  MkeOom*EjZm]=sv"5~HPJWu<4h%1k <;Eǵ2^ N "g7HX,ˇƟh-%~7yl ʹidʾ|4RRq,\G'yld&v.c[ rP~nuiscz2/廣:wd5ޏH?HddžGNXM1 cgVB湀 -ju5:|vHCOķxbX4nhHʙ(>+͙ !VzAgCހ՚բ~U TW+\|6& imk >@ $%+m]g/8r:Y'B;2pB:m ]vsB% R"P]ydR[Ml*IpI{KE^{T )UM\MubMBq̚@t|f+tBUXY VXgBړ\XmtqU`t# j.aJ=kuR_n_,g'/7ZR$C-\TBig`2 xJdEizd'># \:g&RJGsv\hsexY5C4q@t`tv{L !c4Ϣ98Um =8ajv _yz?Wfͧ:Vn<2L ?^ MW#kn-$rf4/G(2Rqyx>R* ΘSJVjOpշ^َ,^\i!EYԅߩ}2V%ֿ!8CS ng5{D]h툅ͽÉ~g1KɶspAZI-T^!ž)3jX %n8p}{SMC]|^*R M2ɑ<'"qŨHM!:=ప;7})7Xhk@"^PR%bʪBa :ټVntOr4}tgNRؑ5;dȢI0] + 0-(3VY*QGbͦ ՑZzEkh\@.۸P5IDB5&}֥ڱ!jz4aEl('CƐ2AtwΘjPo.2X\.rߩz!c9 Tqp'۲C=Z&l[^\iR1QYUַ5BqQdBW9EҮ9o I>4wkq,KTnG.f f d2kpHSBK Y؀A2fdC9/kJgDmPtvxs(7tj9^qr9᰺v pUH5}2zJ5߾@HAYR*O;Nh:Z3%8Fd ic]#;pqjYfEN5` @N| a; p#&v"e,ߝ1a4sN}ho5zr8bw ּܬ&PIPǶȰL+SWw՝[;>Ykmf<;.RNu=$6OO;d -Dʰ[iis3""e!E)zkDO[`jx@R3G1)̺ήAs XDbcڔ*2Su)4 KT^nmC{^yA'==HzhlRlz骜CRE*VIZ$]cTc#w*^feqV"9aXd9HB}?Q&7o.ft↜fsR<1q X@Lwik]Ε;NC1&Ok4YJ{noJM06T}']ʣ?@E;L○$ CaN!qxh>Aw" t>~"xoX@ "љAP|u6HE+e$*&g;k{Ye'^鑺w(8uzCﬔB|Rp`y9Đ,B:%B/ν-q!AJn`7 O2_?˛)W^oHȔ!hr"D* ROgt*t9BTTfvVeX\ HHKx( },7XG~m&-\Xg/Y޾ SI׏Z<~lُTYo3xEc8͵dHTJ(%q8M`LWBoh/n?\ǃzBN)oW[KG[/ՆXJV%gQ S:ߑOVP(`c_,o{1|y@\XwEߵO(`q[1|Z7xH#^(ʰMK*Hjx++{cxd,XTGBPX #5 xt#č@(S 8/ 5Mh$7F8#t$$>`}'\P=io[G03@0`,l0`6 SEQI[߷ZG&U]WQ-~ppB)jޒ{\|cQ^I_Q"0H" ͌K%mvɯih-pmqxm&D[˵!)qĝJ O7w|Hv9-קAǣw?OFkx̯E& $'Y#CE(%I6\00soWȜ\fu 7h#nԒ еo(9^-ʩ3m,l 0-I^Si Yް"ڳy6UhR ͯÞ`sQZetóV7o)ڞ{uZ-Ⱥ&{Ckei,e:{z>m]_*CoQ|E {xm^K.64܇ C o_R[ao+˗/[9"vk0H%W8cAŕ/ ԉs."ZK&'u#kR^E/OYtWik^KCQǨ7:1ի[E6WHk'kPٟeAQ^gp\ͭOhm;36AoKPP%Ѧ 0+ D@™_}GJ|55%D8rdE(~E 2r^O:d8uxyoGh܃P]IIƇ ($dVZEb["d0#u\J;;;ϟ fg_>~̆ fX7giʤK%~]mo^`pU]]]ctSgao/DFǘ镤rVw,2޽|6.+KZ22N8L ַ|ۊn]5}f۲9M?[H_"(n\;Q!DmG)ӳK=s?oq2_xp7|1GN8nG8FEw.uN4L2ΥDw'+E"XsQa01p 8FKM Fŭ^[y~k u>}8$ cG ]}|;u6Ͽ xɟ70al͇Q .P.CkD-u8;o&Pӷ|Ӎ |>_i5 u0Jqb7F'h(4~HϴvhASM7w{և eɻY`G;fǾB}S:*s4m ٵ"y1`n.D?u3{FڡC@q⌾rgY3#4酺>這T-c%E3e$:apčǠGQbɆ Ĝ6qt xκ @ ,o7sb}mAϢxni4D}U$[Fɟ-^tٸRoZ|w]rQ7H1]ozR,aW()v^S4dmdI P* =b"xrtKu X uvhPYr Mn bJ(C#ўˑ`L(]a J LSYkޟdILrP'&ݚ:2s4%DnYP<`w=aftzѹh&1nv)F8͓v`?ccak顷*lpz60t4>XX ?ˇ|~>pKcPsX/ }vn}=߷V238|W#t6ع}7iSwƹtt4(0q7}ww8tJjdmiǶg<\OmƪGr&x ɽZWXECS&!H_}.',z]:li|AW[܎(!kYBUDxHmvp#Q,ȅ&jFaud8jIcJ!JjxrݎsGʜנZ\lOx*6bL}&^t)s̨e؁TqnG BJw=X=cWX#3Y%I`)aKXaL(F6#VY"[DЄ <"Yxƒ3fs_s[K(KJIsƪҲ, #kla^|G!p`J ՁR6gDJ2I>g.g,i-G6$%7`8Oq)ˆi4͆]; 1D D"uYRp@`Acd2/`AOa &,RbHc૫/g!v#0ET Qtt/ħъTznfxd x*h&,n`qh҆h/r~A0RF86)j.δM^\s˞fRGAA{Ϋqkco7AlsSUSʩ~ q|́(|'H0v-';-`~[f{너!XrQpzcM'G;%Uz7zͳ̎;Jéd'8Ut;87ZoqC\Jtu˥jnݼ96]P=S%r]9ToWْ|w@ER^M~96}Hh@lO /lPV;ܴR66I4bòXS $^{qbW]‰Ƹ^ѢmZtlkjsH˝ D3NHMexz|UQ9/播'Zڊ#ޮobǒb()$39ca2";5[4u6@v]h(ĻE#wY|d1XFm하#17oYȁowg'qJgx60߀oZjEP&Y6 ӟ Z 60Lr<z_.[eKO4 j]h7G;Kɉ軓ˢYG %|y%8=Ñ1xMb$ı H "O`O 6u>ynПߘo&)Γ0)}eRKص`4&Oj|xs9>S/h )"~Ҙ{Dɽ$q֢ 0q\m%s54qq[9.ZB_`@ fv"(o|?{m4 ?;hWĘ+XemсHaj;vR9GSCsAm  a ,OTB`X[ B:\/MiWw-u6;* 4ԗ8/' (v90hwcApv90CER|lVr} EltXUy@(U#AY#Pwlesx.$L.ZT#BCd> n@Jk=`܇r-WR uzy܌@+=@7bo $Fܙ\HM V[5fUZ W&Db*ZguY/M juT&#AdX\zADSTTb,/L q5s&2S! 4@Z:bB"Xǁ;> |ʧa+kuWfQ[CXnEhqM ^`5^hAQ{ N\1JRccfl\5Aa N.0Ck[\X=*2Qi "Dth5 R) -" 0Xb9|U@߆!fg@다oZZP,۬C3-UF \a-$`:)O`,(`x-'r+w2:Vk-qV= uԦVzxPI#ͣkQfe1ņŷu.ct\\S>Nj0>!G֔`lh+#]D*Z:/Mݸѫ0;k뷫x39Y ()\Y Mm(]x(Mbs! --&JZAɳ 3~-KмA NEE-w?"f<="߲͗`ph<{^,r~kbNqo~ oÏ(?7f3 rAΪ ?F9,(>NM %΍,q&J't3|$B8ЯpT76q8C-2)ާY68É]m,znN Nǟfs= E_4&17\..veBzo8&|9 E:_<-Ի6xz.]APP܍xصf )AwLəѲ^koo/*5- ٠V~ nWI*D#^8rsђ:V~rY-̐>JpgNʕDB~^،WuǫuRȯ7ar0?]*ɩY$GӰ%OK[U{^R =P+AMlIΐ&GBIyBG6+H$fkBn5ԋqStM` fykeџQjIhx} (ב7wf>k4S5աuk;D۶C(QZ;.V1Ap R/R!skN@҄O' c+S"3fѓ}M2/_b^~? eË8/eCaȧJ6lx#Jxid2 ;P ;siy&?Ǜ-_8C`'l4c9f)|Al f:}Gy9 a*?qa'xƾDa#d؟?QL%UXS?_? pž <&sqxt`=pEiZ4U=y8"'zѼ6e۹ {E>:nhFX' m|=]F[jע3bZVZEiB) \8 03? : ZH cP q M S<\LDL;l8()FEM~@F`a6کoR%;y$4SVf:- `0C,~K<{dJjx2K6 $3Ke.R λ\G4*TZ`Q8Fg8jH<"Vae9V2]"%) Ɩ41fFP] ~wSeڼ|z3N35X֥Ds1o׮id.XYź'4c%}n'Hﱘ;3?5ħ1IRx@.rUoV(/uV 3vI{xBY+dI@h#W?NtK1a6tGg98:2ʘSҙri$%+2kJzco)hdU^E bIf${Va6[h6UfhY~Vt~{ tSJ 㑖Áy8ֲP*{輤qϽ^n.!Z|AZugi7.z+ڏ>]DrݑHLR+ILuEoj--BuPLBDL0h 5b>4(7M!+_1RB(;_zf5]:.NIPLxm탏8˧P>ϼ +_zFsiV wzpBc6E.@_MT| DV6!J؂f.|֊{qWe!jdɂAJu B-@ȃ5PQ `R\Qf` ne*=y~5CV݆wd&Oe2 yu݃v)t?!]eśrgd[ܥ@k}yLTŊ '`bJ_2h+pPDڭu!c%Ȑq fër^SdD.7 jXA5O?{AD[JMw7 * "ea@]?MDrǵevVfTW+c\)a< ݌]OL }o\q!.|+ ` b+s_9X2FdJIf@"m@'cC'[.9ɇ荁BYЗ -wMvYa5ÒQhukgVg ~] e}?`+Q{\a +1[h4Dж'ԏ$4`r~Z.L}!W1V`te5W5CJG8RCu8K#4Z'0_0RJ޻#IN%#  Q~;aAF`ʇ*l;r/ "V0Nt ID1y|J ~U0v crpeK-]:!e6܆:KEfJd* 2 I|7pCALk/fCָ̋0#qL8PH3#v&HZ`Pwc5x6Xn[]M2Jxmխx6iՑPL0Cu'TPD ?$X5a.H-~"cBJm"RuK-Fu1Ulܑ%G/:c-FPCb{Y=Op γ;Z(AW 5T5]_(Sa1%AOmQͽր7  x~ҥ$F["mdNB#|DWvxh JGS$$ U$$Ѕ R} a! ,[d,V>6tȋ|X-Ja ïVU"YCCb~q~\oI*~)xLSuwwWo .{8mpiifc8.98{sٛq#6l%7qRT08&u@dӖ1ԶJG3,F4O,HÿGll7VSRE pQTl;OLũ^Z&N%B5ՑqGWNըBrJvNR k9cTL-O6\I>_EwlKݯ5 0m N-(LPvE Y]ԆQ͜tz3k~ LJ3P#D7ԕH?xa`?ǽi6*9OnQYs[ zG s|MW1Jg*Tk khcbk4M&Sϥ̔14/d(нF%6ǑiCKsH  D-o|.%!I(p!)Xa_o%d?RIE}trb?la|;zc1!!-2reV9wy}m1P؀Ȫ( Pch2py%ӏx|_~7_O{ {o2}ʻdp@!8B"W ojx{~eg:OϏ߷3/М[ 2wl`ok\ōECi1 F$82MhXT%6IGWнбe_T!F @(A `È`O}sMXS\+>K@[^({GAC' `}3g9coo`xr7?%sC<\Q~=n~4I!HqF-xYFM~&\\/8iLPʭii>)s!)lY(prŰfemY~oi:.4v͚Of}1:}eiϗr 95fss]H?o?ݱ3C N+{{-h'!%*v(A,T%%^ܑ5_*PU j;/gy3N=3Tdg)0$JpF1E*"0XP0 4Ā^'ZpUeSBTeSrMee^ej84ʾe'Oa,^ j|:vکg5HR!?#Q]I5>囊66Nʝ(L0?SpbcL`r')w蓼n5.39+N֞/h4-oy 3ʼnVg{r?2ת4ȡe$uN [e/E'JaZ|5aq3TXarnuߔhLɵy%T&f gMͳcLfذ$f|\-Zr .OOlG5`AZw6|w1XMb@ZOBh&eYZt>2,Rba\s2%2Bp٬Y6ydG4a8iKJ2 NpAJ /KSZ攢KU&I\d$MvXjEpMkgTY@&E^"iCp,)DXKF61%%)/H'ZjF8%D \VǘQ6Uuo஽${ZA>|j VB6i2X]V,C^p΍Ht9[? r)FR.we9jx\]qUNJ0KU)kV>"Ţ?fedؗ xMIsDF3R_2v7Y,V}_X$ &uhE\T$᱌_nڒez4c3i3=f/43dJ޶;0uoJC^`b(L˨S%ٽGiÒ@.%ɉAjo괂9[nyJFJ흿7Y"} ^-vn T ? xd;9K1Ɯ^"bDb?f͸k x\vfJ :Y8lkcMW}Yk- ZZ7m,7_:zy+Y3~cO"(ɟv}rudiXkOWl+%p_ӥS\u 31ԫKc Gc6q:Wo_߼ߎForIn!vY@[KF0bMހ. h?ʪX|#%ڲt0K}s?5cbGZ#Fɸm.u djQɘ~a6xB{M v2=6n{DǀEZB?^HpgIdR^ሴ2sD/3/}YPmv 9xRo~[IJLp][K?Y+x۫w]5a4tEcy3sR[ F܏F%̍zz8[xgM"pv]C~7>M}#=}_1AcK x<)9}Sȏ,YO =ӛswuk%p{W8:fA\ahHrqORecCJg9p?7v<)o x.oIlဿl 4NTUIS|cJqY?7/9gRB~Yh4w" PSc-An9hpT_bK,N'(>Ft*E8És"bUsu};T}=//ټNE:`u$=;>Ӟ2<}<?B*ҿb8i7)A*VvzR9҅oޅeG^;O'c7ui;~D@8Lߜ,wvG덱>two1Fv&g7{-$6\|xl78r,W èjX~9ZӧzZ\P] Dn9QUhS>/R GvOiyY|:. gF/6Z"1{LRXk=`,5˔K0/E~ŋЭrmSnT]1IT!άVZH%(H $[\bU+>2X 7ei;O:qq k +F{,LC*L-%|Ws\%͑3|,u,uc $mΔ(C  ܅oήFIYB7Bog^)q Zjms=:W*D*@8Fp m)X(tUVkmJbvSy/^< ;>Mɑ|pmhq][LA/p4ͳ,& Jם%=U) &bBj:96)AŔꫵ䁩AaIr.DNys&C0Ne=9Z1a#1 &x2\K>}X˻*n;*nl\Tܸ+Pc$5:tvC9j)ÕSJ7=h,f߃~|}/7cGֈk!4Zqe(+ ߚK5mh1Umhk$8 JXEƑH[<-}L:&qgh[v_vDr=2Ǭ+ȫ(\WuM!lD|;,-8`UR ԫUD֎s`[FF,aWwy4F[~l#4l -.`=G3xؗ QՈǬ@uٶTkܴ/Ko7K QZ7!4$KlvDgb>Ͳ/spRz!Te #bAEv |5 K %g~J1@ª'VLͯT%yOZ7/+ΔDccVHflf2d`9/Cet=A܇?#X]a+럯#Vw|?I쁹;Ϡ+R>ImJ;nΑyxGG-@ƥ &:*wWt=)~݊CT1DQHydS9,Nճj8VTn% ,.٪AH^[Սs%'R';h8nhsumZ;/>-wlrJweq8L;m}ڍljytۖnuʒV**@x 二QDO /tW:q#|$> a21s]#f>9"H\8v!'bec*m+Z+Q2x[0KB#է\/T/#f)=^T\E]p&ecBhTR2)pˈ_ ~UnPP;]Q$ssAq+A(9|cNyR;]FAzI.f QaPAvC_l{"N_Ap s#cP]4<;eb'pLG vaO߽ˠvJžvT<a3Sp qPD7qP'IEy/!-l,#j7=b>?p6VhJ\6`/G.^g_*/^_;9:;s& Ytⵙ[H~ݫA<> /rt,#tT+]5R! a!tSԝ:_-⛷܇<͙%Wocyb2d ÿ<2/jGXnc-nBAjܩ.#`|IN)<m[Cy1o|ٓ/|,㬏u2;ޡefC>}Dy]?S_#4l~8kMvu42\,EiR--g}1(7wP06ZcV6aS:/T?)U2bk@l`芙mƾ-Y(1iqdDN9pX!ȅg h0[[7 sOƫMsL;G5xy/R_Ƚ@zn#gtP)`OYm/z5IF9 9VmL m/"dE!Y]tnPRvrʡ&XGrP?u'3y,k] g/yUbfHx쬸/O{o|}6b98ݚH9˖O%cw##wIrT@MH2 d?2FFz-|3/[.6P^~>lWn7eRD_ MtPy⊓ۼ)$[.MK.:d\B3[MLɒR,O^-ckDJokD@xO1r3)KjV̸^v( q4<&*'Ds= ѰXg,P{=HppF dĎW07; $"M(v$ .vjr ZdOVבo~9" b2+>zKх] T5 79g2`TTAHRAgʊ4 ׹jĽ}̊X@Et6P0!a)ӥɢ,6aH iL3uDi2zod5?1ߘJVxt}ϊbhLNk.Za]^EԌvJQ=Su+|` m8mnn^ #ej Gy?:vL5~LUM$Q{%5t̓b:Qlл*a\=${Lu]\9#itr##> [Ƿl  >iFƇN#崟;?u'PE(:tj.M*Bթ`u{9-mN9RK2|>ۏ;ů DNt B=ML %%L6%At=lP᷉Q)hkJr!5J NGqj-x-2CKBPY/r$9C~ m$v &Kx(G3"0}F''/[Y_,`Ak9 ]ѽȱz{BhJrN"V뫃S>K>Mdյi$T`皽V$`JaJ{^iw1ƬDnYUYf d[mgX]V[T/Ivܳ)JI]"'0w@i'ղb_oLѫEBTO MѬ7ɬ*?S+)>oULԽ:0}:T,e\ɒN8_"Ad *y~Y~qe'+1|5BSu&TckVL9彈IΤUX%)YJA_\rNo껷ɮ)BJtʦz6/)Fk@6 ;h\׳#<$αz|9oЂ(&MAHj((Vqn{'*.dl~PK&SrJh&-labYCiȪu> =yli/OG2gHPzYU鷃>wmTUx}0_To:vA7tz 3uXyhX lf/s8EVݐ؏#t%g r{XYr/Sn? ̬wכ2n5\(@b4Ge fPl2yR_EMjtyjqrշli['2D>:+6Ce8PǿElV\pEW_Mpg4As5AIrnʏfO?ɖu6=It@ĸ! `1^4[؃nTV@XG-ŇTw_&2 KܣM:^ޗѦ!6ŧ#cPt#ȥg*i5$d1#GN(KljbE;تVh Ĥ"%bF^˄ rWAׄl;+ 'RR7J 6fƔWEtǟ0_!)*4Oi֓)AR51Z0>q-^),f)5+8kʊv#kkjL*XpqQDnrBVݠS8^շo=F<0ܨE5 .o=.?f.h3UgWh#D,эF3奋xh)ꥦ"2eQ%|@j!ER;n:j7{N1DzI.8%EA }1 1jD.ȍ\Zn٣WbxPAgj̩.#mzbDލdG'vD^ =^9kَR;r84] z*]4`QE7IRpQ`=>s$MnM5KP8y8o|*ioܿλ\ t}i#:;sRAƜW"ؕ3oh Q zJVfZyX 㓯"E3y"(G#`fY"B(ЉH2ӊEt~EDbu+2ݠv@Zbt=j=cP;0z @n8#Ap jodػnW9 EjSun* >mXrPԭӒmSIERS O'v~hGy1+maNnI5GL.#Ħwo^B荡$]'+ewR3ye۩ː (2niؕ8U:T;tjvJ`b~r &3ka|SJW0+s<0MWA q@ M5i&К{ b 5mrutCZ%s2k4m#v|mrH}q"GZGڞ[Y7gҊexr4yQ\§w5DsX%O?&jUа-J)}+5cMC1*PN9~$s1 $dFԵ*k\A: /ClHL|k/oPYpfI к3IꠙjTU bhZa2=P$TZ 1QږɍK L*F8]J1Lɍ~r/w=;>x7MJnt?wxyDNn6]xdrf ʝ-&F#Ӌs@T:5胢9>EQ RوH6B*<(RL'gr- zgo)j5PopUI쟞Or&hu>&DZ&01洗l8dHsTP~<:R[1r!J\O 1-z(g I$Vo5"xw=Ĝ(QE5=7&Uj!tbw[Lx3>hd6\NftNERO.DX,(w3k5R9mu٦˱BdbS-+8|}TygQQ[R]59(Uk -SvP v98f88Чႈ.#C~vMǾtSkG?:/>v}㛾{qY G zh o2`2^lԂ!-ērsdQ&6àC @gHFPY0f2``%\ O4'5+o뻳/򇷛L abo;T Ov0VJs ACt 6X&:\fWA¼Rl$?\ɡ_44ÍBqc(jM%IjQd"=[2 3v8Ig_(]4+f `IN&2el9TnNQRjlL\kQ|vF5ƙ ܳT[l桖`z3[jf ';~_6z,N-x}kOgjO}~vXEqfQ4SOå :q"VOُΰW .+iD;Pz/=oc5f1XخqݸNՋ6f)$?Vi5'b`ʪm vƊ^QKh!s1dǒ3(9pwJ詮r݅,%V?rayMyJ]{ Ecy=}+Nns_eCucfop` XޤGi$lǻ?*ZI83b>7C`*Ӕ 蔝fDr.{a(C5_hpԈow-f2ZY2=!Y&L)6c5޻ᣯYTfBJ&LkYRf()i,Yqb޿y; g&ZlӗˆdGqY1qD)zÍj"*~S":F{SCY 5 ۉi "CH~\H2]E[$%~r^1ge%zg<&lEcgXi'njZ㔉yB1ɬE"R39UF%Y(,v"v(@aD_Bՙ)Y%xeۀ 5N5]V hkuK]'c1[wԌ=Ϗ\]Zީkf3!wW|seˁ#3_'OMz׮?9k 0l3*ՎULl%[3SkfPi/5T2 8?1]# y_Z+f%K v5` KdK"S|&yӸRE8v({]dwqn<)f2DnF`R`+ N{ae;]Le䋏5Nєoӓk$L-MYGd\'!R'߃>*-Y~X  Bu"ZhLRHCR adsRUx9ё}ӪրDžUmUGcDŽ?7!fme%>I# xŕ>)8 ElgEP\F+:SKvp)vt>WW6~ҠJK|7oNCƄ! i<6]/+;1br.\$kAjE\͵Hy0(= 瘋 0z7KwYY#6.Fk=į0u=cbMݽ(7sGI2ZZo75p]}'mi֬>,qK~~1߶\lȫrVm^|'lhNxTZY[ L#f1 M0dVeu}=x˲fwoާz6rN]HUpKk2Kf=fh s^$h5.Lt'CnR,c-EXa VȜ ~7a;adK-:Ӊ'FǢb?<88ZK}FWx);S%-m>&ds -O$N$ǨD?y" _*DKm:HHgv~~|:3"qFV;w&O8DȸTYs$Pp#>\$?s#EHW]ZeYw+U'S.T>Sb\(#*ϟ =ź+ifŨGKޥZ&кez`IF֭kA.9RBj5eT1<?#zcy:Џ'a'y!'Oӱdߔ7NfqH&OH?.U3ZK}S/G,G([ t]6y:J#V ;o': sV@1*vɨG~E8ՃUJ *! O?-#_a، İv \@Vt#Q:F֓S[3&B 0#NН|:'Βy3.FNDHƘsŝɓ! ǺfUzZquu-2Q-HM2gN)#nR`XLjtHɎ]Gcv8nd"lYa,<D+ӕ +.j1BG'wO/Έ6c;5X83v+=Sξ^ݲYi_?,۫w- M|/^?{uf_]\ r>VD54rd(%'SG]u sD^7y2ыCMԩx(Q{T?N ~7-܈-Q߅ `F<p32Ǣ^ 8,> yoB @t#Z }*3G"y{nn)du؁܆<]moH+P6;f60MXnbU)T(" ЮZi;>)hbi4v[ hA L!;KA;J9My>FvU7)!@cvT @;+U"c&hY+})߁ϋTc~᧨7:-UOo՟K;WM:Ѵx4Im2oTDdb)uPnnwͫ+*'Nn(RWg4 "o={_ %hieT4I  Ehv, %!a]P 5 8%"gEK<5Ks8 J`Py(N=QV Bꩶ2=B_Fd]|cyPSGԆs(%\fVZ $ቓYL3X)4rM3Yi2>]KV[C1ǁh@D fBc'#UPA\QS>Q|lBWcFwﺝh6v雮|t0s6O,HT~"?'gbI<_`ÎɈG+-8 , 7aZk\t6HW$~! [ngtq|G{R\+ƣ( UO{֭x2M]v%7ëA%Ygg~?_1 ˁ/h!D gBWx+łw]SQ`^txq#MJX'_0n2Y<%uElHbB*d5/o/ |I/Z:x-B?>3.IҔb#LU#dS&fo]L)jF˝:SD;Xa|W9U3FyzRÕ]jnS])ݪT} ]O lN ?5KEѓ3T2$5,u_F5Fxޙ,S';(<@'WVmTXUE`zk5Z,Z3Ʒ.$Ղy"zׇA$9AB :Ġ+e<-U:,?K{NLf(`$mi1jnDg߻Cx"*(eU^(Gw]H` 4ӯLb[L\k[sˆqam#(}?iB4c8? O r3<:>Hˢ1B.c[ UB;z\!JI*/K0)o4k:AL_>6A'%Ms mrӟ]ijwAAC<7ڽ㫢lп4`vԴhk;WGvFo)PHf `g5ÏG#|3h;)(Zw xļ:ZN&v$eSthG/&~v'hjgoHM5͎JzSj պnPNeASBD2C Dd* 11VV f tȅ d՛ \Z:Ic=E0 jƝ/Vho ,TxΒG|Bj0 SFL2 +vB=Zې'h W QJ{+seNΟ~>;Y؟|ʝěZ:MCS) Yqf$}L8G`>ONKg1~$lƀf\;}\[ԆoHHXSQ2DHP,IVv؂9ѠšJov"1#G=]K[ݿ#w].74~E~ܚٞtm [._fWatPj80DFd2N ~(,"R+I*ؤ'b"PUQg$I̪('q;"" PsߐʥNa $%Nd4ʦ-h`;p G8a8UʜR㘌ZT(%<$Nĉ O JzHI=<@]ĵbc/AgDK)zC$25LX <'#*6J(! D댹yI^ x' }!޹q}X m2qQr."FY:Z3F5pW4O s%hKGl.!SEE4٪uph \:8V#loWVfnmMQzI $xD O%qQZ~b>5:=WgA=V= 5)g;&/v뾐IgŶQJ8yh`1>^}1`g2rh;zeAvEk$v{N~MΎl(qV7\b'=?K/ⵛnϗI|J9o2;]b'ߩEdgiDž˧N~.rU&^Fl</"碎ڥv]:j:j? .QcQɌĢh6v+D w:IXQq%pn2Jˌ˯C΂sq|K>(w=M:oDPr$4M1s,0h~|LԆ6>(I3;L#obJE믦twx: 9ec$4ּ #}{u8?rѸ^#OL?mHhE9,wIMqs;8$mB'%鯝j.i\m@;A8}~ ;dIPJNTZ:E%H j#w#Ѯ?%?vR`]ʽ&R(b |?Q҈oŖ.a~(oG}QQβGN^bwhH+*үPJ*,LFӭB-{:9(X~} jΟ{C1k?z O`fN!]xW;3-Xo@jk9t_*)c"?={lS6JJm=nYZ~B5~P>? з6u(Xuka9%(Z5!ۉIA*!VΚfBke32ԡ:NcL&0GF%TP0@HTg`X~v5vW;76muϱkT z0EiK_җi.˥ikiɁnjp(\LfBB${Id")\* _|,rשyA0҄Rxy•[uY32NM*4[LhuG2_¨85\:x[mml< ?Y}7=7><מ<ei"L" ˵?i4J(  ]snWkLm{X|Le hJqZ&iH#)gJns <}^ŤޭpfM"vK3vߍx}[;Q#?qS*ͩ4;$D[;`X"n+fm? R{&el0ڀg:׆F #"lhGOOʹ )g7;AՓ߆{,R~߰M4h~°{KAU0*"0^Vpv0Fט"${u➶zuw\ z4692d~ָE~aiBZB$H5\yJIHmt8gTHKU?k@@s;u~~aJoUJv.qo!HlcE4.ja$:D/aa6MnF$d(YpVex(r^ḃ&%DEd?av A&ai6VYFGl>JM<>0jgR hx4pnp{' 50 ZVUeye߈INN0qa6<#I&bKO?f]KZ̜a>CΘNV7 $RO)~^"8}]#T"F7BXiCxE8б2GocoWN4! DzDubd5,?i;_XR`T߅hxHr!)kdꋅQݖPʚ/3vW E+k_EPy!CpSF2~*W³b}xR/uл=$c9—wZ "/|1}4 Π]a%̞o5e.y#+vQuz6.%k[\lNl,rgCˬZns1s|DgbxMA"PhY rx8>ʁ8>ʁv\\pMdJ$1͍GF}(J6 #/[UV1eQ_L0{F] ސs'' {ތ ׎@&LHs^%;D5H86˃ {r({r9w' FE .N 8J&M!"2ˉuEy]*_ь*qYեA #}yEaln2gp3+պJ/-^ӇVLʠ />#_]^Rr 2B \9u٫l͘ǿjw1}a50Auh&y?R ŕQ^F7^TzOg,N)&whFd&n׸"3(؈qlRZb^Ry*,ː!h +qkFET3VVBh<} ,f=O1$&R4t7%=_r]4˂V‡.pX1&6RjQɎ' P4Zq^,~Hcr^hBU/ke}oQyxG1S97R|2wZ. NW\Fk"EL 0=' UXI:kڄзaByv9sKsR/)׎bpOk*Jk͌ *Mi2?L)!>8fx9μP(C Y|53NuFgU$c[VSn8maey<þZgs RhgB<bŹ4%Y*Bu4A]4=(eL$e%3<'3TS Jy$'CQ%PN8bνB TJTi:uu;BivJN/pT-wWM 2+hM51 4 ]Ҹ>Ѡ'yMqX.ȩGT)!=Oo+#W0 kQkQ;q.6yzK41Ϲv"%;e4-s!ր1q.yc*_TW'$PxQX6omQ%cBMDn3qs_L\Q:nLj.~zROe+8Ŀ0j1!ۏz's'%.GTA/Փ?4e"˿gݛk1i U@{{-. f;0lW|*xIiuoDf6pZNC-z#0 p~49fG\xTE% + l/k_ 90i?(Hz~1[Lds㵫+GP"w|c> ZW#rT Ґ"J+mw9xW9:h! A7-);o5>6kˮ/VH:M&d4U:Wϡu& $  ]4*kv><wRk'ߩúnD,u蜃Zo>=l2˝, <$ c HoxJW .y|ag0JGHv"b,]TSnYOd(4 D% w~65 铧y1-9&! ƙ˻pKяƽ Jg;M_}̞A G!DF[cipqMJTD9ՄY +lȂ#ۏn(Y~(zl"kmfE"rŋm .tȒ*Ɏ wuYݬIVi"y!g8k' q΁V]|;*˺2玊m,MήjsG0cD;Qʞ;/v;8wT?N_FbX{2;8PZ7`"Bas]zvwTHI;3\%^T0bƝ7C :bBi]u-xMgZF|+$EX气5\H) ۫_zVv~;MQJ>zĩKc0kB؃aIF *#17'IQ6 .E67] ☲̯Ud8&Hb#9`DRɘү?!45 履B'҆D(118fBkDžEaŵKBXЇJJ*:"6Rk< 0"hE-;n?mܢnܗIj0b \nݧkaPt(֘pb-g"8O:AF.=QD5LND~v\o|xiO6xU!cD4% s,h&%rI}F.6E7!c !61--b춽ڬRp+9P FdL`)zrA rkk].UAÆ3Xq Y.ط@C`\OzݼIZ`4ap,ZТ_ɊQ`x=|~I,Kle=r(rX;fpٵO.(4XCgSIvPu;ُS>L2Mu" X(E,GKז~Il>Zf闀 %wU 9K3t\;L`%Œb1+/_ 0}v<YY mΑv!0r`bKH\u9%3q# m1q4vgvF5̬l|; C59˪cxi\F%}ΰ'Z]aG%ՄKx?̖0S>P94v<-|] *i}>+}8;}'QhZ8,`qqq}Ԛ_/'=sߴ}&woMwoOO;p y=IU!|!:wG`\u.d7}~ֹ=Υ7CoetnK ×,J# q r_??8|T^cۛȝmB,E{qzi?0ip_NO~sv8칡B,~+X?k`{5.t΀ʿMbG'R w4/`:iWgҳ# o:MB3Yww>#7hn~l?~0П8M@X L<o$0_>z?y5Pbع%|풌gA(dp3F,S eWNјr(\O٫Ft3~IWT:Hl M/PJ|hۿ@;ߙ`dx}9} LD8f/o T/|2%>eaA,$/ףTGO8 Л/Q9ŕip]J$ho5 TlfmAz)8E@9I0)b>faxpvltڅ:LYuɐ*xb8mw6 W' dii"/_yuR:4.Tl ZzFAdfÐ~}'|o n^f6F8ߢC|Vi(=G[ldtw叶іfzϳ |L!"Id'I5X%`t.њ_#WdgnI(%?ZX@GKhI- A-ɣ%y$Xݨ=JB ȥI%2ցu8,PC-bd*;sK,C B8FQizgxYU|s`}BP^r4_ULx.hdVܬN*+Ly㢌VTSd=Ē(W+W~z(M!#bK&i6><@XڡC ]ML5G$&&q^ ą{گeL RF'œQ4цwLyF.l+yf˱\`34wؼ܃n.cdv5kucX;}!3UPdmz'iΟ{ЬeAch6(rkŚn.CckB]T!ib|]ZfX!6oxA\`5<ŭ>g:όen?[ =gIې#N,:HJd!H{kh;f޵vuM ?l+Y\3G|Wj? +-T `4`$j+"pclE7)")xS*_3x{rhkZiPe?콏@{" HS 5>DTaFV}t1 ԘCg姮=ǜ[Zhwf FKVljL6eugy {Gq}7VY Q #޻"]ߍ#*F|[YFnqr` 0;ddž|GPr{lum/prEuK/WK`NTm&+l|zcd.ɧH?X(S&tP9f#:%r+^?9>R z~HkNa*w!+F!H4 qCg7X)αj[ -]JKL^gPRRrs>L1717ww>ssssssn͹GQmJ Ƙ# 1"R&}iLqtLι& F/ /؈Ʋ8CJ`ƚrsߎaG7n<Ѵn`4Õ.$3,9ۿ?˷PGh- =ZG˷˻am8LbL#FQ:Sy婳\ oIx!_}Mgn/Wt1-SsgmBn2(;P-fl߶L*ˏFt:L'@}b&?Du7xoM &N?bi!PofXvjfyJex@MFK;3c[TB"V[!)k ~ 5N=9i:;IWu)ds_<7L.&.]dp?OOŎ[ 3NL<.2P-Q\JkڣraqLIbYbJQ(1V&!{#3'g)m/٠gŚAylЉQLh9ց$8fVJk!S\GE(Ek, }>,ps<~'xȤD&(>b"D$ m" %J%`!a •&NqU@s3ƵxP.C< 2F|Kl$`h@Si}d ScM@a %<܉$L=2+f@%Vq07Gfr2T3g-6jdWdGsY(f2v+ )"$+DH, aj'a)@Ŗ ԃ*L06jYdSی` (Y-RQ$Ĝ&) ܁T=pP#\==KN'萈%D0ۡd9w%Рӂ2A`Dp0 1̤xUX:2E c3IB#jAmj_jZf#~ /Pwīds #&2mU@8)a D3#VĂi%:gI RD%<4xt^=ueb@16P췫_y:YVjms̥n}_KMJ7YtD.΄A?M?hҰtJ<ۋ}W:~s>ywu %0r"@[Wh[ELvX&--ކw|H*?$J)P.~B>bՓbs߷ {[p?${afLN eFXlH[&LI}Z)NvdVv82>X ZGDš5&&aKך#\%I_$+:6]pم ޺asu Vۏre?q 8_cz1lqtJw>'M`ڄ"BU"A SXŔKhe, vAAb[ǂ<,xNstJ  'NTJdJJPFC%T$oQ ՎkD  $b#5PけeHdD*߸(CC!"7<6tQ -CMYCJ6;,MxCe߄&켟 Ba<ijl{TNf-irrmFs.חp"f"фppBx2Ȣ.N)(B5/=Qoz>tpoP`9zY E@T%cmf(ה3?uP{iKW7_%nrs-b4hEJR;kdɼr%D*)  FzT]Ԗ5ЙS2w~bwĭ/Mp?(a^ ldSF`}*:%Q +Ji!Ę̕(4%1,I$JD]JKuELRxxͱYts(01%| xeD!wsZި4tëd⵫W`~܍ФK|xEJsE!Ȩ8 .%[rJotYΩ Qu!oh4"*FVڗs8IKVsɣPJDp!:QRPǗ.ؐ%5I&^)5w͎FduCy琹ZNzݧ)Kt|Y`3#GXAg<Ӽ1G8]c-wDDrU՜h)v5iH[GoxM!YbfsALv'zo&lhA-[X3 2l'5(-w^`|GI{0&=5: D7( gOo N @VX2Nɪ&Jm<|B2/aPJ R ~?/5^: Y/> fiv(VA?V' tZieVg7 .\G9cpb}KS E4+ldXTPر5Vgg,l>)ˁo#,RSuת,5#"jy]dp;Pic:85XfFn4r_?]mi,8\'J5RQ9xmV]p,{m}ycܺ>k39:eBUFΠí\頻Єuakw6WGk"b?,>%G,)y8Q<};^;}}L?_Fcd'"IfԚp@YSt &%>ˇIQûhFBW:_f|驋? Ϡ K&dګ'oWKvUp\+Yաg;j㡣/IEKJ\qxOx+KQnӠ&|fLhʦ2e JwFID茜c]  Z1RBҞ]tG"&s#ZmƷ+J0ȂJ0\]_\^Le6b9}0a 7dŮ203T7Q>}s7ieԔ`ȂI -*N!jBWşr?5"U'w()NܺLhC^^esE+(-j`M[G0*ezѭջ|RS⇮k,w>0J?0*L4ή|e*k]7adzZAcw}%Ukפ\7C9S@QT}$jsB2ĽۇE aʘ C~ 9i]ՎH3urP@˨%)A*ˈ w[tѨRJǻy(:X3.:wG 8&tP;~:8-Yi[W5FFD-GQjWc%F[{%I %%D<1Zv7o~;pp9^rHql!2پnM1 ⿓d?=8 i_|Z2YOkׁt({)T nB-OLSYReu^yvnMEth]M#_*-4Ag=Ӳnle ד)%nvx|p}i-Ln.Ppr2Ye`Zq*D%٪K:On^ufҝ

8B (J'D`ܳu ӧKT)᭍JXQJ h(D0WjQ.ID5R:^"h @, c]VQ7/j}9v7W)Y*_&3gW_&~Tœ:ZSuh2=.ʛv~q>Nj(e*U$WU٨EJyX4m(-Dxґ2 n_JӈyDotԊڮ10hrcdn_lt¡z 8 CVD$RHTEDV{"(#R t! jhڙ]v-L_*$*af2o2wysޫ\=YZ03_gygrTc;Ww쬟x*uU^:{tRK+FWgu8g \>}t*6oʫJr5F2΢Jâ uzl!^'e1+QY:˼WD04kIXoEɺ^t BϭgKUΈf1P갲-J#aoLAG5!Gpf&`Giמ;o] ;s]aЊ]bW.\T.(1oM|851@$&up#BV :+ vn~nۣ!0R0J gTPz+uѯFu~/SA0 jS*?w0ƩaHG ZjQm@:lρ@!@LB- EYT3 lUlGkm8"- P ZA"xGh0)X b%DhfQ#1`Xֶ.԰lԥ[5c;g3g'uz=ZL'q#kMcd9_R LkY\\:4W]Nj~dݘE\V8|*QOsVw,9 oEy@*[2}qDgD^\v[jmvP`Xơ<d!sR~9Jt.}v81,7_3Ƙ㳣s:2%Rqp8E)K rMmYq{Ko.'Tz$t5_@w"9 Ia$2T$Yu!"D9I~Nԩ <:99.uK |RWʹܗTxPz)4v3]єYIΠi4~:]H,QSO& On;)ҁ Y ha?!Zxn 8hor/f kEVE;(.ѺHWJNI~HICqxc:[=,H;zocR|> 80foSl>gGKO0POMi4lWy`N5pAbL{?9ʣ&SпiJ4kn(fR<J&DK1mϤ 1:Ϭ#ǧ8 Rp/Nz N\}Ӹ: }ZlxQq|ZI5A[fG@ e93aQ2FH˸6-(j!Mp)N*T9Z[Gk M4\첍.2Fv}8m-ͧ)10z7HX6Ym11ˋEThg&h9d^х;8j&뮸4jrO[T1֌f <֒H*Zi4٥?S'$-w$].ǑsE9 #f˨i.x 1b.i&K~ނKH-oAPYkPK%6ԀB諚\ˆQE٥be}fV[E|vR(F:km?~ { S.o0=I<{v=@\?JķugKubҦ[2zUAPpB߻^f\^-%\|,vR+ZjE/߫2>+5JU}Ry5=~ҀR}=B^CҌO)h)߶eʹNUp/, Y/OYӼb͜% e9oy_,euyy]v^^ww^7wW b BVze cbv\1Id-jt[z^__]|WxT|DzuLn?}?M$HW-0 a z[,aAw7_n\$A[W'9^]^B]]WF[٤ٸ"ȤBOtb/dUre3'$dXlaXb5-xhJ{zUӤ dC/ڨX+E .[3Xne䁄6RhuZ,4qp'\$(}kǤmj*U8=&L5Ut(.*f(o5 i}[u-!>34i'?şҧ_;tZce% H _zԫ cu px*T:ۻo_rN )xMB1o^^^_G,Kghf# _ל1k0Vjgn3)4{D^)@I_Z9W"}8sƩݕk,A`Yg1 D,$AhU@BQ.Qܱ)pb/ $'ux;z@3\=DITJ njzfczFZ2<+.J6{hS+ӳL긧\AXͥ#nWI`d.Ncn{f1jR[I,V|Q9H\ bⓖ^CSXh~v@|̢c_~<_}nߑk/v壿^|px[f,} MRq3IefȐ].C!kmehZ30$|GlЭ:T7NKFcԸ'f6^-|ఐ~9orTD9Ц8t5aypFM!4wjOJ~4U:/d5et|yc}qwO]hӲսt Þ4?vcN`cdUY<A-șs14賻YE>:<({S9FAgt`83gK/*',0Ft(x/^&Ř+f7Ŕ_S~]Luה-%l \,uG䧕O$.LSA "%&I'luK5h}`FN`VP9[21 0*I=2"$ep>0n\з&5FՙFlU)?8A^sA āa\J(eaV<,oyO0LalѦ{.AWWMrhپ, jfW= ^\Tڞ((a`iOf[Q"0u&+4a쌵)w!g²}=V@G.Cm]93O\LQg &i_7(ö1EV,ar i[7QKQa)N!$s\Oyab Ȩs9rN8^q%yx,Bb븳4!@p1 H*8dZi \ =8aDHq?S/_8W޹{Syw4N훥rh=JS<^ּߧj>IX]65NvX!3rPz2 NlO|V@0 7pLHc[=)5 ׺wy񧛛gw}yן\ݬ~~@X߮Oteӓkk orPs]F2F¨fzY\X?~ O=98j#pz+\;o_7{=}6 &P;̠QYhiPs]>Ҿ>ؠ]"uYbRbĝLFJڪ5[įMj*W%C'L:&ȌtII(l.'L>b#5}ui5xf&fh1 3-!(Yyvd'Vvѣ-h=elHg}5jJA2(o7 V  _:ANww߼{V6%JX雋~c)p:R .yeŠ?⁲ɱA3V#/*7G}x6t*:fe^I)u+abjSzڍI;\md!qm<3ѻ ;/NZ11gnLlޭYZ6Mɦ:e[qx7.yxF LVK6wkڻ`!qlS֐9.sL1J%(0`9\0xQH3 n91H̞X -2(Pz Z8 5J9Ѫ@BCsI5d1mL?  陙9KX>ErFڶJ)s(wqΏ?n Ұ͌KG]34fD/>}|HkaY|xR&z\]MpOi+vh5\ s`X0N)SOOQ>2IbY26] mjflBNc]=QcNj|sR=wU|xY¬lǻ?R_^g1½K?T7ګ. RmqZe 73HgatF1k5^qq){tRLu輦zAwy*Z&l4xQL󌍚g}.Z'f./IC[B0%I;$ٵc%I{hikkLΧũS{L\LS$IAS 9?]th2 [IT7L, S(zAAϞF޵ cyf0m!tWtAZ}V rf:nCR9(aYO[4A5B+f`Y-6O _e)P?Jb0%Yd93&(( <$0j-ah*9t Tteٻ綍$Ἱy?XKnj7kW~M3Ԓ!e\v$t9fVCdB2ijARY)a,QGuV-UÇ1Wa 3s W뫦Gl˃2=uQx!`ZTtd ELQF;7FnuyPFt>u;^05Vв-|"ZK}ibUrs];b=ЙRܧ3[CgT`1+!Db;0xl667elmR] {J]Y^N|p|X?[u7v in)>oX/JMݍPZ;iz2ۊ6˛v\ tA\]_xYV_.[՗xgY⹏Uӯ-QM_$XHw>p7N+ڇj#EBnÍ}nխ} %j(i'.jO Q8Eo?j5 hR9Ei6^j|Fj$Ta<̡Lh+ UOd}.aZSF v&[*-R![*wfT\@$k"{7oiF ޥ}r/I*og|T;ac)I{{5Y-{?LJ]^^ nnwqam825٢լtU)!ֳq~c] 7Q=vn \lխˈ9_ bU=D9_0F9ߖoP'h5n9QH¾jT  'G0^dXz< Q-3:8E)ϸ))x~-oeRؓ\UkYIHY+D:1FwO -LIVs \ycme  ė #b aQ)DJZxr}}%?VW0qo!h(5Xt~ jq`X܂0_^\\Wpo޽fëu> H$lB  Qg"AƃN4K:AX ؂a'=S1 JDl()0}(-cB-6.6Ǣe H=cyzSL0QFuG0Z0QG%:Lk>[!?f,(>$I7S(LOϸ+Bk\'q@Ր=$hx(Ƨ -VbsR0VuZ%0N0Q3 ba$=iCzrew@-x@{j9tՎ iJľ#/̦FZK {{FS̵Ž;|vE){bESjN~ ;!xNQ>jB6aԬ);ŋ17)8˄-;IAӟ^lAH͚rW3E`(g{Z|L[?w})jo"Pw4N, s ꢃ);'SAՑ9嬏$(:7Gl 9)tA洿ښbq)oP,m#Z´y<}1W~Z0&t(>}4s~9 O=@ MhhGڦc:;?H;{Le0^Mv:$$<3QKH)d4 )@MZ1!w_&2b;~';@wFLҔ9-"Bs-1Jj2CXL7JXrqNwg0À!} =?k<I S? ߙ,x ׇnY[@KID]?#!nL1zH^犍[(Kj%mRu$ۅ*/w '#T -@BJpUUٸ垜 ))T`ɈT G~KyMZwd)d'yrc/$NSfֲPG+)VR"1M ,60P= >i2\G SsbC$E6Ec$;-Q-w%|x'\,iPEHfO)ݢ.qY\ws˯PZ*(({>Q- %@iҐMۼE{Rݶo1wGtSR HS 7*|:X &#&uݧ%t=T6pҎN{~]4fM6sO}QMmJ6M##p'bj'A6M.e%.4' $ԃ9#ù_jõƈN?L6lp|r|~3\\C-3+1>/ Tcb''74'}ҏi~0pd󟗦W? ȽS0x_U>=œ"򳙹~A׭D1 + "N3{o@:Y"'RrY&L* ˔fj%1SÉ^8dFyuJkcRϗC|sZP|<3 0 i:&A 6#"V1B:q! y>Jo/f+í [?\jo&ӠWX}BYjL7x~8sr.7X!ۄL2)B53a e!3xuT0l֭NؙJM%gN;p]F<75$S!I- jPvV,W<'EeH\&WT|Y虀{ˁ]|0 ]_nF.oF}xl}9xya7!ϑ6{i+v?&[#ׯ}%Q-qȯ=-> 3'I}?*ɰ'j[+SeK`IRUN'T`=B)\jkY&- g!bT%"LBR2#&DS0bUb bGܭeE=|Jj( FK)}S՚SPt%"4C[bRLی?_gm6~< Q೜UewÎZSfBӯ>% }8՛sZg/6^5HTl[|V IɶUI&d4eöX ЮX7>RE]:0mc@B颏 ?n\o-{7WHvNsS˨\q 2.Ԭ^-6KFc?˶Ÿ})_&;֍Ҷ}^)JSהefMyY,Qߧw_|o׷s?]<퍷 Ȓ%b/_]#g^?{O8_1K`},ȇM 2A>m0HSZvuOb{He˷djKJ%lY>&778qT7 %-`@}:G_"!^U6_E{aʥ;;FՋw̪5Ug _ 4+:ynWwη~^nI_ucX=9Rd2ERh~iKݣ`Bt飒@`Ds${+>:LD%dK>ጠ+,zӝ׋KWIӠUFt؊I4+Me;k؊T@C%")3 )4L"c,c4K"B$܂Ȟ_0iFL Xe2*r:ZK~9wwx"YӬ; C-lvS9PKAǠa l fRd2`8eGqRmS2 OD23'N5 u!?B@m eb,3xLIVhʝʽ0vi_ -c7W/sSf!k0 6;|߷/? )qYMȅɛ`>ir`!KtƯ`1N͸|>P&kX3H; >s(%n4%:k4neOiPDkk$䝋hLqAi7w-uD't&ڭ8ʋ݂ڐw.A2%sgM1J:Fc>HP,[#ZW!!\DCd VF5yݤ`ȻҠ脮DvqHQ'⯐$7~Ͽzz~`fHJ9RR(+GJݠN)uRkJJqvRS*HKyHy~p|*_vKd}.8Nj1ǫ~l<ƕE!j``S4Uݫ9!DZi}\35ykqIԧWZ wj);% k)q8xp<8&hv10$SFixwdS҄9&&@TN Cἦ=8<[fvz~v{v }\ߓß)#tpŌRhB2M0gTa-{'bšc1CT*f$Lm&-4(KA!p `@R"B"d\']P`s'H~x+>$ Z~4#$QMT-"%L֓^R+S̈̊ٺ!PL1v. r9͘Dy[pPN-[^ 9dځ$+SbEOk-p=k:_T^~(/dgy~?ZtƐI.+U1<>OڪXqA@!E 3HZR:Is;Țʚ抔~m\rMC&2ĢK׎&ŃI~{A3@L̬81jֳͅ.G53ՋvkٜfBWnRC1xB35v^0Һ掐f'\Vzu!D},ηgK uɃ| uAMd/5dPi}q!I_fnk K!>5O5rԈBo]DyS,r@rzhÇ4I:06z<*+ͨG|z!s`d1O A.CvKJZ,S:p!ƽ %{{NKNˏ %,5y5q҄G nKǏX}"D3 nE@1۠.ZŤZ*JTTr CT-;Cf^B*?R #%k EhN!P*2d-VgSS!::8b2|^m&6s6Sh3l3%V_(o?w;{=ܽilĂ ~SNq5iʠP=/cUgðp~x!S( ]@Jv[!?B5NNd@ҡ9&boGQʁ)iJVVk*:C9jLn AõYԂdQ2@P ;qT(ܘ>} xˉۛ(MwfwMR ahwZD"*#4 -Ry  3-,8Ʃ "wt6B&Li: x%RMgGN|EOFbtm?_фC:+zW4blWth"\ A@[K6'k1y d)}2AwoeM)VTUd:_?}m;}l|{+=#D;dDtu5bR]Xxtzp1EkWe j kEE0sDp!Zftq $QrE  Yzv؈|G@va<~]@ }㒠Si@@EsڍrHMJΌY$T.M)ILpݾ9L҅b*҈2ǰ $x҇{Yb1ĐNc3 "8vGBA` DX HʴvTSQP|L%N=C3td"q*@2S씨u͠!aS"\<:[rӭ )8ajZVpop~bmi&)I?I (J1!yU.n:Ijrd6֨V/2Gs))!j|uMSޒB*ڧv*a ۷O*7IܷLv0G?L6ϳmsxYbFpsƭ&ɢ2!e<&@3꜅45)\L |'6BG .gR@-OĿ7ž r-NZ:Em=07l8/~Aٽ_ETIUj$L(snfOb aYE՚쭂:}aHD@تD\x~ΞI6A>,{zTJ}%pJN$!>Zj3OUDϗ_Ca©R{$~ٷS$EM$E1\+\iI,ك%TP >@zl{zgܾۥ<Ȯk+xe-sٔP?rS&O{yaL/^ < -AO=*8hT)㚮W]/g~<]1_}6 Q ?+Doo; ,H0:4^2&@.u}]΋-NA(jeM`!!u=s6q!&Q6N86EKm$ONiB>㩜-z,_a`4Gٱ>&R(mƗ5o;;wVcUUQ8Qzՠ]bQć`  ۷K c#doFFk9i->F\ :Me!~; H%&3q6-`W/~k{6+Y$^ŁV҉{) BFNPuHoUڥ㈼qP}SA=!hy[-/TY,揣}>(ph9=;^ 籜pae籯<=r{v?`Vq7<=Y @R]flUb$OӬ{XU !XAHw 3rJ7:r&Z)?K]Ļ 3E ¶=}T=7 !d,q9{ʇTEA Ce\dJ<ktoYFtѻBR  b&b !G!\0?WG@C'u]o@(GN;lD8R_&^kƅ90VC1TtQ$M_U "v[QN){WAhB &n8tPt\*\9 5j?r5Rdg]A秋Oo5yt*]M4l^ ^}xf094rv sQ6}o'-YD ǭlׇ[BA1.w Tpay"A\.tkׄw!TV/(Gv-+>:㨱ٻlW=M0@?nĈ\6[6RۃsBIH,JU8EWϗbl nZ.+V [£ 8gS w?0I +,97bKf ՘˄2Ũ|Z@~4T*J*O.ETh|ZWsp#-ڥԄajp%㚴3ıPܐ31ipjbmwEr;)#/Odz$0#67}&gC{_jVQE?j`>a}ke'LxΙNZMmP \(ܪ Ԓ)H]vULi#H &!Jgrp_c2MQ]m; }J2od8ڒ|ш_pƭt=2f?fѷ TjK!A( } (9E_]Jk्#.AKhT<+-Vx t+fZd]Hȟ 7 ۨfSi51 QRvR'ֳy4?dªB"=e\1>[{s-(Ҩ ܴts-jfIpc۴RWޝqϜ@4ID䬥d׼LE/)G6fXE JJM"G~]TN4.&_l.wMBW& 1&[+;iS؎d1+֒ BpKE|9UTn8Ni#mnJ<#Q[ $9) 2=6{L)#qCE?ؓ޳6lDDos!u+쒵.VRIm"=T"cCgLV-=D(~ ŦUL [޾<#f@i!١D OF"G`b[pg˅=*d7YM.K-0wwwwy5WoAl?,~&A2S"?ys/-{xB+% oۻ q7Q|q0 Em "Dp?Ot;AϒG{`=n-v:NfƳ ƃo&!}>6% 9qC\_wF'\buݝx t7)uʛ$:3mXiCFbod2_ۉ'ȇ6H3ݛ8wjg9W'ǜor1l.$l|wWc1+wz*Lad\~!(FFhº?fkpVQd? q`s%w\=ˑDŽf!nqXgnJMͽ~Y7wEXWoɍMՑ!p*r IH܀>C|J!rCă~iLa {d "alf9!*"S*8)c4p1z1& [RRJg/\`T(=O@1zdrG(D/a*1\;ΞlF|6&3̰<'}wB8ί!l^"{g2wzD ňh@h{bZZ籱'ykq@?` Wcsgn-2q_b@_ LvXhPT [;Q6\񜑬i4Ÿ`1~lNy^z9bC Y!JO3Uer'DA266aW8K@=Hj]^`80h4{LJXoCn#7(g3jd8>`)H}xmk:DNU(I0FsUkbɈr _S]~jʼͼ"є+QA*WN 0zzWV۸RJòuyuX5Ǥ3ꝲ ƖGHJ)kY:Oi գ8XҮfѣKc@SR)z\@ `&2#W?4#=HM FӴgLj.dgj\?WF=Xo)v5ZP#ֺCM jW6Vtm 72-v5i4A5:f j\f'YOFQ3wZ*q̫3Y,I*lejW5Rq %5^m[`'WM;k0˄euUC K-h_.{\L2*[?S-y'ShyG5'3a{?krp~e+!cf͊e+ЫeX + kg$>z0 W}| + 3IufPG;vIϚ!v!<Ne73y}sf6$`җ}syá~@_'fNOC$'JX !aQ YRv6FiӱK̭4տK@,9㕺!\{5*NEW 0t|ڙDFY_ l͓ ުón/>!! 3_܀H.uBżZJ%~@iBO 5L P UPE$&8g}^KJ-{PoOfb(SZc% ,J=qtf is);~BĐSr&@/JY߶L~Jfx8y(TEqJr}%<\pME !* BqXq)5 ENtkdI· lhL-JA{ێAXb(&Siŕ9y;*PV5; .`cKDTm8t'Ds.tj4"@)-gyU[B$XtPm2KK "0pl_7ԴD'3Q<0m뤻K+kui)'Mƾ1i)b Ef| $z<' cl1EDKs$4Zj]*&8,"+2Cp_p_pԃ"E,fo`.b* @66MŎ)\E6B-3$/N>{,nSMQ0b&7J=ǩWjډs.$(FS3K ί? b>|'a{̬ ? bvBw^0N.bggSebW&f;UlX;y=1aIJ($* ЯٓN%Y4b|=x7f]{(ŮPO4CL4H,a9 #7 Fn6$(`bflܽmqzs iߓ%q|04EP,3W`:hkczN ON衾:J*R+C|_AO}]z IђEè J4-M~XPf2X}I-d285pv7lpMf`R5sq7#v{@R}&|恱{9ixumh~&ըaI:+7P)lhc$ "oJNk&{\)}\pi]R`-Ou^H0N/m} ; =Px`  pQxl RxɅBT{zF'06k8nxm?2℟@FعՁ=@;dWi5, WmxPUn$-,tգk-`VP1[&je/-uգjC45#+MW1p$F sUC6 u3k" *@184otVdӠ\U 8q,>pV&!B3ݣ%='Wp&ƨX lUfΖiňøkmHEv_o؞; ̓F_met EŎ[MҡHJ}&KX|]U]]E8Omj9&JIu&A4Y9#!Խ#O('4gߘ?Y8Mj`bjT&{ymCؙvgkL:9&PKؠ QGp-"(lҸԔN"4Dí &l&;*+L)f^9S$]R:bع0N@%Io7E@(ffJ!|I{͋QE׊V!$l?Ksb=s۹%)۹%jHZk9خ (~;-u)1,rA`"|"/.{Ƙ\MKyh, Cz)# g4ghSD{Ƙ烉Yc1Ռ2bX4ց4)#N;Csfr=C\%N<G_ԭ[,qx|T6O7Zpvb;zzg7#a`TvqPYdqXo&u7c?0t75zzy40eepD.y&+7,8RK/mM۲Ii]s.z@ݣeezFEH4} ee\Rj~q9i/Fğ5d?jit<5ZHGy4I2^f"pɽ̅QZB ^TK?ų ][ "XjW vbݟ_IU{oX"`=hjVc&D2H(˪h4F v;i4I4nhR'C`Q^'195㝣nث@tثp$JAdN\̓&#kFNU<8')f\B`|b^xgq}/9 ] wYyqA+3C嗇xcѫd2NM sއy?vjZcrG8IQnM( @f\kڙ!IZ'IFuN)B@za_˜ \HH 5bE 0*7JwcUېB)\]ΫzV!7 ፐVzH .h-DNhI3\`[%ES(P{1*d!#d,֎ Y`eJ/ߟE<_̭w\:{Jr)\맏~,VJF-{\-_E4ȳK|09#lv+rW5~>=EwKg?G|ΩZu];kt}~gwі90RpuW^pAv'(W1xP_ ɠ-ޮR~u@~Fi9>D\/?.Ѵ~(W(R9.>:~<=q/C~xz湒c:T|`3_VI>KjC̗u;7t¼6czE|:`E܍Uo֐fvXNor@z"j7v㩝ubNzu?bqSF72*TQ7GٲH,PI`y;\-yJ4T,PIG*i~b7/bLRgb[-@:UKv}%ep&vҥC0R4`h:OxQJ3Ѥ|lE?2W"jۧt(]D4mPEMW'*֟4tz80 ;Tn@%-Vٵ_*2;#dYD-_^cS3JՈMp;r%RjnFQi8M;T;qϧMbod޷{X y/i}J|a1&\~Ղ^>pq~U诳V'鏜c48ů|X__ZelC ƑʘfI;78kg^ix嬨ĻzoĻw7> %Vj43} .R"ϴTgoE]KU:EQ' E"v*m˜gVTO:x6aպ}͏g{u>T#tOO/ѐ?F‹_.wzS-dyDzбDNaZ6w;VU,nh' 4PowD>3rj]H*_mыrsI8#&f:2Kzh֨dQC(Iw#~zJQ1lfo`xę$@Q(:V³MnIEFh{z5VΓ#`9̈́'z'hr%&γnQo T||i#o@!Bhl'~ sm=層yY`n!MO QJ@0c'pY v]bY,4ztra8l<HpuAGCpDTrd\q 4Ly6Z!-^Eʃ8SHC3@5` +0V8J9wUe %ͯ y|tgjJVw_}0%C@?~DI&!ݎ/""C$+^iz?| tp8TNݻ Oq2_X?4ٗKD_'͓Tt6FKPiZۆK|J NpO`¦ J~S~%Gc<.[Zq>R!0ԧ.F厃Lm;r'3)E`gM("w<ч"#r.1mcrMS%N[]mN E% X۸ WCNJ#@j%` Jr ljp"!;`[TxS2͐: r:vJ9l@ԑ@\-aVfCDXu=f\a>>BXjAX%ۜ(ʐ`&'Oai.O]!-/%cX I} Ɣ <2>EʋwT`4lEwJЏxKPĶH^*HN`ƷH0SU$OS H Ab(.ynRz[oH)yi1!})%&#q51l⬗M6X>sV?D:d,TFT"JSQ K>N>OC^DzLfr)ٚAje&S#ATjCr9BappȎPp$]v٥0֝ Pڮh{BQJ/$ar,XVdf9H:-Ь;+ 4tI4df9;B-Z=L!/*)n]} 3MV][@KF+$u8mr-*ө(;5NłqpynGwwVO i"=0WYR4|zl<]=z?fsS/|K~:fn["? nP9 q>M+!_(fv?]YB|z1/eYJG Y^=E"Bk1~ȏ=.xY(ODBeK$eWcT7syu%߆|Xr ֬oFlo2&;wH3o3A#Θ5*ӆ2ҨS#6Rr`&GmoӂTd-8//[]/†y]d{ZGW*h' tλ»0 m2;Xxs{aiQF4W?Wi#MHYlk$97wk4W(mߝi9ܿ@.kl-=gt0aӸ[;9ٻ#(cEYʸvs';<9/#hiHN+˭v2G>O=t>.T b8JT#v %ͭcK 9 B*Xppto E% uq@w7ijzUmv ,Ҕ  63ed)a5c}E͕dIb בI= hp|B.)p;T7-Z%JM)yG׹^OyryjEnLZqo4N^UL`Ogfr>6&w@9ڗ(ZT%&d7gjB ld T6Dpo{׾|eR:`~tF]ƼxpVbp0dK#F!Ii?~M39\l~g롕gasԃϋ}a v5TcиP9p_w}L[XF@;)գɧ"(g[bC>S˳k'XK=*{R"._hdSzGVj:ggO&05'wy[3ԥFCӑ殇FoLhޓYR{]$&-n|~q|bdMƈqsdƘq7ciiu{,ly[^y j8鎓ld k)m ٦)mKVip81+\pTV[9c-m)5&DKlzs9cSDR` VUsaHBJa3y3LPMhüytm+շQ KBk$ bHnX?)#.y'1sk2Gu QhKCQx`Kb)E+8ju"Dc$t-&s" =R\Ę'^F-,v-U!Z?ˬ lђ>EnFl9Z~z<];/ ۽\fW7u q)˙\"BW]nԲoJ9R}S^HjXi]7ſt7ӝsr;t3Q"\uñOuξoWg5CM4U Qhҩ&ŠΡ`t$VDD1& ׺mk$Af1W̴#IB/3>'w;6|켬!UjՔV>0}"I*R<ʪXmݢy|_ddDdVeC##3tncyo*}5"[rHON/kF-x[v=^JxaQ(6/~u(,R>E?eӛy$LF6籿l&)eȌIC$i꥗~9x6">3BxPa>dU-F Ʊ>6MKUk1`7͞*LOS:M.RM@ Gsڐ=μzz(RpNv儜l(p(ȼ$+%ċ8v=SRXfNϕzJlZ~&cWø T0rb'֊cZj3"+14ٛ[" =gn_0?̢/~w05֠qS*QJ!H. ecլZznh̨ "Ι뤏wIZg dUiQsm "m e6 DsA_wh0gE.'#ZPšin~~\Ǹu\Ǹu9.h4/pay0EaRQ,ăה> 2Hq>?n1VՈ[U,-2֪E?u馠vOxQgM?;DD2s:)c:)AHxc`+(0 aVvW5"*5%J3)9Jy @leN(IS;?Z5EL|;+U VRUB)bܪVd$&!TEk[! @hGFT7Y%5B;bJ Q7 N* (*ul}FM+^j<B 2È#^pZhcY 1$Z^ nd&!-H7.?*fƯ/b0* PJJFQf;DnX (*% 3ڔqcbd]*`} NX;\2;cKSo?VH4\ gbv_s+h{vǭ>',o`ۡ,@`yūؾ[^rΪ EtXvRsb夊& D0ά#z $te%i9+E)SFO߆4:07JpRcmP"5'x2]ICFz 5.f]Ya&[!|(Cp%T ^SkE@[ά_(H_lx t4f%>5k0qLz/"Ƒ(RF1V'Wonn`t۪;o0vX+a 밡{ !c(A !Xo1c|`, i):= *DMS su5?s;;'` IL0Lr"]p*piPZ!6c$ӓW!Z!DA1V"xcIxS0h lDVP lZ ZP$);KY[$P yD_b&x'p^Ej\*rZ"%t2Yɖw+8ShyfZ4yőfՀLrrT!񇏯Kyj6Xf=/=x\g['ff&QeT8 C^&axĂ :Jވ*ǨOWI @J"[b)lNPR:-FZ#wb/NbGʶvlPW8*%;m籁-Z'gj˒w TX",0dLŁt5J?2@օ*ݏx !fηh[rcn,نTҮL)9[Ӻm1Ko:0&S6djI`$=# YĻYRKr0~-HM@E '1it_VOw0͸VQP3,l, 24 !rϩd_UئW%oNP 1[8]`lrlhhGjӧnR`ܚɧ|,>e6n'kۘd42iIiëq6KgS/լaw0slaR7Bk&c)%ji3plA}̟lQ)"Y6鄇߿HZۧ̉ 2|u[\km2'u2u}W0/w٘onj&SfS԰'tLï\Jݞ2|8˿hY!HFxm}Lf+Л:8ekg"jyB5xN~LgSfG!AL:)}(Ic`7r U.pl|DEǁ`ـ=R+E&4?9{'V"/u?ȭ 1,EJoHufB W p'ȂJoxwbByIn*MSJhr:; ηyKaf"+ul-]wº;$*9AjdD'A" yPOY9 QݰF@=TU[2拈yWFO'! >uJnzjeFub=.A ūx'V,qʶī[#sG_jA{xwr`'p|,=aAHO<|2LZ ~Zw0쭇K,T|+Jr9x>|x3 |3cn4b+qgls;TU+[Z}qVo9Ef^n援ټv+[jv4޿C-{K.Ҷ9^d.~&8}a<2Kl_!ro=$Z'#or )g%/ò~sF1+A"#}3)j*(t~_: C[^Ax2fT ?qN(@,Y!$" gdo15+0\Jp}WFgCo- JWg 'k>ye2]u:mf[ΪB! n"5 Uxj q8 $_8JiY3%w5ݟ3Vju.-߿z*r@_v *^_ǫfdhT1AC5245>}>7Y(+< eAz eAC3%{rBA; ߿DhI*ٜĵ눙T42RRۻz4M"a2_ @ 7t<\WaUoY+ {U*9AӇ[`^ Nm FvsRg*LKOT^m#'K)~3`񛿼Ğ>)qVl ~3d(5J[O|· T#jVB(Fӯ( Eۂ hm!T3wsY4xʔ34PGD2Y8uaE!<'A0a0vNo `,FY QopAA ℂB(-C Z\ 넢xIҖJ%.JgP@a ٺev>p{ $ sQG 1Byj 5S@LV Қ:E 3 @0$iW^0*+ق8&NThH`) RƑ!(7(Pb a)X`(@ M ﱀU1d h59 a{&H@QCI lH#p)AF. 4Ŷ}n#,x>w ;3?>0n|c`lp;.fQ:^9ƣc 1ywOxg90fjdC_)ҽ/JÏ/DKïilߥWroNtIc0qED!|{ Q&l.^ @@z쵏XΏ(0ft;7tFcL?01LÇƪ{ `3J ČFBسL[Uhb ĂTyN{Ic4?vwkRz[il{$mvw8TsoWϳwܓis8G%kq9!n3.0R1 +SɬRI*=v GmvA賵5;_Xsvv.]0<3XVmVbP3\b]><3X#.ֈsu9w' ) :3buvCź"Ù-)cgkS.n]svmŬ"Dbƒ EaT|<$h0qCԅ0,bRKďgk3).flQh֪/خ]Xv19gks&.\!js7{O㶑_!eS0Xdܝl$68a:hnբ,+JRui.]GjC7t6ZJgknlj9)3Τ`d]qns^B/{oAJ!^8/,U#ꀾ .Dj-%-bhoR,184 ˀ﭂[i 춺rĐp\(`Lk[-J}-x]JxVQEKrcq.B^gX0a~v8)?xSчhՇbjVgI ;EA' o^$ށg|p|qeԇԛgL<3eZyBD< V]$Y~Xu{VG_v,]mö@R@EC Pt;4kGݡNO]tI1[1~)Ɠmk#VdxV\׋_~XX1[ICt:!j8N=o6kr&2vCXy.VPG v<jq呖elؿ; Pk*O8I-YEF 3&φO&~f-̅S;,~e{[.VSccip٩ԥytbטCzn ?)?y2=O7$QRV#,CN0!`rsF +'skFBcM*[, NXˁj-_Ӻ5!!O\DcdJ~9 U>K:6py5Y֭ y"%SD}vӺ FwA1r|O%ߟݥWwk@BFɔfG_{شnuAщ]cvzR`{n}5[E4J?߆uH"Hr#*2=\6Q#L;F H({6[, NXwfԴnMHҘ}Кu#a"ҠĮu;\XisT Yr5[E4J*ϏpӺU'm2IZI5ʻ_41}nޭ y"#SA޴nRK:6p}Sެ[֭ y"%S8\_;A)}Mjƻ>p=5%n{{~Y_mT$6Vmoe<8{}x$ 6k暵=[mV >-"i{ة"62 Mjh:9S5ŸhJũ.nkmYM^*Fs[cnTЧWcfX֘<3c{֘DDkLQ֘s>3G֘1:35ܨ&]'ogH5f5A(zz5fjkmQM ^Y`Is[cnRĵS֘s:{1 ՘al15F5A]C֘%8m17 ӫ1>s[cnTgWc֘MOh15&5AS< f[cnkjg'XcV֘՘W֘sѮC֘515F5bY3֘sdwQc1I"ӫ7Azf ⇉L͂SplS2' c9lwk3>>qoz{pIzב5] <mx $t*AX 7,3EDNMgڤ$5Ԃ,p05S?.`sJ#Sn/nҜ/:fjhJ'~JT%+cL#,A*ŵNf`Q  IM.eO *gSJ?R})2;j|ׁY#$pGB0#lX:#u)S(K̼ͥȕ #\;m\$ 7@BBVw3&dZ24hJ [Dca n˘+ɒ3p Dzb@Aa`9cD9*^ê9HV. mW>\ )D :P'Xsaǹy$ 7V[CAQ<pDZ`NH 偨!3^[ƭ'R $d]OBrplŠ2( ,)MCY[L[O3P!$IBEo Z  k@ܜAR6H9(1ܜX3FQ^q$Є3BѸs Z36Igd8@(:A1ֺҒx^ۖ)j`~h\M~e c2ů7ipECoz}{3@I&_;L[Ⅿ~xft #SD%X'G'y۫NǓB{Yi7AezL/u)u6VrE]ܜ8QaI:'3ׯܓ(>@m`hԶ>r\jKi=)jJЎ NQʬ wF{Ñ3!asZv&hP_l@)Ol+sɯfr0g%,i Ad KUWry2?knlj9)3Τ(z!l]k=un]St"v,sJP*?Kr"V:s.whfנf5 uL?pF'#3—`~5^יp:UdGpuDMDW TH \5Nߝ*fc ɌLU**6 =ZppI QmH[mxܡZYtQQ°U l5R$3tY֕O&n/~壟&&OI1 Rc]TP$fr%wz;ɋ~:YK]3O?|`Iy2w]I4%f`W)gog_q?޸%Ms6U<9 /ƿ\lHs?/zf4<KtI.REOs!W k-cW(ī+^ ´II)>&>+ϭ)4-YuU6xE"hHT1(@Ek9X9 [8D4̽v<j0O!!,gQ`^m.B@/Jlpoo9scP~YcE`Ql70,> o 6,}qпP YJ5սqUG3M%T[nReyFcB#,gFaZv+B"Y#/> /?mOä秶WaML/l }7?\kZ\jAy8:ԴeGc0 EΗ jK־54-If 2,?p<#J]̥v7<,kqEMAU]W!bR\=Fm>jÔ6xf^nZ ?uzB)PI(Y+ B-f\2,s -MVߒx܄JU>m+,Ca,Αܡ;U,kXFaʹvf%>1Rmxj ?VAjV&& [E=s Ѹp©w >(`KL:9LqMLqjD'-Xs +3;O Grzj,"__v+ڝ [M؜ _֨\rU0*WUTZ '3q2έs0=I:1쁃 $'SKy[2%F_nSB/9f !kA)76#.zevPJnT#uK$6RE>"--fYʬT&^P9\2\q&Ed2jAcƶ%Ձd4_6^ҼK)}}8a %rjQgalR#m{˘r@\2Kd aۊOH"i.l&E2Ztga &1Zpɭyj p˜Bh3K21 Bض"[4ׅM֤ &elu05HXJ&307?Pu(*h #$&D )\a7o!ExowT2Mxۑs,8i͝gNz?~M~^M f y/~72I\wIZ<Ȣ˽3\uQLaxN˯[Pm0 kk_L(iӨG}Ǐm P޻2K4j>5Gh>RUM.T$ǧy5he&d\P!6rLϔb%ăo55e=OzEV`2n&/R UaHNRl +Ѭ0Z\|;58|R۲!`(:s:
,n|x\>%޼{A& X~ 88 h&'[}>8GA$<N*)g%7=> æ;B.1K~bv(F` >9~ouџ =>0ZKccBߘ[_LJBc֥ނi&NTigU(]mzgٯd,TC l *Mv-ww nF׺~3&|NDk)SF8O ³WPD}_cQmekm8E`=Wy8A]؈KogFD%j,'jJQ؇BJ>3{b۟o9cfTlH(4םHRvHܟB \cx«-247G بbdg"EU5ѓYrKmҖڻ{vpXځhvV' { OfKܐ{DXd"{ݕJj`ewIX9w Y R$r0Pd*deu"W#qi81 AM;ԔuNo(u|oYR@hE[hrh^PCͲ5>ԬC=iB$-kZpLrNEP``& lHwHw|oFifGTS(vLqC؉,RDZZ1 tIIXJ <ҧYJ :&\LنA*1/Kb^vix XUz£"&w]@6 ZY.#^d-=8)DA@ BD3>Xb &BR"7Gг"\2 %Ic5GBk$iI7D]2 L\"=kjIƴjytLYk%U9Vb\O /5==8_P݊ ,EpLx#xMU;nU-ADH:L= Jd $2AH_X'1A69>5އOr@Lt^GC;BĆL ɉd v8+؆^~hR4UD{kgdhߑ";ڨ# , &E6h#kӓl,8((=|y}rga5I;DI1"|8E#9mSR}Ͽ߼_}%y ^5;E_|zG^ #Uܟ9ߜgW׋{la488ٺ/}rĩrɹ^g٩(+4T&kpĶ':iCؼe1W0&p 9iȨ{Wb`Z HbV4\. ʧb)-CҡKes(7%'mRӦѹSB0Ja[=g7 iӮ ֐vX|ja7 !0M;iE"^&a42T(Ԣ4$qco#Vh|*9RP-¸b1%*hz4] lE2F* dY#I2^.%YR" 6mAfpDLV1؆ٜek._bc[a c*R D}:Ҡ¸;E  [OlHF??D"VPH -szFʹ bEiDk[v㘚XaMT)R\~),k0p mN-f[ֿsu~! LcrJ,$kCTHC(ܩZ Q)-V$i5xVvțr7oTE>D C]Ӻȣ$+ƻQim1**%|+:66F pKi1[{,ckf{u{P^_9&C-}CwԦzPP/LԨHҗ}Vx57޴FDR|uWr_pRV*2IXH˄}a\ʳPS}K å>s$o ԒK!F IY&N@,QBj`*d7'leLXlv:>}\`VfϚey.VLbN?)c^]0JFB TJ5 Xi^;%X@9%!7'XTb÷7tu7RC~sqo_==[}'w׳~ Hƈ]6;Jr~2'$h)->ҷх|T2F*zt ޼ WhkMqhqlpx-g_pPY k2h:~PWc0怰F?L|{>-%]1+I\|}q/!qL]@ȗ|BzbZS"᧷7YC >_>>y~߬/DaBt3ktS< ,H]/~ QO$HgHF_Sd~ ts~މ-׽{ b|)VǟhŗW7//)=AƇ~qO>Wv^_iۭUG7? WN 4zX?\7Z|[eR[90˗Vk= ଭ0å9-Psy8kaVYV+rmHH2*󼓝s|FPH@A'Ifӵa6'|GrcwP1Sfs \|͏d%YNU֞#O' Xu)P kZCx-9OeXJ<C_J&]@ØkUG#XnvbbYVRAIt10UQ.mӂS5-͚F 'YꚖI1f59?ɒF3  {5c)-ׇAn['"KnSGݮq[(}Ԑ[@Iҋ: וƀ:pqfs+Ȱb@˅8YtR,B>nDYWM|\z±ĭtt 9!VTtXD06퇋EOϙ_"SG@2.חeIAG+Ӳud9S#9{<"?;5fN C?wXbZ'qZC2~#*Y˥H< __W߫OjWzɐ+띨7/9^mXf'鶩Fd]n a$>)BUtJ{+0T|\(䴦2:p;iY^-fqFdXyd1n*2V, shmeuP J+¯O|<.?_Wْ.vzA*omrJ}e@בĕĶ"\JwxGtp[` k[h_4Giå-s:/h6p.2pʶ\s׾`DM",Lφ/t8h~rW̖9^2DvD@ޣ_*V`"ɳd "or:d=lCukIMXcrvްt  ;_K9MG>}s{Ɍ:ر֘NG Pl:B{&?/t9JeSGDJpb 3SJ0DT5)gTlgo_t,ּ 2 tD>)R++UPi׿"iQVن _ CXw0f暾YQ/Ȁ6l i`ɣPmqJب%G,]Qpn;F\Emc;; Qq+6/#Ǩlr냿|h`DhZ Ksg-4"hcy,YҖ9^8P!>|n?=^;h7~97'q$q_M+w,m47)|I-ܟT*;YYro.+ZmA9˭C LѼ1H>FEN~| ξb䛫峠Q;}u[\(h+/nh œO'1og'g{K6W5bJjm>q{ۣCUNOA8)m"PjUo1{þ(޼d%_&Pw{dA(ݷ(SM^@ eF PJ6t!!y07>"YUr,WrY<".DUrez*XeaNHt9^|DuNG$]'[%Sh!В hzU?wm'q x;W*ЏY6vNJ"HA{`@a9w`?j7 3:w8I$Ksby)#LsY"_1bO<ʜR'q%Uaeƽ3#L֍ n=7S*GV-]7E`kOX|T4ӳu9 &щ.mF#c9ةHz% x 6ZCҌ0FvV`A1vGg(˝1p`&і3%9[c?Nq3ˎ{&\" |-T'qx '-~ὂ뎺1Q;^6^`%!v_%2DAaK2(PJ&2@A y$ X{fږA)ԵZ1@vPH2 02)@5pN==(D˥63&ȀPk$պ2ݼ}' ֞m,[RD"$ci1u.ĎA$A C61_![HAi#7%DF6 TEZ2×>' (Yԕ0y݄нko]||8Op+/z$^q^Bڻ#^#t"'wEw߽k lR}~V M(UD9idR I*]!& ?-}wtQf~n?N?U">Y&dOg&`d'r3}q~_7&r K%(/A61{l~2P|b83?Ö-h4| ~_,//r2̸;rXSyI~x Xrfk6yX/ ^)תƮwf_<6Rk(J%.s<ܹ,dΓ e`j~`ۊђPKMXi0;+-EYA7(cp)*?A>ށ.F^ 1 У,/7.l"er pgno]| VJHmHY!G S1́A`8;͑ySM"ؕRjjKqNb뱁c\{X|eY <]3WΫY[>MJJOY;Ϫ>QgU/'%YEJyI{іw֢}rSpw_ps쫋Eޠ3> _G )~Ua"eprWV QFkRso,cDTm_y3!:JF!{9Sw˧VfDZM+̋'ʴHq\9-ΕBZd^lX.}+̷@17",{do/Y$AZ-gLzOEF)Fkmіx,sBa2'c1Pix3J`o{d󴹃6~My;RO) Fz'=N%2C9™Vg685΀F1hw,6l~9_}%B>.=C~R&a!`Pd(wgl_r؁9|UWG(/>72+!}o!7QZl ќ=xePƻ @.#͛@._~6"ӋZǝ B"_lpO,q2u7'^Oz2ڸ[jQ$Cˋ^!0E;4V'[n}}M@:3Epn 6h 8K_fӈ*uje0_X-x1`ɗR\k@-ߐXоP\R+'"?yӏT|>pW\=VYЬOe!v ,`& ZLbE%2VPrt,ww%E԰\Xm'* = _O$>T򔎰#/%R}?I|P!x2?x|5/Vzk(loDznG7deGbw/aY:m[o*ԯ7!`:B2OCY>y(Mmm-fxQޟ̦SPffIBѴXt?@{{3r U lc}@2Kij)j$8.F44jpFĒF 7ƥ҇M0&7+(oR} kun9FLґ͑Z猂 O(/CAe1f?U}9$;*)d|:]s:6r0a. *sL&)Viܧ}hQ/_FDwg{TdimvoHunUwj St4S$prUMpUqq(f_$5R'a1P ”:Pyt|;-qvX醅J0~|7ў璮=fx|B gerE:+&=̱n-A2^8$c,؂-',K. /Ѡ(f!nvf!nv]U[!=a!Ta0(eôYl)+2J?{ב $@f՗j~Ib!d`6A_%%)[=s!\(3gȑǰai8W_WUwWPb-mQYG .RgTމ]|_3'ѻei%FU!ofL;٨r Ng2Hup_O]m)E6׭Ȣk yO;ɚGS+P2j}pVYg}p-iY þVIeH%Y%0n# H``Y*' b3s hs |owmm{--*.['GM(شŽΙ-vb gը-4D[y`"$mdi(g*#t&ڀN5 wр6ڸ<.N:k4?|J1?n@K68֋a?5 8gmNO2ӚOD["Ail{'ucM%5/ԨĭKLMEP^-kQ8g$dNhUV֢a wH72[UGDah^jlVvL~ˢ57eLV,V㴔 [smM6D?'d71(mkC7emllW=|R^G?j{_|?I>b ^5~5Y77W`ՓgtW9n>=~z懋{۷ q;. xߘWçZha&7en/ \GxI8^;KΪ<ݛ;KyrJ 6'A暷 ,& P;[^OKb)7ͺ,Z`Kntbbawag-6YE+x| t|[yAxRX~U]4vwwRxM^{X!yϳG1U_׬5}ͺ}-"1bwYOvnZUB^۟yl;3hO$߯RﯿLɱ[7@.|gͩMY|mRnj`g_f1:{Ned+Jq(LmEhe.K|ƈ?V) 8ܿTMSxqF@zU=lHll^ K,7X9 K. ՘j^Ҁ R 'jӅ8@If{5IzuƓV%كB/ww~ױ E0qZlc\kSB)9o4kc5%=E?yQ GUWe͇ Fby?Y\0XWbeqpH:ȻA#G2>lS:}I2Tb4+@x:c i Q1uA>|% ]k]PB!*JjOk| 6 _y;c>̚B;)^.BÏ=^L*g3q OIɠK,gXikoz}IqǓ!2(SD 8ˉ򙧬NH#M[ݔnHKc7xts;n\}S~9_ $g8ybl|sۇV h EADO{s6wƧ|wSAUGw͞ xĔ})8IA(Ja5H)oζ:Blmʓv[x@Ճuڝ>-D"g&oD) ГgX ۑNKOKg hylUXE"ζWoYV:`"<2 I"mLD}J8kǦXt[Kn=&S"Hy+χD!)":*I]; \,j\w}2W4{>{fOZJO"qzlML'B$z@x8BH&ݠaR:#Kh[9 'lI>ljZt`hn& $zth=m4̌Sl+oNȠcXJHvy&pBϱpHG?t=o{%{|Ft~OHs8`Jl"4/b+IXӕ\!c[mPm7h5W5555V2"zH$DCcrFBR{͕v42Vd =Kͮz(-c) /T|i:^ qV8BU!κBqI& oL7S&G#L(җh8XURI>8dZ5ЎVgHҡCE.4gJ(FeUH:IhZTZJMv.I48,)똵 k *9p/#dPYW,^UKP9l|9i!>4f5̚¨ ]0gsٗu[BM+URى/%%3"],e|2iE8Y pW^e=qSt#]f7dY` O0B1nV⿨ ב `w|0_- ?ݤ{!ĻY+V CyEZjKثR 4B kfaGM5X7|1vd.(F)6kKEH$M%4+ T3>E#7v4M B6y}2HMJ).SLY03BJ_t{ uC޻(T"Xj7vPhmaӈJ[яAUm.XR6V(M9o dcfK ;"bCg/>{8^Y#=&OCqͰLR)^R5%Az2Šʪɑדx +6D "?'^:&|(c/땉C7@Ye:4n ~Di G;~gJKfa} 3;Vk "ذlNÔfZ7*M&֖ͮk[F}v̡&YԵR5CYjv#̋qvՀo&w^UD%F6HGzT~z/Xĩq`69&PHps:AbbvlP߉_>FיmLcJI2]3c5O )F[`>9Imuf`"=|Sd% b $ f {JCu0  L*: V r 1S z(Q;`ĪSnR"<9uj60~zbN3){W"86󒴙RՇN"IfGfJ#a2au^*ЙxL.k,oITHk0IHY{oUbGEBn O@qAYDb#SfXid=')IG B,= LQVqv3R$?^{{qDN39-KHk\Oo׺L^km!nЌQbmQN fQ,*r/қ۟M.S*͂pYۖ1eH)CW~#c xi1\8\7ՠs[p?|?B(0)H#twax I&@&5{Mz(Ңr^ۭZp(G=*ӄ'd.N3}|ժcٱj[:תihѹc%er[/74'/~Yظs[۷^~'ܑ=~1b@Ie{xFF l,lѲPjD`Jٸa 댖`ۮ̛~ ٭޵9Y`ە(chkt8U면dQ=l4Vh =]ʇuow*O`NUMRrghj-:|5qcV+޵FncbeVC~J,ݙM0(8mlw&KmYU"EI%_Ƞ-KwsN)̡M2_N!|BWXdZ?n(A`0$yΒ[>?_^Bx?|펜@ ݬkWzl>nZvY2HzߗG_^vDZkˣϿ|>|xf *qz_dD#۾8:kź[{Fio.fhp$ Om}Dlk>cpR"yR~M^" '\Zj5g tpAɆ&ÜW{]\}u6bS6!KJj+7F-Zί>7O%nLh>u2q .^E&(vuhAc;UKx>c]]v=JЌ]|r͏[ufvÏt٩ iusi섋\ m/#VW߃(W'4䕫hN;>_`#)Jٚ5x\ OWWЦ5$q~wL\_nmMn:\%|57'g'QIWד;4fh[ũ9dCziVP SNB))5 O"1BR $;Krz)qʉ`q+j=a3rqn'`UXo~&qcBL=Q/l \ؼO wjs=ǟ7Ehh >9+`}G[*\<|fR%&x:oߵjq~dЈl -ZgN.>jc'z\Tһh +Vrzp\!H]l}5wϛCKIr7w/'Ls^^&ڶh5d7< p"7#|@)UJN(G!$)oR~jo;4STmTq"ʖ)\\spr8ثgSmepi\e..|N[xu !+)~Rs#*Kj^׵J.-PQ9l{kDQJ5q\LbQ8F譌[ʒZ8 ^Q8"I^K6R F. y"7uuq,܁_69UR ?0lk`UIuXW$ufQ^3pT.PxDz/6Nm; *DD.+UB %IP;R5e`>ɳ3w;Zj%Eh&X7Go&,[6ŰU 8QCG&T9P#D^%eJ'B7D(g~IpP8j+NJ:_#AMFrYSKWWڙzI+Nf (^=ˁ@q!^N&,Tv:AE5 b:K#Ң6:JNZc14~rՒ1TP4d 0r]K# t =V$}1*P >K 7;#1qEXԕRO G$M/U0%%(CҤϖ1Ipt@ɨk7Ga_g'I-!Z;H|PIp5e5ah.Fw~ ,!)MtG2)E}H4[Xvq)'WgThX9}'t`FbhGw]J<['f 4I2ҿ>n;ٕLNH/ޘ8[\$8QU*1 kl @ /CE)\?< ! pٵ:CDE"/Q@`HB~Le]EhU6,E :<2Rk`aK+|{_'(gMox&X/¥uSk}iL" 8\bRkKKaEa_uMcĬ4+Xaa3HYg`,B1[ԤA2w ԨeʍEH́!/3Ļ⋻V"f!7Ÿߪ :+_!8[ _`>Z pLC!'W"~g¦;5J.hv0آDb,\ Ŏo<Ѱ~[؏E8O&Qe)UF}2Ԝ!}F|"ccKn Gxsb;zR*4}WgɄK/L/Il}I׉P2DBתyU4iTӶj#XjBqi*j{+\daFS.yj,H( @e@Hh释"ͱV6^?|#+南u\d7v]U\\uw7aG3gffVlmo+hIqj?6|zJ]my0_@}ЅYY`l5J9tӤV@u9zԆC!uJWs^9^3Nw5 F(Iv:Q1ejjnrn]9I%C%DwELqyϦ"N"-VH)7.Bhi ?`|:r<$C5MSځzMӔS4MOՔ#-*ؔh-K29ɥ}HT)B:k,M._Rb82QQA1G@b31-f:LwP=p0 H,5k5zwmBk2/h(*^WXs CR΢e.Qc8#Ը3ĺWNao{-/n=`_emᛥ*z: L/^;Y &1CdU7taO6lN[q:g5~x دb;{m^5SV>ME5$6ĢI+Vs(t )C6E™J* Nlq^#<ɒ;Sӑ uRӕY2;w&&}>sjTks㓯:Y$":Y++8y$Il bZR@%u׊CRV1{SSB1:ʋt|v|E uit<_PEm=Rjς4Ej#վɈ6K^H| i_~!Y`ey^'KI[7"} \Å%!J%a\U;CB+m͕^: G㡢RF*Iy%ua<Ha{UKn_T@#(]9>&+h:-֊zPkV2+R3HX؈ ^F *@RU5dѠ)/PQ먴uWf:Q{g,8K`d$pbe q>{x+VX+hXdڄO(8ۭAhRP4:cÏRKTFpT&Z,e H2F9BHKd0eL[Fw"Ѡ\.Iovw#I}$ɂգs#+cQBz$Te-w[=w2rZS+ф7<7[v(j]`77؎p}@ig&МuLiW hث7h9ׄ2u,*KûLkTHnaOdݎiz\q&Hޔ7xmA_; GJ*"Na{X$`[ tIVmK9I7ܧh2PW$;$fF[F?J/箃H H! P CebطکXqǼʽꚂQCER_n4n|*"hHrxSH*c۞#Z(ZLB:ERڑsM˒dIҢ4!Q$,IӓKq0HʑYGb~0cwԱ~@7j#8ZO#דF8~?W&l]%^`/j p_Tw]V]t4?(jqB핻m~ٷw9Vu ܏ &6Uu㽬;o.S}bݵDx!\EtU0ߦuӭbZR RT9pW(V2֭$Ѻ !\Et[8g׺IZR RT9퀷 Sκ3HnCh+WJ )D9#?tVIy$y e=x + . @:Q)cHT#1v0h2(ͦ?EAXP &T1AHM,;'lL)r2]"hkPj,4uX yrŐXrIDOd=r@K3LM]TrTeO*&tQFP3jtI0f sJ2Chx\KO!JۨnK!ʐh$=yzr)nP)Dը+tyN'ű\SZ}W@4zxǠ1a ?dAPKJ|T_;w֪AT M!P&ʄL_.|ɨdL3]k2jpzsc 7:6+%1?jE J+ %9YiL5,+6r-)MZCKbTRzÝL[KSK !\Et O][))Sv[$6c[rD6rT"#`LLu~*3 ;?E"gz#ɍ_b{<w~2j,qNUd7},LȓH @#)HxUYLJˑ.oא# <^,1Vk<ȕlhAWϮ&O܍Q 1:ŽNlgp˜`]>Qq&&%P%8{ |Gt#J))"= "݁SR&nŀc0jR7O/Whn%R&_Vi=-5-fBe)I(v]mgCYލܗB́VB|" FҪ0ۘDG&og̛}\ͧn9sBXJҐBAWs/tCii[O :Bq.ff ?Z, `g76F965Foa:? U xf= ć0*eG&} 8 829Nca k +KJrQ:ûK.o"0t! ZMd&H$T \߷|Yo|,>-KU*6MC3xe{[k|j]ίtP*8"F@EbL$Q5*z5Ô->aWU(lu'Cq Pb*WIkHctI2vP\65RS,(9rS5nh$ "ц%u&qkk8UԔ{Yo|@!j98 +w5smTQrzNeeuE=s#6l=/n·V95?׳בK&x!.XzXzͧc㨏S̭P׻!N*0NUtkl(wi0 v0\ [)3=3RJ)) 9 l}2:tPØW((hI#q hBʐ7xTKʚ7lOةK#ŒA&z^wv;Ԏc=Zn퀻;hů4]6 ZG ՜S.ݎn/w'` ;U5˷O x诇*?Y=p9jy:[Pp4kΨF9_<8-<ǯwpB(=ÇOw_Ɔ6$'r8P 殺nd2ُ m?7ߋ2A!PÄYOes@F4_z,X*S18  ejGRʐ@<\ reu6kJ:4|BUKƒOPHE2"h)U"ܟ!am+gt669NQ u,S;7$җxBHuCN%)É114ZB7ܾ٢G츀K璐f^""B>AA$RצlKlK&]W $[wDV +PWD~ ki.[ }kMY'vo[@KE^o93l*˔!~5Hpru= ?wb#Z7G'CE;:8@ǡuR+M!osyB6`` ҼI Zs#N&Ã/b+N\]nGT&.C gV؂I \f_X9^CF+I 6@7kB*wL1V&:HbƬ`o/t 4KQ sjJ-w6t=#XddfbuvS|H|ԖZ2C2CI{N,>Mu9tBk+]tIrXOfېb]ʔm5YצQθi"VL:^?mmK7_ rPCwmsۇzטڞ'W\)7eZ♒j[oئ7PUif'ׂ❎DpVX3m|qI5\\N힬[$gٝ  i ֒]_٫0hT χ=Vn43qԆ#> 㝯*p[eڬKdq|DlYන@UA7Aڨlh_sjXw|^ux/j*$r6 ?)vB9o4)[[ :UvtFɽ֓w9<7&\V fP< R$gD?˩0QrEӍ+f" m'"4C`WB gA˼f)YD0QS@184T?ֶVܨm\Qj]+eYRe&E#~/| hCU,K+e0ѹ0wX9B%rp*-}0U0 +m4؂EG)iˁ5[6f[hmDS9 3oT'yH}g.v䦥"Lh `P*F0,AdT#-jwe-LXFA ]5S[9y]fl^ϢR48~, .'Ú -6g"7"]>*`|%LzA8uQY})ۘu)YI?ei#>AF̄3EU9W (s!#]"D)Dtn)+P7E܈ " "qdۆZȶ7_>d>E8㐗5H*-7S~ؼͳ BUOzy :7V^ω?=N?E/5ׄ b),J8+U1>bb6:.\sa{|?AvN{d R,cpe1̉z ) ߘ>l#o[oG?A^ǽ z%t\Bhvww{â&/~{`x=ov/vw^zfwz=Gw^aΏv?h^CC/wAXA?. ^,Uҝ6./9@~ٸ/x)Nol/l&J"Ə_o_ v]Mw6|`I7f.GDR+V>;1w6>[{Sr" JwlƣSňM?8)\<Wpp~Vg?_ s2\&vTlbǥ/'ݘL]b~~etz:$+K1͓$r>ze ||_g}3x0Q:| ~EÔ;}vY6}REe|3!M"pdp91?zV~wayfO׿{~ ׇ'siUxI׊KÝ`F^{3FY<Ĕ =uf0M oǣFŻ>-b*ٝ1غ/[gS^?:/ֿj@n`N⋐,ԵYLdL$t<{Bi ._,t{EG\?,ZBc2^donFgasMPY ,Q@ɀ(0 E,>IqgM`"6!͂VۜJ x6o=^&0= h4 c٘{-N USezʥh-6omK aՔER)*ň8% 5:@<p׺s4th<'9:GCh s4܄(D-Q̞ f>K>K>K>+CؙXsl62tplBٚʾ0f($h-@kn%Q!*5DQ`Uz L(-;ݡ진?ϟ;ݡew(CC͜NUvSS (`&ߖMCʮ2͓vj]5bnyV+l!)_EWTэ \I1+Wry*Wry\\kQ2H 6dR4j Q1XKETr Z?eCǴlv.hAhXL+;Ō+M([d1Uqp0x1u`' 3R#7CfUx 0R(m}" rh(Y|~ w=;G{׻;gJ%!1R)޷&R7_ZwbU]Z5"gcoLcY#gup &B~+$g3Y,5̬Xl630r ]٦gO rfG\zN9ɕE BL9D+uXz}Dߕ ]qFg)zg)zVο^8_ی7Ց/j%mQ[ŭZAmO0ԑ++# ^i /m{n+Rk眯1$#x%շ\:-Y*Yb-1Q,opXJۖ#()boE↯oDB&JD s-eaR3Ņ\ B9JՋ`.Mr`J9&EA9 :pI66(5bmLRJ,RJRp&o#\ ʿ ڨ¼|@{ X1Yaqm$$RX_$_J , nb%Ռ1@P`8*Y'|).ԯ3/YQ,R,EqmV•\4qzΓ* mFZt;Wp-HX+[K+s*6\]L_܇8SNRYXGþOgec# p|ۍ^ Udۚ!ɊI &T8 V>sD1d*w`18UcH0vX$b5`aR2gA2ıQmK0eb5e3;˗%d._e| TJvmK1_ um3wNVomR`%wݶiO@B_1![gRamΜ A. H9)I+bIY_$|ZNJ8i`*mҩEVQuNZ*X(sRܽ1?xu,KTElj@jII6QF9)kWAtySsRRcEr{c k⤬['P_^>%ܫVR+5 Xs6OuRTݪ$g礜JWO@k1}(tHqE@Fa*SOe'nR{@a6~xn.{7:V(S,j*9y_ ? Y0A:/H J"$ki"\rBw"ocfJ,ע;?bǟyƮqW+v;ji-MrA^lin0^Gg;iK}(f}0ݨ ZWMb¹TY>UVeGATMj)p聿6Xaaƚj`8).RR-u+JւH6 XГbAO" XPǂ:Ա屠|rk Ziu{Ia%|=c ; A> -K@ Uh7E JniS֛2JWD:kg,Dzx}i˖+;@HXUϩ!gPYwggfmw As#qJ/8z!DQ <i<${tdnU1DFrš ɽk׿=c΢,j}<ڳ=ڳ=ڧSfjVBD)i~A}쓳{~ @+S׹!><ۆsyS=tG{))RjB }Њo)%tj 9*UGM]d-T5%p_鑛#9||9-^ݜݜݜݜ<ᱛ 2{]^$_ٳ7RCq"qg9j!goGUX;%ý_~ȑDo#؛`z*78A8#)҅+bs\j@(9HCǎPGYڞsWȯqgi{gi{O'm?a'cӯ7U~յJ I|5N'ͨѾ ]Kj)YgX/> 20:x;5;aeѰ?$n6FV\׷qy}LJѸplBL⛗_\_Pì`p (ε~Mǿ"AH2 Qyۙ1-wD%8&I9_޾x5_^HRT+*L*oo6oۇm/!l8Ic(6ƒ1Ќ ]Y--rP^K3][xwQ6=Uf.'Kez"_&#KH97'18֝R(gge-tn ^ :ri7vsp"bW`㽫OZXoz'ի(z~J=dKbRi(λqBl,:dؾ]N\::ԖfOQإȹyy}Y\<)Te71hLթ$dפ~Pzh)#NBCIb!SV!l q]-&b>6r\7@?f~ǗF)aon70ESwO9dBV(vܩ"KyԼGHl -w-Z R e6P$'2R ͷcfg`~ \_c?^1lKF~5ěC)䊶 NҲL(JPvۮ_0Ćř0Ocpz5W20ը)|A 38(cKٳTtRn%}}H'S.ڤT hB؜ضߜ)\\ϦS[HFޥތ @6Ul":@C CR4r}DK^y<vԀNXT@4-w9v@_Mq6_:ޔ[Lq| \~:\;΍d =`Ij^GX pNiN)NqM(عS jF-ZqH$Ϥ޷s93~tT@s3!MLlH9$cla+XSK'.,:[xX{3, 3Qpx! ߻vnHC) )'Sԙϻ)cCrs1Q6I-!5Qד(Wn '& (g3QRËx+K}A=ko8%ȗ=dGa=wX`o|Eug:q21~l6e Ә;1Y*֋"_:0ACFT3 ,8cYgB:Tjӌ[gẻBK2 7aTGz?RxgC^$Bŋ@vyBPE!H1߷Uiw\CY-#˔[S1J6 Đ zT (A.T\Z|yp2$wPFh(}PɚnLJ5;yuk$QmxR3ZI-˓M 'bBS;Ц˖OF~a5" §4bG Br%;rg/3>FDrʴۚ58Q!4?FH7sJ*DŽ+FrT>9*'Fq?Kc_./Ous*r%YBI)Q$"19w/gKV\̊~y+u%fcq,a;D~~}U$S`kEWNV3-jO(9BUr]pCl>'y(C^iiys[ND&X<gi_]ݮD&ߥ3~fr4 heL%4XfЙT*τH78uY.H3N}hXeC6?-iC@gHDD# ̩wݛP99s,w&AU!?b5YuCId@8Pmz3qb/mtܿϊ>I| ntmBV~ 9b U0#)\]C@Tb| s_2%Ϧ tb !\_$±H/J#G1ыw+Q d7 MD2\Il\$F*Up-N!ٱSb^I_٪.}=>RLyEQf[wCe7)LNwgxqmLLƅ/U$)ceRKnأP§Bҩ3ȄGG~eI!cՒN*xNR>2!UN|yu6{G$&|ķ{GR#I'5&ZEdûd8 |6vG쾕{n8ՑIڻۘb"RR1}ag{\O"AE eH j"LF\r(JN22ӌj'}yLՈ3bi ?3\k ^{WڻUW=# +ccErAėKp.MZNYt(g}gݨUoG`9 vE82`7 @Km |~WW;WZ7{~X4;QƄȜL?Z5Nلx} ItaJ@qow|G% JWI<Fy J%; On`blu8<?9BI} ޸H_҃h73ba| zz0D衄H)'$OL>+|#P=b+Lr>kq tBQA^$NPca ALHe?]bXi(89kQO 0M8:Q #2by5&+2/d2;vPd1 Z3FEj_3,N̎=M(8J|䜤ü1ODǻ=ť&9L^ 'KpNxNrjˈ8IR9QzJKU*[Ⱞy,Y^vDLI { =iol+HQ!}DK'$6 ;fH]Q^i$gm@eA}D(tlPn?qB5LG&/nHqeOtL>9h.).Ok^Cb&$1@g0@ioOhU$g!d{'l i;,dZ;C ϟ[AرǘH" ƚll2&S뱾Aޤcq1YDXj6| :ԯ?&jh^Na :<&߸IYqٗt{( /+{& %a2VYcB61LXЬ/=&ٷxk+n3[) Q&bs޾`YllOĦMX88ǭ=\XշSP`*ml#~كbv7pTSA'{; {أ"/g0]W\<of˗Yo> HUⓚr,OB|ǚhB|r+FY׽S&`A 'QJ&O8-; $l jڼb{QAVXj6cooi 9:r#_VBM4Nj|GĨX+h;*Z>#́GAf q'eH(~T:PXaJW_Z.QUؙF~/*GΤ() $T6!DsPd.6&MT곋1 Ag%g#!C/lv7> '}HD/}#!W6FBN _oe#%9͡ ]Ô7q; #hKnj;f@7;yxT 1.!^.od.k O8l PD{(Y~3})C%:Z>8A6Ll=%Tipٓ2wO.+TɌV%GS`kJP2DVBkJR%JehN:v'楐5'ݺ ٲ+6;45a"BA aXy( @"#Z#S,v^y^[|߻"@Pg6#]2q p[T{EOˬmd)zH|&E)`/Y3v>ò ZpwE7V:7v۝Ppԉ_O*2`9IQ,i&EJኁASc[cC9;Qrcy-ñG'%$5h5LWfo`w$}N9 M5)ˎޘHH(ўBNjz# $"{Og}eI~Q4s9{ӈ_݄Ș#q?"hAyhuGA+<ña7׌r%׈ ^A $E=RƁ%,Rl TK:\s i | UgeR|>d$b̲Di3IfN'UKнw|IkZI參*- E񺣾XyJAzu\&*F_}%P[ *PSX;H^@R $qǾLJUe:ez*E,Ā\hڲE0b F-+'OM" F 59I:mDo"f7}?\rɨ 8M7uVs!_V xS_6rw+wQDY6v~ CR0don*ov [Җƈ:Ȣd55kF q4 )"ڱ^駎&۪& CFH]F4{C CZ8<]Ie3]d>l, vRRIWTu 8D0sedp~̮bue&:WCO#ۥ!R-8iN<|W]t*:{|ٍNʹ2TBSjji?}N˭31֮$gA5~rfx0'k>r[S}Ś_:K7g;̼Əiqƚ J65=IХ1j_u 4ĕ DŽBC\iY:G&fv;^g_}W^܏puRt?q<蔞>99%ldאj tK7c; xv7n|z!uQ U81$Ğs E+Ѳ{yn?o'4{:5H4C [:źn#z qj!%ix6ia SiݣHT7(`KI:Ɣ> ewYbɺRwi8An$:JXkIC#***(p}g V"JUز;s[ c6@<#a*nV"mv8e7+uw7jg>Vj<'}G u AIh>utämGKs ޤ>01ehj\`tA1@t$p#$.9]m"ir"^z3a9@NQVҗklP/HZ!K+7W ^li>G{`%B$(x1]V%ۖZ zj) {b\ze/AWܑi;oS%66'#69['5i8N,{> :?'\<|SU[e!B%^>XkUۚ}$> s]De~_ZRLG0nlvZ~m:ZЕAf;]I}Mྃ1͇AR)$5!(y'ii3Qխ%#0C{}X(/!O)`,rk-^b9k2*rH vnԕA9WhAticfG΅ ewI4Y-ܫpiJ^tCIl36YĮEZ^$Gط6sy}xh|e -}r;*?=/F0ˠi"`sqЅ*YN k2;MuEIzNuI ?3hi" .MwM.nJ43C p7g?­$N3G~]:pa8rf X3y$uI|9f%:(eϠ@97WK w韜~돿~}nPdg etR0JS;x*RI:/"K7sXf0P!v KC! /_ym-d_ݘ!T42l\.A2U(TCp ]"#"f|;)9-;jY" KSJ оBOz1_"Xy73FdM7i?@w{߆/ÛNv&K6-7 ;Mc;zOn-tH zm X?X{.a;_mY ҷPU>!~$wE*9+. }'羑 P }"aGT6>ԃlw~dֿHHyaв~- :Pfdsg2HpYpéuAZ8W!z͹ZX;?x{+W>g)|>^ղ޳|}k+zרTߎ{Pg/"qx_{unRi.g4fh&$ '9"fѨ\e\ ܁{_ٺ? aUPVu8feQ[pA3w-6s.'3vtp]w\W`Ur\ʜC;.Hdg8iv-<0Ts.jq`~LV~!55 ~%H vo.?fҵ~6](í>%Jԙٔ*yTKcg1U17d] I*Ӭ1P=3+|Gu`TyJY[*-úZ ź23GDm0 +c)kPչ&/'sN먨ûIf ci)xy3.!y[{?lD}׀U_]c+O_$?of6Lg!8M<gGqd ' bP'$WT2!7nނ 3e1FN U@.NqF8Hgʹ"[{x 6(~S1kѠw\1` .aiǬ7; NšoWQc>+^&IXq), {'*%Z:a%UHYoP͙s!\#4.+yu#Z[+1Q#6(sQG2ű,|)a*{C H+p\@Thso{h͑%+eZ>5B 0JƛLc3Fr)2i8mtR 7 n5Tj:I2JZD7 ܛ ٶܛ 6=SAܐbBa+ GcO$ 0C)-7BO3%Ao%qI';Ky[Αɱ >>鄷(0sɐ1@**]3xr&k|" dj1i,,Nn)TML*J6T$u ͽ؃a K5R2ޢb_1֖FBĤ$82  ʃ̩0U񦑜bp0wb &b-"rD0׸#)-ɹ)\[EG- *+B̢\ U{.~UMo]hq2) *3RX0<< Iea`äB2mZPdrZ2X`L8Z  Lwv/H_x*Aod۫¤/5rse1 9.ch*i]sy*Pl9я{Ќ{Poۜ"xv$x4;ͱJ09)H5o;VdJSw%\apd9{N9\r3E, y,O 1T^%Z}dK#(?4J;9 Wi?NN)ȩoNbݷo?V?A<9I\g$\HrbyG !Pm.=>6!S(6Ku qy>—en{SZnld=k.V<¨Zy&qc"$&- 3i2409MITiݙ\ch@$^hqM6Cme"&OyG? "IWə :bƓ;W-'7u`ړ5DVpo-=YJK:Ov)k'o-dͤl[?=;rU-ab2|~¦R]ZL8szP&u!rM(]1l97\pA9b1(Ι`JkX4ƂǰB{4=i="n^SİgTk1L9"h6+V]Z͞z2v*T-d>kAb3oK !!maX,+Ic+kZUޏIy,Oφ>46W#] A Ē Gkv$MrWܣ-1a2#iaɌG2TLZ*l~Gn.uv\ɧ:ŞF=y55-<\Af=$ wtFuW?yw#ɃA{r' m>٧ˊ;>OvJ}zI߉"7ߗW{?Ŀ+;R1Ptw{S{Q̙!)Sl-!bjH+!e!$p0N$1,&w֬Ҽ>Ϧ;`kBPـ}j-!r_77ϼhe0AAtO5B"e^;ܷ}//.ٙġMWbc1r4;:^N`r_..?4]0,1ècc7/܈SWڇhe@33QgǵSVj#W+qgzguP]v"@]u5N0Å3Y78jL^w:Pys9 JgDۧ;{~Yc5~lQEXe#$z6py6.& ]&i_viݍ\~l->%^GX1 ^Ȗ3:"sCy|Vcp0ݼ .5S д13]`xjytBbxse qCg`5 [I8r{"Q0HЖ )lnw+Wmׄ><*u֦:,wsTZ֒Sa:dV/&B">U`Olajl$W2UȲMl2MyCrF G%azx JXl/6LO`܁Kz856RVlG`nwuZ3#`L/6¹F\=4PSt1B 5;^͹ъqNQO sLDh^$i:J=?-ˏž&,Y>P#;qd3)s%J(VZXBx ۠j29/cźM;R=h$Z T?fEiX.=;}~`x@:e$LĔ2qRMO$PYL;f!AV A%' inE@iGGwsVb7UTk%-iу62qOHMc;N=h\v?.S_\?lӍ S& 5(0(zFzTsڍb~,^Se=:J-YKɵKAmP\U*õE!L0FSf붊f`}FiQ>XF5,Z'Eo])?T+y"p|4L`}/E߬SlhKK4k᪚M`CaѺ5DY>WJa1_{ h|<|>4'&tGцe^ɍx|pp28|  w(Ng>WHFxS:;;woGy;8v0?kׯޅ]'x6z=<n{ ?%LՕ_,bsyU(W׷쯄OO/rwЛ`6 1h?(a{PkT-~5o`r~a>[pt?/LM8aԈP*AUjI '2oc ~m.Y%%081%4Ɔ.~SBz>9=ǤKzxC* FGMqcL^dϋ߭,G?O`8%9'sO>F{|Il]5BI(uZw%s}5Jgm p,ЉDRDTflFFd6/e' SN Y:wXU>&!T`3O0:"Df8IU"G4ZADK4Gq94VJ!Hɨ/&.T.tAh" 8挧4C˥e&DsBqxmH“B 6m|bH@/&rHꅉ -yB] \VDWм]( Vʚ(5i/(Q{eHd&XDtmehTx1A 1'  Ω!VRKiXE@G%#g}d!H2+PK'ZY%) E+j, &рI­2tBiD7~`af럅B8)j)v{ mJ&^C yO4J*C"bXISF:B cZ>nUYlcyy, ƕ`Is0ۨ-!Lt& h@c %RGK~IcF URNFy% Q_D)ɤ3Uc0R=ۏLAbORˈ]!Ee|+}rb11+f#NuW"pLb1p^-{붭I1=U|Λu*L'fvfOQz6 t4j|N`ī=(Z.cJAmO#ҼA`Sɻj5MjEQ ~>CM?]+Ƨ94K%HAQҦ4qRK4Aʓ}MMSu6ɌfMrrgC"YgaQNS er(g1I sїJ'KswѤы Kڇ(VdPyGC?#IDKLB 0BBHg8%oz1@CN40'dDMzѻCQ.II*5N yWA)g'G^\VOU>e;;ur8 MKi;{0x l,ߣ:R%GKLc-%4kwDCҵ^>!#^v{ӜY|3{-pe&Y+(yŸ<"Vm[_^W[Z&i{~^|"Z1-8-@|lT=y-'ĘSaJ9!r{@Llްej50A+]O&E9Y.3Bjl}蝩CaPa@eI۪6Bw)Tt+FVwWQFd 9%U\@wi,ZO˖&su50*.?j c! gW@:da  yV&`(< kS)9qF !XeA\e΃)39 GvțQѲy81Թ~Zs*ٔuQzrnàL$j?b kPARze'h̨~,l{: +Gi~@6'&b[ylGɳ/И],?ffҞ{4q-~~[/ ࢎ,&U~}EKi}sbSE޾ˉ],.>r1/B>+ꥈ`E;ςgُzt7_l URy`ȗE閵&Hͧ:\@'} @ft'vM {=8z =n˟8{W^ԿLgt7pot,\Uvdҷ"o*¾7>}(~5Gǐw;oC\̗^t[^ctz VA~,@W='qwɒqO'V7,\>[VMZ?ʡB| ۃ7Ç\A~Wlq$-<'Dm+X9&훖=Nhmw;|}VgD-f>ۓy`aC$nlUC] ?4KFkpr$FB̀+5"'k_c ?u@$@eJâ#5$o|b(pUZgTU[.?#l%vW? 88ch \ xț:Cu(%Yzd|kv06H%ImQz- 2C$1wDG㬣F!\JާL$w ZZԺ"9ت>)ޥXb3zE㌃iRTly SD5大8&tZd(D| _p)W8URq)& $p5 1aK4WĢDi.ƖPq|FRR! ؁F FCqq6:ͅf\J2gB/a%o=fRD_}|TFs`Ɇ:ZfCqLkwSrm":ֿ0xՌUbD^L&>e,}1 ,i/Œ|mPr)9*ju!MS'#.W08qׄxg݉D3Դ89>JOC0b|E<\s8dm(ݚs7%1?&9_PJX~tosѹ`}%l梸>w~()%G#-1Dd(0jcXI@I.eeU@i\~;F.AŮ6^~ImZLz @FJb+pwW"@N_h{be09'[? d !G7 h9<:4D"y {n Z`Zph]?vl`Jec% :("F/@P89g0"`."٥u;(՟IɤB h31(z06g"4h-{|ê`,dJ^JcofTYs:H"8*3XsL! v !0H}8V꾳$u@)% XOf09js\/An x~: ݥ+Юnx[Tu+xrO|9Z<1RQƩ"N ъtϽ=J8EYNSq"-g&+`\ QXIi EPD-{#yx<x!mL ?cHBHR#`q?5*(3!@6IU ˥v[v ^a 5jnz KMB#s-z 뿒44Doe'D=q|c?rCǖ0CΉ"4j_ECXКZT~0 Iupq~ A-$t5|CK4:B&(Q3d,2)ܫTc72W/Y&#!3c/~1x/ȃL=_6:(7Eٲq)7>*S]*Ss[=P bg% T};Z(\T:j>I 4ֳ"E}] gDuG!$tƬxM(#W .gw(A2o!W0oh4d([+1g}wřOлr:}:L8ejV RRĭZM#K}7lM| h:SUF;uȁv>PՌ(cҹs -Ww)2t 'طQ-W5x(YI]Z&9g"crr>JJ$ho3}%V细>-"Υ23k)\^칽wnޢ0-8K;Fȟ8'\J _v5JoXoc n*֠"@py^ON8.t"4K.l)"GqXV*Ɣ2|_nB[}[K6&(?n?Tеu-zԔ]P)"x$g7Eٝ9&A;W}1v Gejb!.#WaQD E/1Θ}a~Q~xqs7C ۜJ%CM2͎F n l<Jb!M P2sԔβa±:\Wݑ챔xBT+,%eLo'C;4B׌UbDY)N/?\z?y=8:,:C*qQ*lTĹtqIqvAA廓T].6^hSFRM.[~2 :6sjA9 gVp,M |?#Q)!$I.I|͘_bTFsDgmuΩlc0#cI[ Mj!K-~hF;XfZHMJaT ]z1X }q?*6H.2È0Spmc8K]VA0J1E)Q 9m_Ӹ~غQj"J&b9S&*9ĬĒ"(9) vNJi w~4H#0/U7߲ 4_z۳4ΪCdxgٕ?䴢 qz?ckL޽e\ahɅ7>d~ÛYꍶ{OV2/Gdtv Yj*ɯTCNAKB?ƿ|1It97U"Ud|~p g6^S@\Pm@3Et?Rd8f HAd͑ziBt|]=vO*p _t?Nj[~<Ӹx?4S?VT&^! AJY]$ &QQaey2:F:y0c뒣^)P e< \"9J{z#ɦDyW~9KWț?aa1ęH ⑎))).`cRጰT2Bhψep<`aJXQP s JvXd`Na ƣc|@cS<^^pV} 5(ܺ!` k FS4yfa,lM.˯ bP"%i - ]tu{.׳d+{14XWݘBUh_?|5v0![ƂW x>>{_&Spmŵ[:OfŊt~ew;QQʸZƈJ9c)Ƥ.Wkzξa=ƙV |w,^55nmw6`uYi&9^?4QT̕|K@Q|,;GGV+)vX9p\t *qir1íP2j"֠vy=6PgFsD?OWzN~gO._vѪ_f_V w֓zk#창8_>E _L.j7sNt9 S=ů!^SܘkĬI0F8%$L^Y Uiz+\h|sy5UÐ8Qc^z|A>`~btlB3zHWDp83%9s3 ]^l <$h?'Ϥ*TN _LIݨmB'Zy_u!{dk,wMɳ5yV30.9"$,)!1ȃ |,)SJ D}P-7LU1ݨ ,҆t(v)'!mEB\ ?YpFJ9T$P)*&Eic*aȼ21*A _!H{3kI릢}±ܾ卉@Dl"P6n"JpV%;\>0`M p'Kz8xq΢ 00,bSEY=J(B` (1<ǂcT\1mxāLQ!pP4qcE:2pōbD(^2ER (Ddāª;mq,F._L|Ue9eXs*wZ:PTkhm`@'qƼ\&|P.+7d6_AGTx33Ok`%S} Me^D(1bs";[?#O?l4Na5!ӀQ,Lf]CEoTq 1R3E Ye!G@H>da%Z$'T9PЊq2( ., əhdGODl#q\E;,'|T@ 92dwHĔF)RJ!au ^c'ɁQboAi*Pvf\D:PD$5axc*X)DΔ&[!R=tȓkE5iue@\V<3K^|a/1UC SxP!f'?iOEp.kÂB:4\laגϚ$Ʉ w'->v6Slfp:,`8P0_CSH" zKAV޵5q,翂KT*=8rʕ8vYG~Ovn2NH(֩ӳp,YbA7`랞{Wpðډ8Mݭ`gbc4NՁw<iĨ=p9wd+S88$` ۿ_EkrE'rEO%5 K>|FU5rRX\X˹Jyފ5"Yu!uͦW׈.|:RXwJFKT4xg$s, 8R@\ABYl&L٠Z8VXQPꒋ @5oyN)$Y,񣂲`= <%O USPfLE gNb++ Y,);v^ P*sǔ6]$$%l$d'NIhBҀZOjBZ>Q/(F.ި(Abͅ X|i-Se+4 Nf#jGӡ2/4˂5|Qb=ExBu`EЊZFiP Cfcecag+|s[wQ 3 K@(8xaG,SıvnMEIX++Bo+ls$G2M/7)R>VY3 Y"<-@($8*2yiLCKbhI٤ii Ā"O`(EYMOF'fyicC)SPQ́C5IU"6? ?T02ac XT2iIiΚһYݪ%XDWee@UI*1xHNU!ѐWǺjD$cUR)+v IOVdDqMIq3),0BtocYXLD(CpKv) %<9 đyn\rqc ARjw HO.3Vw)LH2YiŨjwmpy(fXj3$GJ13 ;"@e:l<䏬=DXY,^6 Lu DQ Ns N TF$J}e@Öz"uŵʁeBR桰 mgQGRVBIa!s!Ԧ(.,q;Ab3 ״IJlKRvIܐֶ,j$UDsqߑz;D5BsB#:O4MPu]i`DJ#Lxj Np, XbT㩫6ƈtx xюO9߻{3u شunNCC2,S2vD)7)Qw_FZǎܴ 40"GKkݖVX.5+x[M tHJ8nBV @c,P >>l(drDB@րN q%T0n]z|y||-`&bmUY9Q?!.H?6N=UH^OŜ>"Nf7tR,FG#OG+Q/9%y2tk-1׵f7M?< Aw0SQ-L( rHg^ɞ ~=ӆgͱ" S@'| t<ªOUMFW̳ hKF&o.:`|wޮX RVUIvI`2l0\hLOg'Sss&E,L9 <'$1B"!9yAxMAX?wTU!۴?KrtZ=g(AɞCB2f#5Sl&rtz2OLvYn~ ?NޱvYRQm45)Ϩ@6<r-IJ'I׫@' "6uX?$b:$s#Y"6H}$qCV"T!)**gŜd<7>I+;Z,XTYYO*;85sCp:ɮtS)-S$fV ,z`4{6gaR5 S߁cm N=YqtGImG)9Zq>O 4Oq덏P ԣg|Jdgܛr Nrtk=ЫJ>9zrp5=٪ݻ![O8JP{k5Z=YO8*(ytRkL%׃T:xQHCauI%*eP{'z/9ځh&t.هQT:]t4 QY u5*|scSQ cQ x5s Vpu'A ,6\Kq57Fz#b)nb;uc`"(#{P=m h,wMu  rˍ -*p{ SF \`zpwcy1)+\`ǚ5tҘ0t gʪ}ؼ%o+p:SZ=Y܏y@s{kZrvb ւF=Iz!ƣ)>vG'dVxmVTLt|+iB9$0pUEރu2P fW˹PN ޽='"' aBI0~' vwzXڝg&N׮Rʗ+S7@M~t"#M; RݷԖy!EY,Ta+ :ŃmS +<]&.Gi*v48pIF[gfo띇;of0l`[?ȴկ{g`8|`LZm5}lZ | =`M8Kӄ3kY4׺hi7yr ]p:NXIlϏdK?r !$HmHЌ:&e0X4 x-@ q&tx?uA5Mcx%dM,n &h?fڱ6iƒC t(E0RhJ ǺAtJ5;k!O 22gHؖdv I%ėvKf]f<F*QE~4qd;r"40{A} 69`&H'N<֨{p N#O Hc.+MS *,2k2+8>½Dux JΠ6?Δ&ӀLJ:`?Q\%w}j4x[. UTZLh4U-l!Ssm$ߌ?˚X~1|:Ow1nnԷour@l_}{̸~lo˞Q-xMJ){M1+>-wf5~saOrvkNM10X]h_Qr0T?ESo2$!7ˍ{Pb14݁>}U-Z`Qڵ=U۝Ƣ3MEBXvq́y"Ui6{:vU*[`W=箫N]ʮj(@ru԰rufDJ^XrҙR"*J?sNh]ҮF]gP HFzvps*iV0rvVLg<0ud-+@*/g:X%+J+{ng]R2=kr=x ȫm(C=JyxY ,%eg(oB23VSIj3pԑG1o,1pYoxj˳[ƏT^Zѹ&6f)i ,`"eI3A8ETo>{CAz+R} ~i#Co+_/+|rcI+v/fvlz\hj\!@wC!Xڎ.Nc <ՀZtc[ k+r(.#*<pdwĉ_G(נweHä#.l W݁;~̴zfymstX۠Q dE-b;&ϐ_Tcl"Qiu\9Gfg:qzh<>#鷤%~-iK՝*mk?;Efg4m2itl$XREٍRf")F@9x=uUa A{3r}2̖a1Sfˠ[-W%h۲ .jt{~L]>27vF1SЇLR$K:CD#{x 2s*"y;aaXpY~9݁X;qՇkT L]x3u^a-Z*N?}=vdt=yv򝛀.&Q .pOI.L.i7$uK窯|.ȱK3N*0⡽}VdKdo:oތ!կ(Avj1Jel_(Rȡa$lPT7.LC.g>kcr*Zqۈ4I11Y) #_A R>.VôXi=Yf@]S,bL(&#"!\' M Np"qIf9,1F|nNT{D Bqg|K"Y|sB*YĪ3b%WD=9ּ(K"Xxzި%3Ƀ A1 rI%AU AM2|J(0Qk[/*xK5[lVnܳYWu9um|!0.jAH|W{PQDaUʈŊ ČQDpkQ sBYBs8(yMP ޱ QBvC3RnRkvD3hB-\f_5c a sƁCϓ-8 L=|sIcH 6G~#9%NpꝽ+M]H_VƬL&R },L!9MvfL>ײ4 p5b;"qB `})L"H `uq`LL'[>H$V%Vⱊ-)Ӧa3pote8[5Om]?nq I\Nҫ:K5-÷'Q5HcL$:I[>+/># ?G3VudD"VDU2GQ#ZjN"m8`4R(,w,AX {!°jPህ'BkGju׹pnu6o ޥBn2(hh&qW&=_Bo .9 ߤ04t%QR[V$ w%a᛾SiE0"-9MU櫊[8>pRva`NM͒_VғPSZ_l׆W$J5Nx jԎQAZEڠI*Nk]_z_tLSiN ]̍C M <*p*11R8s}Mi.~Q:gYANaygq@3; Jڷ)kc 5$GNWO% !B2!8}'i,Ebe!H+!4rT"1ODB]R񞉗I,T$XQ8N )bR8C`1G1& i7CS֭46wtp=mS%Q9xT 4r/'(A-߳1Xv #Y٤_G?e J.¨QjdT5mRO꠯˯r HiUIRO05e t*/洭UnJbO&/1tsۋF+R0/LZ%WJ4E"c?awoiϝp.%G[2.Ȭrc`f3h)j/I=ZV=6TEJrpk5AQ䀃_Tv`KP S+1 mz,F$ô%.T[,[OW8F-iv(%>)'ei~&yZOV҅ Oǝ:-f(Ԟž2l1"ZQkҰZѲqbͧ]\{ʡ^c{۬W]_7O澅Ɲ%>kR#40u!Յ3mU$Qވ5MFE\![lkEn)c n#ǓTHvI) RF7GS9y;>;{WCJ !xc-n0M!0{( A@^o+ wDʯ a/Ez?dr1 GI74::/|w.K{2N Ջq}mIQ?m95@ fPflKCqBV z<ߣm'ua+13aA}.p, `9|bx ogq,6ŻY(]g01׏4Źw A9ǽx7_|z~uWow&۟߾y4gM׮-N?8Xūη:ū׿VSX[KĬ7q;&'{ٿOg^|_e_ Lҷwv=pS[@]sޚBg[y7BX=LMB=:KL #!>Ɓ?gM`.oU:aВ~\]llx5rFE^{Y~&'^˳ [҅+6` V7L>q?0M9Uf|쟞cn\?~.˯w^S&[7_OZod]~"{=OCe7K՟䑾W+^Wow- NV~^}ZzܿclW#/OVyo@FŅWoIBf Kzۿ&?O9`<_@Wxx=U&v_YYh1N#P~0)o\jӗ~g7|>=_W.{d ƻo~! *X`ޠBZ1~ﺗM+54ǕE>1N ))Ggy6Qz?8Oc1I#ffi$FXPs+upMyN 礆KEȟR.ح<^8(s>G#6u9'j~{⿟,# =e~o} &ed ,4ªJ##)^v%㬞圲MC`~h);$yme^+8asRI&ACUT 3pfb40F)45E3~ЉSl5p4OS@EHI"FuN^Vp kE{vmSY?+'뗇>9Joڍ3I"̝C%F4R LJ g|l~@%([ڗo +S4:*W|D49(d2A=@f Cv̚pK9bEg |=B{ y*a<79jbp)y8j~`/z*9d[ x\Pmpab5n)N#RqZy p祭j.9.Knj][s8+L%UyxNd2)A&EddJDPHi‡ǶL4nt7m TY8UsDLefoߋ->9A\F ɫt]m}60=m&uL1+xw!c* k ^Q0 ! DN(sli?\\g _ nk_.k.jsR% b)Ъvfps*jͺ@ޅĵQݵfr8 8ՒơC{(ٻ* Ҧ!JߎjU:6̆cL3JWwZϗ9,\.)x,F.E7[5[RI z;K<yGbI].j3{ۂV%uTCky!ŠYrz~.?߿ˎ#߫UVqFṣ?;; و5 ҁqm [e:4hN0ǒt(K钐43KT:63ۚ](#V嚟δowadwףayw(D©eS`\I||OLڳȉc؈2QaK~OaQ 8sv\`jHd'4D؃&“2wt#gJvq0q,ݳk?Q@};}NmA)΁=<(Œ55t?L=G4fx0&$d=Si  o*TgP$!anPf aq"̆w!0Qb{9=XϠ;3ǐIX @HX H{5,2VCJC~h%$wmc9WFak:3HZr}ZCN!Z\[ۆwdhgfS, j{6NS ʛV3rVŽ-7D?^\?\KP 8΁ CTB%9B9TZ`΍A?]ݪop4 bciu&(0,(~(DJy<ݺ' k+uJ)Vgrr3](٪ER=*XV<ψ2߳ QbOrNˍrV()ІBq%@Iܞ@HRE !祱RBRF~_j ޣ8xns_a>|k?G km&1`%Iȴƞ"S),PhT\ YGHrHbJ6,B(&O !d﫱m5υ*[]e׳lve}շZ&+nvҸq;:vu\7((F2crjREee a2&ids }pyW5\z |9Z Fg.r=Fida5?ͫ5GOt+X5ӻʟJ)YgXLyf2vzwYI>x%F sS+ [!@}ebT}}b6z?e7 3hB>UgڟypQRo>  Wk=q_e3{J5 yzsgX>E~(,˦7Mv!Z欥WO=fJB566 4xWdwq,oV(WI^za`P''{}xDٯ1ѾN'NNě>^`F[=6L"؎<{)1jW{E{K׉0 wB壝#}@H086 >qD>NԶs W"D&SiE6:)2*]323@0 (j]Fmշ[|aA<D:XힲIxἙ-I \BP!c_,s[N0kԊ3X‰SEj?Q*Qr鿺RU+ Mn4{*`m.ds|o D98^9Φ}:eQPedf67Eh7li=z;9[z!@ .  DŽ*,0J%En E%Ht!ޅd .~R$(J[4͕̯d33}p0Q[y+j\YϚMi%zi_~V}N*dAZe4rq_&e_qv<%;g o4IVaBMI/g;wy-_> \_Ƕl ]Ԙ-m0 mYks]p&㼕zocQ>M>\c8ǵK(:-X-IHN2S0R`ZTFk&(E(5̱[qd9KF^Pq=-^yr) Έڲ1BnÜD!;bփE)[1+C ٭ZsIdK.bX%94PĨ^lJ],snceʛ!}%X[#&; T { hk,]ZsA|ķ-&_ Z J[=I.H}yut*7EђD/(d)i H_{b7ޚmk;,7e"~sb彏÷ GIl’&鲞 dJ2)̈qq/ ,!vaͯ>+l=cvVz?Đh0_fw3X4UE+byGG}UR pRP/H{h 8ŏr ْ 8軺;WA3JDi8y'q~^]=6>j) n( ^_dV3e,vոhH.5 AIȔVA&iA2AJr߽Xq+H߻X GsmGH>9{IݬEw uٱ_SgDcIrx#它R8F} ?;6Si!$j6 ˅0 f<# !](nPFT\S gD5"50/-J{syYHzٺkTi96TGc@ Mp+,W{t^;xP߇yʈĩ7؀ۀQ|;mf o^v_~|YІT빑q IERA/,_b?.#2#:JLq,r3Ǐ1-pI%## D2JHu.bFHƕ,J2Gԍs0",iC[H vw2sǵ_y"zP3'J8c#ڀFw,Rk-@8YP׆oo:#}:T0)]BN Z(J8TQspQZ9*!eɨQ vۃuJ*Fjƅ!҈7]O /JX%9Vl H SXƒCM$9 aY|k C6NaNmʂkDjd-]I31xi8e(sjױ+bȠ"ٻmW4dgq;֙id< &zR e1)RێI$98q+x0["fD #Gc"֞Pd}?HAHPs~a\[ꬵi!jσiȥ1̀ck X*_ H&;xELNXkp2 fAt"{Áx[@FzJPJ *b`1Lbn\D1_I"\B16gpWP4D*X.B;0r! ʖ؂ $|PGa*H2?R `6;@rB3a22H-ieq`$"왇Un69w g̫(0ʬtM%+x.Eռ3jq\>?H*.BpЀ "ڈG_Q+Ɍ&/^wvB4?3+p`C״- B0ʗOgc$iG R a.UJ8TSbhs>RHaM>*.a$v5 ?H_#I4%YUPXiFZฮRŴ$oH Fz`_ ͢W7` ݄#1Ӳ|q]MtsamW=n<8(|3 8! շ|8.xiq@qMB u(5`v-lkv a#N4xq>zA]2΅F4#-p\S^B>XwK rX`tdpkӥ4p͏مIg0K'!WG*GRZSwr1t U:zu*"G0R*#U#H4VY\9TaBUVY8Ӈ2K8G&ˠ&In_d)]\eʁzGR58IGfFpRh*'UIJoei+UΩ=2]µ{+8St7ET(–x$d$¤HA$@h#5|sg~鮪V4!a-c7< I#&LD#xۀ.nF"E5LAVFz/[!Z Q.a FB LqR:5!{xĐ`PQP[΀AY}Z&)Xcd,xqLT\<<ꂩl- -w\e)<;D5]1S^{dīA{FA|"+msRQ#X+PGB+B.C.! x{9*Phdf:r p ]hl}/\z}T !8LUH0 N[)8~ 62Jy 7*ABo0,USHX>PHIs݀t\ߌ"P%(*9nnhVq S38nnfŒ APr ѱhX>*PcQ9EX!JݿF[sc)H K̭v{xhPde#\.qqf*7fR)ޜ^|IOZqbo.5Es}o?&CP>&VhSJf1ƒ2+817AIA.J g*na/A;YhzK24?&@qa"$$gG}360UKј}?;'$ |ūk?i$`'ZCUKi+?QJ;茉LgݫD(Nue (D%!"r7#юDVWrI$hE+kH˴T msSj:$s2)F9[C%RP+z{7RBio|gygT5侘Lx(ԳI_~cBy~"y; Q޴.D MɵYs!V<[(DC,p{|ڗ)}J˅3{kMeΝl~Rh)M $Áj[s䢠#w:Btntc{7op[{"28 ) cL3D!p1E;+YPɽ +BV@߭9EguDU\.A>_ǩ'-#Ezʽ9AbJnj#҃`r إ0>3BB[,aBEDbWbmm,cc%FFK MςmB!'j)p7dE<L`kH=\٪L !ڭAhW .q4N")gqh:=rQL+3VtoUVyԘ}9Hb,ˬ$r NPQ9m <$j BpUqN܊ڻ*# i C/&>~n|7I,p9?Q#?TVoG"P!*RQ; -sp\ S%eL%cgԅk:NQ+'ᥡ׵~dcw{.n( 6ۼ(jlضpoDɈ,Gw3ĩ4 厲ώT]76PCm>Y| O {j1o;Q!1 nb"x5xz.3K~]3kqX 8E._9$N"{s|>~Ѷi2> w߃mnS=Yw|z)._]\߼<qn=>;N/?mxg_\o_?>K;aOHyߝ,=&/z_O/|ɬԠ r??翧wl^iՠwaNOĞdt}tcw|=ȼM-ݴﳻF@xKσnQc M.FDM2d8Xr^M`iM;e G'Pn%oc: #?e-;5 W߾;V''L :pq-#^ Ewcol/u }P Dhx?Lڠ2 ً!w?ſ >UIj~s 32QםtA|܁}s z}͋q0[2 .wOWО$tp$Oca]e8x 6w)A K?޿*- %iU?ѼrЦIco| KZ~/3fڛ?J0Hy> 46g߯@~>! 2s Zg =[ ަѣnp홤mfn#9#]piS ̔u'L"!G08IM!¡Y2ǭ}w4_3(^R,}:V˩}Gu~+@t`Vv{Q ֠gv>Pt`N9BxiV7nM:~ճ.^DҀb9ʺU"razyWz'74ݼ|u?䈕r2rx̬+θsO eDc7q/D"^A ?zi{n=5y<Wo(LH ژ"#~~eT ԄwF\zzE49'goxzA +UϽ`R%%jL80jqąWfׯkvBtҜu8RZGQcb2-aXőr("ψ3Hq5p< $@!-a1%B?$hPjXR˰[H T6wma/حf Cײ*e*rtpڭ2i~ FDĠŀ;@*p2 r0A.<ب2`y5:< U L7! \Ұ+4=33UpIʑP]E_u7nYJ_g" ?&'ov29w?d&wKnE%`x:Rb]^D20mpObdE"pٛTtp8rVő[.jN={fkOղ;g*y'<RaKt_]*#N> A U+z/@+al}$f3.UGrP.`VZ)r/-P8qx<)#jQ a¶>CQe)`*[/2PcYeP!i-B#IebkN ࣹfw8Y 6FbC]XR+喙s0v8 b<; 3kd@֬6 e8Y` gFY#F ainO XΓH7A< ~2m@Hƴ>֑#)܃ڰZFsZN%Jkg`71o'ֿ8}?YOEpW9ifgܩ (W13G1S̕`;4C xvYy85h 51^Nb#^$L8-kd"`4v'W+'XrUspCEZ U!L"h&8|+oĜlu-)76 :E4DW*]M9Q%pV@ lac MPDܰ-k;1~ TKrF)M=ͼԬ"O)ĥu$/D9=!xGK;2>Fh0F,Jp#*m}(.'k`R 7tgo9 J=#w̟(1jB2/7̟$F&)Ah"qrf.:o*¹XA B#փ@ xX]<`GX=X+?'#ddq:' @ie F?ߏ jw" 2#)uEQɠ+ YJLܐLhN,Rtg:4Xa Ym<8\KZ DԀ@jayҞ$_L̺LlWe`3:Sy[5PyƐ$Lk T Na00Tk ̇ G\*OCs 3W_ܝ9aH "7N4ĬU3 5xSMK0b+2PR[vqlyar3?Cã,N]R]Knp HS ,jh^zs h`&_EIjfRm-Ϣ6"mH,TB}/9٨4/)y&-:% r=+=db w?trщlaE 7yރ P("ӢpزIvAə"Ӳ{5<[,ۂC9!>Ң(:F( Nsp*2~(&;k b2 ^2ܠ&6^QR*>+0D`q6w!<'ƽ 9g}ƣ7^ݞWepVw}>`4i.8F\K |ş6E$DM{k 9 __౿6)f3e m8iD_73~F2shr(ްKh8O2fV$F ;Z a%rW.sH Uh{s5b7B*=O;fNGtM]xy}͌l&XI5#8p7tZ:kͅ ]xquןNXđ=N[1)>CT]랠$ܞܭȑ@%d^T9$C#/FsdGpll/rE|>_>n_hWB=#nJ̑=.3zpF>`$`q9aOvPXlHcQຜ)o%#ؙ˘ zdcg˄= Fwx5;G,HEEQ5.I& eKdC'#MZ9cVκ"虮$L9>g#ĔficD s(\!bm-JxP܆а}ʌ Z\4.bE{+JM]b.%HR0Rdd"Q*~ [2N(%էmǿu >Ԧƪ'zO=PZ'Q6ܼ`qN1c9t=i*X?)uWRadӲ*x솝"3þ8Hlvcᗓ7B p0;<@HhVbe[Q3hRGUyj>T!ɗ_CDXBI;K1L)s X0c?f:s &rWIc YUo >@n\]fYe`T :2H&ch9H8)`<| "JnK qŒv@[6ePgww7A1əmC?=+AYo^]?!܇]8>(3q1R.UIH#̂TbsExy׻Saݟ=LC]IˠbuLPQ*\ cYok[?̺uh$<1V0b"p̾n3}U8ry]>J+OkڟƖ`%5"xUIK589 G^ %J.~$wba9iffsODf`(c3`"qqfbFkd&It8fCĢ4 /#7As1A|ŷ~ 96O7T /[ '$וEm\;6$ۦ$:1*$y9V@/m{ @ߓ<}FnM@I^.mRI7X]pK)AO*(4~eYxsEF䟲FG,$+O\XL^3uw~fA&-$JgЈE #(<{n΄ФE"XIsdK1+*TqYۂ;}MA*E f32`$YYCGY?x_{#t?A(|5*:'4uWְ)meNw I\L*.h's'o9GSMobiVXز* Tx bQߜAVij | vU0zW{H.[z=I10t K`w})zjr5ᏓR<3~=S.OV&f#K}Ytruo, 6B#4<}}q<ЩzIoyE)TQ#EBN^IDrԈVMZY_^d9iHh lFLww>/tw@g׫ꦾ;KS͇N1)>:Lg*[J6 >ņ'Ỽ +S3CNW۝F-9*l)p؀zwK׎!R@(x}+#BMf7Ylݓ#OUגA,;ܱFmM䄊L>bW{m/?! yK9nIQCq Ԩ|  F˗Nmz<ᐭvw?K?J+Y?ě'͛'%2 JH s[kぽE(ػ/\jp 7mlUpW#+׀9g0Ck.2Uq 0T-?!O$L)Xb%Sžfcx/g>jNY~AK&l]>M a[7ͳrA1JT7,oxc,Oӽ8^P2>umAm:z` ;!܍> .`wn' f=gewOwV69րqBPNi_ 9zI'c@;L8hf+=bkH9;0SF{;g\l<cQl@E}/F_~Yє4%lчrC/TW,y>{wrL2:Ex2i?jE\X}?/ǶEX2Hmbl{+bMIm? 'WRkFzO1Xo чa5 ⥄ZHtꃉ]Ƹ~sr^Qb􇤄ʡ(4GEqe'.IMGF)CXm67Yq}198L旸t\%_MlXGF K{'p%5Dٻ߰7>3*ƙZ<˸Y.N1TU)F?xWswקUc*oNJGif<:WoՇϓTin̰QF$(Wl<: _l"Z=~?Mx#ycee>z8?n ݝ>dK^ov/Xv ʈ?/>WN؟I{H$)y"JH+'(TRGa[~ x">9" ECqG VZc$\MEn d+(,C<;?S=?~d<*i@ Qؐ&LN bpJhK.*N3*hY%"L]wcL`qDj05k rݟ7/QAE/_Ѥi\U$!Ct=Y I6z]3iPI7aAm h@3L֯jkWg$9Z-WL K 늧L5ǮȔZ5MjA!gv^k gz~y~?_G)r#%2 ctw$_6MEjEjND#[/dQq b EݪE)rqB-\ФCf[ 5nMʩZ-ȣޤ$aWjc:Rq9Y)";U]-JwiɋiS̏j{ `"l>?6 &*1M4`RTx:a +1ByhR]ix `\ I>yN I9Ƀ<'Au9keievmO\KyW#W*UF-_ $&C;Hu1P^ (:'t'd>%ty' ޸ۇSsq1l%<`!x絬`lܐ}mR!HC@io!dhY&VT >)ț Jq#"չ[h㴕>o- UITK@[-xH|MBA3,>˫{m%#.-v0. " Y.-ߺ.E4.վP:mIS촥"9^U©I Rpj <idj` ֌7M#]|ԺtNZ*ЁE]P`DcruuaHQ i`1bj47G5)ZnyT'%m+&kA[ؘ~Tጻ(<wQ[!e_<$a*xA­_!nAW4 eKc䰵1ZZ=?D~Ck4ϵ*mqVc@GotSI~9m3jїwѸ߈M<\O<&5Ԓ qw?.pI>ƧFRǍ 90&k=4פ z*f^&alffx[ 7ǐO`hO`h*v;MW` &ul  ՁOnj קNŇş(L/'8?/@OƯ>WSvWۧhɫ۵_^W߸8Ir>pITzK]x2?R-Vǯ^.yw9f)iDoůƣſO_js뫓3"^P e7Ӗ*:o hF[=Z Id*J ߣ TYNն6bdIDίo<ߏ8_:QuḎጽx7m2T>6LVվ-!KxTGGT4+N5WEKYVգ#.F/H0eH-l913Qy#xREE*Ln*cB,`O4f6@q?F܀J/qi^]wTcك[V[L: _A.0P1ջqC ScY{ƕ+F>7r$&ٶض tm񫲓6'~loʖdDE gpD7l8k ˅}g+&h$Ө=je~+$9..UzXe䊃M] smJhB*Ӫ9Wϕ)@hk;q= x 1<=pV'v`;سN9%XhTd4y];%aŸ]}%w=4\.KjsH? %soG_$O7>h3܂O0>^)Vc0^hvF=~xNŅ(8gCc Gy9!,?wA=У# G^=Ks#. @XxkyٶO0>\" >.VY{{EU ;`4^Й'ž,xCud]'9348<:@!&3ϱ--|ӳbYxTvMC*&C'l6SbfDIJbGk0B2IѵAs5l@:^,jYJ#e)*f/r;=in;T#E"ot/`ӎ'v=k{?P=c> #-hH$ov8#8} tQK)|M_w7lA*E~$Oa5/C06т0Tխ\iE,f)PF Z @~PKMwH;h J,_Ek8 b`3e͞p8Ta0.UۘHC)jmr0K$7iH`HBSf`y2TTMdc]Uxx-B=P)85B y+uf3kk<35WC=7y- !ڦz,#sF}}T=qǦ,\5%ʁ!;,}Yq _ ʜY8f%y,֐z|^vt)3)0!jȋ~(asƙss :܌~.fϑ vGq?XOw>,o͂ Q(N&dkCCm.T*oa2|Ըƅ{j§#Ec,#1~`sO>MĶVx7<[sNC~kɁwc| D_.>']~ݛ۷QIӞF:bӀ J9LPāV}w9k#\|?^sA<c|l\$>yC^nwg}+;g7?e|ۋO/~}6'ޛw.m>>j~{[xcQBz2;}MoܔM<x߉M5A6H|N_GÊc\:OzcYt~i_?уMXw=#d'Mz¼a`o)L2@ؓ'VCW<=ܔn%Gw'{9ON'uv/h\1OFgS)w~ `f L&)Y|ReenCM-YQ¹(29ƶe]&ivap$k^^H*Qo>}(Dy">.rZ{#dLESt 27o@Sx4Z `uO"gq/M3xIVfP +uaẁj3a~Ѻgʀ r bjc3wL_tk-O0cޗ!FtAAiv9A =)Ht1J:<:gv R + v^@cY={3߰=CxQ!nRr(Cfq>&+Qkr88b'lSo8>aԄyN4 Ζ+W n}K0]xPAm[ `UYe.w79Fsu mڜ2%k LW{&Α!S)%#6.p픒S %, 6ŁH!N .IW & {>t/ڭ]Rjr|Om,]Rj[ѢK|ݔH/lkg<'mOn*ڮu"'r>) e'^u7\1XsSҡ$V처/AT8Z7v) VT˓#BZkPZzCυ۶:2X>שutpųw3TqꚘU3dB%,vD(^(F ;qY!XŊoTq`8Ʊu %dKASILȢ>?Vb&+Qe~>R)xUÄ!ԏt\Qsbbj./L@Lb;|nP'py@es)ݏP)Qm'ΝqK]>'"IO;6g}!GAG7d5{Lٌuz8#;JKC mNK:|w=)WkL>i^B)iqWj@L% 3C̰06悥$~'?CdArˣѼ$زҷÀ6dq0rq0N.\)<ᝀNtڑ{8i1p[f'8%EbqLF W W筭$<a! ,GJ+QCSH$b>a""6ps߱ HJ(C \!"Pyt7^#)Qg7@Ԥe..+fIZãĦmyR"{`%f98J0d&P[lEyR$[%r8 ŧnc T!u< $àAt-Ǜ4l|9j7(x}4=?-O\J'ɷwz`OMO`H~I24ebNiך ͎+ $-M2$%먗=GHQZ8R/.9l iB7 Q"⹲<_&"PhTeĜի2!Je,25rXr!!ʬ  aXv~W2".h&rGmͧ.I$fyaf鮥 lq-lP"ePkX#.7yVP%A\ACWRUkBMĊԾnqS+Q9lUTXvLUInJjJb]5*2W̫8MTȜ7^i./?UqԨ(4kgEJSΪeK*WxiZ1T]RSBoTYyGruRI)RcB1yDQHyUDz+zՙHĕ]%6!RʑGrY$MJ孡ˉg ,*F- H:T|*N]J)@N忧g `Ab/cT"E.vz陾}m<'rQ* ˘P"#wuK_%NV;ko@3㪣4LF $ɨ0c=UqR#gCqNu u*7:$[Lw^òs+.Y< U&ʹ$ ]DX^P?sg¥.E%,%'OM&Wr$g 1G\2:8P#(iIc:P':"8#t d"H*fAPҜɧ*rya'vW61O)"٩4wQvj>'ܓc2ȭo[Ňl}nyHAzhw J@,ZE :u);O}XAֻ5b@pT0MmkFbD7m{S7xLz:7Tpub|:M;M+`D;4Ƿzzt>֓194x?vx|z;PEFbiuiãkkO{"{}Fg@ԁU|/~u]c?+^dxEUħ7VWٜ8v#O8`IF[dglEvUgl人/;!$#$\h}^Ͷ.ja"R&$-M69QP\28O ILk^WԲE L%VȕM T2FuaPiUG]3iI5f„S* z܁^),ڀ4t1\>Sߠ4Zz΂Y*Nk7,4PARKYqPU#6)@  8B(-/SmloS^mqG\Us_ub@Id;ns;F#Mn BǏdh:(Tu'*B6(.L?M11CSD$c G:\á`'83UHC9E;E*'ų>8H ύD51o g4C}rw<ՊiA9z8{)C$*o_Vc(^u Qh2=?2 2mW#jܫр^{W!((C<^'[efWXbZ1ϥ-2R,2⸑|b:xkXQ)uGny \>ȑUp)jr'4̦L|ݱ&bh~࠿GN={WͫFN7._M_l F:xR)cw_;VOE YP8 ?0_ʞ CEg,wY-L%Qɫj[%j0 tF?^>'mƃGj08sKi}M c.ԈRCn,q*Sjq ߭QJIBsr'h?+5pOMLC!s0M+A󳘖߇mT Ȑh5B\ ̈́*\vr3P~g~pKllӇ&ȫsQ݊뱻L`qf|71yz^dg (= [3c ݅2B6*J1.ȩO9D" 9lXhVy;r 64 ⦒: 9HrRqWVS"e Ԛ@6\ ~!p% pzt6A7fvSS@#DABē'6!R㣥>J;ϹVƒQ{s3aDa!~Fj4g"3A\)B n<5u0IР=TugLP@k\P9I`$j+HXcygDeM]yCFhmֽu!Td{y[[+pB'4]P՜D"%ȉ2T ge9{2ęADc#Υ4dQAVq E +{x^qHSDAeNxF^*48(^YT{e4vPVPp̭ZbT?D{k }u4hj}Qt]GSMI3)9Cp?,L!MdU cMo>]L>`=^&ojq7w/2./^Ќ<#gTs{|ye9 H;8 |, =?Ov̹(N!_wNAUr;FBqKiAQ_Ԃ#Z z*H'bQPpcP݃PZM-7w`{vizJRz&uvgPO{ev_fev_V՞)H](켃Pj G^AD{JvSOɮ7n=b \Ȁ&W]ڽJ!\,{ yfp+ѮΫPtu &'] B>2ޒFTNQ\.qâ#" im D3$-Ę@brn)HYy l?qHHw{@X;V<.!vz}85kŽ'-Ũ8<+ cQ^qաJ\P;jrT[/*X 6hi}rөhTV>B*.o@EV'"ٽ{␈Npu$ُlC$'P|=l8Z17%eeq&C s$2[ tm]ǹP6၌jJeBuI8|{un0sADa󐽾Y"g^/Tj`V&D@o5S} Yeu4uZize]ɕR*ɧ*b S2g0EBD]dr+^{ c.1n]muvN[J`p'٢HE#qҐp*]]l5j@UqdpCC,mjJ6gmjmj=|Mmu>Զ㖢thnљXOcܠ/G%9<*zYKkQR.)qn}:ȱߏΩG琬:g<҉}W> ʹ$X0y= >28'Tkg-Y>]a\8;ϮuЮRYc>Ba|㩎9v+9GX#TW| xny҂ Xe&JD*.<$z 1qd.l*JOA?BʨȡZe O^(LyYqt_M@+ `Vvn<| B \/ @6zlVvMqIʊ.]/Ѯ'߉Z jH$܎N޹RbPL7߄bעtIm< (}Ϊ^,@Bh |̍xV II֧Co2LmÕr:bxR8zC yv|zW"ou.?[fNQzÅRw81&Gֈ {Ϗcꨵ=sɿٻ޶dU1oS<Nf1&<ݴFH ݷA]i<$jVYGP?=i1+qD|LYuEAE5.UC崿oˢ$GSy`Mb3[~ R P4fSk sGp#W/^n|o`5f|'%)uכ?ٙfzHYO^ع̝,Ҍ , !j3Ǽ%d:[lp Y6%@8gB(ۖ%(ڧU>d+E.Jiʞ;ck,x`(j."틍 1DSITmAb uwq@5\Iv9D$UΗoH6[j4mL,2+e9,A+ŔحDbIt&.䫝s Jse^RlWxGRg+0QstqO9FɮUq cAJS<;{)^&'^/Dz!G Ouꨋ@(fkvҦO^s ̺YIW??jħ=,Y| "`4}'S*}OoMTzփ%ZVĊp"`&h b1e!ʄq֯~~V`9c9oz ;1S[b7\}*؇q֪a/ƷIq3 ' ߇]%,}68:MΈaEB8z9^zQڷ pc&7f߿'?bf׀i[+gO/\xz,qǼ_4]:g[I?uIwUfMvE*p!9LJyCiձpȂ~Tҙ.22PY,T9^n9d#b1DQ2D81v(5H&\4^/*Ӽ9IއxΪxwY]o8l2(rhd]cۋ$^Fl{9@9 fydiB>TA5{cVhS|DxLX!w83R/3e`ؓ*kZ4~Af|kHMI22(疣!:Ĕ!6*pFEa$t yP9YWdzT(E 14}Y3ah16>8QQ$cY*CA<&24&4+e0Adk"-Q> )N^sS08XD&b)p$`.8* P'ifVհR=^QU8BV_Q;27|7Yoo&#x^QM-K2kݲD)ǡa:=_$9]4xE^78˷ȼW}/x{Ch 5x Y+NFL5״I![uҿtYa fH6X:WJ/;ZaLkϲo wr[(GԤZ o#qɶ 2@3P퓜߀r <\u od vH&3e[Xvla&U.!q؋.[Sė#-)_HR!tFebHGJLxZ՗ī64lG< n,HFjG w7>}yy@\EQy`EEn3w*TWt~ ONN6;'ܾNݳ`A!K.V6B Z&H5n4 &nZ{NnbJb+" [~q$NCBF.rͩU;@FBD,#!\WFٯHW_Cs 'k/ 1?7:s1!_=! qb3+=xSA];9pɰ0h&]`=ޏqu8̬Fg(-`+?59Ak?֎g"FBdg ' };5%'sAEqW%'{n%dho[Hw_*9xnao{;$fՌIGlY2I7l̰Kzfu=ի Tݲ{ژlZ|nz>mHvͱB?E,րHT⊱znv-3x #Lҷy\ Rq+DaÝ%"p 00NԊ*Ei$ GCCB-wmxABWD 9BL )h a#LUY 5.GqE{^p) "I->5 y9I$e!02fXd(Q0*S: .l[9>uh'+`sx9$mB1z>WC?z[8 p XĄQ r?AhY*< ݍ>q[q@2j](*)]I/ċG 9UPa!…Tf< ̜!jGey:޼Iy5_Jv-*?^ZB31YZ3 qBFAV:͸ `JG4(9 foM  N4dp%{j?M(Za Mî_YG$aiM4\UqJ͌h҂2J LA2Mk9*c),Ά`lʚ8X+gw.ToNj#IxYg/7u7tl8wñ "tL6!QLj -'c 3 [%2 ՁOF!X+[7jlX;:z&Jiz-oU,N4eG%ۥ?Vfσgor /n^M}{~bYP JJτ>BgB҆AG\4p憮Y Xf$pZVoY ݴ5@a [wH)DUw4Q])Y j^2OEmJPDJYc\'mN_xj$/e͍ gPb#;ףp;i{=&f#@*3&FJe>+\1ف;3`Ta9r V5 zFRhy2)@"0)[= ( Zt^PZOnv$ H;J4v<+WA.xy9Oe њTjߩSOD}kk.k>Ol:f;PDR͝N y Oj%2JStP 5` J<9jí' ک[-%} dφ=T Cŭc>8ge,!yG ǛB.eg4]QFʭZ:^Jv^NmpBu0uNq<oW9_蹶it odƙ68r&,W+;޷o6ׅ~o.hL}hTM]v$FI\3ǂc-Ma&wS.t(´?n-!= ڇDQU׏+3J#2E_ XDed (sz։l|%"Qɱ| "[XOEG]C&@$P*čA@s G9s͉(gXQYPl,wBn1TAk *\Z}9` 8f:}(*mn[C֕$aicífjvh5RrAm B^aۚL?/&}ܞ.5u4`j#GefR4 RcY4&Lw\AH y* QSZFM2U) h,PNhV}nKE#imq39LjƐ4r(x s@qN6W-n<۶ J;j-unJ[u+PQZ[mIOѡt'P (#̭ݷ[c%JKHB٪3FOZ53<(AB Rd:+D,%$"(cՀ\&;iar s=L1jG44Qeo‡ b*8fKU=H4Ol' cH8OT_>®0>fWOIP$yUD䬐;uHP)h!_-S6Z_헾PoPշ2LyW^7]yݼn z<Ϡ0Hc#bLA/J, X)m`Y(+ed~y?C<Gqӂ^ 7d@qrb@Lge^zzX4.F(|('7;f4*^Q=8!T}|B+=No;y;_SIszD!S8* ,ݟKӹ Q4uy) .L+BHA*e yCÆ=T_ֲ.LAPSD!Y)" aLwEB.;A K? )o%pX܌0*):WΩfDlE )؊pswnA@d%9L8h_ә<*898,kQ/$D< Wwo<&onMcD?> 7xpռ}S>~y;T/EqA/e7<\\0}]yC#]^aƤC=s{`/ + e; ' <@#k&X|N߸4RE* uA@%I)Ixb3QIF Il kEF\nvLU2M$)nT7JQcFysG EP"I)^qkeರZ~s4ƯFu5AI 'K_y_ 'PC GO&G-9$OCcrq|iA.4-^E0,5U~N/שrJ[Os -ۜ2K 0 Fd4&Fc.w~e"{t?Ñ7ǖ0r׮Q(C:K7Gטw\\o[tE[tEoхRĭ[[(s`@fȂŲABd6M舱s" : +"\( -7p]`&׿7Z"bcc3h'%h ;BQۋ~$rs2y__AŌcf|5?>\yYz5 CwČ?-ǹVu-?y//JLfxex֓݃{  ͅ[n'ESI~1~~??Fyo6< mC)Cܽt  pu1_ m/BRD3KЇ[]9mIp&g Z 멮c妝L݂miȆ$$h.UZ{[CȡT{sd֕;$0 Q|l~f:{0U-% A1\*8 (!}h3ʄ[;g EBxB PzfhlA;~:쥣,D/?$GSu`qwcyjLagB"&=%i{|uF'S9$U"{S62ϻRTAMqOoe-1%Rs _ܻ%FV{A8/i둂"ʔׯW-eb.;PB{U9AX<ɗQ ]dװRb8Z fڿn<pQd¢.+!VRb4'fߌBIBwwIA"/i퇛N +ywX(Hl1FT, !na&4R+U+CZtD:$GpbWRY1{+n'3ceT4|PEN/Kv͙ "{Ty_'Er̽?`E}FS ?MoE{&tPJ܉ @ ?fG\ a5jB$XgmRbq%x2gGRɓq>Z ::4:xz ><&qO{NӘ-|YiAGVu:)Y-8!N+B IEAچ7TN+geSo"ϚaZJW :iٍ8<gC]&8a<{> ۍ`ĎV8O#! Tnu ktWxKZ .3{.t,4(cK;J=ip} ףA(o]&-Ѱa:i+Ɣh?wX[zoк0^svvlf}7ݶptY'V~ΧqV=?3#7} \???xu*lQ |20 ܷXfs]K^3~? ߼j=YK,( . `(ĨTSQ(m.$}m^t/tqT5u%-s h$(7YW}#&8{qA_H]կawcH`ORHjUzȡ4PVmH԰W:Hzk=GJh0`+欩0d^EW3FzVR%m C͌:I'R w1 a\YCr@bi#v}oe8y6!Υ)VtCqo> +O'|Nj?n~X`:_=iGnyqE0*ƫ'Ud/??$;nO![ιt~cD[qJ'숡ІiPz;R$mA ~S^H "Y'1!F)scӢQp°(Y揞`Y%a&+^M.)fS ;B[1'dO\$2ӕ]sv9Ԭ% frSί$bPVz*+DR(ȵ5T =9Y|vrXvy=K3iIl8#vҾ38陿txggd {NjAY:XYdDS@[uHs|{ݏ\_ wwDmfVo=5jBoAYT`Yagp Oo̮'gg=s F[ @"̓'z \ ❐clcLN'Wp9)ௗ?4hݞω#Zڭvۆ|"L),og_ en{!LG3Fl=z^|G4c 5:Y?~Ykxa(>.$2\]pcwȖ!;/yߌbIlp:o]h}ܡjqٹE-Fꌖ}yoɆ[2앺Ym3@*׸JGmZXc69;/=;DݹSo$fΧ|9c.mX,ArJ׉ Ԑu X~$FCZ[W$A+|ƊycK5^b%F 1z[5.@b&(Hw`"a^y0qچN2liUp>A,[D6cڡ HqLq.f+}^ZDh5<"gtIQHP}#@lSFLMJ* TBId\XkA#\ibHL9I- 7yieD_:ըiG3hx/ wF{S'ZƱՔXM\}˂H%@kx$W ʊH3)0 EAlS6[!dttLpd4<*nzI-D2Nz1v5TQFa$kGdG4r7ga;%ܒk0F@ΑאYr.9 9UJzTޒilBJ 7#:K2uYo:`l쑮om"6ʊŤXN<ҭH$ %y,%##)Y >h)c$lw? d6C?*>dFGGq <sI8̶'Ǭ|v1YɔtW C2mٰEY6pI@#.:lfN?S٫mG %pD-7{Bta!K g]6#QJ 2-{X jj$Ul*r_WD..i߃v+ SX5UwT_]Q_B/ZɾOhL>؝cݢ .g٪5-_DE(J]8XdhXq2-n Q@^O1گ;}yq^Oһɝd4 8B?ЊOkXCsGd iزX3{47'=O Pr7+;4J$Lz' ߷2o~vhXA>~rfT-Ս~eܛjD[2oF3L?s,}1Pve;j` jd]tbT5oيBY_IA6/ȐeH*s\in 7hBdL)ϔBZA=YEp\;g9 uQ,7Y0wx|\; ֝ww$F0:W R)b86, 6^"$IЮyDmn7mLbL'کilMi ʬѓj.m?nSW-/~k|n[]N&l=@ָ+kS+`.xfg6 ^Cn`okc9=Dtiv7[P8vy c'3Fh~Nn_6XwK -1,nw Xx5bFo<`+Y˲ۿW7~I.87Zn5]s5s.*gMײ3yv9*r+3ًZT謭\`UNEpqFV^DQKCQ 2gG@dHna&s$i fs7NLgK VP=07xLb ! $y0PIK.|d>s Dq(W%{o #AxjͶ׎7qLtI ~>%mܑguCL3x֖m|ƴ(erL繋I{|H_F!}>+Ϛ.mztӐ(((Zor"}H;߇;~)Q:N}y r['7%gޟ fA'M֮jjxY|Zɳpط>֏w S q{^Lpn`M$H@K߁(5QYUVjЫ*n~譩}v~ڮbcQ R?_>*x/Z=z.f e09Z}PSۛ5sb/ 7ӌ8X#e.@}F6z`{ tM 2U1ԇyT)PCV޶,jQު< '1VLEJNĹ ĬAH1St^Zc ƬWUuˆ\ebbB0MIIY&0Zb2{"\A1T+'mg`,2x'2I2 ti{֦ ЉhK#ZSn_l?Вd$ LrD!HjU>@Bi cjO}jio#,#L"*'I & 3f  hD=,| 5(|&!tI$圲Πёnt";Vee0s\$٤69 ͵%h璟hd#X&0'Ńh֏e酫NcS7>'B .hGw!vY;9dn|D džEE}sMv"=kf2I+%_JҙVA~̧sb.5R~DՁxDjAP,$M2\&Wݢ۰(]mGcDsHkcDPqs:,O[9jAɘR$6(RI ȠGkmFEKm/}qdKlmeGg&Y?Ŗld]Z2@flꯊ*.|qAq8F++T3rN% Zs(H'!\q">"]Aclʧo7zX`]LXyU%5 p-TNoYQ)Ն_Qy|f!eUb4SJX].ŌVxl,e/ő5sS!F. A[53*8o U; P׫Mn"#I;Cq,y&ՏWۉX5tSŽJW없0ˊZt><-\jPâk5-goz뗝<HXӫoR_t}53*V2Gfx[jH.l&8ξ_@|4J~VQ٫8T(ׁɼvxe}8B͉t}jbrݫڦ-`ME 1Ic2+4QTT(,-ߨ>0"})6BkT(3D dTY 0HE&p)ϔ2+jV NSf{Coor/k_-R5گ_,4o fUݐ*I 鋨Eiר!Qo! $5:5geYI1QCqt3X\E B#gki5M&/Q巫rpw87/R IY}|ù(0ũՉ*K<1.)3)_g^eh1&L] ,ǜY+!)SfXx/Gn F2/z9=sv9 _~+ D  df=EAM Vp%OAxI8C!nEm/*V܌dٻ_*= 7xI_@›Ktn{ל(Dwd"U+ Oyq*2+B*m-hʪ4 5strolqwgOϏnH O@y>xU;2dEblhN*zJӶjh"uE~~%"r4 uǷϱ31)(>'{',zv} 7 ^~嬪C":3pβ/C?]M'ѭ7ퟯ"~V5+A/u>L@'~tJTuw1vT+kCprǁ/ehZQBǏ񺐥p^f79YHxɟ \ʟ^ W\EiN}c$}w=3 nFwyh~`C!;%Fi56ݯHb5*<EC>za~? n0#Ꮌ,&UbW{]a;V; &M3E9Һjk4̔Ťb,T[ojlSzmg KL.QsD Kۂ,Xohk+0>ϲg!1C!bObxdXMMkfɦ^/ 澲BN%4b8fgl?lԤ::BjL6b2FLǖl# 5c'L[RMf*Sf{R Gt*x =q|VLi% AXV*3Lc4@65I* iRO^zh3DP[䅔xJS-t?JUZ5ŇG(!IX yFI :(R|wpYYW#wJp_:҇-$&6meĄ@{}TB|gI' h=ZaE{8qx:` %p X K1sI7.[GZ8FPjur"8: QF)0o-cZ޽E[gw(IBc %ՂS75.*n'Bf"_w'#!vT'Ieː78ZfRh5S zT++I~AqHWhm]Q?k_r8fJ 38&I1.C$epߝvp!y!̙B{'@GsH$t}e6&HdBQb3BɈr!6w[yDn{0-__6(!Ok8|V/=VN.?jf :K$7qX6ٜvh,eT nl(9i4t49j &!~:XzB=׾'סM0gu}N#pCN~Y6HZ{T8LFB)OfiD\7,{Vrz@ƣJ^0y~74j<o#,8̈́!9׺v_BZ)R]!0ē F#\ r<ӎR`-ē=!L2& ~O4^ 2+f3؁ 6epČa5ƖP%Q&BLjVr)ʠE M,`;IjiUng)c(+RSEZmaG Y (:ʽUTR#8I,uS2pL@R./xXuFӱ "A>]͟ǙQEA㇧o\30ׁ )W}r:)-'`y~yO "1z>Lp8ݬ?@҄Rf2~~6'ϸ$ E]Շ}uO^lx112Ԋ@8{yЇA]9a %TԯRFQ!Phc}oÙm3>3Wᯰ.P/7äa< |1 >e=]I??]T tR@2Xz؁y 6^ISL`*b`\eޥiG~ )/.zW}B^Չ.zݦo|6Fs08Wyݵ!JꃪSXSjRF9R:lDªhJ y2Ǥւy5MB‘Tb eFeHOr#Pby?˷ ,˙Q9=(,t585wE!f3?^̐o<3<(&=yŲf*Fř1JPWӈ`xAu"؀>>#0@7P}r׭= D') D$fLhS,1i!>.o1?*d]8^+b}?1X?+90͏PFO~ ipoyPQTiU0hyl_^ccYC(ex%BoǃGvRLHBD= d 'P0+D$4\(Skt!4He>꜆@~9Fg3tPsg&ތݹyނ8 Fog/zn`i{5H2;XU!~q %x_U=B=e\usnY7ཌPmԙCZINQI3̻y/R>βཪU ޫ:|H9gyq=&nMxLJ6wPa_tFT߽1ϧ>2s3|/K.Nԛ A|V2kQfLlvP1#쌸n]Np:*F7겻0kK9: 9ͪ2UEwYa]]#F9b'AV&'M<4!)ߵ Կ+OLG wHkdHfBPxda:bc,f& s~`7.HLdHT+LŮ~ лDA ɎPZ鷧`X ;<^M_٥(9V-Z_GKGףOTk|? &Rcϻ}\]PQgM8꯮ XXJ S%cA(E$M"KE*I:n;"H?I % ½5Pl+CƣQ'H!7^n|lys鼯4X3*F):\s9V F[82 4~b t4B}Z3=S6KΕj3("nãmRY("DPxM7y u~ u~ h~Y<, |> X.dhV-B56}=M4%uY Zf~ }pHU`T80-~:}>S 9ڂi@UnTvp`q5t f,,;Č2Zs'76y3R %F,u6"cTX^ֻ'Iw[9wO WX3N&02d ~ŵK|gX 7taܥ (YYnz^/TjAKvtssTwU?wu`EvvJRv(@LC `ĘҰH:L)I0JrICZC3BߎԄ\nHJǦʹu!Xy;:!=@+BZц^VyCak1mC<X ]qxApX iΊ `,ǍaT͜ dG)JjAr\; 2$v0i с/L`tU hڋYDk%#r$L)hp.z9k#9 ^J=]$[ bɵC'5`aŽ@'`ʤ?3"(vӚ5TNI,-<`*/ձNH3:-H=(w^ 1z A6z`4Na0/"n[ ݗdޢ׶m`ʮYȾfU{lO%R ޽^+}T*օ$V)!.Wq6-U}*TQcUe*QUXOXz!H #JX,4<+vWtrӝFJ}bx];ghbnz}HG;[ Pۯ':½G=f!E0-:ujԞ&*EuR ?&:Ξ$wPxYN#ԭ×KDZBPS-wV#_Q.q_A2K+@ Ftkf'͔&*ׂ _cS$*ar@ܶsLnh5^vRܑIy%nN3OjLٓj 놪Ayۖ+Uǎc5nFG9,͕jl,c%FǏmtKJ4痻\;oѺE<-l57[(b7hݣ{v1Oh||k\3gH^6jw"  '_ A, YVB%=嬀Vd=ๅfkN*pۦtњE56hȊ?c'/L ̦0dFi6Y j"Iq`\F`f(@\H⻇/_oOx*_2F!1&U@ńBO::+_5f#6 z{OPŜY̘zEu:_.?1"7XVZQ7J(3XbUtln 7y*I(f-az)wF-(ן47,ԗhKp/&G+ Ş|hTx/rIkF%Ǚ!o7 6c]6b4Yu@[ k%k7x5 fvA&D445m4\<2{͠KkD@3&[V'3$Jzvտ16NM#h%E̻:4|vlZCHHLjym B9e5#BHNiܷ*i*O'X*Lr}=1 OQ'RJ+պ&g)Ŋ:kQ(6U1NSW~ኊexI{צ/?@(\;=&X^k,on\%x2Zٍ'?`3e?d-p ϻ_<_kZnSv$$b$ 6~2ƤQġ3/ Uh^ ݦf3i9!uʸkSdCr\tiީĝPy#Gf{Bk<>>{Fug{y[ jZ}:J*sE#BZLD~Zts C_/{1 ٟa t&::G5ggN}^1R%Mz ksSFKPg;27/Yogq[v=@#msdI^~iQ*]/r`W{|;u8mOWͤ>bj viFKJ g9=j`_ BJٞWi&E7|taTs1hq =pbx:(i_3mdӫV:ZIVzd1JQWGXcr)x)sl? 延[_LC `Ę҉$MV&$R%(V($a9HSp.ms 0d{N&6 }BTl)| xduR5yzuF вm`r( aެ4{$a=c( ܒ{u #Cd#brS;)4r4ŒQEj>Д uX F1J%az[K~ oSifp79ʑ5%M>c[+8 ?R>@YL°cw^P7<؝O~*l*|g=h-i11S$*pq{0*ۏxv@?F..?w/py \ \V4Y̬|*ccl,xNi"%4FO#ZpZGa߻l6B5r ބn9$7bd@Ke9Z<ܠ x$xVj0|SB/;S&1 ɲ,Q\Rc~]p"`LR~'L…W:&.xyMgٿαzpÇ5 nƳܬ~S 7C&uY,9uͶ6O~ S3UFVo)byG$Fk"]N3)D6ϙ͖ۈ{5yBjz^F.qwqS g,~O}F  1>#+BW@fʧ?cÎl} &S [ފ Zvje;ԡxh.#@:d +]F{A4krPLmYdB9;qL)V&i0n9{yZZ=-Ñ^,9L̏Gغ 3.8>LZirJ%6HӈF p‚4U x͍wwy3 0I#b4x1Sm OcLf3M=E!x˖JYǵ@$k".sDŊH,!B#d.i"99l~_RuUO+lʛeI}x쯾eȜV7:HJ1@\œQD%Ql Q*"jL;%^8hnZT>lK6(pq_==+1d?48С~; 3C&BWb.0A~1 Wm|XeU'H3< ުJuZ`0K0 )Û R-Wک0OTK>Y*2|iqWĿ\uȘ|yc.Z!Q4w8qR҇1pq֢a/^@9 Dh]2.щ Kziʉü!5nx2}ςn$]o>̤8Xɔp!Y;,aDܫ{Z<__)&Rpe*J4J OfXk$RGӘ&4;Q`z`\3#bd3qb g1%q#2L8bQ4 2R$7RDgzȑ_xI}݇=o;0xZr>*IY7,K -X/`0OC]"ˉ:|wx|;E\ ')rcSI%%T[4HqC¢B1bZųiԆ1,M&[owcp4F! |/tr@w-W u|u1ɵ.&$׺Tk]l)eEe,ՋYV^+ b I9ך A+o|sAnȠZE*S@rE:no*t%%21IE XI=ZoiF_ܚ8Ԝ'fS.Ad,A9!8`7kИ à 9eͨE@4z3.{bC ,Q6`xup Q!1V ÉOЊmJQI.n(^rqI+*$C>  J!hQXZuѯdYY :䒶J6zZ[vP5j 9Z[P^Y?vO2ڝ1x.KMOnߖdgiHoc㹻G6J0?yq}؆06RgqOS:9#O;7vE;K~yYm1֖i/>ʘᦕIutj@ieZqK xT4,.]7Bb8,&W+S<@ 0hWŅv^o0ܝ an.BJY ضGKZTp9Elh/\iM{W1J魨6MWyÕ6UÓۈzHWKʼnNb vH8>B9= >-gs3 yCMw.n$W^6<˷j/͵\>2eIg7yTT|]%=t9ȌNiJvb 7fHxSAﯪ:E*J.u&Rd]E()uyJ:Gǃ)h3UINMݪE*{Ў'@\h) c%|.Rˡ' CꢧWףf- NF%T0#|w~[E"{˻[W F]fZT%duoˇEA|gĹa}{ީ5 ͩУR%5cZz5ZJƼ-d%"}2W8ڝb$P5/;'}Rй ޣ=İ i%#ױؐ *%:}k GQ {דMRr#tg^)/kIP g On]A\w6y#`daK?]grFȄW8rLBo P[y(âqU1JH5%P=v~m,:\VhYtï UL[/>ciV wg85uJS!I)9Vsbkzpn+@*u0 gFCHcXuqj֗&iO4V 50NSoxO4h7p6Zjo/^*4+FF?C*crmi/@$כAW#kF-uSf|Je@ qztWwoPb+K@3ԨZ"Ād .x? 2h;:hݪzDw +6?ᆙ޿6MLP&\/h5r'pL'q1d'; ZE8Z]8C>g ?qY~ F|B`gWNå}zč!+pmoF'w8-_~@Of 67U]Zi,RIVvH֚ Y{r|ѦaČ;ہgxkԾoנVr +~P錮-k`>g-$ fᐚtO@'h0;F}(7{r k ĊvS{0|ZmDv_(fĈc³11b’Sx{r{SHRzDyO8-0F+'c! #R ?=x3QnfKMBGA :+LSeIIbHP$Qj.E-E.YNo.zr]5@qY-H+QQ %kg\䙍BT莻|ٚ&M$U"qݨY 9S95^[Un.50iM6%x$8bASF !4RQ)R"IZAGtS W{%Oj fj4QU*dÕV Ũ&uN9ܘ|5F%6K~Z,޳< ȆT`>ռЁv,X'2$I8gLoխ=כ. *d.ӯstId#O7ێ51!¦lk%e¡Y58mVfQHNx\ RqjKX4X\$;)`۬Z1>XI@ ZQҤ2Q'q+O!pi%m]f,y!=!| H[%᯦2>3c'=KbdjX &M !fT`tx}?8][*@zxuf>OqFOsz}_d^V%;<o̬W-WY/w5`zrCkh ZP/GV\'C)&HJ"NDԂ`UpxoYݬ|TVOcW_sgA(ljВ*}Ҫ0elӰ7JaY󍲊BY"x]Cu:ZT]8fds[C`;O۞ ~}Hhn-O/{3 t70;3.vs|ŢƻOݗ򊇳k(gz\%yuCAGf;[z vCv NG$ճ[ًփrUϮQN㚔=r& COn%dL4f.01sR!.M^QƬɷPXie7#-—*B PNE…Iy(ZK09OSk.5iXHZkqVEncY4v,'{cͬq,ߗ4]FZ#G-v+XUK:0fKEx(ՙꐎ10@6k"ѴZJEl^rF8dñɳ+$m"^*H_BਐQ'@C4Ƃ&&DU hYO=>#`)ixiollduԄm_QHbtykFH`E$u-$;:o(UoI^2y&QFteb&ç=Zu-Q.2)ѦBr"fQ q(SƏl)I DE #bט݋'od\H65N)T<5q QSEUƙbYlJYX;W#̸*Ӹ(gF4KNOѵ ՐPj1OXf%5?|Mhgxz95k=["p(,w 8S}/82?݃.fHCz2=L7y*ф (e T_V{l#/ CZU[D[BZi 0>4QmND~jkS"Ꙣ_no>wŁw<Q{|7rf~)gvHK;NCljCET ;=e:7/z87=eEF[)!SXvM~?e_ȎîX(4,='ۺB AaqO-E]@SɬfX]ZR ʩ88tkT`JJN04])m.P+s=S 5Er'0.,ZD538qe{c84 p%3]gxȂĩ1c+(c<@\ib nǸ$n&4bYf%^ߣ&B 7ӻŊݣ4}a1X9x*VWJ~~5@;(ya,?w{2Xt t]Yxxǐ!'os;O@ތ:dX)û8Ȃ!"M!Ԇ J 诂^bUA/|z zأحϹqS{o{hhB紿{jL{tOF[t8J0 dZ_dO-OWeg;>͝cTrhnmad'jJ1F,̠/͕N]S?˯KL1WG;sCh@OQ& rBuQɓO$xZWV h\%6aHkgWׂC^j_SPmޮURh[S_{FN);o orU1%_#62+Y xǐ!ǴS4]CDqޅn&҇/t4twve~쎡n(Rm<2=uHb [=w5%m^y^}r֎XHcso/qn@Ø!)*6ňvC>%@6^g~k`uI mz滄xܭߩIf.ai-D1wRXaNwݲt NMzE\Bj-x!n;_]5s^s gCT= "O R=bAL {ߕ \\~\I+^Mpu:oo,vg*ƈbCWvlALmnYA}zHQGX0-~~'Őˉzl@uE7?Hu!.:¬m-mK+&7S.X.^c]\\Ϡ戢\1_]Iٙx3[Wq2J`q9N&Ed@/9%,0 mΪVbٱ}1^Rbqn)i{}sO" t۸7Z `J"R/u)褄6$*6C~*Ph B'"w*Ll3ރͅx!?^-ŊC ΂ X.C?["\ō8/{Nѳ@2zӡcЗP8LL&8 aNljveVHq&^IGwO˿9q"}={=WNzaVL>B[|^ӗ&3B q1ےoaItߒR xhGSFG5)0q@uŁ9Q b {*A|sE\E3< 8FRvqNf~ bm\Ԏ-F A׉z㓵'P@ <`*9n>/S7zi8E[tPDZilq2`ج`xEIxrwrD#(0W@!CZa Bg%BSpE%yqHd2PX0.=S<ZL/Uz {BPB=֔b,ԯ*Q&A*}H$A^zW? 2X~ 3Naf):]#p #=`SCC)# *NY4Sk w jC4w%o_oFfVC8(uzb|_<̒/?w|饿Fc[d`r(z !7'$r@QqNuRp]VʋW'>ʢ(QR0&rNpab <W?DOm0YiVz.(/k/]ɨB$X -(eCJ|mWd>Ed决K!UMc POYKdקZ`I=G7w6"p RL4*)}P{U ŏ$SZcjDMZ0jŔ/(`;t$t %@/;#'-|Ń3G_#7j1,N\ & eś8S!Cѝŕ[΅#d"uݱϕ# A)?^t\\)a22v.*IW%k}kNQdz6N!Ԍ=Skwl:WbC}vr^(zϔ?7ێfhaHAT)SV+3\-/JL 1CԔE)a%Be[1I8xlB XdSTXy Ya$s2`x:P+8hO<=l7fgJHdF;F0-,3kV t*'l)QEwI3Rb|r>_zׯ3]PqռoU2l=s5^?|-_R8QrJ̗C~"@Lw(W0d2޼}?7L|: ;wk0{h#Bۣ۷V #A*ǼAY AkmHEAp3Y{v3lgD]<栗%Z,RMJX@e]UuݺZ-Rjzse)dBIB?IS]VteSJz!W0UY 2&AH̄2>Ŋ˜'~K5NbηD h0"$|p6ÔJOf :`<&NJjNޕyQ@rSJd[l-)DSƀy?YOaPK=枲( WBfUYS(h sYh`QEpՁxB /E&uNWXsyO EQAd>CtkR1L ' S)lJ,1R;)2)*-rk NC30{+U鰥jRU񌥙EgTe>)[yÄmb'Sm ]. `T*9AJv2V)=P?V!DZ Hqh VU`IaEXecK)=>ë%, ˆ3Q-,Xtn򴾴 WZ:xS|4!HU,Pk5 HT q@Bh1A722!kACgY:td *ݭydv@icfPqy>PTTK1*0VkM S!ai`Za"92E{+HY eq8x 5eJK9BF,TYy*)R3%rXg 2NN|n@.38вVC&YW>Ex# lҔy`8NC 1=F콵Z)B'T7q (o4|򤸵|585hsw o` l6' dOŊ1|%|ia.LLzII'I`A 뤢cDcJOXZSL<] CЃ"<,:QͲtɧնNfn86R [ :hBD2κudK9J9Sw|8LF]c{8^öFPь>i%Jw4E4ΔK!Y!NJP4֯n:}s54$n37x3d],n'Q6Y)1W78}PLJĎQP&:<߇yoTSY&Vda(HnנRȧs1wDp(`MnAYYg$!/\DdJ'>7D3Aщv;]2;te}-JMֆpqFnڍ^n1hEで~]kڭ y"%S1ok7{bPGtbhN "jh!?diД1wª(.YvQ]'y̡ ^FCV&͝1K?.nr 馼ooi0vB sHni=Ly)xeTJ^|>XF 'BO/)`yv!q&Z1rBY-XשH\VU-uQd̦ "1J&s+jH1CMeʑTTL}Fc5ɩO@(CyiZf­cb=R8`S LxM&RT-iۑ[dΛh ǀ8_&tJt.ϯD10nSX7a~BKAQdg4a)Zd2ت_{4<9_~X!=')CA V#\Z+ % lkaϧgJqH2C>Tb$%k@B2сP5%yKWƳ;3-Ҝ'R=|`H֢(SɭU)@!ૺgI_r&pNK2 \DTSp_V>b zQO|Zkoᎈ Ymg#n8g ܥ!bIJ2嵙g:,5=AKrv\2 # {N';?x^ dbDzn|~sZ$DT%ʡD8+ێ13P,eq SBGXKJθ+&@/|PHi[]#REXZKFFUk⏯ŨTMUXN0wq,A6Paߦp¹TUtɩ8;,W#~i݈ܿ_4T? wp'tUcN1ps!aiΏJ@Lݖ.&ϫ:݀NńԽ9RSpQih敬o ԤN'(IjR(R,d[VPRuz>m1`&1vZ9p-vUdǤ ,fN%l)ЩD`Ry`T$BNn:V} n2NvG&n:wT GiCKK*ĂLҞ,:S-q`@'zKl/kAFqg o|T߮z}\<3fe^װ㨋qbsu!1Vv?<7̭[^2ղogntF]`2//RW! vfvy\u`s7-pljDfzs=~<ޤc׃p~o>~61'aChq6>&(9rS8j#$;#fHToU]Uq|E~gb<^ 8U6Lh- C)D,cijrXQS1ytwK{4~%ّpGjԼ,|(~4ͳpAbV4X#;f ;+dc[60-7M߅ ֙d1`̋+yݏ6m ĎQP&:IQ:!ҋR5S'F1wLDoq@όF57h" 8(` q^`Y:WCQ فIX54F)ZŷpuCHZ*Z&`a1i ; V 4w蚻0{yzIG'^qaE!+Lj隆ĜI(JIWT<0r8&   ^1Z2Rz-_!6i/sܚ301^3>Ҿj:y[=ʁ &2th`ٔ48!H b0Jˑ M@q>qd~ k:}Mgq>Rq1{e6{{_<>Ss'a4sh4 +Ȃ %,$L%qn\y8DZK$E2<8J13aL\QRhRz#7^ua%znsP1pm41IC$Ny30q(W8n?N҂EE|ldf6j'2$x%`佇(z@p`Va#G p3(&]ymv̕ՙZZL`i伛:D#EQ#yߙyU70K9BSl,ߝs}_n63UT1!me04`X4jt/}y7#6u -s[$eIopVtr*lXuư)8.C jdT\[f\H½b8D9(HGa t.GfX3Xl(B LYgwcJg`j3܍tL]6QZUUbTtJaq9nʺt5NK:ѝ(u1N^4iGT)T89rj#F.8r]1-P& nW7Y+4jL4Uݩ´vLq,P Ƅf%ˊq[WPNc \םΣ)'5%-ѝg++|]dʭ EPr*ʺI9ی#OA%@U<ʠr*PJƺV4c:M*ŖJF(ZI%8Ba F#yOe^hmbƈ`\U ֽ1)Ffg0+ |]}t0l\F/秢NF'#KO{Ei79 GLŘ(Xn)+{s0eJͥDo*w| 6J%t{pLY6N׵`LXp4MT՞R&{0yͷ+4VWcTAԇסr{fE8O}ۣ0QPQS ްv|4u_Ǟ~JyoftD/U=?OXt~o~oO?9ce2(v! FEk);ɢVa,a긞X))Q(jEizo~?N~˂rCL]*igw= C).,j ѧhAϛkq}+&lч{O7O) |/eQѦ굻W7Wi+W׾cTzy_wgp=8`_|\;IZ0Ha2'_kx_lwK7e)ċ*R r0 Μc;IBkw 5n͎,Ep2iⅠRM&ݵB 'ͳ&Y?u4Cd 'S؊,|"9ҡH;dIcNʂ3pT`Sw{?xqOymQKFڵ8\ȾF\u#E]PeOrysZCx雏?ǯxy [ Oa7ܳpBq[*tA+B/rPEUO_9#i ?t6'ڈw-͹C T{ڕ޿^J+n-A8Es:޼ܭ;J]fN]x8ӊ|c)Lk\ k/=NLʶ`)ǟ3ЊųI.SՋXj/V%Y|P|sZP^m=Yc{qzpݍ56FaZѸ~ YNz"ɑۏ*N٬ھ\N㧓/R59e=xM9ϨѦ۴3Yhp~i^qxp~DLt]nrC0LuzpMݍjD eQi(\1beb)`TV{H

o2ņa+X9oY0YPI}nh=0wwڻ]I$JNN%h=8sݍHT)-pm/yn /QB䇠Wɚ*hPNܐ@c<(e"EذTea2HljVef8 @v 'L_`qҗ՞nX)]j2hÇ06d<\vn™^YQG'17y˼SeTI.iD C0A}XŒFd3xw`p`Jy~غNdp ̍a"`\K-| >vH;#Q.0׈u鄤Jţ+c.6< EO>,zlF 3@(p7+#oy 2@[BA s^ ^gdNЊ[jqLM`ؼ%CyN@Hk(0, QB0aJDAhjRz>@4Q{rAD$Ɠ( Q0K! s΄%kţx>nq2VxTwT9y:b="Q ofwf_BAǾ!3X^#ڪzӹ4?4z<}eÑ7. 羿j&7wc{|Mҙ',}zDa,ݘCI@c%;<8l̀;O+CBrHϴ7gM=ipF"G|5!o|S2Wţ:@!Q[] RN6hsڙvTyC{ΪP,-eQT,-{6IfI(-AOK=e=|wQ>Ur %{B^J9Dj)lZJ˟7_LI㲰S/z/Sf) #mr^ΰr=-G@1nS?TM&&#]at >|NzNyXToKA.&񖡇J{Ffoqm l!BH+GM)eVm! O-1t2?J?v;%flw|m?Hes49 )-4R }҉Q. :V!t 5\=)8<[5͵ӝsIWrA.-q[:ph\NEZAlh\׋spR} kw$8MF_N~+mGju:J'}G :,jlLnSH0kT?kcȋ1ϻ񻏥.'[]9a^Ο>~?''!07v!w7t&!)ٮiy $l|-?-$g,f'Rp疖v8Inj6VV֡b盻{pEw=;ϟb{\Ctq$]iU+KtCtr6@WZ#KVs^=rժuR nT z\k'Z܁bEmיDFU[ZTuWv\JJڅjcՈ:SW (YegxTne/ #W2*t1_rīқ/O_h6Bf?h*¬JQU ߽~x\\oxدiK]%q?(E(w f 1~[/޸y?)MCȡ:Qi@>D"k;r˥x롢γX֑%'( D3* [ECZtSʳ Q)8&J[@CZ1DFa 8Ű |;UuJcbe ܎(SpD<#D(+1E~\YbWqJ)FwSc*1(>~0<{Ň~>fn 6h-?0 |?.'0| Gٟ7ٟw&jv=IR3]1<,-0VQ/1QpUnp,z)TEY|eg\~6ksJ`l&ՑS1Vlإ|/gߥ=~X ]Hc&We'Ha$/59v2CC௏R^E{oewV #%|^C.yOaʠ5d'YVYV>eutSkJ%%[jPbkT]J?NY,:)eevU; Z;;ΐ+bǹ:x{D?(LuDҥaFZ p鮪Up(q3Dm[A HF uU Ea<]pEl1, >_Y-)g(Rľ=Vݛgg,baTf|LgO~pO#JfNj /jv=u1XRuW8K r/'pk2BooUٍyfX{1/z܀A623WIFG·8(x=g] {lh2p PeW aAy-:\Qz`؟O7_74s; uļmyY.kͨ[dm]am+c%5Ͼq먕$W_Ds!X)#0&|SZ0ťc ( Q#9D2#єs䑵sBe-nq;{nBWo~W,<"BL$@0\g YB8a:VK84 |6vXcՔX^ǀ (K"PPq NbZOqSV8&)xC%ª"q_&(I=J:%+揯y %5t(~͖ԗz.~lm(pE< 9Mu.M]&gL!PԹR@9Z$!'MrUk'RC(wt5P5/60gEC; F5l!{QGJHRP֡XR; Xyc%ц(3"̅=JBGR全CmjHM! 6m$ \{AI%&t# )[Q9q$Gn #R. 'x"z?_կ) IgѯC|AKog^[%#}Uc`t0hY_+bG9@$YSTr.[WjuQ.i+;5>Z7oOEte)߶%"sk"x7Mv4s}}6Q}]ޤ ~{ ]N؁X`HΚoUe24FZNcnf@N^aK!d~PsX`x?c{Y1 2nU/F{Otv3n% |ɓy,^'˙Lvn ss} _x#ן,>:ر iD1L_菮yL̜q0گ@97~̺h8q;| KDE[͍f#"< A8#$\ ߼'7_Nkf]PS Ɋc((A;A2Y}dz$^|m]<ؠ.j< *S>ԯ"X7pDT>+c4u%Ĕ 4"Q*(jU&s6FyIŤB`$fFFuoIb5Ɏd5uԆ8Zq^FU|< qz?P gAB3~il}P{p:c`rA@\:%~RlUE.ZUwѪ[U^F5:ƴTy:V*=aq[FXKMP޺WX2d=S:ڏ,VQH1󲭭(FW__vS<F\ʨ(у|,Hd=)#)a} Y #MBk2 Z* ߬ *͎ **t @ @ * lc Rj戆Q%5fN cVmeQ%A kfs {T)I4z2N8R™7I8C`bavc h!;Lsaz'"|f"SCXCQ>eMpS鑴2X7aBh KIm 3Y*jQTï2MV~^o`6 RQ*FQԚ w+B$f P]Mq6qEL@j2F3w~qu.!)!0W{G߶$F$ h,P2:FdA5+>Q;q{<2X_@@QĄc(YmFaF=:.qzZn%SZ+ls9:lgDI$ƎH](GI58{9 -Jd]$by|kj AޏPץJQ荃WcZ1:;!W.ov|[v~70CA>\ 8j=#_P.b h%i,DŬ݊TSC2qgkeЂPBKJ/,0B Do j49Q dXhpR,x7 ߒn%RhTX.!S%2C"T OI"䊆&9[zs4X F4h!JDSJ-  iطǹU SuIƻq ebJ@kӓ`"ǁxA&:+Y2Xu@t(GIFúsp$S+`aG s6 rz(\`-E/aFE_AF$:qb"G׉p",mRGv Ʌ.8&ٟe:\PH#9K %m)քZB'7$ 3 vfՑKx.fS/l.Qr%@ ڠ ZH):/ؕ>r˛YN ߆.f@Ÿĥ fMo~@,ebd/W@sE2k#Qڠgj۸_Q5p?%ޫ\3F/>r6 F `XNJ!4@OWXAXrIAc0$.'o̓ >9<2[wlB.7~yGXcc"Z5!1 \0#s$WJmϺ1on:P&BFk8EG;eV2kvsZIVfPޗC9QE~0^3ֽrbE0eZlC[\jBV,^a0L8dF>YO<>Z37&V<`̻xj6+* -q m-WB~Nłc+]Up#;Q{f=ZH[5. suyf 7(Px y.J @˜tu[:Lm] y=mvJ}Ɣ g;;bTdIq *ufǏ7Y&Ր3dmz6p:4pzwAPvypr eycF^`c7S#V{>8I 1kfL cnl'PeCoΧ5E4HXoТn1 e-0}(#mΟ!9>wrӜiftη[+lMN`ca-{E߄]X~<\n!8[٧W`reZ[Gwzjzx>%;WCTCk(ҪGW`J^'5OUgΓvPs_:zoeޏ`G"EoDNz68JimV;H8LP7O/(mnGUȹdF\ fv" Fg@a01BhvF_vv٩^h>Ri; #k?^jm6#xv kYu6F6J_ LdVYxױ2ْ3YJOm٣.eI ]ɳی)wKCF&»pNS8m SX||sʠk5!(% -e{.JeFfLXcE=>'c+_8 % 4Wa_~Z|]J:L2B_s=ڌI7Oq gP8oFl,VJ(?;LWOWq~C4TF\UN!efF-PjuqӇ_/bqGSWW˯6*n.ڼ{bMHFuļN-h't`v%8]z}SW?oNYΝz(o~r'XԙK Q}9(LnT_v !P[W3xl -dDmN`&Ə#ߟ d;ù?]<p"E 7y~9J߫cxd9_\}t2,./3?]TݻwQu۴@̺ڕ**YQAP-ѡ`x؟WWonW49pxbzPLdzga6Xy?/?z(5{GXFJ%qĤ[R wr+ ZBK;cR[@*k-C9-\ / . WVu0h1YvVhyJr5:yaFZ4wZ 1 DZ(wJ# sg1i& 1ږ+ %34 43,hVb]P#ؖ$=,],5S0J`{ң7 ˀԱ?p o*эH:knaDm.^'5dJ!ݩHBS H?LBx >I{u& 'Dq3o#aZm J{(#"q%s]b"Qra,.#ׂ#SP\G*DaR*l+$؊T%K4]NTXɝB{`,H(KKy+  }ЁsS3뵻a:DV/5s^ѨeR+3a(Vqô F' Z)NʽCjdX3;"鹙ؖ(2170;hKj uu@#Y,@=R['l7)˪Yi] w~0Cׁ}CBNxu`{`fS!WSB' u™fV*SGM7wQpVu;lX+`w? '٩GvPv{\ &Nށ-5ښ壑:;׺[C+WXb"n*2ZҖ[\A.| GvW|Zf}}ZءWq~\,s(C:սӾ靏`c4/u7g!ӏB/BA,nVIhAvPV+SӘ6HS$OI8C퀖S&NSJSPBs1*h$=@ e|^n@'Cx\mDOL.q  29ȑY^? QZO⒣1^dD4Zr[J҅*Z,0ꕳh0U4C}m~;:`>)R*@` rޜU; y:C|K *erוJ YWW*ƺR)y#g&L:CC@rԅ/RU^TZT(-mI8s }1ťtW7C%k[591 mp=! q ^1gRBZ06^ե܄ʹfF|N!TQkGo$5rD!-ik>K.y(wJeff3(Kj; X+T:>Z*nDy'zLGivy$Mg)b9ӱof5>-֡7~&G&&Y( m-;IVb4rX~만Y;|6jn[,t]!LK1 }'DC3Mj1s.K;MOc|yG)# CtE0ˏvj<㗰dڠ6(TɵYH&;jTscFb|<ZѕI) i)), +x 5cj4S3L s8"fWΆ}w& 0eT~Pp Nza"\)-/ոo?7`dsoժ(8nO +F:q"XqF"rXxi@W E@`dR)dADi=zW%X/>at:ݻ\Y&vL JaQ$uL^K'X;(ٛBօ܎ZHŞڣv`WS#>ʰ&*g,*:*m4э՟z5BHzdZJ2-i{F MӲ!h%ùEfyƴ}W]cu*4%l{kn쪰-fImaFSP` |S V'F0ju$>7>D%I%-8K(u3|t"8j/F=aG"~1H5`Ə%Y.:=;aux K 4ctрŌ=C $^9#Өj#Y4n }Ye x-}dsz k2|\ / . W{:Ah·7h/AK@)ܘ l`7JT4L %;/hA"T&f^7FrWᄎC*< VYtn5 pe $߯/7fܔ}y XMM|% ]fԧ ue3˂5$5MXjr%ĝ]XPL +akDeK *$a4o>jףK t(a!| Iٲ$ H1D2 :^Ĕ`.l%u 2Ɛ ήddf {m$2l]{ȮQoY׷揙<=Epay3}=Μ#8/Bw-fܒc J,LU^C9`ݼѽXTRZ?xqFZ;aj۸_a]j2/ҧɥn⊝/Ra;Iʎ~ Phf!T[yn4w:f^I0N'($qgkxR>"!x瑀`Tkj`PKbH e 7i%RoOĖh}&:Th@v%o`( ,sΏt)\m5qw[=BZйɇ#XKA6=-:7_*Kja#8 "@Y GlcmGZ I).ǘ.H8w|ЄO).¢FXk_`=B._/BQ@bF5DMU6I[oOcmP,g0f~4!tn0p^!4HY[UR׀҉ # 1GdL`8m&ܻ8hߔcS59nS\RI=R,nBK;nJNL˱ 94 (3? D>HNI63b )+T s<3 H)3 R+7d#;O`C"dύKcR "U/?o0L:3F)j>WlǚkC[Q|!Fmjk(la03t.tN# b:/\୅|Npl 'l":TA ꒴d ڧ5}xBi_}o /dW䃡EskC_kXQS!)]-K%@Iv74@3"h&g}<+cr{w%rP$z?Ok㷻O\;D5at'oM>18FhIeqb8 qp6)E%{.h.=w(A0imJ_F+'Mꀫ$;UhJb@}:@4-jtAsa@GkOV6ٰ dlۊ4zLzrGfUk1ę-+fp??&#'i4tNtXY;~Z$/k '3 ctk*N,Y (죽 +sig8hz.ɘzgr:I$NTZ \R0Z(pcGSGJW& K\;qکZ.IoI#g ?05~8ľؗ2b_#u; 809 L䙷DfNgAS's&ɜY{̴F s޻e*pNzu':}'@xzjl6MBaүUokO"2QaS:(@ ]>MQ)'4x{UNT9?:Z+G%\sG :ZXfRB^;ѭFưp XJ1d1nhn6AA dN67*4:Q*nEK)GA!8XjE _SY3ȸBBPOB[UL",2ZpZT2h2ԁC̖=m* 'hERXi 2^82C} +$s\r3e [.H43' uy;, Om k ‐[[yu-:3MoEi{)㞑!(ismceM\`DFl ziY(9s.c6}Ayg鞿*|zLJ0&bc>|/<0 8>8A]cˀ_}TtHE?E9[εLX.~4#eŅ Bȧon.BQgZJ4ٟ;y?'cɠ wTA_.=2KhxGWZJ3$݄r $7$FZhx3jKqqQ)5mJSX3r1G I3fd'Aji(UE-F<@ޣǼ#N3nPYH&@%:";ٻ](`% 8 %h_,BR Yq}we@V"a 7g2Fg E а |ĬB\:QZ-4GwZ9)h26_7lmYT nFIFh*YprL].,WfD趰:b+Py냋=9霠#h'+F抢W `rEYy Z(mV8u [^A,Wi3 52!"SW$A>kXpFG ĵm|8 XR<Ǖ˹YpV`ŴY 3$ͬ4$"1ֿ͆/A9뮺L^utڍ_:6VoX ihEն組{wvf5l[A@c> oȿzK} v[}a<]psvx3lfߝ~V]z{}i8j+ڛǡ'kGP4;fO y#ʟ9x4Qŋ=[ 5F&9늓?ǰK7-tAMj}>ôG5a!Kd!C_N'F yfRo× [ML6َ^VЀNoUٴhQ*N&Mj3WmV `ޱ;~#TZ Pd{zVBKZVCB*l8|yᏘKnZ֡NDЬ-/^K4]7_1L_Dc.L!뤒p/0ڸR2]Bb{JB43k%l1'/V_dP!|V=ܠ,߹yh٠yqI;qh.J'Vy9N2P ߬ELf[2S?3ԋ}cHQlpHAXC.beFA=_[BE~!n?^i*TE9t1w 9D$]m+ rP6GX!<@0_NbĢQ 9͹(99 T!EyWJ\idfqF!j8aN =gtV] Gb,e2Yı@O&<C^`m `ِ3`dTL$^7&&LjpSՊ⡅nSemI)hHH*mS_)֭+):h9͹(6łTk"&ۦPr\Ҡc4đi†{5*bBTPQK+I0kf*vZ% i Ev0Zf-l~r>< 2wC:/QKd<ӌSIAW[Q)I2k/+!HJWu!r!LS_[:MwT#ecCOٟCeb*&rA! ͤ3!1[5kx#Jܵ)D)4Mr)PF/ &2sQl&Aܞ׆Aqʅ2(|CL!@rDE;`8 yI;M#|LAj9DtB٭IIFp4} #9^_`K)1 ͦb.j[Gkϼ_[km!p;Bº׋+c&ō  1YK+kFDRRKvNF/]K {j#(ܡfcF&*s8Q89PXQ.9,]%NI=zH1/s|\-6.>]5C0"l67;,iB fLԲS{=Ydݓ*u Qn`sl~P!()ttA17"@ zf v9Io^^.fA<ƵN6&NA0@fwɳojuƔ,|եZn^_Gφm٭쪡5qcFA684к d }G6:K0ɬX-к!_4OđwR7 ?|xzk0|&P$ 0aE<9loW 3_a~d΀BD9Bpi`F\[F/ضn7 aA귞v1zNw4ؑCHsK['ň-ABRFW݅0vЖĘȼ 4a W!E¬&\KڪǶɕ&HjRpi7%Fs:\IQh:6q騘4hr4K Lx<\28-9ORJݟQL~b>}np;qD-eM!kO&JOdr0]6B,N 3H)S\gRٷY_awBR/֨p1_/&"|v'd+͒p29ܗj%22DZe{rB BF&O*@L4bH\ap)ӛC`ӢB% ,MĔ[VDPT,2p&1O7T$c32[bD@ ,XeJ&˲7ln  F z3!}y\oˋ͍Ǘ=YD;H++Yoխvp/^BBSMc?7g5{KTCu=x5z/ #^9G?o{ͽk+3 &?O?Ur0پ^^OaQ7E_;rP1;w~p{{T#@†K8&DJv1TC! 1 yC}O2⫒w2ٷ;nw xǛ`VW^\rVڭ?)[^[^mכ'_q !WRS9-WI;UD{[%'%Y[um?oj__^UvC_b!]k7ӫv ~yFиƼo׾߲'̯nmL}w3c}zcͫ Jtڼ{KgjkƋv⩆,*ъkPP%&uHh( аT F$TDFVWO 0uN`ܩ$ae袘#,yBɭ(CZe VOl:H'7†Zy^ƑX^#" Ym{^nJL!I ꜤNlx!78`_?/C8-lR=~?fo[i˃r'mTz}7wbz8KV ɻ{݇ՋY{sX}ƜLα\\wXÛ?u\.)UY*pMX ZR*a3wo$T``n7[(+QqN{[B2ʅy?et4.Iѷ`V!$h8Eѥ)T~,+-Pe5`5VWka#RVUUi*EJ7su~.r-Ӡ+< 1CK,XyoH<.fiUHnʨR˨cOBZ'!Fj$ ~R6Rjxr]0+'!fp=s)d˾{PI]iUhAq) w*ͥ%WEPX-bTzH;^n웓9 b7:+g[ըn|GS#5džjDY摁/0wnK"aлhuwfu(nչ$u~My2?do|#GM y@FtfIs/@OQrr) KTT7 vppY$9=ݪtdoFC<6n.ǘ@D,4bTp`|pG J0jd9CK0*"Ĩu=9 8eQ`4ەzĨpLYsE0#W٫xʘ2n &'㘅(K Eqno:JEQ L')YRiw{ \Z!(ĘQ J(Ŝ*aTN14r*@X1E7 ed -_c]>( \+JZ+&PƔj]J3bJ#`ĥseetL~͵;/{ԓoKsڕ{'{Hdw9C('Kc *`I+P*92$ R1]ZȪY"5#t"..b27Acɩ0 DXk)cR줻z4+$>/%xbmNPSsB_]̫~oqw)2[%#PΒH^2 Ke=VDqy=qG%FWs-m0 | vs 8C)|;d˄M 2;٢/(N~FA³~E:Eț Q3_y& t#uTR~BG_G\.c`$t^o*H>^h>`#kٛAOIXFcJ ׭+ؿ;hT RigzP5SXT]THyJJEB זSIdN xh@3Ä)lx sa F$vc9.DA8z q%(@&d!/ prUXJ3 j) F<0-9]q痏pt;r 73GuffHSIhp ׭7xxPhx-Fh8\JLaPJS]P=Wg m͏YJC2Y;Re!* TPQi 62 S1C%T*ie5ڠt.hI,D>%F}PUZ (xweIzyg Г0v6ٙ!"4$vc$%x)Uţ[[KU_DƑ!IJ ZjIS"Sܡus?FI:\_xm4E0ڶ5p,F<-1PuX.Y 6!xi p6hBDy-b|FjUkxkhQ8ހI H&F%jP:b~"THhDJA&Z7Ü0m FqSòԌM}EU"3d2g(/xq΄b"F˳_܍sƬr?w*0x=L90Z5h?mjϘKtod#Eɤ|0hWϒ]m[=30W1YC'13z榔8r27n,7V6enޭ59Rcc+Y&&[cѵPl;ƫPg3>w Z$Dx04F;?NYLLഐZH-ӷ+rn6$w}|<6 Z[S}ʿF%+3v0h&2Dmwh V\bDn9.>ѰjB[n@1q,Ci^p$̿~K2n.@Z1ݽj27DH2wkKF[. l0K׍i,iVRk&1y*0O!U$dU H s/pwu. gl2l;uR%Ĝc\s_E+Bţe6D Fp ;O nƼ #/::y:.bS 8}pFYu+P%[tC15qCEox*Aic<_aF gnjGӨkffOm5syl+3gR>}yn2&^)7|`K'ܯ0(*ʭ*m\¯R6}%w4xV8x6\K&8}uÊAg#Dz)Nm.V]S#1T'?_ɨm$D?!Cr|S8B|>l o+'Ә0dL7ROq*rxtWI Õl'@;¨SN9# m)z'<_`e|ȇfG^}`q]ҡA87GFw6cXv{ʅ*{N@4)I("ZiEUk &cM d!Cp L h_VN7@L0JI5nQ䩷Fm$<(KhHz@bzBLlP(e…upK%X\+XBN˲6>ĈHQIǠYÝA$h @ӉDIwh5K@A0P,+g0ĚCB<] SoDY<"KpaE)aa:ڐ3"8&\GL x:jEG"4Vh",JcF\ $P(l,|I%˞ .Rj71[Ly(nX鐏XrE}z z[ ? K:'ߞ>?]0;ZGaq7Z+ܿ߿;lw<.`3TE߹[ -^\'[]ș_ #rH9& ??? u7g%гME}ʫ`P{v1&1bHXf@i="jP1FTb$0S`¨x+TBH-6Ŝā5L̯͒hɡqs h1ŐeJD" JYyRs1s? 7Vhq um2 dY^yL2T]bI%o5uȰ&coQI6F(IV)̨$>aIVs'|#'||wMuna{w#S6Jj>ffonH޶y EVCxk[OB9u^[Wf'9Eʇho_.7VsrXiMfŠ$/LKM ghty˓o<UYF8+ԤAH=X|ܻb.qp3{Gt-׳~rq~N`H\}:h nFى_>| g;5C8쌜O_tm+rs=`\BХOzâj3tk"R=n{F3An5XmqӉ_ XN5*د\ͷm,nT^174:t.&k!ěL# GnQ\SZt,Fj&oMҹ֏O[Yw3ѷ\,VLc3Ek2N6_8HL\]Hۆ}Ihiʭ ᚦ_דQ$a5r*M% <f xլiVvKsS9=N<%fR=),gOx=2,gM(gCNTGDc)z)T8U+KOyr=>mPwŴ֚-RH3}H~]`L/zxN㭸5{TӅv{ ގ>,y*ڂ&z~^Lyol.8Y^Fd. ;"" r'U Wy\'HIӖDYuuLKubj@SUIe1y"2or}(JŸdM  AȤj'$3)` 4F5ёi!(Fƍcu -M!%gB1"u[ËDPqw%:i^lCc<"a!&qAX7ބă4Ymʲytʦ!2+"3Po- rSvS[[cG+ ?!V)[z6 w[O&Xq/(zGLu"Ja+JՌwv'T)7ƗsB^\!`tݫp㷝Jº5i+Ҟ1ָy?\"hf^ |yћLO~Hr;'h $Quwq~ޚ3w>fCdYA [tF~*,rõX'm| 6$F%g wRNg:(a&PE "%*֓ U{[!VUH@+)6S%4$fsu"R0Z`0b…upYi=){՞noaQ}út. Lnj_ F怄IDKRJb 4I8 #;u>{"\;{3c AW)(fDbi+?|*#B6ً&6Xn"2f 4x-ceF:~wN*5Py1.=uwD}uPPÛX>]РheG\f=ceTC,=fEzSLJ|Mڎ*uzeƆ~}uմq u(ڧVN)q7VCv𫼮V[O?@E ?ZOvP1v:P-2N|PÌoZ'/Ü{E>Z zZZl-BуΘwQ` 5C.0/# LknSr6WiA!Mp!k:#G曻o6ꉟ_0:1fcicw! %=і`Jg.I;iśIgOG<J&!o3LJJ6fFz]Lh=zﶻ^hT"؞(M ˵%NK[ZSwZ7P`Z(z +!ټK+ RuN׌7p4-P+_n|[p*넭!P PH4D: : KpJéH Lk)z'<_`Z_ ͉1Ͽg"^-&pB:srݥ~67J1U^?,KwmH7'/),w &_f`Mv"{$9ٹ+vRbf;@VbU,>U)Iz†E~/,C FXhu6ʃ!A}5`Ր,29:«C@K2r67 e[/RQJ^:E+p|m_%qaS"ܼ ?o s;-3EI#rS! y&dSB<@!!#bǷsKbY]nbb9u\n&pw.7!6a!/DϷ)!lH3Kq6@Ma࿦҃o*,OWMC0yޜ8*ǩ~0)W P lWYN'Y`Lo1B7&fv̖~>3DD 5̗´fk(k7ːADqXo2ܹ I!CCw4WUm.qR]ۍ7J5]TvCWWv !V9+[e G1d%`":*6ҍs.' [دVb;].ڗE|q0E:{E_QױA|}S+1rCK@lX?ıǮ.>yt#A5EcvK[Gw-#:QE=|޹ΣROCp!f4Uyu <,F#TT4J`rPcn7:>~y:YƨSS*AhwշgSu4*8|j*"5$5$_!F8 |pS!Ba,.$f=miOtڞTr9{mn`-%&EGbvcZObOՄY猓kÈq 1ƁKzirs2Lɶ jb%0}JQ|%6 uJ-ZX]^ۆzL`Dyg:;aT@x8>- +"" #HR ձȴ۬/ R}SKggK fVܲ*L֔XrL';SҹHݕ)4Rj'oy0hS:**/UT +df:f4ڇ^xJR+iƘMI}UP8q2]NIVW50X0{4Ban~[R[4PX\j43gRhۡ4z|mkC҇ž)q)f:Mϖsl\^8zM`> ^]8 %i5ӫ/f~5ثգ)/ //"M|'zcSpsNsk%PtKY82 RsC`O :u(;:MMX 7$YDfGA>wP(*K}m{nMX 7$PSǻ $bc:Ϩ:% \koۻuDz&,䅛hMq4}&*+=zT bL'u[vRH":n/ݚn۔fZtظo!ł-`2fұqrOueYq^#!~Cv1X iR'G+}c Mr& 9*&0WW*ъ`>F-xYA]5Uz--nC1iϋ_< _J ZF6-j(b43}8h(Ri s%oyWb RzncJ0n?&+y,^T%BI6eDb ˹'3P&4f#Vy9s6ȳL=^ZXŞV7%<\QʰO #b 0A (\BJnW}wͲos<-Q8-Ά#w_zGу?E ?K޾yr/>i3>51"jˆAbO d|ſBſLb+onWx?A=Jإ ϯ)qff}z}5n8GB]D"-\"ut+Qrysc}BSj658#\L'_8`k><~~i7!݄vbM5SqTr [X[Ω80Bvfs9WxPU]m;Rt=Z02vBh˪kgM\6[rU>l&&%8Bb2Y@K2(jy9rf Ya9'%gO TCFE/1#.38뿻XE͋UQ櫋'oONe%/^*YCjʤHZm_fjB:, b=l,~:hL_ϗC6ts,vU!m*2^ #% mOȳG<:|K4Ĥꡝ(G?a]p;[UnUr<9!hkN(YQ'MK';AXkCϸY&:LYp&r" UY7*4voL=ԙ$Tya)OPeB! *pCS xeGT^|SO/܄X{K e/,05JM`NKΤVpfs鸓r ˱&al,rIC9qD l[L:Ռs}0o|8spe|TR%xIxJ.n f5UkWè>trDu/ džڰ' *Ulw֦ف|jrnj@ƧQXyp!ɛۯù^ *~^>ÝJV ʩ︋0qLlsT.kToÕ U5еkrG$s[}ܹVAI0=>~<=Ost쨺v.\k;AyTIv,V U|Y2il4 D0]7>7O_ҀQ'7OU8NrKשB jsUNVV_rkxp=GE/PiȐ.NbZ4:إhmJ}4S ~R9 QLNT04Gt.,s%HW0Y-Zl %!7;s*~sWj4c&esi"^.G4:fylR eY~ D5DWt[ʭ3z?{26M\ksc-<ؑ|0#2: MH xAr!j+5hC6c,q.Dq(#0n8Xh!aH,Q bYjDMPɹ4QRR̸\ףݔB^ݓ ?Pd89OBj +Y] q٭oTN$藋@0JJv: +W\{pa. xx=@4ϑj'xj&EUЃwIhKLю!0*EMtvԧi2*Ճ@apDs3Cɩ%HW@WghM1@q.Hz S [99HlY{LJ898L9┶^yfG>_m@󨋏m?<]EOoӐԴ`tq{yu <,FCu4J H`U/.\uՇw=$ K[ #Jۻб "*OsD0Pa i(vSV^;q':KQ /R5]FT;^) ەUp|e9]kT,_ݾort+po@(O,"-LnxPnE)J&6ac F_劉:Mi܁Ϋ5{\l 4|l! Ún+PN##Fh)pN8<`\:i3 G\!RƬbn >C$#h'"020C0366q*$#ąC6qj{cr̈́ɭњSIWxoX -0%5|Ӆ#8 oQ.20ɵ$L/)R3l,xa4`Αe80 QH4ZϞlnMs}I (jX[s(NA3-P3&S3)ÓLF06;# C\Bv7M !x͔LՂ7Y꾟nyL<?r)3tlgX[2LvDsT;gXzUHü0:rRc*1Å31Y)|ȣLEB|kHSj 6rg rۯhF}\fڨ8>LTڡRsQDDTw>gʦuQ856,|b~on!iw~j֘ ~n燐1o7/;KT?)RX*~^uW0%X,`.>Ydo翼"鏇Xi<'~YQeyx! y&zMɥ zT[d5TH%dX*#9k_}..ً^C[]:wV Bfk'd }lx<|{|/lJI39߅ffƒ*d)nnڋAѾ=ae2}C%I 8Pm7qGDSWK,:VU-9XYV3;u63ɿy |:%/UY9z}樏gЉjDxZTqU?RXSSYmӝt~f+l_2ۇY1_Ếhzy݉6(M*~QrqS`C)WLlƤYMHWz1*#iQS5ln0Mx=KfTw P;ŏa[PJՔT 4UX#*TZ`6%^4*pշ'È NxOH BѹdCGPBPZM- SȘ^tÈرa%#Xzt"+$eSGo\&PIU1r~sL}#(>[>]/kDϊ-QI؞jY)OPjPFaä!_]!v*ƌcA1}@3(D^ =QpP$dGlR÷ͿP |ԊL'g9RtZ94O>ç M4˦D4| -}Fvth-;ѻa!߹v)5=ӫdZj$Vi-kw5,F$KbDkQI3?2b,eTRBJȝOqq!ʫdi#A<cVTh#%Mθ8+l>2i1۳y>ǖ:[^W-S Ft A"`!1(![$%"#H`%Hrl&ᝍ6Z=TV]ʔNk5 ׻/$Ie s%ΊRӂXe1+'Az \N+4rUx硅+s+Nl& gSh!3WJnH[j @3`@3g h V`m AFOaz^%[)9}ĨS HR: kAA$A_S5]5E#~3EJM'4 Mwƅ`rzi8:_,Mϱ$Ӡ&Ԉ,#Ҽ͞?Z4"Ƿ:UJI 5!bǴ*0d*)vP!}C<]*%04 oq\ -IJqVNalqDGp]BS"f )8_(EǙ+ ‚T,uNr{zwj HlRQ5Ɩw:98@? LJE"e[ϸ}k3Zg\a|\m`i}p YBYwǛoVŽ%"^}qϘ 㥐.)?ǺjKwT8Օ꾡Le? -%ᒁ0Si qdn%HL`8Ta!3 _!Q%Cu)who$sJa&(gYdC$% @`+M!{3Q+4F!paU ɰ%i#"BI 4.Cp<.u9q5W!lA[7pzR=IU V1XsSN18^ȕظa5`{tyQe,;~<>w&I Pq ˗Dfq~E;3v-OiZ7}Zz~uߗ7w0\,p`vW4yaw%)θ݁@w!tspkWѪy͞[k7_>)O-ex}ƺaaW1Avng+1'S&)lj|D͌"(w_ \q7뭅M"NO<( {6:rv2O&h W ^ND>?yxsu fv R؜;-C&vF4=AJ€yk#TTZHWJtY3i_`Г3Ӱ[OyP=q+X5}dX(OP (Oʺը=2O!vOw+֖s1^knseikvkE61"n3F)L U-_M|v&3FY U2 iņZD1 6 AmE # )~,E~|9L,Lq_l7p_ pd>ɁSAjg+; c62"jZ4 H|P;E-IoAUP+'p߼M KP:t[e T^!(=W mNCCp'RR;F%C,6snGԭ<*`gV?c`3S6{q! 7{[9Z|)ojA9eKZZ=*0Ke(]FEq4o̯r 38߿bzlр, 8( Bx))tJB푋^[b='UW q\Y$ҷeМ={oQ22Smc-A;*#Cc>vѱ졓38cU˃J*{(v BYu,+aLE b&VS}Ű"DeX NI &aa9Z#D=/*B@8X ?[Ʊjl~-$0F1y+0*clukPT:W GKSHI}"9x燐`Lc P$聸"1XA0bWkɸ&(_7P[wSLƕJ ?5sU3* keet! E1tekmӲ%ꌖ٠JePRydf`qe} .\g>]|vA+hl0h۷KBE$=V(nwV-}C9.l.4ޚOg97!|w3 99V;&A':$I"Ϗn9HNrNsϷ0S<4Q Qڒm`tqˣm8kQ;m)&mܨ+FˣO-nc >&]bFDJ~@04*UCJZhR13jRz|hޕkmO1eEԎ%Bf0ʆà @k`PD4rEH$eEpoODjJBWs_ c|pk@w/ל@i/E[MXl28--N;0`0F%:c#vĢ`mtq"=*nTsr1y:5b!m U%TXsK-b*6b).$IM,ͱRgz_)>͏˯a 3 d(HU\cIߗlږmJ"%I.sK^D1HW:EӁMO)lZR9sh"%`]`p"ioʀqUzû免ӆ3=*TONcT2J8Rk%sQ]௿%\ך=peg77ҁS])&-8[7GkM9-O[׭q–*yjB]NKeJq,JU0Q`+ a Q!A#I ]*NrFnVKwxzk;dz:h.,V:5k61mꟺy^nk~Kޓ}ԎmWtOf|z^~|Wݼ{L)|?pSN~}b}aC ]lVeŬcm»~ռjUr`y^8G ;EͶ* M3;iBDe dG>PB䁚A_,-/̼냚].p`㉴#2i=K9$P'axp(ˇg仵o@g"m@ لK.@f;gR7}:_ͭԺ"cT:s t3%+T;jd{}߾ | vUHYPP N<'D3F 2B#uk_5#P=|y |d''|ze;;Jwƈ44!MlG8 \5FhW+eY0.1uJ2&kIx)ТQ$h2e9a'y!SUgOO['&;?|`\&~tR` Dݱ914px`. {j%o$+k5׮-_i`X@Ʉ0ӯ@jJ:wj.p·lbMo* f73sUw%<or&{Z)}1@ ߱1,H/ W6?Kr ή U~} J}n:5G 8i;*z#T^N/{Y ܫ cF[1ރWF_>b1Jxs~ իG 0MLlX΁[ξ<~[Qk7["@> q%q/'W/'>vS60?ӈB&='$iuQvꝺG:Fڈ6ohl:Í];'?{nM{|mTg)VF&I0{:zYqڇ{g֊N^C>;>_{)%1Y'<_?;fD:yވQ dl{h}}a\le0՞4/I8OZm3p.Ԟج!T䧲xk[)lu܇[޵K=Tk)j',-.*Is+Bf^6> -*z ZRp(-K SpKڰ>4kmkz2&_m_n]~+sF0G@~N$A; z:sJ' W\F)GpU%Ps""/0p .$U@sJ9E \7jG}V6 jMQpWE HhYKW`E-(9'dFE);T?|QO-Kgu7ӾamĠo=dhC~Sœ{W%oö[N|ӏbz`w 0Y>?e۷uB4PC 45zݞkq#'÷ Z?Be1*:7cpV$xi{=^L|Ib%JVG2Nc]o- QbTqݍ!yfb/4J1kµ]m63XFh\[Xrt%KpmÍ峍h5mWlf}tH׽gݕ\sguCxҧu~>лIj![G No4nγ { nzGvNcdHdB%-#(-wڅn2X~ۃySޅ*j?lvn{U+oo:-gE=a%~`V1W|O?Z_/ӻF$,,m։ >1lSkl ePʖUG0*椯wbO_[M^kI71X!L1|#wRuT2{^ts='M%!/=_\Ww / [fV#x Pݛ6J5fM RH^3gu %۪9l[Cz>6Ŕ|DsEQSTX} l+|u絰_A!~D.6:edu9PWpCyszI4pky1){Sz٨8N3Z6X6UW:_~;s2*sq[8 A g/Y)T4+)KNэ~DioS(n,F~@DQ2 2!Y" D0@s DSF bT3"덧B%#!˖1NM ?bk^0Ͻ8+jQu=~8fv~1&39}-+%X^,oejVd#8i1if>9m^fN5Ǽ焃3 9SS&q&@+唳BbJ\: $_+V89:ua˘#^jp@Il wIMlƁLRJ)\KҜSӕ &bޱKKA(EjrO^Qp{ZV B▓pmbap': EZ0b/8M^ZIC 1aB6jCW C:쪕DM_VQh&F9o)|IAj :[Tw:F_A蟭+5gk ճ mFlǑcVg/d;AGXL*̄ls,zƌ7X5XOqh18K#8+cpI5vm<+`wU쵋Lɮ&W*m ٫ nqa=]G1Cz:֩viJDjmٺf U2?~z\H-F<|y'9r2}D1}݃d=$PcQ/qOK <l@2~Q=?p=f`[ BTdr?\hhĊ Ha =.i6ƾ6H ݁!FUO>ԩD!.dqLJ$o$rlNxe9Wx0Ӌ_)~gXNDM\DUyyqSJrB.:c(v=פsYIpǑ%ħmo}f?7[:2wWQj~ZT ̓}q6ͦWќ~NwD3߁Tpy^[Co?D Lgd1 ̣vU dx_(}O/knz|`F]yy!]Wi^1)B7 K\Jtv؟dv9L ּ|wEZND4`X+%(_gڝ ,j ?m17g.I]nfwd~TL?~l"<fX[p$ºPha+kyH5N5~Tpmghnn&ΉMo{wmJߋVW)ފDN,7`@+D)cL]ۊ ==1n5^i#C9sT)a RaQ5G$gРe>gv@TX5Bdp_lY`1mfX%pʁ@uKɺ.f9<`a6`"dЬ@W9bimNgwr NE7_cegj3=G, :F0N 9Gfx{Mzgz"9 yB,"v8P>[E +;m.L+:XuӷF~0Zzxsy+7 pcuξ].8 YEI;YC}C.|(_=n·gB6qe(%Lb؆)[ σ5iGL^xOuFC9Tm "y\Hf.hC\t+w_C,ty_y 5]5`ɝ4)KĴ1W 2{knxї1 d^!|_-@|Mx1#o %♱ wpJ62.zg7w×6PwoܹYly8G1V Ѧ9ZImeJy2.=P/(:ȼaGf(cmǘMm[g&"K hT)p~}:z~ts{},˝6o!eWQ8&`U㨖z($:?,o ,=W &޸٬3 Ւo{y~o}+ e ]{<*WbU*IO9+-'a#DzmCP7VB\6(4PZ.zrՒ"&60?'?S]BqE3S6:L_йȠ07w2, S'ʢ@cUE)%*!d*S 8:Δy#upy "@`Ydc6v@4|))L * ]D>R-9fnm@d٤Ə%uj͏Ql!%Iz8!m5Q"'6ng.4A9bgKSvY{sCE0r΢2p FӿLME&Bz̑?-r{& y"ZI ϒj9jwAU GtQF<-KkZUhvCB޸.S r?}맿 Sq,m;k*abzo)zPFtI|׼J^Rh i55{af:ݝ1: 9G7}{JzbywTj.e]36VF|g/ZdFY̌(|\8Ԫ~o%W+6vUWlFlL PF/;u,7 OgU$K(թ!\@\FQȥ2rHiTWv%PeWZtN⊕y):̂ViW3!"J,D,v6ׁ֤b U,|e 'Z٪UHt:YU*>j\oZz/pq04mdXHuTkl4\lqpewp)Vp]ղtЇ+WL,6tCuS=3 y"ZIB]U1cn3eO mMUhvCB޸V);oqk7ҍN1h#znuHуeJ)xXzq59Izfxt;9q,%\g#=_~;s2'd݁].Ǯ2'j~sz6y9yuAX$@0鹈L {d48E076Ő JK<+FG׾L&W\d.v8 F^"=Bڗ{ng('휁?驃+:'J *O &.GGD[%0MQ _];ǓRt>?wd&;K~ }"pmH"{3.2VpO>ḩSN,<7 hM>, %V,nngse0õGIcY RuyI.o*>\k`L /(-uB"E1j16@0aN{G$pj 'EH:u^јu3`MXcc= !E)c`T-,΂抳*8^/G?>aj娆é ;+QK$5hBeSj0! ʧ!i&#p}T*Eꉁsv@kۍFRj? <=;Ƭ_v UjI.n fugV^*öک`F0"q}6vĵrdMi6F:BOӪX=[)'}:O7*Ʋ3ZN']~KgVpp <׹ ټë$JӇgwXY;ЄS5{[V:zmD-;:e;`Naiݫ Pg{. }9rV?z$UH@Vun*HSENrM15FFNJ\l߰/uo /J*/=QB1[8X{S!Py4qCdsYV<{ꁢLi:ز8npЭm O4_t k_7"Lɏ!BNDq< -d.QB6ҝKHu2*0!,yb Yj5n~YH^v>/aP#Ru e8 .Mo5ЇyNAKN;䠘]9s-*TLYJᬤDXMתG<JeAG ߠ l=#_ K hK#522'NJ4uƐJz.DIY` Zh/J藵>wJr7ma\BA'8&D %6%L)x7-|Fw]/U瓐2'T2S [)mt9{@}6ߺvWvxZ#pl[;lӃ[Nض"qm/ +2 m@ߌ 3Nh)T+lZ`D$ T%)2|߫Nlab޴f^oy3mTwINk&KܿT57/h=6R631Ax'i Ђ0%-\B@1 peiP@RkPjA<*ӵts*M7S*unyfYz:fЭ5i&S_f4a,-*jp+K:xVmm!CwIvfj\k!:efZ -OߝyVh}c qs,BcZA]QȻ W:VW޺/x]snognn1_¹1l#\^j";)w[ɽt& Rަ*tۘ|ar$!a,Myu:!~bJ֙t Y0m]K`8Q)Vơ8ST]ZdgాRcN_da|C_O̪߽]{+̀g7)KSB+TELp`")ZBsI;Q!)wjk^A@urSH"|d3{"T0},mߟ- %muV)T +[KPQ2h)KF98BbT2#8ʴf]a5 xM8jUk4)t$V*I1&ZBZPRbPR B_g/i+) Kp&H`=؛a^nt}xQ:#t7]a7aD)gILd/pǗwT*vYYC'rO-+ZFF* "2٩eSS䎗j$.x SZW 1ʀSK$E'N[It Z #5+* JzԚ5owi|j`2,8RO j'g婭vS~Q~jr^B:]۶.&=&Tv6Xa K)+~|y&'K7Y*ߠ>٫Vݖ_md /YzWTWKjTTݓ3AKcc!u#)4ο"P 80vQ sDwC@ MO) ϟx?ܬFZ IΩ }{-P GT֗۸"CjT'LsҨJ^*SB7LPFR mTYmPDCB<DP(x72AU,MD*-LIzѪOx\tU+\$80n-! 8HZ\K չi֍LqTRNB4LSS(hIxL".H{pL6oPɝdsYNp#kfsT | F:bv 0J0a`QtM9T[^E~6z5?igF x|1P2ǧ{J@$^ÿHD=Qf=8]PS6ENݱ}WOn1:bmTV~{A=;C{HUzƄs˗'GGoNv;hH،Ҵr mI8 Bm(o9yg-93k#y)3c&n$!Rvy8|J@c/5$YEYC3O c 7,i5~W_'oKV4_Er@P9 Lꔷ,- śp:au2%ļL90"Gv/>PﲼFH 3eY5-#GJLRN6`C7͘YXτ!2NLEoጦ'):p%s|%m[6ATk`?簼'+Y}q/#aej Nh'7Ι$p&1)_[§嗢-QRh}B\ TUu6u+li۫Rͥj)H8c R̯|#=x+p͒;+r;{, Ynv Ņņ[Tl.* ]y.~/6{͸vYaµ _e`F:4rᜳG*f.qCPm>hN 58,(qfL:fB~yxP⦢Qn6!V <Ʉ2PzTn 9@QɄRIn| @*l2*e [$5ȄgJ]/DQN O\E箿˫jcf ^jCe`M*zNHPDm0[,9@HRtJن ݱ>gvN2V:%Sx%R* RqL҅0N3Ax]ji)9lHeQsshgWCGO^&;>Z/`86ӕ͆zx+/|$z8 o%FGlx TsiKk۰ mlxMJ' \lxMpR]qw5Tu%#gkMKܴ㿭2oUS^IZ`FЎ~˚+bQ~us6E^;[}'>\hԏמF6ZG]kZ=<[HUgk&fD4cdTq05 l$ zMhݑM鵐Zwet'A3EYɂke '-j(p^S]j9j@NBA{*kB6k+Wk5^,4wl^2tJANC14^⚾0d<  x)xb$v %zFJ뉀T(=#q}DIY.kwAY%6ΖEiLI>}4?./CD,~D&4>'bv[8"lВi%`m0"c2F4U/!&x~jWϫɇ:Um O~j)'#h$T_ÉͲdyj<[^Ofe돟dSN>p;'p'w >jˤ. d1._/_3Ck%B3 P6vxw{/D`,yȻGG3{5#fCdπy"LV& RP"G#v6wmE+]`7~MN62Q;~vgoR܁~zr-&cՒ5Џcm-{uZ&`ɺS'YZ^}:y9@4M8@6[&wk)(q-eo!qGNWkZ U#pupf']\uUHZ0tz}!.=~/a (+F62xj0oK[>'X| U<\eD%FM^zm]-l><ֿ׽z'k}*ׇhYD{ٔ8wJ1x2(1h&봱xώZF]קg7vq6?wg>[ EgfL0ӄ hfQOnVNuV!p/^{u]FN7iYxFksdS|G)Q Jc}vlzQJ 2$ /\8؄0]NRH!-r%1LL׹ڂap'= AWxrJz}k6*H0>:fGYwם;Fjѭ_=(eJ;# yj Phh/fCׂ|`-x3cthD&r$uxBϘԊgN |NĦ*i`H`9@4U ϴu%i22D&[@1Re^mĔParoaߣa$c$a@$HeQV: ڄa4N$JT" UL>\d[sQJ59gh;z%Ŷ"Ds0eN޸6xdU)Õ6H"CHq*DUx$EU(0gQ\(iKn x-4< QdB#0 /żg\zE1(rh4CY-6J#hm8/FAĉ3 3eh?T2O7`̄4BHMOZ&!9Ui qʲ(\-18mUZw\# (E\4R3!7&ƥH(8q9=L5T >ד^ak(ޱ~(.faKF/o'N!hv;F4=ٽq 6tGl˽}C7r !(rW;{=:1sBڹrcZj Aﯻg)[?=w1_;2LOͿxiqv3Z5'Ask-U 6G"MNep)v W F@} % Wr8W#k4zv-MOu閌K7z8gDB^{ô.6 vH5[BDGRs_)BAVTKQ' B: %CkmD kx 榌jnfcV>>c\v".v謑𼠦UEˆO\^^IӜJ{u󛘏gNwL]zO.WW׫קW?VW(S3+>[Jnpn'+uzW/0D/yvee1m\u/u75{wOn;p;Ej6n | r, t"&T*̰\>ՏOv|K?zZ>fNPMSb.E~nF.tuc;T>*/)l-툖8a8?R/ ?<.*#'SK2e%&RIOrcgi?ЌKQcLo}hDJZBp: =KԐ.ac$1TѫJ>P_۪$Sp9ce1\pna :؄g mWt0 IB)+gc3L 1Kc!# hɼ4|`(>P ?~ $n%0J0 aާ'Ш+ǗNH4%# n|U$Uyneז؈/a l`5))$0^UJ\fRvJFy8˼K+*fHZ9gHE! 0&#y̼[ﱤ*FTFTQ.irmpx%# <yA0]Y3VtWHCH#nדyS|+FlؖBֺqFe%9L(q8/18 1P)hOg*<@6i׃j H2ERᨶX,(O t gVT% D-(HM13JFH` ɳF(pQ^b^K'L$A$`A{PmPأ`fPR#O(x5ŌLwo%B\yߝjBwڢc U6a"DGOcH|H*17o#D.p 6=f#< 9:D,*D0Hu  I.`0NVq4NcCN5NSEt< Cb X$oS2 O8c$|#AZ1Ocv\3,^9q.2nऔhOL1TL2ҝjQwkN PȮwq= PCڍD?q==\6ڝ#ѱUMU i _ku>hNahBD'0PF|g heB/&= 9)řAHEQ.cz `b]QcRXYt8+!gN+aDQha\!; Ůj.>م;m-` P]QٳmE8^3؃bpJ BӼ7#X&Mbs ,R |C?~¨cP̤RR, )3YeFi Yp?I~PcLƤYRfp0kŨssGoaKTYUĐLk'1=pN{[1󂁙i1OuB#\*SQUz`I3t 0VPYMp% Rk%# di%d S!h,tvk㯽zdhT(a57r՝tl˴)Ӻx}3IIB0$HA$zŨQ\HP/\TZje!o*{ÿ' PnǓ_v8'c# +8IBu.0Rٿ^.>z$eͪ'KtOίK^ߟVnwZk.?f_=K\҉Lwv }ٽd_6lO%v`| @frXɵ%uNwɵAQ:h(Z$_pHE;L|=}r/YDGňrBQ), ¸o>S}_.# %/v؂Y3H>>EmA+έZ)IT>"*t Ur؊QVWoزs%8p uyJ%tLP($<22x#A+ډ@s2HX>\-W*UW˯u3럜_ϕX_oVK$y(-ۭBV p`tt̗4džJPIVSy=cMlǢǷۊ\a/\|=$e*qccrCJU C~q8 CyROx/xvv+°;ʒKGd ?NHjj꓿4r)4i&)G%~67OLͫ֓aj"> S3."fbJa۠iWU*FQW)K;tz(Iw FAWӈ~3[HfcWy ˏ [l ,q|LD)•.(Q|;nžsۇtCU# H('MlfuPrVC v V$men~?o&f[zO?FtJGcu ^ k4vTش,PITWRϡ,T:f^dNdˌjJ(1${=_UJk"y bv:C_yJϸ8O_s7Io;&< a0{tczmeI fIVY]~R3.F`|g'?]-wnf-E\Iw1΂GԒJ6CچW{cnZaoEMߜizuOԾޏwmQw| /A`ۦ2bK*;T`f^ꮵi,CTm辟@{_ኍ|Vu-yy[Q3h0sZDq/dwz?ύE诹\K%j^]Ybڎl}-7~ LR*jB6JaH|0䧛.#kC2OX2keM B|$mҠ MVe6ddy_dem_](jA%c ,MRA@i%T1XsX!^Or8"U7 >G=r;Y?Z6 y(]/cތzQ?܎F.H#߿^-[Wv_Cj἞t(G?:*W7Jo~wz$-97ч7ڙHzut/rsgJRн::_OOyCUJe3zG}>46V1أFK3”^]\R5k:THAV|:u={WLjvF\mZAǩ~Ju84px >pĞUod$ZFc' ( 8`@j#TFb h&L 13R N5IOx#X2jWFf.'λ&P]17EdUʁ]=y=KQt EB*Jq~YsF>iN0eN:[fKY5 ? RD912k6sfC5g9D7 3gˎL³1 Еas< dƌ+QT6y2c?:8YwEROEp[UR晠8 ͩp6b =O̫-&٠Hʕ@HVDjI9W=j&P턑6X4[7(AieDNnj@gּpU?Rm65>o<g**dXWEfJ-P hJCj ]9%LyA@y [[.?^|@Œh4s笡n`..tc9TLD.'nasVʹݜ)N֨BΡdAbQ]=|kWq]JE%vt/1BbeVCKh+mMhÆVoWFj%[~-ҝ;XT]Jぃ4+mo-"ohN{RVs TBrS'Zע)6`4' KZAzk Xח=⒍b mQ IYbr'v ۮPq=EV`mأ`/E}jrG14-EiJ nnZnE[1 8Z h63 65NO핞63]nljvCr`Ss18#%}!zoSJr6NF={y{"MnsE-ۙPh.8IE#ԃ&~4'_u>{,XG["?$AÔ GK-=a3DN߼)tsΜFk#O:˝7Tpb{h(Nw@%i7Qh'){Vu{4nMcj]ifݮ\y(ũNDLSi00G(Lðf$ū-cjp;3 sgEma|Sh[!PGcm))ȵe/ XpdP-f%j9' W+BNxo)R QАJPXX>e=Rp.pqޣںC|4սiņSHl( H: ;,+) Vl4I0jذ wMb 1*^UBVu)eWdٜj `TBI+ H*KUt̒Èͅ@@a'#berJYQ"\ "h_d)<\T[vw"kdUl%d'f u6"*EYU]#ծrjPMBhMz4GS 5FX~$^2U W,BIS<0b~L^2RHN#O/U@Mͦ(xTA VIb؞i|}uqvoAFEsr+MAGXD0z/2Amj==/sϟ VrjLlC}2ND<_};c*tϦm%[5blguw2OݳS]u_˧T?3#T4+Q{RdFDOհʫsoAE"vy^mn,nZuĶ_m3aAz86ٓOWXrգt(6/{ҟ}XDz-f BB ?ӹ(kKJM1΢DmI88 YJ:F&=4`4o }2WI"΅41ARu*pEɆ(I[|)[$½ X]@ 1+eP!Ut!Ih",Xw9ƨ> ^4-ns9ɌwjD|Nmv3ҍ2 F;uRK2ȅ-i$;Xxh<Jf'T-UhCNZ.~ 4A!FDQX~׽BÊ}S)+㙅fkbTEit<rM(ĸ*tt\5.)a Bb :,c߼x&]k7fYK#鍈1w_; i,5˜oRͽr;'=`?w_g:i[=Esy-{)Z;xӔ=E4n ~MHuu]^X?{@OO\1|jسrkqHdP}ո&Xכ d'iV@j6;MX&~0msaHو}Yis_ .'C946TOZװe L>ޜj"6K;hs5XТ^#5ˍЫ) .s'"0?k[vّ?{ۜx\EMZ8uzsgW1lRxv}6ǧsN^^= >L}(:x:˗G]^M|0},ar;t/}^_Hbp9Aݻ|5Pw& MtMY!GUfIѻMAL}|Pn%RB}FϻY3m ZLg;x[NםݦpinsX7n+6X/B(#P;Sb9*#HDtۧmDk;m*ŘONdoޒ{}?kY[@.dB&tۡE|]*IZ2š>`G=|s0O W+Ŋʮ`.B'M32:M#5h[pNfodM4Ft[pbໃSk"35)TUmκ-ěY Vܦ9j[I@DŇOq[7"6i90wQC̥ >JEh30º 3y9*Ợ (M]m;})q[Fy'an]T`;Վ3zY.TS8-#¼s3W^K)2l)cQ0Sq[FX<6n`A[|=$˾GSxG>FAEJ_ZVex VSUӟ~4VV$p;+Vbm#u&UCHֳ;>wNЉ`L2'?ZFx0DR1/{!9ǩ~{!D׆T&ݸ쳒B࿍`o: wmqH~Y`[,V`L&x;0H/OlZyn}Z #"YW*!F <揜8wP+5Jk ZY!=Xv>EcB2I%<B0M x&nj =6KM A2h+jMBoQAt(K-Q!6IA;)2uM4>+$Rrnʄ0 FI=ٗ>RbVᨡ*i BJQ9k!9fk~;ٞۺ<€AR $-Trm݈Ukd Z;t&O S|bpZŌ3g+M2 1}lap.[1FR(VS#? vꧠڪ;2Nh- '&G{wf)aFWCw7˰t+NΰXƂiR+=쑱g֥dƱd/vw,dy>8xhHv*sZ@0omعvٲt']RO?b*׻Lա Q;͎d_xX-:O% >5_^} n4*| \~pﶉI7qUɽ3Fz7 ADwa[D Wo~{ٻ$Ra$K. P:J%  C@1+k3_uT^l5 N@CQ.~U;.’iwoLs|:KObCMǛnWRvkxEV$nYl(>v׷Κswڍu{$r3rؖ` ͻiσĖRNj>i:!v lBRCdW}dq=p%Vw)>߸O7t1R2Q ϸQz.)7MWgVyy=LyQ+eWV.ȸJSj.;LoyW8%*rA;kx>kEw!)E!Ia"ګ9S@6`;- deswԆ؁2n~4'$wqiҌ) ius$ISmZ#y*Ҏw5eo=7S~IWaI " E'&I8ƂǨ!H-hf1\NkU˳-  h#DR[[I țel&[|Z~ûB?ZJ?6 |=G@{ߜl3>*~N,Z;71&֨ [$E߾o׸QS^oxf+_CKpْPWݡWh7_5gW70iS|﴾K pk¢_=ў]poz춗&>~~moZR:E26݄?5zg6רTr;]RvI]F~+5)TG~ !dբi~?7@NZD;;~zf1&J+cQdr].2qiEMT+c_IZ ϔb(hTUKƗ 4MZ5ŕG+F YLFP_jnJj5)~vGȧ\=&ӟ>^~'yӅgP51G[^Y{QBOl9G95q%!] y`aiH==JlFǻ7+>!k\fWȓVMlLP+/8QOcN,{W ȗ#E`>X 8E{|˶&\me '\|(J.ONm^kc-&{`i{"Ӑ)OD1#&Ѭ0ȅ\*giiDd/=Jr >Hket(t֭c[*h (6?8},K]nŢdUU^*NPd1V9 !L|Cc-LIgɼFaEvwQ$:YrdvɍؒbZ+V RdnN5uVZdQ̐RJ|(_nJotCYwoJ`wB ЊE0$"xbH5C =&}$K$Km>)#ohiJW7KRa/l8QEk~>tۼBOu=M6׆*HX*m6xٕCtCLH,8ymK|;#ö!*`&w \ӷ,- R.VQuJb.P3:RS%%nj|ȍIMIa=?2Yi$kZc4K, ^~xܓJ*줼 /_fλw oş;ޞiLfeoK쫃lX5<OsZ<)}1_)|/s[g1 c+u!DLւ|ܝRk$-0M-1F*GN%4V؈ʹ&RFtJhĆmA.A(Fj֝t'12puzs๖e:OU($6xeR*\5CY J۳ƪ|>y )O^F^RM(<8g 5!/d llnϸ+Xyhz3 ޗ>Jy3 ܿ= >\+?(! r>w*xPxe |pDKC{{<2h;g1y-òl9p˰,zc$s -Cf"֍rb|K‰%4=}>,WK5yZgn I$xtzKɌcHj_x+UFmOJ|ݐc%ц܍i Hp$]E뛇?C§}+,k`se\oKh:,;9*q#w*B f;/^]<,tB]["X[Jva80J^`([ ƻC;+/[[bʭHCZec9 :R~m ޼vT I|8OUG9k~ `vٗhWs=!8Y1y4.ZH<Ly-=] {ZƧn e  R(d0M69x>V&NXr3qIDtMli4G٠eV6)?h3܂r >82#2 3lJ ުhSA8@ /R3R 15@VM-42lٰ|R+6#(QΨegԶnFi" V"w[ƨ͆GiӞU J-5Ihk65~#L+ϻz:J/vnĨ3AkYhq@RN&#`%FᔭCl/6"=%LO!<۲֒fֻSO-T4-Tp)E`+:DmP~tVGTq8v6DxA_w_>}-Z6ooTEfCsRE%,5,Smn){nr j^gnuIWoxG؞Nf%!F\}(e0x~|>Y]n{<;f= -M%d9bi`_=CJ\]K}kj̊jG_)7*-F̘=嬷#zvkd1]GcVV'oV$ږk l|6f$/tY{Լ|ԞcLH0iTfE|0* )Kvr@aRd'?96z,āˏcЈ;M])MHNpdԈs85 eZm;܉?ů^35Wom1}W*f_ }8ۆnGL.!Vcj ?.b%4Qi md,PQL5&7- Bظ  幓Vv .`v+, #9eKp'7?9Ԟ#S&Imx%@z ?{;-='AP5* JЪ@+E56 ]`2ΝO!yJSh%uQLQ7$-g\j@@4GwikaJAKjʵVqsENyFE SE@3\s15pMqCI$0}f Jnxxu@5 t/ KyT{_*љ=&S0R29H܎= ;/ P=ظ혓 4psبv5,0.1`w?;S {NMѩA*.t~~1xz7b-@!bDyaxZ)M&d%4mR%Kaaбh'$$uDAv.c J%30`n7<j JG=^VF0f'],=SH~_e!!1O#C(Cn77Oi{c=Vّ)ν3Q9C *[`.=:z Ys\'nEW08 呶&sg˙T*r=+!5|PѹLrׂF|șP1!vYػ[7Qwy*&[ů>PYzT]OIAŅhL=9K߸7foZr_q-OK|TQc]2ʮ2pH؋ߎo|Pzʸř{?XB*`s1 GUPߊȋq'>*;Տ%EqAy)yq ?ʴHX뫽_:"3};i#Re0/2V>Rq"Ru\jDxJxK)2)6^tF,$et+d"O85A0^NdŽIiƛs ,>^;dCI+ŢuP8#X3QoN%jb0ҜXHY 7>wsN |LIECS8^˟JBgOg晃u78)09F Sjzc JHcRu BWBi fb[t>Xn~EQ5VۦԖQ)lc˂jdUH,uQW5G0bJ&ИR>J6Ghy2x2ۿxJ2>Bt۸rhԚ$׾}<;3ȈWS7vЀ㹢A 񺏧{_BduQoV}0[¼QM%J2l>^tDoC3qsMvayJS4izy? Hipx銟U3I=)GAՙO$D ?ˑ0.yR2ˑ"M4:K$,17\Ѽ0rD&]2Ȓc{Г[Ex#)ø,i7gOq Ҍ mtG &b b/aKX;[_{ހ{Ӌwczl~h*b?>tJ*':O;)=#i= KzjĜ=E$%4 V>dO%89)!%xrh=0Ǔ3Ѓ!,Ks&7n ˲ВRE]| &2HOߒtw nXNI2lr*kəƍڐ~Ҽ?Ҥ'X$wY/FzC2q*C?ܢ8w< wzrƔΉ(N_H˧S$Ӂ;NLr. ],M$yG]_Fm݊F>6G{'8mx7ZQ66;*3(8iy㝣C\"tAi+@rzOoTpDˍRUmT#!|,RFJkﲈŰE{QUrhs᧣;庆l?ݿ=){t}G\ 3lN?~f2#3r]? -7ż\Bh/ɡklӢEVC %yZc5_l};oͺV[7dןSM 76rnxn1dVH +XxQ fb؜1[}mklPR~׺Nڡo0uA3Y\ܖ>y;gοW)SW5j~%bB޺viVTۼ5@k焻@=D5Z^i}GnbfÿŬg\X&Dt'o̤l$fR(&朗3_BtI> hEp򣝯tL:~*~c#LD/T{|Pڍ6?J#A}lʤaZX7i"6N)!#6 Hoލۅib{䐖4|3q:Ec ̭ iK *a=ljS qx7q F8e<2S@o"Fڙ&0zɽ棯>-0SQ1 PZՕBլlXYAJ5"^K54JN`Si8w'u` &Ym&]JsB ^Ԡ7 d^͚X.ﮋ~^X;4˫]k )g c賌G[TGO#^.+*fT L ȫYmO~ 7|NѵJ!aP;f˙=O7w&lG{xq2u bBF@z׋{n/B6925gh k4FOȕϪ=T.3m62hCXp˘4Z5A)*XLPkY:[~w[ۯ,}ܱP_Q˷akLj䍪%-R~$r$R60DO  AB UA5V5X2Xirx ]Y2ZRӾvW-1hC5er|^b[{ݶ,[eg)S`V )]Yࢴ.R\`Yڔuٸ06 ZAla맶UnN6E-W 0L_NW-^S)f+[U~z}h3וrvV\G.^~Jpa_yk9<8nu~rVًOnd{5:[^pyxreY{ٟ_k 3^e[W).|`?u1wxߵDnһ_.clk:'6܂tȫ 9ZE=+/_xxrr&䶙o_7*:{O'̈e43&%nP&,Xp;FQ^Mo>2QbRx'hW']8A 'Wb*i?}|i`m_w\T/ vLmzh0,qmb ^M F}#l->ӊdAfY 99yV0n)C"|c̤rmԫԠJ d!fWs+-w= ݗ 9uCG]uSЍ6b,xz@Gz-VG4{oL:d`tT43ȅrxz*.C\5i?Μ!پ~hsJX{9t!O7ޭ!O?v1^=v1zط5ZAZC1 H@u= Rw''CR78z}}rbaInsH6n@viZ5sCr ɀ⼽kzz!=HB!3h5%0Ch9ϔ#tDE>G4 ҰBDƢv7=> %  U6p9]mF (&|W/4@϶Fl(Qjh8S)x+8e(Tv:쫶jG{ʫ?E#P%`}&%zRL>kΤ +p9\4 tۡWjo lꉩjN>)ka&4*:ZL:tx:7>-xϑ,m$͚OB=̹d"a5s`^^ ~co[źW cR"VWqiς`C::5qq΅jV7н^1jX&ƃ1oڠD0pE΂X-П.qhFw6/,ۇRPDCydIj*5K;pz)猖 trvaͭ1HT/@C nerbrwV0'էH$BU"Nq,$@Ⱂ[[t y2ADj%=u t;}uXsʑtDz.Vtwn_H*%ױ|d3'i,#0 A:6VfHC06paJw)%]N yÃ-XBYDJX}$w<J؆3^@w|=/|OirT1< 3OR*6ȥjPF0%#%=[# p";Cr߽*0g ̤!lqːIXJ8;5pi@S* ja9d =2G0W1/4#I( MPB"F)ubD`mML&nr nEێn3JD}BRI-S(2i> K##H, U25# iS`pDaÌ"vԸ*YFܗ3P#*830>!x!F@p.<,Kf9O!Eh.GQxrSXB*lcCޢA @+x`nnx sxzY+?Q+5u-p`XⓀt@qKY:ᛩ7WD^U70.zb7֪[kB jYr|IHgN*ɭ?Y7}tDPkWpp;8BN;rǬ:S`"xf7JȭHqJ'cwwRj-H _N+dt-R[eF(Rխ "`kvqg3w+rNS ^ЉLN?p[ )ګǓH !d<77L3x3;4Vk7 k, {6ܳdW9(7޲8䙳h OQ.{FH7s R-!*ҭCKpYk-zҭy,StA ubQEuy/fGҒi'MRzHr9lVWMa֬p{&p}홸/g`Q,F]& "Hw{(6e⮀ XvV=^. DB@kV$|$!V5Rta41f?'!֝KfEGN5~lBkq8$VtoYQ& %f:]>Z*x ^d $0Ur\ӧW$ZӞ CYF&|vv+=\2s=XuuMě[xWC go [߽ƥGqf?(CL|Lun>q%sisi]e#u8)/'`׊ܪI@!Qۚ"b6*2$0O[hކ$(okScTXUZ"uBe L(¨'KKSR#R)"бZ#@>i>1Ok/<ѫט.1 7odd /*hޟ0O\]=_SSwhgF |RɭA=| ٯҫ\( UiX㝽wDw[I.KwlZ_V˲< !7ԳSpiԴS$\W>ԅ| [>S^$׈ErWtrԺKhQI.%@kbf=\`pUj >\ H^BŎW:/PX{I&?[|ˣ))@eJ " OYQ}y]^r;;u0B1V FG|tbRw, څ8z9;6Y;f%\]_yΚOt9N&6sy/nuOams?CmbE`f]PKԴ%#2wn-mݢB.̱P\YveE1tƈ|f#-fWgf@ C wzn`B!$I*@2.U2,K.݊5A@J A[Ł+A{3B1>ø*[]'Uc+{;+|2iGswEV ~r{nymm(q05C^C͵ęhsD@$sb%h,+nޮ>̋^cSP|zzùA&^+_ᣂϱ{veJͅ>٬DHB$/3B NOh`&DslJ@o戨nFy7 A39@޾ ekXP̆xٯ~Y{ 6hOp~}}5HNjjλd2~* w>=1,D-W֊>31I00FWٖ𿬧S7_Xx A1`eeOn_TDZNd֮ ;Wrzj5u} \[&[_*ތdY}SupkK_h=QNlqmy[NZO7V @K޵57뿢Sd%~q!3JU*l&5V(yfwϟ?EK (^DٜJem"5ݸG6`F-? JtvWSuzd/E,UQlu$# 7a7N"$$V[. {6: -n\n*Ϙm9ߧݽ7\0]UT?4V$P[%؁kfp#^bCj!T6U?}aA%\CIQRkW*a%E9ȚҦMo+Q.F%A/&1+\q` rإNߧ1i@cFG6RElINWs0 '<+]<i )4)lL$z AR9ܦXR;x~4:](䯴?ݥ#7[nZK6AWUJ[ڳ OjC[#.Z GL 5yQ(+3#ܤ4E sWSQG4[i6(:~xFز; j\ݯ9ڳK׳eJAi#)t #l) h냍sYa~TS-pIlUꂋ"KG\".ꪝ(,gxr6<@r @ 6L39Œ ᑌb8tuEjx 8ab;q8Dsu>{HAc.LčXOx* 1*.T,Nr;z Uٷw`l r؜]~b#;o;~lL kU*k]n.lG`!uؕDNJZ,fGn=:lW,OwYDΫ[q9BbpuV2lcSk5 k'l5b)2^k[I)n5AJWw:#Xwt]~=(O~44J9}o/FKt|݉s&?w7;!pZėS^9(^̪gPOc2K6=80ɈjFx$0D#7 Rb/ _Wʚz yUv a#Dc3Z9ODD 0}}vz3'ܟ1b8.2Lj^a ;{%5tg^b7K 9 9ǁXeo Yn7<뷲KvI߻[e1{nҿ[ٕЭXL_|*)ub:&IF9lq:oÁM[|gVyϻ>M|ŝ<U8+zv[kf!{8; Ʒ7acT¯nCT~Kkx*"@'.KUVvԓ&h/aSf%sZ!'EI{og<)Or_V3)mVzV RWbvJI=)JM)^@+cdBFYr>t;}XIQXOu7b 3ތrݱY-whܕӔ:˽,Yp8~_r/ۡ>HX}. @#[뢼c.[,ϣ>ztCyD{?w5:oIb#c6"fX"JD "+G/޾Gu !-۴Iq.6X|Mm Z-rz79]N汔{k( ipK?dJc#yC#)`SY<!+V,q8f}R)8nL~. V);>1"rq$=\^LsC^L0GYPF.6m(G4t)!Ǡ1!C`[!..j?O a@ARKb bFPHCTCB 32Q jB,V:Hj/T, Tc@T""q`dR"-0jA%xԬ4gƉu;)~Ah9˩jlf)^cbP87tR\堶ø}SW+1>[HV4yw,r绨{ a~~jc5V{;{T0]kfXE[W2Ta rV*}eRXIh ᢽbb[t+D. uĺöi'S|:ӁHlYiCԂDW`B.Nωщr٦ǵm 5$DU_Y-aZXaRFu$i[MJT UF5XmCA]+3N& hK"pnvJ"ˢ8$Y C!_l.g~ʞ{WsD\ƖDx(`WR 1xGHu0P )%pkT>yUY k?t<3`:opx54nԃa1f68{Myd)X 㕅ۀ 6!wj5fg,sgU+ ˤlLj9$⬮M]Τ$t|;"VEl :EoD`bސ +kGI}W$l8(* ηFgNf% jGx "FS6-)L֌V?ڂN&Zr8t"tvAdk+knU6 ]S+t֠ꄎ,/|5nޅ~+:,NoF&sY͝IbE9Ɯ}sߛ `DYUJ ](fC*"1A[`12aaa7GH T@Nt }E+'R&HGFsYbaB+Ŗ3 UbbX q~AـQ>ع PQkZ-R0˜[ЀkXAed8[[E0_*uO/2Q8ê:!ZfOj^l멂~;ocBS(\>AϟIZxMk~~3D@g ~׻ &eǧlp;Υ 0$\oo!$@p0S2` $xCrzy̴wlH \S|#F\sZa/Ex)#)LG\=y|*<>%U+ݛa:YuL FK@"2:vc͠ CĢD.bq/QHi*j8(v\ !Bvp(1&Ͼ% n1nเ%H n'#d./18QΗ" '~5]F7iΪZ.tJ|LM/^r$rڽ9琯KԊܚI%vf%rjE9/\R`DJhc@f1G D3, t*:j٩Zb02К|zZ8ŹB[qOhI 9 nm3&O7WB?i\K S.Ta5ed:z:1 *h=p47Tt{ THf˩]s-3Ms} #/]_A  5i*jT[ҹ YMx{sW:%<ETQT6MbZ8IT)8°2wK+aK%̎q>tLzuS-FMeH1'T{,>:EwK `cA+A+10 ?gsg9=CY]IrwUj2UPs @Z9a P "EXF3J@)4D!K5UlXJy*,5Kr"ٞCx>Sw$Pb"jOZ3ibi4bG]4tӭBJo ( DPtc?ȯdZ\t3o) ~׏O3⺜>B]HUÙ7_?Tmcy_n" LS qtZ&nxH|raYݧ{IM 7Pk9(Wh5Y7J6Ygwsd]nUub:U(}9L)y[[MM!v5,z7wUATmw;cXּ[历z:,䍛6%Ez=Y-2oNfVyy]_4mGjyw34ʚK A WY^l%Dtx!'/.9:,R o+=,(5 VzV i'RHì4 *{+h+ +e@HVdRVXomY)A+QЉkwZq[) Ki!;a40.MJ/J5KsiD@ԓԮ>{l+vP;jbe DVFJ6Emf5Ԕ(>5GfM(\k3QjhmJ0\{VJ79i % HH%H$Z;e3{n'&' DVh 7R[RbQu8:ե5ʣ28U8r(i]7ZA|́>[;ݦHͮ*ǣ<,nnl ֧O?$g,4F}-z4V!CBq(ͭ]5_xo߿sAfۼAK!Z-%:.G:U,+v[}s%[p% :Y凰:dy%J1v ֦.?\d05)4!M_6EAE\=˷SVBFDPg$lKlu͂~L2'˷gfY׃zqe~a\y\LPB)r%^LCwŚ5lxrMJs*x{a(WZp֨BlM\|?AաsF19sl獐t?]~|9tE$ äY&Tp= $h(P/` KRtkv9+lN&|=_ÜvQ$vkxV?Kb6W/7.4W1]6o,q,-R1So9cmŸUDR+D-#G}_yŸ|6> ̇ A^_ x:hzLwv}')J T_0S|DpB WSyyl(\Ϟ'FK&pW?.³v?O@LxRP}wCH$%Ji.oxi%GޖqtI+9㴒#4-ǴZzHgRK|RBӴ7ZzHm)gi)eQK9-=$ԔKR&Ӵɑ@BKI}-5.ZzZ aߝsі ,zޖBbZh/4-ͥ󋖞$2 RҴ4#u=D[JH0z]׎M~Gbo˿x:yyW!\(hҁƦW>?A{@ 6z.J4}asăBN\]++T޴>5!%--|C\%ŭu L% 5B҈@G+zȂD9ЧĞf@Gc!l[k(^٠mxK֊i{UkŤkxPL^U cźݟnb >Mc+b$+@>}B諱tYrFѤDNJԻIZ'פ<=l>D^<NmvA}|Dl5$8X-$$^utlKI(m\Z6Cݷ6@MkUB_\EWK#hWpUZc SقT$J-bf*{zMʖKU</ '\8/g /8" 3ARIȰ $G̈M5;@>P^!:&ڸSg?|.JG$ޖTSkZ2pܲD *vVe&]N’4W&u>ph7 ma)j5Scdw0< _Sd'e>At0">fƪ?"Pkd]Wxl]3-Ϟ~P"8a$㼉 X ;$MLyըbY!E_OM΋U:zyTybJ0Yl-ڋJdI%F$$]lӅ@:G݌`RX g(:Sa륣'0Z"Lif_i"tPa5SJt| ZxUV{']i! M ȱ6]b}³Q3PnKž&b2YƉe{-tBa+L!@&3g??j9Hᯆn|fwʱ{c8y=6Dsc ? @=v!pT"9RPKxfeZ ck 05K:u+% I4/'AXBhF8CR'DKa=+Ɍ]k5^ &TaFK >U"kv^,PY$5rx(Hφ\&A(ETr FF~;Bg7A/?2ҿ;$޹x6[Y/Rf/Ջ(`h6=8lb~| `M `M1XzUcc< 0fS,\QL կW4}Ip^lv.|_9yݦIPhMu&W{U\s^!! r;Nh>'48& A3+`|&q,(&?0 P/p6\fA8LEpFc!5?GV1k,RDH[N F^dS_Q5&HfxwW 9^ O:>QHթl$)3J48LFry D`)8\*ɞH17Jo֋gyڼd2GaCW^EKAC$%l7 Ú0sPfpİbz.b9Ƀ8E]˽ds HO$$ˈ>HLmFX}^xMXm ;",#|Y% (@1/_Z X+( ۍ3aό&IDbUd)D8|do ˿*@GUkh#4j;ac3d\ 4 v6ʃqYXw׋6 gcvi Sb"l#@GQ"yEYX=U ǵDrJRRH]AP5Տ;>6IaUVUA.,.WMҳSz&B/)UEBv0UZVYyfp[2 @h2,DA}w5>]FYSj:ƒVցUA;[\&ee,S=3.5,P~o5A< 2b+"iU+/6IM;pzs[z|:a{XVJU*ۆӥ& X~K1]L/oK)d+3),\IVwٲRP=(ep $obt #Ck8"ILUbhy\_K"g;z 9mιnWIxIq1W'q>8VּNRg&H@ _:PҲ/ϝڢ//!J9 \vC=Q}lµ(Q$ jp}쨇O<[i !uIl\R":KC%o\^dUR˧Ww{LMx{%ю*pX+Ylu` :^gh7 :˥n<*ee=76C2ڝ'7 tI(BWF9bƠYvnq֐w8x7y?F?,N-6~?<FɪNU+x*Fc\'l)b@kt]OhDE huu6ڀ@8'3ɑRx"U%V,N*.2e`V_b)ق3$e}1wo2BC;ePJ$!%T"9$@ɍnjrI}UyYBeEŸ2c-B Yӳ9^zs kw9~=JH1M_ǏnqY)s Pn+a r3rKk';Ct搔'(Ivrӭ},E#ޱ<%'ǟhÊgYhِ^T2C%k=6R^NECA KCh8M 0(5rFp26*fj8%تjd;K`Ǥ dob/Uye+_PEytrgtR-!ݕ'#;3A4W0<a? b,$GH!H *ۨrz[i)  ㋆ND?Yesdy*avN?x?N?RT+GB 9#B~^ETjK NJ%'kϯJ TJ1BMFL `Jxm.n"͔lbxKFko,0b3,QeaPAANP,UԻP2 ZrL4 JC4wp(@KfR +U!9I.&ݿCR{P 0U1RۨS ?0m@`zY 0rM) nB1ףwKA餶Qǻλ%w4ֻ尐WnI6G m-U1F*iv--rmlST~T|BR1&<[{\9۲yqy:IE͠Z Ơ1o9W ZVBӘͿx1߅ǪVWR?yuߞpwlԖ~68m.r#1=w<8T|d/9Y;yd= NqdA(ouwu'C"%pIOֹ_FE"%L+%n^\{N|3F dp3C Ž  XÍ?F!ۈGYQ@n!O2λ2 H((0m#`7ʐf=]Sɷ|&nbMe|*Kž #뻇ku. i1~\OVrft%{OGl͕SPץO_>{ϡfuyS@RǞ,IT6|Aq@RkkdD2J2BQѕctb ,LT9{Q2 <J.TxoTQlAӇ3Cq=?b@a{{5B}PB>d;+uydJZIMڙTO#j[c˜VCs#U#tyr?Bo+eL;j`ĸ < Ĉ3;WUsU{Dtۅ?CȻh=;cELv*u>hca Ҵt՚p4p q1L%[f7"1`8{eAq+Ny'„S*7"h煛Q >oA'{JO~Ss}4<VoIi.ayϦ/㯣wڝc}"@}P_T^@OnGT!!$~a!a@j',Bb$Fs,Z“|.7ހ?ޜL2c&7'˜&ZK0X~̯z$7of(IZ!1NgmQdL1F{J1iPBj TVA|a -] ڽ L):D7$51_bOTHjqP8\Adsbņ"(S|aB G , jJSq nW `w)T#{舐lЅvzI˵,Vb+0papp JSG͐ -$ӎ!iiCj%I}"W0?XnpP8Iyďb!;]7a*ӁQ?W%TqI.m /Eȃ^>}b;FT …T3*8G8>1+_ޖk뛽{{b#F /|G_.^ԛ5pT9M=߹]*nnA-$l3#t}b Kp*9>xHFa$C5 0F*xVbM :EJ)!(lH˝Q#pLh]%eYnί~tQg,= = cF "36z =ҚS/HvZe|y`S=9M#KZ{WlX^'Aq܁cՊ}[){8>D{ ~Rz@* r 5xߣfG&ᘇ;} tسD/bd}};zOJd/:Lu iԪr(aҚBםj@`.@]b CRЯ` }زa *N'7Z0i5䕤Z-++!xr hhYͥfQz|]j!>6TP Q !`*c2O"0sX+7j)u8Soz7W/EzMtͦDM Ej`CjV!_h澇 &spɀ |o B*. 9 d]( nʤYݳY%v,bҰ$ńV2UEV`[u-e3l;`}^V+* #r e1ȬaWkV >qN#ecEO^zn|phb[Kl1I'BC(jS F3*-bМPa:cL1+7Mt'w`kmVEnHCdy8XdlSlg d/ْVKӓ`.]Xdů& *8Msp‘3ܧXߖcw?dM1}UAOz5$kUi"bD$$Y`5|kpp5SRD˜4˶L[@e Vg;t>r|Rodx˰mPSίtcڝ~D{TRLU<ʳzf*˂B3<~!5j#!h'+1W2Y賿{FԂ>j}3*$o8 ̀uq#k uR=ز-8˳g#o_-] @V>l͚,MVʠzWqbC[}Ykϵ@e ģJ-4@z묶Jl;^P ADdRὈo񜗠d `ݒ"קv"A^_?\>?[ !NNl݇DŽB˳pMRսP+ ƨ[38K'YS:uznF#!>81цws}Y޲Z6l=:3؞Ǖabg~=Ԯ.}+66=Kp =JԎ8M1HoMH6-H)/>h2z]X5+XN9P*K,\E  !Wy]*j:(]*MKn4Y#/Gw wt}pe' !1xOJ.+Vl``B]P h(:QIfG__F?rҡY[:}]Lt3WO_]0gWgϳc?,>k bB<66H+BaKnIRmh%( N3k;bmF9a*svfF4;TՄĜ6Fc\RtKřvZRNָ IK#Vq0EbMBouIbdabOJו,596t˜,%Q!l`RLHj^0p)?@(qUkGB,IJk31pi {UꨦiHWLWldU!]tE}Q2V>pry[G-!} %gfWUAf|•2$vUo`f0{oj]bG[E =&w.~"Y_k}"gR"hh=3BB9IkZ MQhn Z=)m@wN zglC8l.RZSvSfviZjQ|6V@X~єZJa:+.% }w\Xd4-Hi=Lo ܧԺs,E1jO=! hLZռcIVb5z:ȁNwn\D2_vݺez>C$?w8[W@;xCF ͻuOn} ֘drK.O7Gz;%/}2r5s&pgj>TٝxU+ʡN,CKonYy'EM|0 ; 0&[h6g׼_MFss|wՓч7o7qmb.$Շ7hnbd'kQ2xd1oeޭ`#Ja*~104UKꊜ~^üBB کrLwe;%C{/Q)_^MUKD?\HsTC_蹯*A۫`.vz{ߋ*[F"pn.ݻū%d!KL@ZPy ϙ-lZwLyH4~ R2 F1fMܲK:Bn,MfnqwK/bFɖZ a:_Kt4a1~3.yYuq e1IVvj?%߶LKu6or$#=iz 'D`__૟e{]7J&J]3uwܯOod um5*h$\m)ugeK}"c ԥ8mEOb=i@i^ΞN P N<샗hJGDpML5ZXNezYҋI=U9'ljuR=BWA u/=twHR-* D(ٺBK /f>_LA# cebǍ3t*v7znd-%.Q(@#A#k& wc93O8DD޳Yj=m *-hO˷aVKKߑ%%D9࿝jD״ca6δdpm )nhLaD+j0堩M!cҒ@X*8G %Ci.IS (DVu3|;Kqʿ;(bI9 Te\}AB` %*B:m(cN:L8$Dd~4Dϕhu>WO_W慠:bC-'&҄kހma)"z cDnK0pdGd&mZfQK61,vFN$䩻CbH]9 sK`qs'ȒgOL:Bo`(,_P/JUҺ89SH)2C).@J`VFz F:GئD0ئe%RO7]EZM~'-BFKj0644Z T+ϋza4Lhky4s;5SJpZT?Rﺇ jy3@)]HGc{3m%}ؾ vqЧQk tԧcacg#jl1tR8ZYΣszP*bN<>4m^?af ^},3mxKj#?/uDkOv=1v1JϹ7Kb(OH0X&ԹAT|gÜ"+' .PJkpsUq7\s :v+P+Ii[Pڸ JKNA*lx(s$Y9ąf0Jq̅P0`Bز"Y 1@reYh h!# )i2(Lj"9C*(T' :U?K+g gӗIQm#YO^>}ROcFћpPFHﱰ\TeQ%/V4/RdM2XR$iZRb=!=15ԇ7 ۝2EZ4|M?Bl~f0 ۇ+:oϟ7q7J,ٻ8rW6Z w>90V,Kh|=#GdG,pg˭bbU| y櫛|_7ʑ&a͛/Y m_T g'Ë.$UMBVxN 䆛LrL4 ff?+>4͕E~=H% TrSj+sPmAy$ a)m]Bտ:2zFb#9LUҎcW]Ik*[X-?#ڿ.Iz/-H"*щ"0Ę$IX7gXi2vXuIvȵ>꒞Qw/~J>>%kԯ2}`5.hV )*4C}A+pQ|aZ=q羀~HjՑU:lq9?&ĖZϯo/޽ܗ{PTN c:ˏyb_␈P)V4 $y'rjM۟X5j[Vڽnk}ٟFRs@k)naJ >}+|w?=8wOZvx< aJQLcnO|npm 0hV! Z!FWS7ϮIB,"\z6ـ*ME.y>煔*̪ ;pK;\VL`.Q e\ Mreyۏ\dK݅ӱg(*0H򽧒T ;~[B @p/ݥVȎ آ+aKm PvK?5Rt,.,Yv-p]nJ-弚SY56[ŅOuq.wxFg}s6ۇ† 4*slg!0{n@my_FCTkSd-͞$T~#Ļ{XK͡[|w!oQ|wS[|6.BIen=[yĔIƅzJS@Q)P)vP^j r.W)y:xϤz)UJpUa.N/%zx:k_W%Z˧n1*(?T/΁ aJhԿL94=1Y7!)i;5z˻)h)Hx#-r6yz{@h|ĹݤDAO|6.]q̻E׻7 L'(![7'?-җ|u:Ν'x]~vZ \΅=srKjM|(P=h('F)\H8Y{I-XB53@B+P[1JMD Wi r EeYR)^C2CNBb=[(< TEM*~6yiiuUlol~a/7UMw7%G^k?'k-=D(_FkP(?}+=Ynه~ݍWou9# aI^cT_<E1L}&1 4I'N>Mb"M2vP{<:G1IA0ϔ>G)}'ƨ8f 1W/Шcο7-åagÞ< U;$lf}1H .o]obYerao_((`t~%ϊMធqv\">j(4# yn ?[Či% r)rb\SAyr\4e^*S-D^J i+ElyfN0ҌIdgە~r: 8h~HPKl{ȿG|^oq_SSzK^/|eO404M-6klqIZk0%ޛR Oy)Jj`uܿf! i:47@FPBx)Ӄ1pw~rͦw=E57MkgzE& i4Lq\~+Պ]ၸe |8.Nۛ zV-F/\AvpAvP% z`Z"X7>Lo; Sln瑁:45sHȤm(%;C+ntFvZp>Yv%d`zՁS7BΦB F{7(Li>dۻ)xX@'!mڴq-ӻ7(L|Ĺ4G#u;AEh1}toc!S6j" 8Hs D9^x g2Bˆx Fa,Yz/ocGyNvdI K 6@}^]ꧺRvv֕O]|^3v?5YE:ٟ͋~+]rZ-rv{X?/T$8GU!(|M>yxYsND([,JH<"/㴳\1Li{p>i Vka~_+&:0im 0it1(؛t)Jx? ;Xm$XU6iHhMN M!YIfiQ0e%T2u+Yq ;#@J^{)ED[9'í/32# A.Õ!7*%<*Ea*(5lKoWS~.yi5νN~SvQY\OZUpQ3?mCL)IsuX0(:2ϰ4T$.\Z"䕖B9/IOЊ 3x SAqarL1+9`f2 3clR]+eqIZ@N)c9i^rU'BLs@FT2Cŷ+(^ID(2/S|Y d\պdMWF'F8"=-SUH3|KpLQ/M he:P0 ,) vwԆVvIo@Oai6?$˺'HxU5$ [ Ici@0}7Ϩ= g@ wC 63i؅G}B0閩)iw= ;R ?Qf0Nzfɐ%&fҜGVՆ1ŇYKi48S|w|zp+ ĩͿHڛxڛ0)dҌ\P+LLATG5`9B82hsNTD;*h;Mg=R.= tAԡnPM]jD :G]j%t۝Vڮ:E3B^n,@힦:/nF'Ձk7BΦj'=qyWmfA/D~#Ȼ}Y[t)w!oQڂms˻ʱ)xX@'!m]4O/9xz>!Sd4꾙1a'SY=>֏ښy=ͿXU.+O5#=in ;D1O>8g#oނ= i5Vgx! VkTDHum l/3䰏aKR)\i5 JZj!8Q ~|ksصWrSjPz(R0ޤJImlqPz(RΗ2@>/ڜPz(BY{+$POMmr::fTOa_MEF LjR/9go 5 Ci-dK^ՀsvMf1QU(52u:/=n C%P۝ִ.U_n 妤^;nvb5T5Jۺx;L8${m/3Ia$l2dgK`ؕpy x(5/D[޴,&QM·!,GA/Mť&>ocR(0*]Y9r+ ,@xD Z6 `}`CjV1nfV:lVagQvf` FW7p()T޲0I*q 5}%IH͠k) *zCƜ[̛+!ڧD!FѩJ۴`S$!2xV<- 3P  %SKR`1dB$TwQT{(U+Jv : 1?.aA˄K GE Bie6gaNV(1Z榱0|ghNFbc獔ix>J)|Jg]·g|>"+y#Nh}e46S]?t.+@?q6Pp~XHg #3zhDGJx +NF-V#Xg K2ٚ8{uiK{YORKsAA\o NmKz:L!,;7^6q5Ļ. [_ rL7Jۄ:m̻OnCXwnl8vnObc:}QNyHm:ֿ\%ӻ a!߹mSl:/|: (Aq)V0VR۔ K2Y'#4Gru`JYJjfXY[Ό"G9 +՘gsR`~wkZûŗ1inѸpM7ZOr̓Q␜.Јz2L<*prbr*V S!lDd7iPRd2JfЩF|tjוcGnc1¦SkH4ٟT%Q<4-V5vU..Y}Ye}Ep޼}5N@T /ݦI4(Y}3n|wki60H{bKG`N/,Z$Sa/ mK]ODyB!!OmQ}ƻ_XGsB|k$Ne\چSh]tDc貖 &K 愺g((e\<0$;[!lRJNPy%90Z6%+s bjMv@ҡDa95DPR` rh)95 ƺb5bM!~jhudw h9` "tU#sB_>آ.ΏFfHT{8?hKoJ$xwr;VjJNwhhxG%w< ]/eJ_jӯLRIfvՏ.gm4C+}=k?KB>M&$ݦ4t) :"ȴRX*bK8I;6Ãhtcw?Q2X~9O# hO#VwqﺽSS휇^FRB͑$[sG^?5?Hm'֙.eΊn/dW7mwpM!; ΗAxfޭ`h$џf}BP/`X†RGt}ZKܜ>3C䪂UUW,RuZ![cZE_9o+b3'MԦ @s'CR۔/խ֙$& tXwRS륃ՙ[ʹRT%)A(Owo.Շd})+ )#-^,nEϢ@{dǠ:HO<ع'/kڹwNгzC\mYOLXv,p+dlPu -H>gއ\';Sbǵ v:Am5jWp:Or~6KA{qʂo!;QE3eŒUivYiʵbqڸ,+yF]wZhV @ޒbR#]Vݔ9sDfom9d;XJC)ߧv$Ǧ6C&Zcp+ڹ6Q[TVV,XC:+ڑB~[c Yнb̯} $qn+A7k+֦r>`ыJyK=5UI&뭗PSx_oS=!ކPn+KP%Dn~ՇX=|ڪE܏!ff%#%lLEr+A?d,)c`(,Y3R4>\`0 #BrTԮ^:{wZ颈}p|㵨`W%Y{p?gjia>!ѫ%(/.T7z ,ߵ2R>)6<`w<rfluH'VH1Czf0&-H? 65sIIտ|KMvܪc54kq՚w[Ќby&~M?H?>ύiܘύi5ۛ fQ7@0 ggCPd$<~)R ֌/ UsG~I|itpkɐp~C|M?a"J`2ArƵ@1ʱFDKhJWסfhF"H)9+)E}O>z1<3ulY;2ˆ`b ;(3 s aʁoۦM=lT'h$w>-WM]7ign!2g6|~F&ӻ%HD SɣzpZI܅LpR6g{*t7cM Ze-iKAфk1JHZikm N\(wh!Vplg]UNyuT6I0,:V+kokJ UiJ}:MHP܆d8ZMSF* "!2+t(*Ya墩CUyNh5 ց6aQ e'ȻأLdj R Ly;D!ɚB-i)YT(/w;K~Mr8G$ ݓd䦼[nJ 4JZyS2Ǻ) )9v4j10MɤBp0\&%ZP!8a#A2SV M );e(BsU횳!5eJMKfD5m5a]Y%s͉:,_ˠvԐܤBCHG|qɳJw?u)-w_{Aܤ%t!?܆_x>-\}+Y?ymz*7I9ǫ{j6,)mKv'Gxe@TPLᭈo⚅7Oxnds{xNy3ju3u[V] @^ ~hc9i[F'}H:=|IgD8,Mae$B~Mpzp _!|;ҹ@-2RJ'8L*A'Cءϩfɰ O1Ou^V 3S*ebk;':!`;$߻Z;v3Zm'{4֍_ޭcԯȆj{t`^)Xz ",+P` ɘ"d|eM*5`בYע.R.3VA &"tLRݠ0!9etRKr 0eSnyFe[r[7t +E]\ckiZ']SeoUk9_Tc1Ǎ ]ZcKρH?V!_&*lgQgO.:M1iG8߻#B 1^r#rhۈ/A"w~%eM!}M1dAeDYS0`> @A J;\S;3.tkG[ *ōHtv~:0){j>M,>$ziI'yjby19+i>tޛIX"Џ:^3,[Mx>JݹK1M1llY B{B9'"-9w?g""KVJ2v|as9l;n$a1RʃM\7:uF_vQ3/}k>|>6">To}kKwO_+s\}c}˧/]\}]۹?o'nS [˻뾼|?u/*?ZcllP /=t|*} US,z,ݢg n/Pnw<'B+hQdkn>@')mƘ 6gnm6B^9Ds0![}/r}N;Rی1G"3CtvTf)!)$&u689l` ҋ9-ֱz>EXϓqfg={u1@%%;XQ)^6ͅ/`o^$bj(QoPrȆQ[[do鴼ո1e )8v`o7 ~OT[mK.ӊnz 䫢-Lj2\4 wyGbiNvYZDdĄʸV&VdSؿcKxڕ$?Ǯˆۮ뜘C]I8 ' } 0X@8x`MZ㝱.?i/=x4ޣ=u-Lm"" E(eo|b5ZJac^IU sm 6`jCBY7uDBq1xtֵT|5!?[cAw[cةNhok u^ nbɀOtw OXl3|7Orwx;Z 㗛)zvQWWX]lpEy&OcM~~jG(dsLyOwOo/~xq(Ms<\T?{vG}e8ww=z@q&?> x[EL? 4g5]dk6VIGiOr72fO:z~"zKWU}sVRUcWZo(.;4(P VV[̀ҫF)1[JY}`u4OPz(%ѡ$=UVQJlw%~_|t KO:~R`@;(ų"wFgaXa5a9"c%>#!tc,3R!0G糵kGp ɟ7.̨sb| 6- [ƸR oߖlj_־NswۀMϊD] 8eӻր{YN@~WefUpNc@ٍ?I8^2˙Y"u_4}G~y%uuZt_e(6˫]]/.S:oԻG>Ӻi>~N?vgW7Of笾߷xa#,%%HL1S90ֆi^A*,}C-7RT s8):?0Hswc8aU>.5eUs%4€b|uYl"Z4qr0n)mM\7xa9.:Oc4' gaO8Bpl'eT[t$l>1-Uͻ0r"/:qhUu/M,dO˨6*xwi8UYgq:~o ݳWQGJ7j/w954|Nr\@VP~j |@p5+tŌ" :2yLQy^F +hhYEFejK[FVҸ ^N<ˑJ2%jq+j~ I)|m={1wZD (5LuHPWJw%42,Cgj;fZZ5y9ʹڳKK%]` ȵKL{ԟp%{`vL @ɢ V{qVg(:eQvZǞ.#t9@#>j!x} J)<Ib!/=Z1 !)v ZWn>@')muWݻsnS C4 S{k]qdl -#%-YE8[v˿nS Ct<|p1~o[,m>6E~߻݃wQ@b#ԯŗ߾ᛏovOP{QtǞY:zQtnK6G$-ЉQJNjo(j P*(Y^Jt(X^Pz(wF= Pz}]QFR4*ģ[}o5#Pz(%ҡ8[Ō}jPz(qɖ cqf7kF}gT{ˡVK𷝨Fm\.eYJP[8 QJFR'n(%Cio5p?!u CSnY?ʢig,Xŕp^l1`C88 XMX- wgJ!9'PoL9h:ȥGCqF!1 hYL2fpd҄+@-aU(Q[6ӦIw 54`@S>W"XNC2T:.k7U&16NįG581qmwz?(*:vzI·Aet&osuRL,0N_Q1&E( 5mWV{GZSXn!$dckPJB uvb Oq%-]#tȞ|MN7CK)WYhKӚ|ijr񃢨j-hZ_xjg"QGhnph^K2*Q}5ƶ޸8) 4mh˦ nJ7 XR餖g_@RFl`Nv_Ưڹ.r))eoEm+Mdžb|Qz,q#9vAռ3=& ~uj~ћ(E-p@4䢉8pp^9XK l @tyH =}!U ޽Gq鲷 geQ]Qކ9tDmFտKe6i5ǚ.x4:F@S ^YQ]>:Yt],[]I_w3"+58:))_|?\IS^ASJsN5\ꬬI ϝO(nn9}c:/4]Klקw>7۸)Oj; SP=Uٚ1BObDY^_I8z"7BV6!3OXG2rEr\N5*E9'9җAu7)]\!k`:N˯;OOVWt&D^'Z)> q>!͇l`ovG[M`HFr9ddgXNv̷`/;r.bowJ9R$,ik0vV)s}?}G EYՃ;EAI?B^9D0럩 ` trߑ\}Ř-7L*y”]0E!@wd%_=[v˿nS CtS =t[4pT^t^laXEX]j7Ik}t@u{ k=o WҎ PR.+VsV[mMQ %A:r9& r(q-DqS7ՏГ{U7z 3q|[dj #5HUQ6X9& jbiW1[_`}zvsѩ⬽ CS/|Y:VVk$ܣ؜JN}z.C8 wOKe  ,sϩA- {xqspN ߞLܼmp$~k-qKZ<7yw)( f5rBsuŹ[.*@'ޱ`)7}>ό/3џ~ik#~Ys}B4n(,O b…)oCBD㭧 A#\sj0w =&]UPUCFCKm]WmI[ȗ *lUSĉUd QE#랻;̿0FbڴMB -@a+ ʑ^;Y|?bB]=%.Nr Hi0%YT\F^kQh6RKr#8fnvv>ԴMT;T3&E+ΖûY<ۧgq3&N(ZZ/x(^8-~!K/RhSoutQ/\P-$ ߥ Xm ϦzLSKDCCsS4>ՒDTĨNm63SU?"[ͳO5ojySW'3 \N YE)g b/3x>ߕt?<駯r1:E̷+y^%(nN\Q;($ECzl' áZS 0ɫY?}'-h%|~}r #OYbƽ.tz/ G R9TS̡<ܯ}\ƝV!Q& vW+ zw#?2ft$ }c&$0 M}3O)IFKoA%=s9IJ=oX0y,K*9$ }[ 豶 B BF081D]6h#8+8'PI8m֬Sqq) ߡȥ@"Hu\`,/Qú埠ӱGu *C^DaKZatX]i>݅uAFmDa^ AG 5ѸjvU=T;v*fz/ߖhp^vt&va ,M}[B1AiB$2=@2cOX+i(U֗n(*z^%%xdAU%7>7IJd DޛutgWZksf}9 g'94YOU ype..fnهm⦸O>U4^eY#D|!Rk33)hD.,3:+v1$[&~2y8'ȼ%U_PdL -@`T#+tn?btg_oߝ}_6x=~AűaT1t泑 =;qLƅ1*Yz{,BQ9Z< h7q߾嘆̏)r,B9P*ɜ` gukɸ~WG#u)RU)Un̔@D#1h yAp5\.t.-,EB@m^O)\P+)a4-.m R)}_FSP,k< <O&IU S\!6}__oC(XƝNB{ߝ|O [eLtE8eE›ƃ]EqM tBePd&-u.Ḇc^<’WvT15|ǃ{p6|Igv0+6g߾^ǃ"vqq8ۤ8كVzVD@OyLpJ& {/=SgGpA^I.eMptn=ɳ߹U*@}Uztqd[%n˱Ю9*(Ƃ bpXtQ&@a7*.'CC]V4$uɪ 5fêE ujex.R$o~zkC#U>ߞEgfl ~縟n1NjBFHøH3A:nUk +(GW9wN0Rp$IMMf1&/ _n1O΋'Y o5r,[}UOeeV/jdFCaܓ)%GhHW)HQŹ.9sƘD5i-)a \PP;hH1'_q,~Jw`4zWȈ& quSFìv Z[ILHj EI˥QNl鮝N_[`OnU %kg*XHO<ɨdJ)B<=_C[!TAzm|b$[JuNa?3 K*=wu1) YPڟ9b,EG+ډ;!y؛^xXy"9) hBe99+Y{৏8YE 8mzW-B7u+* {+/+n⧹$R6'Rno~Yڼd&u)u1E~}"֋gVOSL5GJ7)pEL~a)ݩBTNՖ|S-Y%ʍ%Zeq *bR9!ӣYM}e?#lQHylm~O4p!V96 .탛D؉S.7*w T5{.dd7{/+k~Ji%ơ}'V3;H*cAz֠v PRhEᓁS{kP* sJ,JkG6]@=to%C㥥<Ӆ1Y8Ɏi%x fE-#&E-R#w[2䘇EF$)~yj4Ǟ9&@6"%_Y+>RY!qI+߃'cjaOMoFX ?Gڂenpb'0U*|UZ>]kLiՈ\,`9A-2";crx!htaԉ ӧNYPtHt]1|TsDljߦ3`o_ 8Ô1 #r3\/F\PEaO$ӑ];BmvODZOJ1\;j/!+H)б}H$;vzx„X~ѯ6=SIqQ+J5J6=a?F/jjK UO-j\Zpg&Í_ 3X9O7+0Q:%Ƽkbvͦo,Q"1H-vq!IcQ\[ =[|Y?#m]& \g9P!wiV{#ث«bfiB}c?_~ 0Vi . oǹ#S1!oN |ropLOZf_; }UJu.*J9cd| _B3i`-idN(j/u*ǫ߾1V\X-eF"eK+a)e碤Q kʱr0!p*PYp] ![G@+LܱƲsDvdu " Sbd-S |zӻMj#XFPDD՚4}󍱙SSKa n_ "l25H$_IGkt)6MNPĂ+֧oQjq[U{oMZ_U9Eh :[m_m9gZO}U=:d^i(.EѽdaRR2n [ E X(/_-̲o8g`(wLH&T),Uኜj)2%689džIO^* eLHT"KMqW8~ t󲢜lٿnmRv]ξ_r4ujW+o\䟷_`>}Udل"$=.&b/|g7N-nB<{& 8Qz^oDBL<SPd!EȲR!c#e9_Jօ/OsZ E4մ0Κ,'+,P&C m#!"W7[,-zd.ƀf i0 Eך,sB"˽Fè eW@/l(UHgږətF’R\8t16MloʦTrOT{1jiFMd'sI9wGcϟw?:-8 gjhOz.7xyG'x򿧤C~dW\Yr.̇0~%A7N2x3lL>6z)(zW*x .e=ZQιOk|9u"!J?> +e=*N"gx, iND1n*HAH'6CBR#Q[롁κͿ]WkkSٿt*u*a랽U߹Lhf~oۜ^iCr$meThbIa{Zso%t v{WI pv xUgۼcc.{sUXJ2&3re(؃]E@Jh>b es費tu䭎S7eN8}$w/8qY}{= O8PZ{z}FmqC)+1{hPm$aOֿFugEJ3_queҋBq\q~31P=_U]cH|.2ee==dgxR:k:/V$y{Tu?%G?:s2 EH~m$}ҋ_F:'.*p4@·Sk=>Pwu--˻|ݰ4#-di3:2ClEէ-4\ATzbϩѝ ΄OQK;Zw9yO ͅByA ~qKδ>LJl!g/w~Y_jݜzk}4pCXu*%HX ac0AQ0k 3V }a).2 GLxN(gECd=aje־aa>IfYtN]bbw!4FE +FzRZK#L?/|lh :3H*gTn4:Ŕ5Ѱd|= 1LHp!| (oVfrmRڭJ=܇^;2h&ɜ,: @ W H!¨S"ҁu,pO氁;Qq ŋp#)m.:hաw7t|P͹롸P{:f IŀU:401e$+IĔɋHzF6"fh.%);-KnZk6LߧdTy&o̱g@(5ྔ#f) ) AQMFZdH@yrW}I:&M!ᮚ} ڙ3A_H=?75|{0`n所X s6U4i(SiuT+%P]x $x5amDR'IqRI f&/*@]u~@Zw/-lgX6̦pkUl2&F )E!!igcgW)k" W$0t E3;%s^ЗR,ER*͗~"ɇf8ZaHiD=3EX4 'U_`JQ"0ji0-|M>klJк.PZN( M?/N1}Y+ _FgdiIP&=} 2dY(`KxPLJR4v`1bBJbXl !.146ҭU&w8 5g3 ?MpȖ K(KK!cAq{ 5:iGַB aA2Ey}HBۓ)Qx׍'C1tz_* r96i"˥Tٳ@ nv/M[mC4 |4$hѰR2 w4;5{Hf3X\iGWC/[m^%[D[4s)/]l: ovܯ42IB0qަ|Az є(.L:EK^8ZD.λJ>vTI&au6:hVCy$TMMZRHM ba}+'?5@Ӏ$3 ~y)nq;4HmzqLKB R@‡*{q~#N쎙 x|}`='NMvY$%:S"EbJ0_ ;1@ǭ9~Ȋ4ed^$ۜT.xɽtPa#2+ez`=̀|_! !qi@Ӽd4i_3TA  v@6,Wl5 9ᔤB'/z5"tWݴKb꾸Ɯg^]Ih h̩2CІE] 2 63FxA~۬j ?]1wim<mܖzN֗ {O/F UŹ: \k}6\m vŭm7 v7b-<5GT}~[ ۀ[Js+U.;RA*eo>)=q(&k wp^tg #rb +f\>5o30csK+lom4N@z2CT%hbR32 ܠq'Ot=y#yN zo,Ny ?A۷߃0$Hvo~?O /Iq=nn!BPyaI{HR4B#dp ч,U@]I8dj)wp_Mmَח٬&A643^= w֬߬lU<ܐ_I~;)RNZBH Ycp~'ERqvC<)D %̧bim5\i*j rS`/,燾j﫡/ϫV~<9g^&emed% *`'sß4"h",##JT.tA%]~-Ƌ˭-3};;ZzK{ @An͎m{Vsۋȣ:u 3NfI}G97U\e9FыR Ƒ=:_|_{X$B<AX,X CT}[wgLxBԽZ¥BNp뤹0tb5F'3OA]JűP4y:r0!r煚4>756膧Ɯ\Hi~ L -8k]v| /}CSqC` 9E2(F3%)4w1qFše鮾,bYy"i~Uj!dD\ $/.'*ɥ4T+Q$cĸ>؁Nn β ;ō 52s`(SH  j/]q5vW$B nc^[Km@eQVpvT" f) &0R\FrC[:B 3vb-|:Wh$~1Тb jOpk{C$X\*'mP!Zk.3QKɕ0 vpoݤ}ogZumkaP4DKa8K]3-$9[-[\`O tr^lo^.x9=P˱ՅYK+x%m{ljqx~|n<>g,grP(mӳѶq !ruZdnksvI{}T`NɛR]xiIz|74tZl7u<KǞi~J4.BSM`"LTmU=OQ5]}N$bae+ cHYpF>Y66oMJj=miИ]U; y*IZ!QO s=uXːI%@E1 +$7} 5e=2 n"\ׇS<Ȇs-@/K[kyE$!LdZ_::В'/"[>\#\):|yʛ]ϝ4lW8rݧ| aS?=PhruHݥUJh3SNR䮒JⳊoGbTkut(XuBMc)7; [04CE!n6"ܿ7|$9 ۪bY.^f#Q u=v2,ں ֏LȂ/!q*z:h4òh^_e9=[*h3ڶ&Uk/7m]~$y8=1ji{Xԁ n-\ 0/8nث띒Хxap|}dIǺ^Q'C-j aOSa*?C nM*l w4K"jiNwSng7C<{=>LuHcHfhNi,&c`*0?+tH?Dw;9p; ^{ksYx8;?rkyu1ϋ?l{(^ .)#?"#K׵#.ITGS?JsbxW^i,bĒ3:*zHCEPֺuSxSE;8h>~DoJI1ÓAʶE)9 ֓o%ﺭs*w OE6 b·8ř1Ns빴upCv jmb&{b(H.}`\${^%-.6"Qa(:$bTpDAN?d4s-_}z?vkdvRU[Ж?LN~|P{Q܄w |y+MqLכZz)_"+W^Ir,Y b3wߌofR k{_%iV_nCKZYlCL2IZRkGOF ޝk}LN l \Ny#F)UQU綎l Kٜw qFgrTxyԸ{W6fAV-Augáz'hWt*|FS]lvs!z_9SQ6*gK 5ɼ ԪQhhO{r%IojpRe;6E4@Ԃ ggzK$$G/)/n{2#j{d`t2G`vcS)"aë(s!Aa>w*E:d.+Z^edoev}݁MV7v;|ر 9&!* A25vxvLm}T#XK~gt(^ha뵺ƍ:T@S)ʒs9abqhaa༬<%bs*.i0ieb Ӕǜ8Th{9>D/ V V#ˡ"l^!Ք!^"Zt?u2.)q Zy7"C"L&4]flEr7WW#5qZ(y>9)GTVH:/ŤkJ۶X0rD~uzqeFW|%S{*jE D\%Le}4vq%ek"1Ʈ -~ &AU(D_rn8T3P3T\0,V{BR]'+`S*X]htʨȫX!|Jb;TWm{v8rJn0(f _ݻη Tr=aƊ5ե'ON5; ޶خLfӨly7; nQ+>G^ۨ3iI;#8sw~ l:ڤ^9 쨗&±,$4Q"N c ]bS3W]r2y0Q\\]DCwh`dO7x,Zr"sd*.dA`u-H{'n`n:,%覑UpDь?I7n_ct6ylЅ.7 =7rPŪܺvRBaF91s#% 9Mc= o.}^)}jhqCHI{gBGo^8Q1tf~ơơ!BJ)7Vk; %{!kaB_֋;m3o`JBP7(   &SXWbG&޶~lLb:æNL11eDBPBbK8BDiJB$N%z7=%U@K/؋Ӣ / ܆[y nlk-pS/}| > C11I֑˜D9fME%\i $9J 3\VP3xoӁx*M: PH;̾4wF=.Mb `PDSLG" hhSw)9BFiX1iRwBjaHS+O UHXVV|-]n:I]܈rp1e1s.,S4a$(u1FXX+.`"u& 6@6 Nxr'?O Pd|xn9 @GQ5}.oLx Rw/6 h",."Z ?^4 9Ɲ,뾓LtxPaJA'NOBB +ܞ@.#El .[מ3#bV!.B5.Mx5*u xNq"&B~ Ü.Jy$Q}rSJrR-!w  1U%oǯ르gHDk7}dmcGXEĠx,>K=FiBpu_C75rNw5Ն -[ zzu ٭~(Zz0:j]!v6@BkG `eToW_͠칬!SU9rM+\w7]c*_:9\rЙkԩ2mμ[%ޭ]Xk G{7g?Ƀ (U,؎_M^Bi{fO*SU-vVwUr2%S;3kr7PfjmhF$; Gz|hFby8 iQcr(3!Yb D c* u&DywS~QqyUlFzyc!W o֢-4'rD?En"`a2 ?}Wc9ȻɏMXvȺz|n ;.M>Ɵ4h8`o%O><-ڛ!P\zN%weoJ)(la-r/$"Y=2gRsBȉGRa{. i[RcMĉGR$PkV'K".8Y b)šOIZrvZH8n2R&19 >%ukYj9=Y93bEy. e(pQfHFJ#VG̩E.`SNTVTK )ZS5\n.B5 W{Gz {M)a3֜b Z!̝*˔+[N6S-ukYjrbQc?bB[_חej I6)#\|Zֲȹ 5qhֱsȞSފHo S^ln>؇y|" .?Q Cv͞TX5 `L2QګQQb,:ͲUzP[ţ [b71{7@v|\r_;ˆ+١x\R*.*<B9[cQ2IQ6ۺP}|pW)pisT0ct嚫;._ |I^$bצv悃^zo"|4񦕑lLHy36VP̺7UHi;K1YޝYsMԽ>6;1';4ַtzjl"g3/5ϝx9y5_go'V6Ҙ䯶DžY\LUj(„>׌ʼny=v8l]COH *,$_^M.\O/s=]'&O89n ,|+wӦI"8 9C2{EZ"SC!T~!f~pi}MD/E| aYmuq!JaqLC!( xN4M=ўԘJJ{*H(7ۅbsRILٚ$LcBX#n}w.=6+{ІD9פA9'vl0"GɺՔe˪0nM KAƝUp d:Ǒ͏,uO/.ܴ N)tj6~qW^^Ʌ<-Zd$BaR{PzVJY xWU:`roԉ<:Ju뒨,osn̘.;#)eaiwB0s|lj&fN>tToL'sL?? )Z>,2⃗G1>tv+^c|pE#tp,s +Z eF7SɥeDn> ?f ',-L)V 'G3gםsY(82;+~h͏8oq[N\j [KNh/'j2-'W-%{[kLgRqU@DoD][Tm:t@ҫVwޒVH}XcSBtKՁuR}6[zݤ0!I>ytPDtKՁuRv\Z` .zS{M/k7Hṕ늸ԯR[COVm>BȬ;z) +tR[7$AQGNBaFO/i/Hy~w$<&X&w#@*7sMw75}IwF,o¹IŘ2>(vrFY~bdޡOڞ4;,o#_"iD\d-wj0/fh:^0SXwu[ݥWdك(l V ѰfrfQXR䖗B@uA] 'qa˸cm*JO뻁ix",3 \˛ ra~G ɍ)'$>u!N#u9zh;ZVpm1=jPH6f-PHuJV)*4Jߢ 3vjФ[t.(BE= ӎPAl&ϴHB-RLd -zbZ@@= XiqhilQ{T1tvW,*8Z.,%2YYq&k?xrMv&f!jTs͔ {%җsI8}I7ThUvM@ =| A,˥F]&urzԤieRkj![7Fk8z*XP,UT7bxg`Sk?B.SHŏO;FTH\'1mhST٢[z0!I>tqyݘ] -UIFLtsA[nGZm yp}SFX~kX 5_O.RcDeYnQYDed0FE!wNr.z]<~r$FmRD^~l=EƽSh.e 2aR]&6r@?4}?pHm}q6agH6J *ЬV9&qu!ٺu\cΪU6^hCckoΪŪڍ&{JOZj Ūtz*k<5C] َ</^Zd;-d235M@{QCb \=}Em4 sJxߵx6(AasF+T=GK.O8\MQX_(D7{q*Ճ]CXNK :χ<{TS~K-,d~RC/m롇w'b:.I} mL-\7)֦iՀoUmznzAxF6  #"\+ۅʍ4eA~džKyd$o8%J=V+yol=W)vt{}uƳv/t2ɈH s 6]= n ^[ }whm)T:ܽ>f|^0]&(7f*sg.o]u7-w!2OQ'CjDbIݥU aqx89uW6RA.EgFZP>h"nb7ޡ*%gYql s7.\0Ucά: R9 KI?3AdH^wΥS"U*V`!ށ.c$O6}Sjq[6-~3˥~yW!n"/ҲR"`*>I-ߖ +1ƇѦg÷ρ5<]4ɧF#Ytc'@DmNFX|hSL/Ww qG)SlϹ[Dc7n;"xzSs8ҹ .اxI+CMe[נ7\&x+5_O 14910ms (11:36:57.736) Feb 18 11:36:57 crc kubenswrapper[4922]: Trace[1637881157]: [14.910520267s] [14.910520267s] END Feb 18 11:36:57 crc kubenswrapper[4922]: I0218 11:36:57.736994 4922 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 18 11:36:57 crc kubenswrapper[4922]: I0218 11:36:57.737858 4922 trace.go:236] Trace[1971315235]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (18-Feb-2026 11:36:43.651) (total time: 14086ms): Feb 18 11:36:57 crc kubenswrapper[4922]: Trace[1971315235]: ---"Objects listed" error: 14086ms (11:36:57.737) Feb 18 11:36:57 crc kubenswrapper[4922]: Trace[1971315235]: [14.086289522s] [14.086289522s] END Feb 18 11:36:57 crc kubenswrapper[4922]: I0218 11:36:57.738139 4922 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 18 11:36:57 crc kubenswrapper[4922]: I0218 11:36:57.740988 4922 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 18 11:36:57 crc kubenswrapper[4922]: E0218 11:36:57.742152 4922 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Feb 18 11:36:57 crc kubenswrapper[4922]: I0218 11:36:57.742251 4922 trace.go:236] Trace[1527444132]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (18-Feb-2026 11:36:42.982) (total time: 14759ms): Feb 18 11:36:57 crc kubenswrapper[4922]: Trace[1527444132]: ---"Objects listed" error: 14759ms (11:36:57.742) Feb 18 11:36:57 crc kubenswrapper[4922]: Trace[1527444132]: [14.759439268s] [14.759439268s] END Feb 18 11:36:57 crc kubenswrapper[4922]: I0218 11:36:57.742724 4922 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 18 11:36:57 crc kubenswrapper[4922]: I0218 11:36:57.747684 4922 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 18 11:36:57 crc kubenswrapper[4922]: I0218 11:36:57.920623 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 22:02:03.331172927 +0000 UTC Feb 18 11:36:58 crc kubenswrapper[4922]: I0218 11:36:58.094998 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 18 11:36:58 crc kubenswrapper[4922]: I0218 11:36:58.096453 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:36:58 crc kubenswrapper[4922]: I0218 11:36:58.096611 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:36:58 crc kubenswrapper[4922]: I0218 11:36:58.096721 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:36:58 crc kubenswrapper[4922]: I0218 11:36:58.111680 4922 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 18 11:36:58 crc kubenswrapper[4922]: I0218 11:36:58.111860 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 18 11:36:58 crc kubenswrapper[4922]: I0218 11:36:58.444271 4922 csr.go:261] certificate signing request csr-k6swn is approved, waiting to be issued Feb 18 11:36:58 crc kubenswrapper[4922]: I0218 11:36:58.469839 4922 csr.go:257] certificate signing request csr-k6swn is issued Feb 18 11:36:58 crc kubenswrapper[4922]: I0218 11:36:58.738110 4922 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 18 11:36:58 crc kubenswrapper[4922]: W0218 11:36:58.738598 4922 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.Service ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 18 11:36:58 crc kubenswrapper[4922]: I0218 11:36:58.921566 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 17:33:42.871773631 +0000 UTC Feb 18 11:36:58 crc kubenswrapper[4922]: I0218 11:36:58.991866 4922 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.018243 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.032405 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.100763 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.101579 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.103658 4922 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd" exitCode=255 Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.103748 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd"} Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.103843 4922 scope.go:117] "RemoveContainer" containerID="3fcb9209bee7db64487a386ac170c5349cab80d8924d502f2325cab2d19761b1" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.266521 4922 scope.go:117] "RemoveContainer" containerID="4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd" Feb 18 11:36:59 crc kubenswrapper[4922]: E0218 11:36:59.267200 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.471436 4922 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-18 11:31:58 +0000 UTC, rotation deadline is 2026-12-06 07:05:05.524038344 +0000 UTC Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.471508 4922 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6979h28m6.052533973s for next certificate rotation Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.900714 4922 apiserver.go:52] "Watching apiserver" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.909120 4922 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.909649 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc","openshift-multus/multus-c9xzd","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-node-identity/network-node-identity-vrzqb","openshift-dns/node-resolver-w46bt","openshift-multus/multus-additional-cni-plugins-26zbd","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-operator/iptables-alerter-4ln5h","openshift-kube-apiserver/kube-apiserver-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-image-registry/node-ca-q5qkb","openshift-machine-config-operator/machine-config-daemon-znglx","openshift-ovn-kubernetes/ovnkube-node-wg4r5"] Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.910072 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.910117 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.910096 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:36:59 crc kubenswrapper[4922]: E0218 11:36:59.910208 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:36:59 crc kubenswrapper[4922]: E0218 11:36:59.910405 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.910439 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.910615 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.912043 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.912146 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-w46bt" Feb 18 11:36:59 crc kubenswrapper[4922]: E0218 11:36:59.912171 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.912712 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-26zbd" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.912824 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-c9xzd" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.913635 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-q5qkb" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.914838 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-znglx" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.914947 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.919885 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.920335 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.920398 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.920575 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.920799 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.920839 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.920956 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.921052 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.921058 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.921090 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.921185 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.921188 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.921390 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.921468 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.921525 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.921585 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.921598 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.921756 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.921815 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.922445 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 10:21:49.475083279 +0000 UTC Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.923079 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.923513 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.923924 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.924274 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.924410 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.924698 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.924756 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.924700 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.924957 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.925001 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.925108 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.925145 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.925201 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.925545 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.925705 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.925714 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.936033 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.947082 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5qkb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cd3723d-a12f-4c7c-a1ea-63bfef3c931a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c45cw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5qkb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.964826 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"384d508e-8f6a-427d-a85d-85bb61d8405e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dacf3659cbb51e3d8df303470837041f3110f38b42187d40cbb4e53a8750cafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab832676116da0ae3d9dc70d182e283d7ae2418db02c1ea0062812cab0d71b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://434010c9772cf1c9db5578ecd4726ae2de0c0a3923d45b522868677cf78fcaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fcb9209bee7db64487a386ac170c5349cab80d8924d502f2325cab2d19761b1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:36:52Z\\\",\\\"message\\\":\\\"W0218 11:36:42.053001 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0218 11:36:42.053399 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771414602 cert, and key in /tmp/serving-cert-1995526895/serving-signer.crt, /tmp/serving-cert-1995526895/serving-signer.key\\\\nI0218 11:36:42.305115 1 observer_polling.go:159] Starting file observer\\\\nW0218 11:36:42.309353 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0218 11:36:42.309585 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0218 11:36:42.311169 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1995526895/tls.crt::/tmp/serving-cert-1995526895/tls.key\\\\\\\"\\\\nF0218 11:36:52.736229 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0218 11:36:58.792829 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0218 11:36:58.792990 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0218 11:36:58.794119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-56419459/tls.crt::/tmp/serving-cert-56419459/tls.key\\\\\\\"\\\\nI0218 11:36:58.994875 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0218 11:36:59.009974 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0218 11:36:59.010010 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0218 11:36:59.010039 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0218 11:36:59.010047 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0218 11:36:59.021309 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0218 11:36:59.021338 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021343 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021347 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0218 11:36:59.021350 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0218 11:36:59.021353 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:36:59.021356 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:36:59.021636 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0218 11:36:59.024101 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b891a2fe938b437cbfb28c7715d0ad758541d844a4768434d3c1f1a3f0578c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.979220 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:36:59 crc kubenswrapper[4922]: I0218 11:36:59.992713 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26zbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592c6351-c252-4c19-b3b1-167096be2de9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26zbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.004152 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.008604 4922 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.014975 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.022991 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w46bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"067f44ac-9e60-4581-87cc-f2e1c823fc4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ks8d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w46bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.032489 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.048511 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdb7cedc-b2e3-48f0-80e0-e17073b43228\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-znglx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.054426 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.054712 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.054806 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.054893 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.054973 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.055052 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.055138 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.055227 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.055316 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.055409 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.055493 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.055599 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.055695 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.055779 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.055870 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.055957 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.056315 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.057410 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.056556 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.056586 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.056609 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.056590 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.056634 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.056734 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.056838 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.057068 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.057197 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.057487 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.057378 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.057388 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.057926 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.058025 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.058084 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.058108 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.058128 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.058143 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.058524 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.058578 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.058599 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.058730 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.059046 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.059212 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.059277 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.059319 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.059352 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.059397 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.059425 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.059451 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.059490 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.059523 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.059552 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.059578 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.059608 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.059637 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.059639 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.059666 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.059694 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.059722 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.059748 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.059795 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.059823 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.059852 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.059858 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.059880 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.059905 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.059927 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.059950 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.059951 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.060043 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.060062 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.060074 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.060104 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.060140 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.060138 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.060189 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.060244 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.060268 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.060286 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.060277 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.060298 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.060311 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.060440 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.060502 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.060491 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.060562 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.060604 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.060608 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.060631 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.060681 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.060690 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.060709 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.060733 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.060774 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.060795 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.060850 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.060878 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.060912 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.060967 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.061000 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.061064 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.061314 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.061379 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.061441 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.061464 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.061483 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.061520 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.061559 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.061599 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.061637 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.061695 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.061730 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.061772 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.061795 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.061827 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.061849 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.060966 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.061092 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.061344 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.061373 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.061429 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.061451 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.061456 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.061919 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.062268 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.062282 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.062552 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.062512 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.062710 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.062737 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.062951 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.062981 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.063040 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.063062 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.063082 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.063113 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.063123 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.063127 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.063145 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.063170 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.063190 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.063246 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.063265 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.063313 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.063332 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.063350 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.063402 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.063460 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.063578 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.063616 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.063696 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.063722 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.063781 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.063800 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.063817 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.064163 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.064220 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.064250 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.064294 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.064314 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.064455 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.064475 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.064516 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.064550 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.064603 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.064817 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.064927 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.064960 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.065111 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.065134 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.065172 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.065302 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.065350 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.065421 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.065441 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.065461 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.065480 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.065500 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.065542 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.065587 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.065607 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.065656 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.065674 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.065690 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.065707 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.065724 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.065767 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.065785 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.065829 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.065855 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.066457 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"653a41bb-bb1d-421c-a92b-7f2811d95edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wg4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.066756 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.066802 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.066872 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.067017 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.067048 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.067071 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.067096 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.067132 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.067160 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.067188 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.067219 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.067249 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.067273 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.067296 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.067318 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.063579 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.063619 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.065137 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.065281 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: E0218 11:37:00.067378 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:37:00.567333471 +0000 UTC m=+22.295037561 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.065312 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.065412 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.065424 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.065432 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.065435 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.065583 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.065764 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.065837 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.065847 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.066048 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.066225 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.066189 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.066326 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.066372 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.066744 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.066762 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.067150 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.067313 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.067684 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.067914 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.068620 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.069002 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.069131 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.069267 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.069305 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.069350 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.069558 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.069935 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.070217 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.070557 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.071134 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.072000 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.072380 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.072822 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.072984 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.073184 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.073689 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.073836 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.073983 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.074191 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.075007 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.075093 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.075186 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.075239 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.075398 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.075611 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.075671 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.075675 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.075716 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.076074 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.076200 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.076240 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.076272 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.076276 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.076301 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.076333 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.076376 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.076404 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.076432 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.076481 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.076505 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.076531 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.076554 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.076579 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.076606 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.076631 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.076660 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.076687 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.076712 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.076738 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.076765 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.076790 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.076815 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.076836 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.076869 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.076899 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.076925 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.076952 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.076979 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.077008 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.077038 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.077065 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.077097 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.077126 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.077151 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.077188 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.077217 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.077460 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.077488 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.077586 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.077627 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/067f44ac-9e60-4581-87cc-f2e1c823fc4c-hosts-file\") pod \"node-resolver-w46bt\" (UID: \"067f44ac-9e60-4581-87cc-f2e1c823fc4c\") " pod="openshift-dns/node-resolver-w46bt" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.078061 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-node-log\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.078085 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/592c6351-c252-4c19-b3b1-167096be2de9-system-cni-dir\") pod \"multus-additional-cni-plugins-26zbd\" (UID: \"592c6351-c252-4c19-b3b1-167096be2de9\") " pod="openshift-multus/multus-additional-cni-plugins-26zbd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.078107 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-log-socket\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.078129 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.078150 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cd3723d-a12f-4c7c-a1ea-63bfef3c931a-serviceca\") pod \"node-ca-q5qkb\" (UID: \"3cd3723d-a12f-4c7c-a1ea-63bfef3c931a\") " pod="openshift-image-registry/node-ca-q5qkb" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.078174 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fdb7cedc-b2e3-48f0-80e0-e17073b43228-proxy-tls\") pod \"machine-config-daemon-znglx\" (UID: \"fdb7cedc-b2e3-48f0-80e0-e17073b43228\") " pod="openshift-machine-config-operator/machine-config-daemon-znglx" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.078197 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9b4595ac-c521-4ada-950d-e1b01cdff99b-host-var-lib-cni-multus\") pod \"multus-c9xzd\" (UID: \"9b4595ac-c521-4ada-950d-e1b01cdff99b\") " pod="openshift-multus/multus-c9xzd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.078220 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c45cw\" (UniqueName: \"kubernetes.io/projected/3cd3723d-a12f-4c7c-a1ea-63bfef3c931a-kube-api-access-c45cw\") pod \"node-ca-q5qkb\" (UID: \"3cd3723d-a12f-4c7c-a1ea-63bfef3c931a\") " pod="openshift-image-registry/node-ca-q5qkb" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.078241 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/653a41bb-bb1d-421c-a92b-7f2811d95edf-ovnkube-config\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.078260 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/653a41bb-bb1d-421c-a92b-7f2811d95edf-env-overrides\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.078277 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9b4595ac-c521-4ada-950d-e1b01cdff99b-multus-socket-dir-parent\") pod \"multus-c9xzd\" (UID: \"9b4595ac-c521-4ada-950d-e1b01cdff99b\") " pod="openshift-multus/multus-c9xzd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.078299 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.078327 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.078351 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-systemd-units\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.078389 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-run-ovn\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.078417 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/653a41bb-bb1d-421c-a92b-7f2811d95edf-ovnkube-script-lib\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.078458 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/592c6351-c252-4c19-b3b1-167096be2de9-cnibin\") pod \"multus-additional-cni-plugins-26zbd\" (UID: \"592c6351-c252-4c19-b3b1-167096be2de9\") " pod="openshift-multus/multus-additional-cni-plugins-26zbd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.078482 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9b4595ac-c521-4ada-950d-e1b01cdff99b-cnibin\") pod \"multus-c9xzd\" (UID: \"9b4595ac-c521-4ada-950d-e1b01cdff99b\") " pod="openshift-multus/multus-c9xzd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.078506 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9b4595ac-c521-4ada-950d-e1b01cdff99b-host-run-k8s-cni-cncf-io\") pod \"multus-c9xzd\" (UID: \"9b4595ac-c521-4ada-950d-e1b01cdff99b\") " pod="openshift-multus/multus-c9xzd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.078529 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.078551 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/fdb7cedc-b2e3-48f0-80e0-e17073b43228-rootfs\") pod \"machine-config-daemon-znglx\" (UID: \"fdb7cedc-b2e3-48f0-80e0-e17073b43228\") " pod="openshift-machine-config-operator/machine-config-daemon-znglx" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.078576 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fdb7cedc-b2e3-48f0-80e0-e17073b43228-mcd-auth-proxy-config\") pod \"machine-config-daemon-znglx\" (UID: \"fdb7cedc-b2e3-48f0-80e0-e17073b43228\") " pod="openshift-machine-config-operator/machine-config-daemon-znglx" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.078599 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9b4595ac-c521-4ada-950d-e1b01cdff99b-system-cni-dir\") pod \"multus-c9xzd\" (UID: \"9b4595ac-c521-4ada-950d-e1b01cdff99b\") " pod="openshift-multus/multus-c9xzd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.078618 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/653a41bb-bb1d-421c-a92b-7f2811d95edf-ovn-node-metrics-cert\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.078638 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3cd3723d-a12f-4c7c-a1ea-63bfef3c931a-host\") pod \"node-ca-q5qkb\" (UID: \"3cd3723d-a12f-4c7c-a1ea-63bfef3c931a\") " pod="openshift-image-registry/node-ca-q5qkb" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.078667 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.078692 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.078714 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-host-kubelet\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.078742 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.079049 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9b4595ac-c521-4ada-950d-e1b01cdff99b-multus-cni-dir\") pod \"multus-c9xzd\" (UID: \"9b4595ac-c521-4ada-950d-e1b01cdff99b\") " pod="openshift-multus/multus-c9xzd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.079077 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9b4595ac-c521-4ada-950d-e1b01cdff99b-cni-binary-copy\") pod \"multus-c9xzd\" (UID: \"9b4595ac-c521-4ada-950d-e1b01cdff99b\") " pod="openshift-multus/multus-c9xzd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.079114 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.079145 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-etc-openvswitch\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.079171 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-run-openvswitch\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.079194 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/592c6351-c252-4c19-b3b1-167096be2de9-cni-binary-copy\") pod \"multus-additional-cni-plugins-26zbd\" (UID: \"592c6351-c252-4c19-b3b1-167096be2de9\") " pod="openshift-multus/multus-additional-cni-plugins-26zbd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.079212 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9b4595ac-c521-4ada-950d-e1b01cdff99b-multus-conf-dir\") pod \"multus-c9xzd\" (UID: \"9b4595ac-c521-4ada-950d-e1b01cdff99b\") " pod="openshift-multus/multus-c9xzd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.079234 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.079253 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26p2s\" (UniqueName: \"kubernetes.io/projected/653a41bb-bb1d-421c-a92b-7f2811d95edf-kube-api-access-26p2s\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.079274 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/592c6351-c252-4c19-b3b1-167096be2de9-os-release\") pod \"multus-additional-cni-plugins-26zbd\" (UID: \"592c6351-c252-4c19-b3b1-167096be2de9\") " pod="openshift-multus/multus-additional-cni-plugins-26zbd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.079294 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55zqx\" (UniqueName: \"kubernetes.io/projected/592c6351-c252-4c19-b3b1-167096be2de9-kube-api-access-55zqx\") pod \"multus-additional-cni-plugins-26zbd\" (UID: \"592c6351-c252-4c19-b3b1-167096be2de9\") " pod="openshift-multus/multus-additional-cni-plugins-26zbd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.079313 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9b4595ac-c521-4ada-950d-e1b01cdff99b-hostroot\") pod \"multus-c9xzd\" (UID: \"9b4595ac-c521-4ada-950d-e1b01cdff99b\") " pod="openshift-multus/multus-c9xzd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.079336 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9b4595ac-c521-4ada-950d-e1b01cdff99b-multus-daemon-config\") pod \"multus-c9xzd\" (UID: \"9b4595ac-c521-4ada-950d-e1b01cdff99b\") " pod="openshift-multus/multus-c9xzd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.079397 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.079418 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9b4595ac-c521-4ada-950d-e1b01cdff99b-host-run-netns\") pod \"multus-c9xzd\" (UID: \"9b4595ac-c521-4ada-950d-e1b01cdff99b\") " pod="openshift-multus/multus-c9xzd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.079442 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9b4595ac-c521-4ada-950d-e1b01cdff99b-host-run-multus-certs\") pod \"multus-c9xzd\" (UID: \"9b4595ac-c521-4ada-950d-e1b01cdff99b\") " pod="openshift-multus/multus-c9xzd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.079462 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ks8d\" (UniqueName: \"kubernetes.io/projected/067f44ac-9e60-4581-87cc-f2e1c823fc4c-kube-api-access-5ks8d\") pod \"node-resolver-w46bt\" (UID: \"067f44ac-9e60-4581-87cc-f2e1c823fc4c\") " pod="openshift-dns/node-resolver-w46bt" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.079480 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/592c6351-c252-4c19-b3b1-167096be2de9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-26zbd\" (UID: \"592c6351-c252-4c19-b3b1-167096be2de9\") " pod="openshift-multus/multus-additional-cni-plugins-26zbd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.079502 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.079519 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-run-systemd\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.079536 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-var-lib-openvswitch\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.079554 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-host-cni-bin\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.079571 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9b4595ac-c521-4ada-950d-e1b01cdff99b-os-release\") pod \"multus-c9xzd\" (UID: \"9b4595ac-c521-4ada-950d-e1b01cdff99b\") " pod="openshift-multus/multus-c9xzd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.079593 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.079611 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-host-slash\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.079630 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-host-cni-netd\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.079652 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.079670 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9b4595ac-c521-4ada-950d-e1b01cdff99b-host-var-lib-kubelet\") pod \"multus-c9xzd\" (UID: \"9b4595ac-c521-4ada-950d-e1b01cdff99b\") " pod="openshift-multus/multus-c9xzd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.079690 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv6rh\" (UniqueName: \"kubernetes.io/projected/9b4595ac-c521-4ada-950d-e1b01cdff99b-kube-api-access-zv6rh\") pod \"multus-c9xzd\" (UID: \"9b4595ac-c521-4ada-950d-e1b01cdff99b\") " pod="openshift-multus/multus-c9xzd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.079708 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-host-run-netns\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.079727 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/592c6351-c252-4c19-b3b1-167096be2de9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-26zbd\" (UID: \"592c6351-c252-4c19-b3b1-167096be2de9\") " pod="openshift-multus/multus-additional-cni-plugins-26zbd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.079745 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9b4595ac-c521-4ada-950d-e1b01cdff99b-host-var-lib-cni-bin\") pod \"multus-c9xzd\" (UID: \"9b4595ac-c521-4ada-950d-e1b01cdff99b\") " pod="openshift-multus/multus-c9xzd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.079763 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-host-run-ovn-kubernetes\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.079780 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqgzp\" (UniqueName: \"kubernetes.io/projected/fdb7cedc-b2e3-48f0-80e0-e17073b43228-kube-api-access-mqgzp\") pod \"machine-config-daemon-znglx\" (UID: \"fdb7cedc-b2e3-48f0-80e0-e17073b43228\") " pod="openshift-machine-config-operator/machine-config-daemon-znglx" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.079803 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.079821 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9b4595ac-c521-4ada-950d-e1b01cdff99b-etc-kubernetes\") pod \"multus-c9xzd\" (UID: \"9b4595ac-c521-4ada-950d-e1b01cdff99b\") " pod="openshift-multus/multus-c9xzd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.079932 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.079944 4922 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.079957 4922 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.079974 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.079985 4922 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.079997 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080008 4922 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080020 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080030 4922 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080040 4922 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080057 4922 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080071 4922 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080081 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080090 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080104 4922 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080113 4922 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080123 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080133 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080164 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080177 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080192 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080214 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080226 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080237 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080246 4922 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080257 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080266 4922 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080277 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080288 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080298 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080314 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080326 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080340 4922 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080351 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080390 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080408 4922 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080424 4922 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080437 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080449 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080459 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080469 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080481 4922 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080493 4922 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080512 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080527 4922 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080540 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080551 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080562 4922 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080572 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080588 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080598 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080608 4922 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080619 4922 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080628 4922 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080638 4922 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080648 4922 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080657 4922 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080667 4922 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080677 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080691 4922 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080700 4922 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080710 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080720 4922 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080732 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080741 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080753 4922 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080766 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080781 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080793 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080805 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080817 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080830 4922 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080843 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080855 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080870 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080882 4922 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080897 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080909 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080930 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080945 4922 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080958 4922 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080971 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.080984 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.081004 4922 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.081021 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.081034 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.081046 4922 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.081059 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.081071 4922 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.081085 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.081107 4922 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.081120 4922 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.081133 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.081145 4922 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.081156 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.081168 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.081181 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.081193 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.081206 4922 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.081222 4922 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.081235 4922 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.081248 4922 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.078480 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.081772 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.082100 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.082322 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.082520 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.083177 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.083265 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.083536 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.083546 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.097716 4922 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.083747 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.083893 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.083947 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.083968 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.082562 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.087717 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.088409 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.097974 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.089287 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.089946 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.098100 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.090688 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.091221 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.092048 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.092569 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.092685 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.093310 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.094015 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.094576 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: E0218 11:37:00.097091 4922 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 11:37:00 crc kubenswrapper[4922]: E0218 11:37:00.097340 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 11:37:00 crc kubenswrapper[4922]: E0218 11:37:00.098248 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 11:37:00 crc kubenswrapper[4922]: E0218 11:37:00.098267 4922 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:37:00 crc kubenswrapper[4922]: E0218 11:37:00.097556 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 11:37:00 crc kubenswrapper[4922]: E0218 11:37:00.098389 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.098399 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: E0218 11:37:00.098384 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 11:37:00.598284238 +0000 UTC m=+22.325988538 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.097763 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: E0218 11:37:00.098512 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-18 11:37:00.598476803 +0000 UTC m=+22.326180883 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.098512 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.097795 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.097965 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.088995 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: E0218 11:37:00.098420 4922 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.098583 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 18 11:37:00 crc kubenswrapper[4922]: E0218 11:37:00.097767 4922 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 11:37:00 crc kubenswrapper[4922]: E0218 11:37:00.098630 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-18 11:37:00.598597106 +0000 UTC m=+22.326301186 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:37:00 crc kubenswrapper[4922]: E0218 11:37:00.098675 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 11:37:00.598656097 +0000 UTC m=+22.326360177 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.099023 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.099100 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.099116 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.099194 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.100133 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.100630 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.100671 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.101039 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.102167 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.102226 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.102879 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.103853 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.105153 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.105222 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.106146 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.106573 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.106752 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.108659 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.109729 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.114322 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.114668 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.115909 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.116392 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.116502 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.116817 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.117017 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.117025 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.117490 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.117689 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.117789 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.117898 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.118625 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.119185 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.119234 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.119266 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.119342 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.119602 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.119698 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.119990 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.120169 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.120292 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.120295 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.120412 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.121599 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.121401 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.122733 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.122934 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.123516 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bbe0881c-3173-4207-adb8-72c0dd4fa9c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e73616098bbc0fcdcd0b7e6b5bf610a7c193bdb65922e4532ce9553e9fc6e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71f2a53a03eafd89329b955a7edbe71c610706392e0a07fa09b9a6247889913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cac665b2d5a707e29339e7e8afa75dffcfd1dc1f42262e2643b2487ae68558a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0517235f78003f9b1a9044287c23aebed811299dc48f24c6643dd3bfcf5cf1d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a0c8c34dd5237a5b2d52b436f5c4deabd96a96ccb8073961dc49aaffd62c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.123990 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.124575 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.125171 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.125766 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.126305 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.126478 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.126480 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.126513 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.126717 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.126945 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.127117 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.127826 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.128125 4922 scope.go:117] "RemoveContainer" containerID="4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd" Feb 18 11:37:00 crc kubenswrapper[4922]: E0218 11:37:00.128288 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.128703 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.128892 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.136660 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.137288 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.137879 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.138827 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:37:00 crc kubenswrapper[4922]: E0218 11:37:00.140093 4922 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.145873 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.153256 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9xzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b4595ac-c521-4ada-950d-e1b01cdff99b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zv6rh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9xzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.157667 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.171900 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.182338 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-systemd-units\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.182384 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-run-ovn\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.182404 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/653a41bb-bb1d-421c-a92b-7f2811d95edf-ovnkube-script-lib\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.182421 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/592c6351-c252-4c19-b3b1-167096be2de9-cnibin\") pod \"multus-additional-cni-plugins-26zbd\" (UID: \"592c6351-c252-4c19-b3b1-167096be2de9\") " pod="openshift-multus/multus-additional-cni-plugins-26zbd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.182438 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9b4595ac-c521-4ada-950d-e1b01cdff99b-cnibin\") pod \"multus-c9xzd\" (UID: \"9b4595ac-c521-4ada-950d-e1b01cdff99b\") " pod="openshift-multus/multus-c9xzd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.182454 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9b4595ac-c521-4ada-950d-e1b01cdff99b-multus-socket-dir-parent\") pod \"multus-c9xzd\" (UID: \"9b4595ac-c521-4ada-950d-e1b01cdff99b\") " pod="openshift-multus/multus-c9xzd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.182470 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9b4595ac-c521-4ada-950d-e1b01cdff99b-host-run-k8s-cni-cncf-io\") pod \"multus-c9xzd\" (UID: \"9b4595ac-c521-4ada-950d-e1b01cdff99b\") " pod="openshift-multus/multus-c9xzd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.182487 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.182505 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/fdb7cedc-b2e3-48f0-80e0-e17073b43228-rootfs\") pod \"machine-config-daemon-znglx\" (UID: \"fdb7cedc-b2e3-48f0-80e0-e17073b43228\") " pod="openshift-machine-config-operator/machine-config-daemon-znglx" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.182521 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fdb7cedc-b2e3-48f0-80e0-e17073b43228-mcd-auth-proxy-config\") pod \"machine-config-daemon-znglx\" (UID: \"fdb7cedc-b2e3-48f0-80e0-e17073b43228\") " pod="openshift-machine-config-operator/machine-config-daemon-znglx" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.182535 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9b4595ac-c521-4ada-950d-e1b01cdff99b-system-cni-dir\") pod \"multus-c9xzd\" (UID: \"9b4595ac-c521-4ada-950d-e1b01cdff99b\") " pod="openshift-multus/multus-c9xzd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.182552 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/653a41bb-bb1d-421c-a92b-7f2811d95edf-ovn-node-metrics-cert\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.182575 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-host-kubelet\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.182596 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9b4595ac-c521-4ada-950d-e1b01cdff99b-multus-cni-dir\") pod \"multus-c9xzd\" (UID: \"9b4595ac-c521-4ada-950d-e1b01cdff99b\") " pod="openshift-multus/multus-c9xzd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.182612 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9b4595ac-c521-4ada-950d-e1b01cdff99b-cni-binary-copy\") pod \"multus-c9xzd\" (UID: \"9b4595ac-c521-4ada-950d-e1b01cdff99b\") " pod="openshift-multus/multus-c9xzd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.182630 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3cd3723d-a12f-4c7c-a1ea-63bfef3c931a-host\") pod \"node-ca-q5qkb\" (UID: \"3cd3723d-a12f-4c7c-a1ea-63bfef3c931a\") " pod="openshift-image-registry/node-ca-q5qkb" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.182648 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-etc-openvswitch\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.182662 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-run-openvswitch\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.182677 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/592c6351-c252-4c19-b3b1-167096be2de9-cni-binary-copy\") pod \"multus-additional-cni-plugins-26zbd\" (UID: \"592c6351-c252-4c19-b3b1-167096be2de9\") " pod="openshift-multus/multus-additional-cni-plugins-26zbd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.182692 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9b4595ac-c521-4ada-950d-e1b01cdff99b-multus-conf-dir\") pod \"multus-c9xzd\" (UID: \"9b4595ac-c521-4ada-950d-e1b01cdff99b\") " pod="openshift-multus/multus-c9xzd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.182707 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26p2s\" (UniqueName: \"kubernetes.io/projected/653a41bb-bb1d-421c-a92b-7f2811d95edf-kube-api-access-26p2s\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.182721 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/592c6351-c252-4c19-b3b1-167096be2de9-os-release\") pod \"multus-additional-cni-plugins-26zbd\" (UID: \"592c6351-c252-4c19-b3b1-167096be2de9\") " pod="openshift-multus/multus-additional-cni-plugins-26zbd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.182736 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55zqx\" (UniqueName: \"kubernetes.io/projected/592c6351-c252-4c19-b3b1-167096be2de9-kube-api-access-55zqx\") pod \"multus-additional-cni-plugins-26zbd\" (UID: \"592c6351-c252-4c19-b3b1-167096be2de9\") " pod="openshift-multus/multus-additional-cni-plugins-26zbd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.182753 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9b4595ac-c521-4ada-950d-e1b01cdff99b-hostroot\") pod \"multus-c9xzd\" (UID: \"9b4595ac-c521-4ada-950d-e1b01cdff99b\") " pod="openshift-multus/multus-c9xzd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.182768 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9b4595ac-c521-4ada-950d-e1b01cdff99b-multus-daemon-config\") pod \"multus-c9xzd\" (UID: \"9b4595ac-c521-4ada-950d-e1b01cdff99b\") " pod="openshift-multus/multus-c9xzd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.182783 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9b4595ac-c521-4ada-950d-e1b01cdff99b-host-run-netns\") pod \"multus-c9xzd\" (UID: \"9b4595ac-c521-4ada-950d-e1b01cdff99b\") " pod="openshift-multus/multus-c9xzd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.182799 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9b4595ac-c521-4ada-950d-e1b01cdff99b-host-run-multus-certs\") pod \"multus-c9xzd\" (UID: \"9b4595ac-c521-4ada-950d-e1b01cdff99b\") " pod="openshift-multus/multus-c9xzd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.182815 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ks8d\" (UniqueName: \"kubernetes.io/projected/067f44ac-9e60-4581-87cc-f2e1c823fc4c-kube-api-access-5ks8d\") pod \"node-resolver-w46bt\" (UID: \"067f44ac-9e60-4581-87cc-f2e1c823fc4c\") " pod="openshift-dns/node-resolver-w46bt" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.182830 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/592c6351-c252-4c19-b3b1-167096be2de9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-26zbd\" (UID: \"592c6351-c252-4c19-b3b1-167096be2de9\") " pod="openshift-multus/multus-additional-cni-plugins-26zbd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.182844 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.182860 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-run-systemd\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.182877 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-var-lib-openvswitch\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.182893 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-host-cni-bin\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.182908 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9b4595ac-c521-4ada-950d-e1b01cdff99b-os-release\") pod \"multus-c9xzd\" (UID: \"9b4595ac-c521-4ada-950d-e1b01cdff99b\") " pod="openshift-multus/multus-c9xzd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.182923 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-host-slash\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.182937 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-host-cni-netd\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.182972 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9b4595ac-c521-4ada-950d-e1b01cdff99b-host-var-lib-kubelet\") pod \"multus-c9xzd\" (UID: \"9b4595ac-c521-4ada-950d-e1b01cdff99b\") " pod="openshift-multus/multus-c9xzd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.182991 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zv6rh\" (UniqueName: \"kubernetes.io/projected/9b4595ac-c521-4ada-950d-e1b01cdff99b-kube-api-access-zv6rh\") pod \"multus-c9xzd\" (UID: \"9b4595ac-c521-4ada-950d-e1b01cdff99b\") " pod="openshift-multus/multus-c9xzd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183007 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-host-run-netns\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183024 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/592c6351-c252-4c19-b3b1-167096be2de9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-26zbd\" (UID: \"592c6351-c252-4c19-b3b1-167096be2de9\") " pod="openshift-multus/multus-additional-cni-plugins-26zbd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183039 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9b4595ac-c521-4ada-950d-e1b01cdff99b-host-var-lib-cni-bin\") pod \"multus-c9xzd\" (UID: \"9b4595ac-c521-4ada-950d-e1b01cdff99b\") " pod="openshift-multus/multus-c9xzd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183055 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-host-run-ovn-kubernetes\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183085 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqgzp\" (UniqueName: \"kubernetes.io/projected/fdb7cedc-b2e3-48f0-80e0-e17073b43228-kube-api-access-mqgzp\") pod \"machine-config-daemon-znglx\" (UID: \"fdb7cedc-b2e3-48f0-80e0-e17073b43228\") " pod="openshift-machine-config-operator/machine-config-daemon-znglx" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183108 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9b4595ac-c521-4ada-950d-e1b01cdff99b-etc-kubernetes\") pod \"multus-c9xzd\" (UID: \"9b4595ac-c521-4ada-950d-e1b01cdff99b\") " pod="openshift-multus/multus-c9xzd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183125 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/067f44ac-9e60-4581-87cc-f2e1c823fc4c-hosts-file\") pod \"node-resolver-w46bt\" (UID: \"067f44ac-9e60-4581-87cc-f2e1c823fc4c\") " pod="openshift-dns/node-resolver-w46bt" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183141 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-node-log\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183157 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/592c6351-c252-4c19-b3b1-167096be2de9-system-cni-dir\") pod \"multus-additional-cni-plugins-26zbd\" (UID: \"592c6351-c252-4c19-b3b1-167096be2de9\") " pod="openshift-multus/multus-additional-cni-plugins-26zbd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183176 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-log-socket\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183193 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183209 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cd3723d-a12f-4c7c-a1ea-63bfef3c931a-serviceca\") pod \"node-ca-q5qkb\" (UID: \"3cd3723d-a12f-4c7c-a1ea-63bfef3c931a\") " pod="openshift-image-registry/node-ca-q5qkb" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183222 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fdb7cedc-b2e3-48f0-80e0-e17073b43228-proxy-tls\") pod \"machine-config-daemon-znglx\" (UID: \"fdb7cedc-b2e3-48f0-80e0-e17073b43228\") " pod="openshift-machine-config-operator/machine-config-daemon-znglx" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183237 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9b4595ac-c521-4ada-950d-e1b01cdff99b-host-var-lib-cni-multus\") pod \"multus-c9xzd\" (UID: \"9b4595ac-c521-4ada-950d-e1b01cdff99b\") " pod="openshift-multus/multus-c9xzd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183252 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c45cw\" (UniqueName: \"kubernetes.io/projected/3cd3723d-a12f-4c7c-a1ea-63bfef3c931a-kube-api-access-c45cw\") pod \"node-ca-q5qkb\" (UID: \"3cd3723d-a12f-4c7c-a1ea-63bfef3c931a\") " pod="openshift-image-registry/node-ca-q5qkb" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183269 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/653a41bb-bb1d-421c-a92b-7f2811d95edf-ovnkube-config\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183295 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/653a41bb-bb1d-421c-a92b-7f2811d95edf-env-overrides\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183336 4922 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183349 4922 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183382 4922 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183391 4922 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183401 4922 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183411 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183421 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183431 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183443 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183452 4922 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183462 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183481 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183491 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183501 4922 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183511 4922 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183522 4922 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183532 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183542 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183553 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183563 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183573 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183583 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183594 4922 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183604 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183615 4922 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183625 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183635 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183645 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183655 4922 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183665 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183675 4922 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183684 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183694 4922 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183704 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183714 4922 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183725 4922 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183735 4922 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183745 4922 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183756 4922 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183766 4922 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183777 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183787 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183796 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183805 4922 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183814 4922 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183824 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183855 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183866 4922 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183875 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183883 4922 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183892 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183901 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183911 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183919 4922 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183928 4922 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183937 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183945 4922 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183953 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183961 4922 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183970 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183980 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183988 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.183997 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.184006 4922 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.184015 4922 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.184023 4922 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.184031 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.184041 4922 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.184050 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.184059 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.184068 4922 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.184077 4922 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.184085 4922 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.184094 4922 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.184103 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.184111 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.184120 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.184128 4922 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.184137 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.184145 4922 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.184153 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.184161 4922 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.184170 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.184178 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.184187 4922 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.184195 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.184203 4922 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.184213 4922 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.184223 4922 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.184232 4922 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.184240 4922 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.184248 4922 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.184402 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9b4595ac-c521-4ada-950d-e1b01cdff99b-host-run-multus-certs\") pod \"multus-c9xzd\" (UID: \"9b4595ac-c521-4ada-950d-e1b01cdff99b\") " pod="openshift-multus/multus-c9xzd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.184442 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-systemd-units\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.184466 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-run-ovn\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.184816 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-run-openvswitch\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.184789 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.184887 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-host-slash\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.184945 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.184965 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-host-cni-netd\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.185020 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9b4595ac-c521-4ada-950d-e1b01cdff99b-host-var-lib-kubelet\") pod \"multus-c9xzd\" (UID: \"9b4595ac-c521-4ada-950d-e1b01cdff99b\") " pod="openshift-multus/multus-c9xzd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.185036 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/653a41bb-bb1d-421c-a92b-7f2811d95edf-ovnkube-script-lib\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.185096 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3cd3723d-a12f-4c7c-a1ea-63bfef3c931a-host\") pod \"node-ca-q5qkb\" (UID: \"3cd3723d-a12f-4c7c-a1ea-63bfef3c931a\") " pod="openshift-image-registry/node-ca-q5qkb" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.185101 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-host-run-netns\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.185171 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-etc-openvswitch\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.184836 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9b4595ac-c521-4ada-950d-e1b01cdff99b-host-var-lib-cni-bin\") pod \"multus-c9xzd\" (UID: \"9b4595ac-c521-4ada-950d-e1b01cdff99b\") " pod="openshift-multus/multus-c9xzd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.185215 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9b4595ac-c521-4ada-950d-e1b01cdff99b-hostroot\") pod \"multus-c9xzd\" (UID: \"9b4595ac-c521-4ada-950d-e1b01cdff99b\") " pod="openshift-multus/multus-c9xzd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.185217 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/592c6351-c252-4c19-b3b1-167096be2de9-os-release\") pod \"multus-additional-cni-plugins-26zbd\" (UID: \"592c6351-c252-4c19-b3b1-167096be2de9\") " pod="openshift-multus/multus-additional-cni-plugins-26zbd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.185277 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/fdb7cedc-b2e3-48f0-80e0-e17073b43228-rootfs\") pod \"machine-config-daemon-znglx\" (UID: \"fdb7cedc-b2e3-48f0-80e0-e17073b43228\") " pod="openshift-machine-config-operator/machine-config-daemon-znglx" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.185333 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/592c6351-c252-4c19-b3b1-167096be2de9-cnibin\") pod \"multus-additional-cni-plugins-26zbd\" (UID: \"592c6351-c252-4c19-b3b1-167096be2de9\") " pod="openshift-multus/multus-additional-cni-plugins-26zbd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.185417 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9b4595ac-c521-4ada-950d-e1b01cdff99b-cnibin\") pod \"multus-c9xzd\" (UID: \"9b4595ac-c521-4ada-950d-e1b01cdff99b\") " pod="openshift-multus/multus-c9xzd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.185472 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9b4595ac-c521-4ada-950d-e1b01cdff99b-multus-socket-dir-parent\") pod \"multus-c9xzd\" (UID: \"9b4595ac-c521-4ada-950d-e1b01cdff99b\") " pod="openshift-multus/multus-c9xzd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.185513 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9b4595ac-c521-4ada-950d-e1b01cdff99b-host-run-k8s-cni-cncf-io\") pod \"multus-c9xzd\" (UID: \"9b4595ac-c521-4ada-950d-e1b01cdff99b\") " pod="openshift-multus/multus-c9xzd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.185549 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.185587 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-var-lib-openvswitch\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.185573 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9b4595ac-c521-4ada-950d-e1b01cdff99b-os-release\") pod \"multus-c9xzd\" (UID: \"9b4595ac-c521-4ada-950d-e1b01cdff99b\") " pod="openshift-multus/multus-c9xzd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.185625 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-run-systemd\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.185756 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-host-kubelet\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.185772 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-host-cni-bin\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.185795 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9b4595ac-c521-4ada-950d-e1b01cdff99b-multus-conf-dir\") pod \"multus-c9xzd\" (UID: \"9b4595ac-c521-4ada-950d-e1b01cdff99b\") " pod="openshift-multus/multus-c9xzd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.185816 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/592c6351-c252-4c19-b3b1-167096be2de9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-26zbd\" (UID: \"592c6351-c252-4c19-b3b1-167096be2de9\") " pod="openshift-multus/multus-additional-cni-plugins-26zbd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.185874 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9b4595ac-c521-4ada-950d-e1b01cdff99b-system-cni-dir\") pod \"multus-c9xzd\" (UID: \"9b4595ac-c521-4ada-950d-e1b01cdff99b\") " pod="openshift-multus/multus-c9xzd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.185886 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-log-socket\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.185941 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-host-run-ovn-kubernetes\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.185854 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9b4595ac-c521-4ada-950d-e1b01cdff99b-host-run-netns\") pod \"multus-c9xzd\" (UID: \"9b4595ac-c521-4ada-950d-e1b01cdff99b\") " pod="openshift-multus/multus-c9xzd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.186003 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9b4595ac-c521-4ada-950d-e1b01cdff99b-multus-cni-dir\") pod \"multus-c9xzd\" (UID: \"9b4595ac-c521-4ada-950d-e1b01cdff99b\") " pod="openshift-multus/multus-c9xzd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.186063 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9b4595ac-c521-4ada-950d-e1b01cdff99b-host-var-lib-cni-multus\") pod \"multus-c9xzd\" (UID: \"9b4595ac-c521-4ada-950d-e1b01cdff99b\") " pod="openshift-multus/multus-c9xzd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.186070 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/653a41bb-bb1d-421c-a92b-7f2811d95edf-env-overrides\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.186086 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/592c6351-c252-4c19-b3b1-167096be2de9-cni-binary-copy\") pod \"multus-additional-cni-plugins-26zbd\" (UID: \"592c6351-c252-4c19-b3b1-167096be2de9\") " pod="openshift-multus/multus-additional-cni-plugins-26zbd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.186106 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-node-log\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.186096 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9b4595ac-c521-4ada-950d-e1b01cdff99b-etc-kubernetes\") pod \"multus-c9xzd\" (UID: \"9b4595ac-c521-4ada-950d-e1b01cdff99b\") " pod="openshift-multus/multus-c9xzd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.186070 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/067f44ac-9e60-4581-87cc-f2e1c823fc4c-hosts-file\") pod \"node-resolver-w46bt\" (UID: \"067f44ac-9e60-4581-87cc-f2e1c823fc4c\") " pod="openshift-dns/node-resolver-w46bt" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.186309 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.186345 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/592c6351-c252-4c19-b3b1-167096be2de9-system-cni-dir\") pod \"multus-additional-cni-plugins-26zbd\" (UID: \"592c6351-c252-4c19-b3b1-167096be2de9\") " pod="openshift-multus/multus-additional-cni-plugins-26zbd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.186430 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9b4595ac-c521-4ada-950d-e1b01cdff99b-multus-daemon-config\") pod \"multus-c9xzd\" (UID: \"9b4595ac-c521-4ada-950d-e1b01cdff99b\") " pod="openshift-multus/multus-c9xzd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.187008 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/653a41bb-bb1d-421c-a92b-7f2811d95edf-ovnkube-config\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.187305 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cd3723d-a12f-4c7c-a1ea-63bfef3c931a-serviceca\") pod \"node-ca-q5qkb\" (UID: \"3cd3723d-a12f-4c7c-a1ea-63bfef3c931a\") " pod="openshift-image-registry/node-ca-q5qkb" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.187938 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fdb7cedc-b2e3-48f0-80e0-e17073b43228-mcd-auth-proxy-config\") pod \"machine-config-daemon-znglx\" (UID: \"fdb7cedc-b2e3-48f0-80e0-e17073b43228\") " pod="openshift-machine-config-operator/machine-config-daemon-znglx" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.188055 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/592c6351-c252-4c19-b3b1-167096be2de9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-26zbd\" (UID: \"592c6351-c252-4c19-b3b1-167096be2de9\") " pod="openshift-multus/multus-additional-cni-plugins-26zbd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.188198 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9b4595ac-c521-4ada-950d-e1b01cdff99b-cni-binary-copy\") pod \"multus-c9xzd\" (UID: \"9b4595ac-c521-4ada-950d-e1b01cdff99b\") " pod="openshift-multus/multus-c9xzd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.188960 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/653a41bb-bb1d-421c-a92b-7f2811d95edf-ovn-node-metrics-cert\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.192098 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fdb7cedc-b2e3-48f0-80e0-e17073b43228-proxy-tls\") pod \"machine-config-daemon-znglx\" (UID: \"fdb7cedc-b2e3-48f0-80e0-e17073b43228\") " pod="openshift-machine-config-operator/machine-config-daemon-znglx" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.208001 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9xzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b4595ac-c521-4ada-950d-e1b01cdff99b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zv6rh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9xzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.211320 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55zqx\" (UniqueName: \"kubernetes.io/projected/592c6351-c252-4c19-b3b1-167096be2de9-kube-api-access-55zqx\") pod \"multus-additional-cni-plugins-26zbd\" (UID: \"592c6351-c252-4c19-b3b1-167096be2de9\") " pod="openshift-multus/multus-additional-cni-plugins-26zbd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.212019 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqgzp\" (UniqueName: \"kubernetes.io/projected/fdb7cedc-b2e3-48f0-80e0-e17073b43228-kube-api-access-mqgzp\") pod \"machine-config-daemon-znglx\" (UID: \"fdb7cedc-b2e3-48f0-80e0-e17073b43228\") " pod="openshift-machine-config-operator/machine-config-daemon-znglx" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.212110 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c45cw\" (UniqueName: \"kubernetes.io/projected/3cd3723d-a12f-4c7c-a1ea-63bfef3c931a-kube-api-access-c45cw\") pod \"node-ca-q5qkb\" (UID: \"3cd3723d-a12f-4c7c-a1ea-63bfef3c931a\") " pod="openshift-image-registry/node-ca-q5qkb" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.212741 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26p2s\" (UniqueName: \"kubernetes.io/projected/653a41bb-bb1d-421c-a92b-7f2811d95edf-kube-api-access-26p2s\") pod \"ovnkube-node-wg4r5\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.212771 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv6rh\" (UniqueName: \"kubernetes.io/projected/9b4595ac-c521-4ada-950d-e1b01cdff99b-kube-api-access-zv6rh\") pod \"multus-c9xzd\" (UID: \"9b4595ac-c521-4ada-950d-e1b01cdff99b\") " pod="openshift-multus/multus-c9xzd" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.214210 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ks8d\" (UniqueName: \"kubernetes.io/projected/067f44ac-9e60-4581-87cc-f2e1c823fc4c-kube-api-access-5ks8d\") pod \"node-resolver-w46bt\" (UID: \"067f44ac-9e60-4581-87cc-f2e1c823fc4c\") " pod="openshift-dns/node-resolver-w46bt" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.219617 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.228635 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdb7cedc-b2e3-48f0-80e0-e17073b43228\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-znglx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.234140 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.241348 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.248330 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-w46bt" Feb 18 11:37:00 crc kubenswrapper[4922]: W0218 11:37:00.248590 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-91a352607919a46845c4d118bbbb9255f0c9a7fa067c63dc230d7e24435313c3 WatchSource:0}: Error finding container 91a352607919a46845c4d118bbbb9255f0c9a7fa067c63dc230d7e24435313c3: Status 404 returned error can't find the container with id 91a352607919a46845c4d118bbbb9255f0c9a7fa067c63dc230d7e24435313c3 Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.251493 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"653a41bb-bb1d-421c-a92b-7f2811d95edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wg4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.258200 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-26zbd" Feb 18 11:37:00 crc kubenswrapper[4922]: W0218 11:37:00.258831 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-f9c954d8a26388280d54536ac050bf4b9fc46a86a540e8ea8272bbb404b2be12 WatchSource:0}: Error finding container f9c954d8a26388280d54536ac050bf4b9fc46a86a540e8ea8272bbb404b2be12: Status 404 returned error can't find the container with id f9c954d8a26388280d54536ac050bf4b9fc46a86a540e8ea8272bbb404b2be12 Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.263478 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-c9xzd" Feb 18 11:37:00 crc kubenswrapper[4922]: W0218 11:37:00.263890 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod067f44ac_9e60_4581_87cc_f2e1c823fc4c.slice/crio-5b7c4d01264221ec01062dd316e3f65f1514dda9ec238d45b751f38f58ca7e76 WatchSource:0}: Error finding container 5b7c4d01264221ec01062dd316e3f65f1514dda9ec238d45b751f38f58ca7e76: Status 404 returned error can't find the container with id 5b7c4d01264221ec01062dd316e3f65f1514dda9ec238d45b751f38f58ca7e76 Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.272011 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.273775 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bbe0881c-3173-4207-adb8-72c0dd4fa9c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e73616098bbc0fcdcd0b7e6b5bf610a7c193bdb65922e4532ce9553e9fc6e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71f2a53a03eafd89329b955a7edbe71c610706392e0a07fa09b9a6247889913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cac665b2d5a707e29339e7e8afa75dffcfd1dc1f42262e2643b2487ae68558a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0517235f78003f9b1a9044287c23aebed811299dc48f24c6643dd3bfcf5cf1d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a0c8c34dd5237a5b2d52b436f5c4deabd96a96ccb8073961dc49aaffd62c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.279994 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-q5qkb" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.282970 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5qkb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cd3723d-a12f-4c7c-a1ea-63bfef3c931a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c45cw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5qkb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.286979 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-znglx" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.293970 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.295017 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:37:00 crc kubenswrapper[4922]: W0218 11:37:00.301069 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b4595ac_c521_4ada_950d_e1b01cdff99b.slice/crio-369d3ce9933d0ba2aa5337839d9116d19adb4c543b8bd377f6a5a4cf7dbcd90d WatchSource:0}: Error finding container 369d3ce9933d0ba2aa5337839d9116d19adb4c543b8bd377f6a5a4cf7dbcd90d: Status 404 returned error can't find the container with id 369d3ce9933d0ba2aa5337839d9116d19adb4c543b8bd377f6a5a4cf7dbcd90d Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.309545 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26zbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592c6351-c252-4c19-b3b1-167096be2de9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26zbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:37:00 crc kubenswrapper[4922]: W0218 11:37:00.314805 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-54336f292dc20f541fbc275cfb15270551b4b10cabac510fcffaa8364cc0a3cc WatchSource:0}: Error finding container 54336f292dc20f541fbc275cfb15270551b4b10cabac510fcffaa8364cc0a3cc: Status 404 returned error can't find the container with id 54336f292dc20f541fbc275cfb15270551b4b10cabac510fcffaa8364cc0a3cc Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.324406 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"384d508e-8f6a-427d-a85d-85bb61d8405e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dacf3659cbb51e3d8df303470837041f3110f38b42187d40cbb4e53a8750cafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab832676116da0ae3d9dc70d182e283d7ae2418db02c1ea0062812cab0d71b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://434010c9772cf1c9db5578ecd4726ae2de0c0a3923d45b522868677cf78fcaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0218 11:36:58.792829 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0218 11:36:58.792990 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0218 11:36:58.794119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-56419459/tls.crt::/tmp/serving-cert-56419459/tls.key\\\\\\\"\\\\nI0218 11:36:58.994875 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0218 11:36:59.009974 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0218 11:36:59.010010 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0218 11:36:59.010039 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0218 11:36:59.010047 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0218 11:36:59.021309 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0218 11:36:59.021338 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021343 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021347 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0218 11:36:59.021350 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0218 11:36:59.021353 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:36:59.021356 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:36:59.021636 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0218 11:36:59.024101 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b891a2fe938b437cbfb28c7715d0ad758541d844a4768434d3c1f1a3f0578c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.339961 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:37:00 crc kubenswrapper[4922]: W0218 11:37:00.349629 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3cd3723d_a12f_4c7c_a1ea_63bfef3c931a.slice/crio-fea43d3481437797091f71baf35e55a9212d9c3819a4ee4d5a880815d5152258 WatchSource:0}: Error finding container fea43d3481437797091f71baf35e55a9212d9c3819a4ee4d5a880815d5152258: Status 404 returned error can't find the container with id fea43d3481437797091f71baf35e55a9212d9c3819a4ee4d5a880815d5152258 Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.360510 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.382109 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.398716 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w46bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"067f44ac-9e60-4581-87cc-f2e1c823fc4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ks8d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w46bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:37:00 crc kubenswrapper[4922]: W0218 11:37:00.432377 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod653a41bb_bb1d_421c_a92b_7f2811d95edf.slice/crio-925e7e67a4fbe78eeed080fb6248ee3ea896e9f6ac32d11416ce15ab5e43d0fb WatchSource:0}: Error finding container 925e7e67a4fbe78eeed080fb6248ee3ea896e9f6ac32d11416ce15ab5e43d0fb: Status 404 returned error can't find the container with id 925e7e67a4fbe78eeed080fb6248ee3ea896e9f6ac32d11416ce15ab5e43d0fb Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.587963 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:37:00 crc kubenswrapper[4922]: E0218 11:37:00.588163 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:37:01.588146578 +0000 UTC m=+23.315850658 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.688728 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.688781 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.688817 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.688837 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:37:00 crc kubenswrapper[4922]: E0218 11:37:00.688908 4922 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 11:37:00 crc kubenswrapper[4922]: E0218 11:37:00.688943 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 11:37:00 crc kubenswrapper[4922]: E0218 11:37:00.688973 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 11:37:00 crc kubenswrapper[4922]: E0218 11:37:00.688956 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 11:37:00 crc kubenswrapper[4922]: E0218 11:37:00.689003 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 11:37:00 crc kubenswrapper[4922]: E0218 11:37:00.689015 4922 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:37:00 crc kubenswrapper[4922]: E0218 11:37:00.689053 4922 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 11:37:00 crc kubenswrapper[4922]: E0218 11:37:00.688985 4922 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:37:00 crc kubenswrapper[4922]: E0218 11:37:00.689626 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 11:37:01.688955207 +0000 UTC m=+23.416659287 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 11:37:00 crc kubenswrapper[4922]: E0218 11:37:00.689708 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 11:37:01.689694764 +0000 UTC m=+23.417398844 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 11:37:00 crc kubenswrapper[4922]: E0218 11:37:00.689736 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-18 11:37:01.689727425 +0000 UTC m=+23.417431505 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:37:00 crc kubenswrapper[4922]: E0218 11:37:00.690420 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-18 11:37:01.689747535 +0000 UTC m=+23.417451615 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.922741 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 11:34:29.173361707 +0000 UTC Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.976699 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.978043 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.980275 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.981672 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.982809 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.983772 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.984786 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.985987 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.987137 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.988045 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.989186 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.990418 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.991159 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.992546 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.993182 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.993843 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.994550 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.994971 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.995552 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.996174 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.996682 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.997244 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.997754 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.998704 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.999174 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 18 11:37:00 crc kubenswrapper[4922]: I0218 11:37:00.999798 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.000556 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.003078 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.005130 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.007286 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.008005 4922 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.008165 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.010169 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.011150 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.011885 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.013522 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.014560 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.015344 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.016345 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.017346 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.018983 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.019935 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.021830 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.022942 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.024292 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.026570 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.028548 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.029673 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.030559 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.031348 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.032118 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.033076 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.034130 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.035115 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.131860 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" event={"ID":"fdb7cedc-b2e3-48f0-80e0-e17073b43228","Type":"ContainerStarted","Data":"6c29cb76dded770e8691cd270c1c1197ffd3764c3fa73faf7f072a63c9e3e4b2"} Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.131919 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" event={"ID":"fdb7cedc-b2e3-48f0-80e0-e17073b43228","Type":"ContainerStarted","Data":"653a3be8615ffc467d4be70b77ab62c837c24df920515a9965ec20ce2941d86b"} Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.131935 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" event={"ID":"fdb7cedc-b2e3-48f0-80e0-e17073b43228","Type":"ContainerStarted","Data":"184f28cf2d4378c04e9175430295b3af9bc0d81faedb6ccbf913076a666644cb"} Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.133500 4922 generic.go:334] "Generic (PLEG): container finished" podID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerID="64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794" exitCode=0 Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.133579 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" event={"ID":"653a41bb-bb1d-421c-a92b-7f2811d95edf","Type":"ContainerDied","Data":"64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794"} Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.133630 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" event={"ID":"653a41bb-bb1d-421c-a92b-7f2811d95edf","Type":"ContainerStarted","Data":"925e7e67a4fbe78eeed080fb6248ee3ea896e9f6ac32d11416ce15ab5e43d0fb"} Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.142655 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"65825dc8bc12fe9933ff1392fd17a19042bf62d65c0b1ba22303094881745e25"} Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.142718 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"91a352607919a46845c4d118bbbb9255f0c9a7fa067c63dc230d7e24435313c3"} Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.145591 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-w46bt" event={"ID":"067f44ac-9e60-4581-87cc-f2e1c823fc4c","Type":"ContainerStarted","Data":"c4c52fc2f1b105e90957f8cd7424352990f9ec9e5350d1a501980bfda1613066"} Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.145632 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-w46bt" event={"ID":"067f44ac-9e60-4581-87cc-f2e1c823fc4c","Type":"ContainerStarted","Data":"5b7c4d01264221ec01062dd316e3f65f1514dda9ec238d45b751f38f58ca7e76"} Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.148476 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"454f72313609f075cd3f659b19279f8c3b07e8fea70f0ca7c88c5ca8e2e57397"} Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.148515 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e0577fe2fd5192d4ec36ec8bfc6259a533b6422ef672d8153f601c2ac96bb5bb"} Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.148528 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"f9c954d8a26388280d54536ac050bf4b9fc46a86a540e8ea8272bbb404b2be12"} Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.150915 4922 generic.go:334] "Generic (PLEG): container finished" podID="592c6351-c252-4c19-b3b1-167096be2de9" containerID="b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e" exitCode=0 Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.150974 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-26zbd" event={"ID":"592c6351-c252-4c19-b3b1-167096be2de9","Type":"ContainerDied","Data":"b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e"} Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.150993 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-26zbd" event={"ID":"592c6351-c252-4c19-b3b1-167096be2de9","Type":"ContainerStarted","Data":"279f20a7de4fb8ce518321ba3f9ea1dfd8f527c83b87cfc76af3a8271e76a690"} Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.153920 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"54336f292dc20f541fbc275cfb15270551b4b10cabac510fcffaa8364cc0a3cc"} Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.154548 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bbe0881c-3173-4207-adb8-72c0dd4fa9c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e73616098bbc0fcdcd0b7e6b5bf610a7c193bdb65922e4532ce9553e9fc6e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71f2a53a03eafd89329b955a7edbe71c610706392e0a07fa09b9a6247889913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cac665b2d5a707e29339e7e8afa75dffcfd1dc1f42262e2643b2487ae68558a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0517235f78003f9b1a9044287c23aebed811299dc48f24c6643dd3bfcf5cf1d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a0c8c34dd5237a5b2d52b436f5c4deabd96a96ccb8073961dc49aaffd62c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.156294 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-q5qkb" event={"ID":"3cd3723d-a12f-4c7c-a1ea-63bfef3c931a","Type":"ContainerStarted","Data":"61ff8efc7bd7cb5045841f85767cea766898289dab6f4081829d530fbaaf9dcf"} Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.156378 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-q5qkb" event={"ID":"3cd3723d-a12f-4c7c-a1ea-63bfef3c931a","Type":"ContainerStarted","Data":"fea43d3481437797091f71baf35e55a9212d9c3819a4ee4d5a880815d5152258"} Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.160894 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-c9xzd" event={"ID":"9b4595ac-c521-4ada-950d-e1b01cdff99b","Type":"ContainerStarted","Data":"83f784f6e4f98fea5928c944062388eaa69d5a5782490bdb120cd735f54f52df"} Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.160936 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-c9xzd" event={"ID":"9b4595ac-c521-4ada-950d-e1b01cdff99b","Type":"ContainerStarted","Data":"369d3ce9933d0ba2aa5337839d9116d19adb4c543b8bd377f6a5a4cf7dbcd90d"} Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.168744 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.181174 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9xzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b4595ac-c521-4ada-950d-e1b01cdff99b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zv6rh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9xzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.191929 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.201862 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdb7cedc-b2e3-48f0-80e0-e17073b43228\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c29cb76dded770e8691cd270c1c1197ffd3764c3fa73faf7f072a63c9e3e4b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653a3be8615ffc467d4be70b77ab62c837c24df920515a9965ec20ce2941d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-znglx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.224076 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"653a41bb-bb1d-421c-a92b-7f2811d95edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wg4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.234250 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5qkb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cd3723d-a12f-4c7c-a1ea-63bfef3c931a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c45cw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5qkb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.247583 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"384d508e-8f6a-427d-a85d-85bb61d8405e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dacf3659cbb51e3d8df303470837041f3110f38b42187d40cbb4e53a8750cafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab832676116da0ae3d9dc70d182e283d7ae2418db02c1ea0062812cab0d71b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://434010c9772cf1c9db5578ecd4726ae2de0c0a3923d45b522868677cf78fcaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0218 11:36:58.792829 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0218 11:36:58.792990 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0218 11:36:58.794119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-56419459/tls.crt::/tmp/serving-cert-56419459/tls.key\\\\\\\"\\\\nI0218 11:36:58.994875 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0218 11:36:59.009974 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0218 11:36:59.010010 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0218 11:36:59.010039 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0218 11:36:59.010047 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0218 11:36:59.021309 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0218 11:36:59.021338 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021343 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021347 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0218 11:36:59.021350 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0218 11:36:59.021353 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:36:59.021356 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:36:59.021636 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0218 11:36:59.024101 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b891a2fe938b437cbfb28c7715d0ad758541d844a4768434d3c1f1a3f0578c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.258493 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.271703 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26zbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592c6351-c252-4c19-b3b1-167096be2de9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26zbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.290186 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:01Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.306175 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:01Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.318770 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:01Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.339521 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w46bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"067f44ac-9e60-4581-87cc-f2e1c823fc4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ks8d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w46bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:01Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.354006 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"384d508e-8f6a-427d-a85d-85bb61d8405e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dacf3659cbb51e3d8df303470837041f3110f38b42187d40cbb4e53a8750cafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab832676116da0ae3d9dc70d182e283d7ae2418db02c1ea0062812cab0d71b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://434010c9772cf1c9db5578ecd4726ae2de0c0a3923d45b522868677cf78fcaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0218 11:36:58.792829 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0218 11:36:58.792990 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0218 11:36:58.794119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-56419459/tls.crt::/tmp/serving-cert-56419459/tls.key\\\\\\\"\\\\nI0218 11:36:58.994875 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0218 11:36:59.009974 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0218 11:36:59.010010 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0218 11:36:59.010039 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0218 11:36:59.010047 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0218 11:36:59.021309 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0218 11:36:59.021338 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021343 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021347 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0218 11:36:59.021350 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0218 11:36:59.021353 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:36:59.021356 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:36:59.021636 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0218 11:36:59.024101 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b891a2fe938b437cbfb28c7715d0ad758541d844a4768434d3c1f1a3f0578c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:01Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.374381 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://454f72313609f075cd3f659b19279f8c3b07e8fea70f0ca7c88c5ca8e2e57397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0577fe2fd5192d4ec36ec8bfc6259a533b6422ef672d8153f601c2ac96bb5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:01Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.398679 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26zbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592c6351-c252-4c19-b3b1-167096be2de9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26zbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:01Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.416624 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:01Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.430113 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:01Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.453585 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:01Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.466135 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w46bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"067f44ac-9e60-4581-87cc-f2e1c823fc4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c52fc2f1b105e90957f8cd7424352990f9ec9e5350d1a501980bfda1613066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ks8d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w46bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:01Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.482759 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:01Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.494759 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdb7cedc-b2e3-48f0-80e0-e17073b43228\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c29cb76dded770e8691cd270c1c1197ffd3764c3fa73faf7f072a63c9e3e4b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653a3be8615ffc467d4be70b77ab62c837c24df920515a9965ec20ce2941d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-znglx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:01Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.514005 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"653a41bb-bb1d-421c-a92b-7f2811d95edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wg4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:01Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.538344 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bbe0881c-3173-4207-adb8-72c0dd4fa9c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e73616098bbc0fcdcd0b7e6b5bf610a7c193bdb65922e4532ce9553e9fc6e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71f2a53a03eafd89329b955a7edbe71c610706392e0a07fa09b9a6247889913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cac665b2d5a707e29339e7e8afa75dffcfd1dc1f42262e2643b2487ae68558a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0517235f78003f9b1a9044287c23aebed811299dc48f24c6643dd3bfcf5cf1d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a0c8c34dd5237a5b2d52b436f5c4deabd96a96ccb8073961dc49aaffd62c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:01Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.551053 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65825dc8bc12fe9933ff1392fd17a19042bf62d65c0b1ba22303094881745e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:01Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.569729 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9xzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b4595ac-c521-4ada-950d-e1b01cdff99b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f784f6e4f98fea5928c944062388eaa69d5a5782490bdb120cd735f54f52df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zv6rh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9xzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:01Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.581141 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5qkb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cd3723d-a12f-4c7c-a1ea-63bfef3c931a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ff8efc7bd7cb5045841f85767cea766898289dab6f4081829d530fbaaf9dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c45cw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5qkb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:01Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.603507 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:37:01 crc kubenswrapper[4922]: E0218 11:37:01.603681 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:37:03.603662828 +0000 UTC m=+25.331366908 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.704674 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.704728 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.704770 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.704798 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:37:01 crc kubenswrapper[4922]: E0218 11:37:01.704892 4922 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 11:37:01 crc kubenswrapper[4922]: E0218 11:37:01.704927 4922 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 11:37:01 crc kubenswrapper[4922]: E0218 11:37:01.704958 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 11:37:03.704942028 +0000 UTC m=+25.432646128 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 11:37:01 crc kubenswrapper[4922]: E0218 11:37:01.704961 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 11:37:01 crc kubenswrapper[4922]: E0218 11:37:01.704980 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 11:37:03.704967518 +0000 UTC m=+25.432671598 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 11:37:01 crc kubenswrapper[4922]: E0218 11:37:01.705000 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 11:37:01 crc kubenswrapper[4922]: E0218 11:37:01.705018 4922 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:37:01 crc kubenswrapper[4922]: E0218 11:37:01.704973 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 11:37:01 crc kubenswrapper[4922]: E0218 11:37:01.705088 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-18 11:37:03.705067471 +0000 UTC m=+25.432771611 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:37:01 crc kubenswrapper[4922]: E0218 11:37:01.705088 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 11:37:01 crc kubenswrapper[4922]: E0218 11:37:01.705115 4922 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:37:01 crc kubenswrapper[4922]: E0218 11:37:01.705153 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-18 11:37:03.705141482 +0000 UTC m=+25.432845562 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.923045 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 07:29:19.51094808 +0000 UTC Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.972608 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.972648 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:37:01 crc kubenswrapper[4922]: I0218 11:37:01.972664 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:37:01 crc kubenswrapper[4922]: E0218 11:37:01.972759 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:37:01 crc kubenswrapper[4922]: E0218 11:37:01.972809 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:37:01 crc kubenswrapper[4922]: E0218 11:37:01.972870 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:37:02 crc kubenswrapper[4922]: I0218 11:37:02.166247 4922 generic.go:334] "Generic (PLEG): container finished" podID="592c6351-c252-4c19-b3b1-167096be2de9" containerID="89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e" exitCode=0 Feb 18 11:37:02 crc kubenswrapper[4922]: I0218 11:37:02.166398 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-26zbd" event={"ID":"592c6351-c252-4c19-b3b1-167096be2de9","Type":"ContainerDied","Data":"89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e"} Feb 18 11:37:02 crc kubenswrapper[4922]: I0218 11:37:02.174046 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" event={"ID":"653a41bb-bb1d-421c-a92b-7f2811d95edf","Type":"ContainerStarted","Data":"11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6"} Feb 18 11:37:02 crc kubenswrapper[4922]: I0218 11:37:02.174102 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" event={"ID":"653a41bb-bb1d-421c-a92b-7f2811d95edf","Type":"ContainerStarted","Data":"90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16"} Feb 18 11:37:02 crc kubenswrapper[4922]: I0218 11:37:02.174116 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" event={"ID":"653a41bb-bb1d-421c-a92b-7f2811d95edf","Type":"ContainerStarted","Data":"eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b"} Feb 18 11:37:02 crc kubenswrapper[4922]: I0218 11:37:02.174127 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" event={"ID":"653a41bb-bb1d-421c-a92b-7f2811d95edf","Type":"ContainerStarted","Data":"3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844"} Feb 18 11:37:02 crc kubenswrapper[4922]: I0218 11:37:02.174138 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" event={"ID":"653a41bb-bb1d-421c-a92b-7f2811d95edf","Type":"ContainerStarted","Data":"fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7"} Feb 18 11:37:02 crc kubenswrapper[4922]: I0218 11:37:02.174152 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" event={"ID":"653a41bb-bb1d-421c-a92b-7f2811d95edf","Type":"ContainerStarted","Data":"9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e"} Feb 18 11:37:02 crc kubenswrapper[4922]: I0218 11:37:02.188199 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5qkb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cd3723d-a12f-4c7c-a1ea-63bfef3c931a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ff8efc7bd7cb5045841f85767cea766898289dab6f4081829d530fbaaf9dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c45cw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5qkb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:02Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:02 crc kubenswrapper[4922]: I0218 11:37:02.229684 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"384d508e-8f6a-427d-a85d-85bb61d8405e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dacf3659cbb51e3d8df303470837041f3110f38b42187d40cbb4e53a8750cafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab832676116da0ae3d9dc70d182e283d7ae2418db02c1ea0062812cab0d71b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://434010c9772cf1c9db5578ecd4726ae2de0c0a3923d45b522868677cf78fcaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0218 11:36:58.792829 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0218 11:36:58.792990 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0218 11:36:58.794119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-56419459/tls.crt::/tmp/serving-cert-56419459/tls.key\\\\\\\"\\\\nI0218 11:36:58.994875 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0218 11:36:59.009974 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0218 11:36:59.010010 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0218 11:36:59.010039 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0218 11:36:59.010047 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0218 11:36:59.021309 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0218 11:36:59.021338 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021343 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021347 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0218 11:36:59.021350 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0218 11:36:59.021353 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:36:59.021356 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:36:59.021636 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0218 11:36:59.024101 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b891a2fe938b437cbfb28c7715d0ad758541d844a4768434d3c1f1a3f0578c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:02Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:02 crc kubenswrapper[4922]: I0218 11:37:02.263448 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://454f72313609f075cd3f659b19279f8c3b07e8fea70f0ca7c88c5ca8e2e57397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0577fe2fd5192d4ec36ec8bfc6259a533b6422ef672d8153f601c2ac96bb5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:02Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:02 crc kubenswrapper[4922]: I0218 11:37:02.293298 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26zbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592c6351-c252-4c19-b3b1-167096be2de9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26zbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:02Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:02 crc kubenswrapper[4922]: I0218 11:37:02.310848 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:02Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:02 crc kubenswrapper[4922]: I0218 11:37:02.326506 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:02Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:02 crc kubenswrapper[4922]: I0218 11:37:02.342396 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:02Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:02 crc kubenswrapper[4922]: I0218 11:37:02.354178 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w46bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"067f44ac-9e60-4581-87cc-f2e1c823fc4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c52fc2f1b105e90957f8cd7424352990f9ec9e5350d1a501980bfda1613066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ks8d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w46bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:02Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:02 crc kubenswrapper[4922]: I0218 11:37:02.368792 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:02Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:02 crc kubenswrapper[4922]: I0218 11:37:02.390219 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdb7cedc-b2e3-48f0-80e0-e17073b43228\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c29cb76dded770e8691cd270c1c1197ffd3764c3fa73faf7f072a63c9e3e4b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653a3be8615ffc467d4be70b77ab62c837c24df920515a9965ec20ce2941d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-znglx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:02Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:02 crc kubenswrapper[4922]: I0218 11:37:02.425848 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"653a41bb-bb1d-421c-a92b-7f2811d95edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wg4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:02Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:02 crc kubenswrapper[4922]: I0218 11:37:02.446893 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bbe0881c-3173-4207-adb8-72c0dd4fa9c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e73616098bbc0fcdcd0b7e6b5bf610a7c193bdb65922e4532ce9553e9fc6e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71f2a53a03eafd89329b955a7edbe71c610706392e0a07fa09b9a6247889913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cac665b2d5a707e29339e7e8afa75dffcfd1dc1f42262e2643b2487ae68558a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0517235f78003f9b1a9044287c23aebed811299dc48f24c6643dd3bfcf5cf1d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a0c8c34dd5237a5b2d52b436f5c4deabd96a96ccb8073961dc49aaffd62c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:02Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:02 crc kubenswrapper[4922]: I0218 11:37:02.460444 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65825dc8bc12fe9933ff1392fd17a19042bf62d65c0b1ba22303094881745e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:02Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:02 crc kubenswrapper[4922]: I0218 11:37:02.475631 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9xzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b4595ac-c521-4ada-950d-e1b01cdff99b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f784f6e4f98fea5928c944062388eaa69d5a5782490bdb120cd735f54f52df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zv6rh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9xzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:02Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:02 crc kubenswrapper[4922]: I0218 11:37:02.895849 4922 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 18 11:37:02 crc kubenswrapper[4922]: I0218 11:37:02.923640 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 04:49:26.847003208 +0000 UTC Feb 18 11:37:03 crc kubenswrapper[4922]: I0218 11:37:03.179471 4922 generic.go:334] "Generic (PLEG): container finished" podID="592c6351-c252-4c19-b3b1-167096be2de9" containerID="6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00" exitCode=0 Feb 18 11:37:03 crc kubenswrapper[4922]: I0218 11:37:03.179558 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-26zbd" event={"ID":"592c6351-c252-4c19-b3b1-167096be2de9","Type":"ContainerDied","Data":"6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00"} Feb 18 11:37:03 crc kubenswrapper[4922]: I0218 11:37:03.182995 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"f3079844ef727cc58eb6253c7eaa0289ca84977a00334a10cdbf493b16133a04"} Feb 18 11:37:03 crc kubenswrapper[4922]: I0218 11:37:03.214045 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"653a41bb-bb1d-421c-a92b-7f2811d95edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wg4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:03Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:03 crc kubenswrapper[4922]: I0218 11:37:03.240830 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bbe0881c-3173-4207-adb8-72c0dd4fa9c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e73616098bbc0fcdcd0b7e6b5bf610a7c193bdb65922e4532ce9553e9fc6e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71f2a53a03eafd89329b955a7edbe71c610706392e0a07fa09b9a6247889913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cac665b2d5a707e29339e7e8afa75dffcfd1dc1f42262e2643b2487ae68558a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0517235f78003f9b1a9044287c23aebed811299dc48f24c6643dd3bfcf5cf1d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a0c8c34dd5237a5b2d52b436f5c4deabd96a96ccb8073961dc49aaffd62c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:03Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:03 crc kubenswrapper[4922]: I0218 11:37:03.258965 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65825dc8bc12fe9933ff1392fd17a19042bf62d65c0b1ba22303094881745e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:03Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:03 crc kubenswrapper[4922]: I0218 11:37:03.273752 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9xzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b4595ac-c521-4ada-950d-e1b01cdff99b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f784f6e4f98fea5928c944062388eaa69d5a5782490bdb120cd735f54f52df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zv6rh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9xzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:03Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:03 crc kubenswrapper[4922]: I0218 11:37:03.288487 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:03Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:03 crc kubenswrapper[4922]: I0218 11:37:03.300786 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdb7cedc-b2e3-48f0-80e0-e17073b43228\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c29cb76dded770e8691cd270c1c1197ffd3764c3fa73faf7f072a63c9e3e4b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653a3be8615ffc467d4be70b77ab62c837c24df920515a9965ec20ce2941d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-znglx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:03Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:03 crc kubenswrapper[4922]: I0218 11:37:03.313887 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5qkb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cd3723d-a12f-4c7c-a1ea-63bfef3c931a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ff8efc7bd7cb5045841f85767cea766898289dab6f4081829d530fbaaf9dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c45cw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5qkb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:03Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:03 crc kubenswrapper[4922]: I0218 11:37:03.330790 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"384d508e-8f6a-427d-a85d-85bb61d8405e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dacf3659cbb51e3d8df303470837041f3110f38b42187d40cbb4e53a8750cafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab832676116da0ae3d9dc70d182e283d7ae2418db02c1ea0062812cab0d71b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://434010c9772cf1c9db5578ecd4726ae2de0c0a3923d45b522868677cf78fcaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0218 11:36:58.792829 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0218 11:36:58.792990 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0218 11:36:58.794119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-56419459/tls.crt::/tmp/serving-cert-56419459/tls.key\\\\\\\"\\\\nI0218 11:36:58.994875 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0218 11:36:59.009974 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0218 11:36:59.010010 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0218 11:36:59.010039 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0218 11:36:59.010047 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0218 11:36:59.021309 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0218 11:36:59.021338 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021343 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021347 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0218 11:36:59.021350 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0218 11:36:59.021353 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:36:59.021356 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:36:59.021636 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0218 11:36:59.024101 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b891a2fe938b437cbfb28c7715d0ad758541d844a4768434d3c1f1a3f0578c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:03Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:03 crc kubenswrapper[4922]: I0218 11:37:03.344587 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://454f72313609f075cd3f659b19279f8c3b07e8fea70f0ca7c88c5ca8e2e57397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0577fe2fd5192d4ec36ec8bfc6259a533b6422ef672d8153f601c2ac96bb5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:03Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:03 crc kubenswrapper[4922]: I0218 11:37:03.360202 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26zbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592c6351-c252-4c19-b3b1-167096be2de9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26zbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:03Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:03 crc kubenswrapper[4922]: I0218 11:37:03.374773 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:03Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:03 crc kubenswrapper[4922]: I0218 11:37:03.388062 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w46bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"067f44ac-9e60-4581-87cc-f2e1c823fc4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c52fc2f1b105e90957f8cd7424352990f9ec9e5350d1a501980bfda1613066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ks8d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w46bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:03Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:03 crc kubenswrapper[4922]: I0218 11:37:03.403977 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:03Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:03 crc kubenswrapper[4922]: I0218 11:37:03.418298 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:03Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:03 crc kubenswrapper[4922]: I0218 11:37:03.431523 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"384d508e-8f6a-427d-a85d-85bb61d8405e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dacf3659cbb51e3d8df303470837041f3110f38b42187d40cbb4e53a8750cafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab832676116da0ae3d9dc70d182e283d7ae2418db02c1ea0062812cab0d71b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://434010c9772cf1c9db5578ecd4726ae2de0c0a3923d45b522868677cf78fcaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0218 11:36:58.792829 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0218 11:36:58.792990 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0218 11:36:58.794119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-56419459/tls.crt::/tmp/serving-cert-56419459/tls.key\\\\\\\"\\\\nI0218 11:36:58.994875 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0218 11:36:59.009974 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0218 11:36:59.010010 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0218 11:36:59.010039 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0218 11:36:59.010047 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0218 11:36:59.021309 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0218 11:36:59.021338 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021343 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021347 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0218 11:36:59.021350 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0218 11:36:59.021353 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:36:59.021356 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:36:59.021636 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0218 11:36:59.024101 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b891a2fe938b437cbfb28c7715d0ad758541d844a4768434d3c1f1a3f0578c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:03Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:03 crc kubenswrapper[4922]: I0218 11:37:03.448124 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://454f72313609f075cd3f659b19279f8c3b07e8fea70f0ca7c88c5ca8e2e57397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0577fe2fd5192d4ec36ec8bfc6259a533b6422ef672d8153f601c2ac96bb5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:03Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:03 crc kubenswrapper[4922]: I0218 11:37:03.463771 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26zbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592c6351-c252-4c19-b3b1-167096be2de9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26zbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:03Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:03 crc kubenswrapper[4922]: I0218 11:37:03.478495 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:03Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:03 crc kubenswrapper[4922]: I0218 11:37:03.497318 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:03Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:03 crc kubenswrapper[4922]: I0218 11:37:03.510785 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w46bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"067f44ac-9e60-4581-87cc-f2e1c823fc4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c52fc2f1b105e90957f8cd7424352990f9ec9e5350d1a501980bfda1613066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ks8d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w46bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:03Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:03 crc kubenswrapper[4922]: I0218 11:37:03.525150 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:03Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:03 crc kubenswrapper[4922]: I0218 11:37:03.538418 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdb7cedc-b2e3-48f0-80e0-e17073b43228\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c29cb76dded770e8691cd270c1c1197ffd3764c3fa73faf7f072a63c9e3e4b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653a3be8615ffc467d4be70b77ab62c837c24df920515a9965ec20ce2941d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-znglx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:03Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:03 crc kubenswrapper[4922]: I0218 11:37:03.560513 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"653a41bb-bb1d-421c-a92b-7f2811d95edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wg4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:03Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:03 crc kubenswrapper[4922]: I0218 11:37:03.586782 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bbe0881c-3173-4207-adb8-72c0dd4fa9c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e73616098bbc0fcdcd0b7e6b5bf610a7c193bdb65922e4532ce9553e9fc6e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71f2a53a03eafd89329b955a7edbe71c610706392e0a07fa09b9a6247889913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cac665b2d5a707e29339e7e8afa75dffcfd1dc1f42262e2643b2487ae68558a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0517235f78003f9b1a9044287c23aebed811299dc48f24c6643dd3bfcf5cf1d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a0c8c34dd5237a5b2d52b436f5c4deabd96a96ccb8073961dc49aaffd62c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:03Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:03 crc kubenswrapper[4922]: I0218 11:37:03.603992 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65825dc8bc12fe9933ff1392fd17a19042bf62d65c0b1ba22303094881745e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:03Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:03 crc kubenswrapper[4922]: I0218 11:37:03.619285 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9xzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b4595ac-c521-4ada-950d-e1b01cdff99b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f784f6e4f98fea5928c944062388eaa69d5a5782490bdb120cd735f54f52df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zv6rh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9xzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:03Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:03 crc kubenswrapper[4922]: I0218 11:37:03.627834 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:37:03 crc kubenswrapper[4922]: E0218 11:37:03.627996 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:37:07.627973772 +0000 UTC m=+29.355677852 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:37:03 crc kubenswrapper[4922]: I0218 11:37:03.634135 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3079844ef727cc58eb6253c7eaa0289ca84977a00334a10cdbf493b16133a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:03Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:03 crc kubenswrapper[4922]: I0218 11:37:03.646440 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5qkb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cd3723d-a12f-4c7c-a1ea-63bfef3c931a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ff8efc7bd7cb5045841f85767cea766898289dab6f4081829d530fbaaf9dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c45cw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5qkb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:03Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:03 crc kubenswrapper[4922]: I0218 11:37:03.729254 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:37:03 crc kubenswrapper[4922]: I0218 11:37:03.729310 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:37:03 crc kubenswrapper[4922]: I0218 11:37:03.729346 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:37:03 crc kubenswrapper[4922]: I0218 11:37:03.729395 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:37:03 crc kubenswrapper[4922]: E0218 11:37:03.729496 4922 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 11:37:03 crc kubenswrapper[4922]: E0218 11:37:03.729548 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 11:37:03 crc kubenswrapper[4922]: E0218 11:37:03.729567 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 11:37:03 crc kubenswrapper[4922]: E0218 11:37:03.729578 4922 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:37:03 crc kubenswrapper[4922]: E0218 11:37:03.729586 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 11:37:07.729565807 +0000 UTC m=+29.457269887 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 11:37:03 crc kubenswrapper[4922]: E0218 11:37:03.729626 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-18 11:37:07.729613659 +0000 UTC m=+29.457317739 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:37:03 crc kubenswrapper[4922]: E0218 11:37:03.729671 4922 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 11:37:03 crc kubenswrapper[4922]: E0218 11:37:03.729689 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 11:37:07.72968372 +0000 UTC m=+29.457387800 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 11:37:03 crc kubenswrapper[4922]: E0218 11:37:03.729728 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 11:37:03 crc kubenswrapper[4922]: E0218 11:37:03.729737 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 11:37:03 crc kubenswrapper[4922]: E0218 11:37:03.729744 4922 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:37:03 crc kubenswrapper[4922]: E0218 11:37:03.729761 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-18 11:37:07.729755892 +0000 UTC m=+29.457459972 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:37:03 crc kubenswrapper[4922]: I0218 11:37:03.924786 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 11:48:38.53131069 +0000 UTC Feb 18 11:37:03 crc kubenswrapper[4922]: I0218 11:37:03.972729 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:37:03 crc kubenswrapper[4922]: I0218 11:37:03.972762 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:37:03 crc kubenswrapper[4922]: E0218 11:37:03.972867 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:37:03 crc kubenswrapper[4922]: I0218 11:37:03.972729 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:37:03 crc kubenswrapper[4922]: E0218 11:37:03.972966 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:37:03 crc kubenswrapper[4922]: E0218 11:37:03.973057 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.143424 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.145603 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.145661 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.145674 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.145750 4922 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.153559 4922 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.153788 4922 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.154834 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.154896 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.154913 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.154951 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.154966 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:04Z","lastTransitionTime":"2026-02-18T11:37:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:04 crc kubenswrapper[4922]: E0218 11:37:04.170647 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"142c3684-8991-4ed8-97d2-827c32777413\\\",\\\"systemUUID\\\":\\\"e6651d54-876d-4ba5-b7d2-813fd96498e1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:04Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.175317 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.175340 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.175351 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.175378 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.175391 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:04Z","lastTransitionTime":"2026-02-18T11:37:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.189796 4922 generic.go:334] "Generic (PLEG): container finished" podID="592c6351-c252-4c19-b3b1-167096be2de9" containerID="66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5" exitCode=0 Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.189909 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-26zbd" event={"ID":"592c6351-c252-4c19-b3b1-167096be2de9","Type":"ContainerDied","Data":"66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5"} Feb 18 11:37:04 crc kubenswrapper[4922]: E0218 11:37:04.193874 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"142c3684-8991-4ed8-97d2-827c32777413\\\",\\\"systemUUID\\\":\\\"e6651d54-876d-4ba5-b7d2-813fd96498e1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:04Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.197646 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.197685 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.197697 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.197720 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.197733 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:04Z","lastTransitionTime":"2026-02-18T11:37:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.198059 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" event={"ID":"653a41bb-bb1d-421c-a92b-7f2811d95edf","Type":"ContainerStarted","Data":"e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c"} Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.211848 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bbe0881c-3173-4207-adb8-72c0dd4fa9c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e73616098bbc0fcdcd0b7e6b5bf610a7c193bdb65922e4532ce9553e9fc6e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71f2a53a03eafd89329b955a7edbe71c610706392e0a07fa09b9a6247889913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cac665b2d5a707e29339e7e8afa75dffcfd1dc1f42262e2643b2487ae68558a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0517235f78003f9b1a9044287c23aebed811299dc48f24c6643dd3bfcf5cf1d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a0c8c34dd5237a5b2d52b436f5c4deabd96a96ccb8073961dc49aaffd62c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:04Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:04 crc kubenswrapper[4922]: E0218 11:37:04.216247 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"142c3684-8991-4ed8-97d2-827c32777413\\\",\\\"systemUUID\\\":\\\"e6651d54-876d-4ba5-b7d2-813fd96498e1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:04Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.220744 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.220806 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.220818 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.220837 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.220850 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:04Z","lastTransitionTime":"2026-02-18T11:37:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.229135 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65825dc8bc12fe9933ff1392fd17a19042bf62d65c0b1ba22303094881745e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:04Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:04 crc kubenswrapper[4922]: E0218 11:37:04.237413 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"142c3684-8991-4ed8-97d2-827c32777413\\\",\\\"systemUUID\\\":\\\"e6651d54-876d-4ba5-b7d2-813fd96498e1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:04Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.240671 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.240701 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.240714 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.240733 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.240746 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:04Z","lastTransitionTime":"2026-02-18T11:37:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.250032 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9xzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b4595ac-c521-4ada-950d-e1b01cdff99b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f784f6e4f98fea5928c944062388eaa69d5a5782490bdb120cd735f54f52df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zv6rh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9xzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:04Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:04 crc kubenswrapper[4922]: E0218 11:37:04.254716 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"142c3684-8991-4ed8-97d2-827c32777413\\\",\\\"systemUUID\\\":\\\"e6651d54-876d-4ba5-b7d2-813fd96498e1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:04Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:04 crc kubenswrapper[4922]: E0218 11:37:04.254867 4922 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.258412 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.258475 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.258517 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.258556 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.258583 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:04Z","lastTransitionTime":"2026-02-18T11:37:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.266378 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3079844ef727cc58eb6253c7eaa0289ca84977a00334a10cdbf493b16133a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:04Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.280486 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdb7cedc-b2e3-48f0-80e0-e17073b43228\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c29cb76dded770e8691cd270c1c1197ffd3764c3fa73faf7f072a63c9e3e4b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653a3be8615ffc467d4be70b77ab62c837c24df920515a9965ec20ce2941d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-znglx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:04Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.301654 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"653a41bb-bb1d-421c-a92b-7f2811d95edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wg4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:04Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.313155 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5qkb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cd3723d-a12f-4c7c-a1ea-63bfef3c931a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ff8efc7bd7cb5045841f85767cea766898289dab6f4081829d530fbaaf9dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c45cw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5qkb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:04Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.329700 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"384d508e-8f6a-427d-a85d-85bb61d8405e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dacf3659cbb51e3d8df303470837041f3110f38b42187d40cbb4e53a8750cafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab832676116da0ae3d9dc70d182e283d7ae2418db02c1ea0062812cab0d71b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://434010c9772cf1c9db5578ecd4726ae2de0c0a3923d45b522868677cf78fcaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0218 11:36:58.792829 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0218 11:36:58.792990 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0218 11:36:58.794119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-56419459/tls.crt::/tmp/serving-cert-56419459/tls.key\\\\\\\"\\\\nI0218 11:36:58.994875 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0218 11:36:59.009974 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0218 11:36:59.010010 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0218 11:36:59.010039 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0218 11:36:59.010047 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0218 11:36:59.021309 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0218 11:36:59.021338 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021343 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021347 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0218 11:36:59.021350 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0218 11:36:59.021353 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:36:59.021356 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:36:59.021636 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0218 11:36:59.024101 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b891a2fe938b437cbfb28c7715d0ad758541d844a4768434d3c1f1a3f0578c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:04Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.344685 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://454f72313609f075cd3f659b19279f8c3b07e8fea70f0ca7c88c5ca8e2e57397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0577fe2fd5192d4ec36ec8bfc6259a533b6422ef672d8153f601c2ac96bb5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:04Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.362924 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26zbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592c6351-c252-4c19-b3b1-167096be2de9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26zbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:04Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.363762 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.363823 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.363837 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.363862 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.363879 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:04Z","lastTransitionTime":"2026-02-18T11:37:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.379126 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:04Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.392507 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:04Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.412823 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:04Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.427039 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w46bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"067f44ac-9e60-4581-87cc-f2e1c823fc4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c52fc2f1b105e90957f8cd7424352990f9ec9e5350d1a501980bfda1613066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ks8d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w46bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:04Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.467180 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.467219 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.467232 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.467252 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.467268 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:04Z","lastTransitionTime":"2026-02-18T11:37:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.570277 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.570333 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.570345 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.570376 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.570387 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:04Z","lastTransitionTime":"2026-02-18T11:37:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.673459 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.673512 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.673525 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.673546 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.673560 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:04Z","lastTransitionTime":"2026-02-18T11:37:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.776532 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.776886 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.776926 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.776963 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.776986 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:04Z","lastTransitionTime":"2026-02-18T11:37:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.879886 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.879937 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.879951 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.879974 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.879988 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:04Z","lastTransitionTime":"2026-02-18T11:37:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.925327 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 02:12:23.933964386 +0000 UTC Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.985552 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.985630 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.985645 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.985677 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:04 crc kubenswrapper[4922]: I0218 11:37:04.985697 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:04Z","lastTransitionTime":"2026-02-18T11:37:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.088529 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.088587 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.088604 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.088629 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.088649 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:05Z","lastTransitionTime":"2026-02-18T11:37:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.117039 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.122435 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.132017 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.143423 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"384d508e-8f6a-427d-a85d-85bb61d8405e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dacf3659cbb51e3d8df303470837041f3110f38b42187d40cbb4e53a8750cafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab832676116da0ae3d9dc70d182e283d7ae2418db02c1ea0062812cab0d71b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://434010c9772cf1c9db5578ecd4726ae2de0c0a3923d45b522868677cf78fcaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0218 11:36:58.792829 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0218 11:36:58.792990 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0218 11:36:58.794119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-56419459/tls.crt::/tmp/serving-cert-56419459/tls.key\\\\\\\"\\\\nI0218 11:36:58.994875 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0218 11:36:59.009974 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0218 11:36:59.010010 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0218 11:36:59.010039 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0218 11:36:59.010047 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0218 11:36:59.021309 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0218 11:36:59.021338 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021343 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021347 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0218 11:36:59.021350 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0218 11:36:59.021353 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:36:59.021356 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:36:59.021636 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0218 11:36:59.024101 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b891a2fe938b437cbfb28c7715d0ad758541d844a4768434d3c1f1a3f0578c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.161724 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://454f72313609f075cd3f659b19279f8c3b07e8fea70f0ca7c88c5ca8e2e57397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0577fe2fd5192d4ec36ec8bfc6259a533b6422ef672d8153f601c2ac96bb5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.186176 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26zbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592c6351-c252-4c19-b3b1-167096be2de9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26zbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.191300 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.191384 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.191406 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.191433 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.191453 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:05Z","lastTransitionTime":"2026-02-18T11:37:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.205601 4922 generic.go:334] "Generic (PLEG): container finished" podID="592c6351-c252-4c19-b3b1-167096be2de9" containerID="c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959" exitCode=0 Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.205698 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-26zbd" event={"ID":"592c6351-c252-4c19-b3b1-167096be2de9","Type":"ContainerDied","Data":"c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959"} Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.211837 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.231071 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w46bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"067f44ac-9e60-4581-87cc-f2e1c823fc4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c52fc2f1b105e90957f8cd7424352990f9ec9e5350d1a501980bfda1613066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ks8d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w46bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.256508 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.282131 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.293589 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.293649 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.293664 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.293689 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.293707 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:05Z","lastTransitionTime":"2026-02-18T11:37:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.302449 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"653a41bb-bb1d-421c-a92b-7f2811d95edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wg4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.338717 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bbe0881c-3173-4207-adb8-72c0dd4fa9c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e73616098bbc0fcdcd0b7e6b5bf610a7c193bdb65922e4532ce9553e9fc6e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71f2a53a03eafd89329b955a7edbe71c610706392e0a07fa09b9a6247889913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cac665b2d5a707e29339e7e8afa75dffcfd1dc1f42262e2643b2487ae68558a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0517235f78003f9b1a9044287c23aebed811299dc48f24c6643dd3bfcf5cf1d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a0c8c34dd5237a5b2d52b436f5c4deabd96a96ccb8073961dc49aaffd62c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.357271 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65825dc8bc12fe9933ff1392fd17a19042bf62d65c0b1ba22303094881745e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.382252 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9xzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b4595ac-c521-4ada-950d-e1b01cdff99b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f784f6e4f98fea5928c944062388eaa69d5a5782490bdb120cd735f54f52df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zv6rh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9xzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.397243 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.397293 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.397307 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.397329 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.397343 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:05Z","lastTransitionTime":"2026-02-18T11:37:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.399914 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3079844ef727cc58eb6253c7eaa0289ca84977a00334a10cdbf493b16133a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.425378 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdb7cedc-b2e3-48f0-80e0-e17073b43228\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c29cb76dded770e8691cd270c1c1197ffd3764c3fa73faf7f072a63c9e3e4b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653a3be8615ffc467d4be70b77ab62c837c24df920515a9965ec20ce2941d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-znglx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.437802 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5qkb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cd3723d-a12f-4c7c-a1ea-63bfef3c931a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ff8efc7bd7cb5045841f85767cea766898289dab6f4081829d530fbaaf9dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c45cw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5qkb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.457174 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"384d508e-8f6a-427d-a85d-85bb61d8405e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dacf3659cbb51e3d8df303470837041f3110f38b42187d40cbb4e53a8750cafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab832676116da0ae3d9dc70d182e283d7ae2418db02c1ea0062812cab0d71b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://434010c9772cf1c9db5578ecd4726ae2de0c0a3923d45b522868677cf78fcaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0218 11:36:58.792829 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0218 11:36:58.792990 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0218 11:36:58.794119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-56419459/tls.crt::/tmp/serving-cert-56419459/tls.key\\\\\\\"\\\\nI0218 11:36:58.994875 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0218 11:36:59.009974 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0218 11:36:59.010010 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0218 11:36:59.010039 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0218 11:36:59.010047 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0218 11:36:59.021309 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0218 11:36:59.021338 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021343 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021347 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0218 11:36:59.021350 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0218 11:36:59.021353 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:36:59.021356 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:36:59.021636 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0218 11:36:59.024101 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b891a2fe938b437cbfb28c7715d0ad758541d844a4768434d3c1f1a3f0578c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.475736 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://454f72313609f075cd3f659b19279f8c3b07e8fea70f0ca7c88c5ca8e2e57397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0577fe2fd5192d4ec36ec8bfc6259a533b6422ef672d8153f601c2ac96bb5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.494136 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26zbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592c6351-c252-4c19-b3b1-167096be2de9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26zbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.500241 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.500335 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.500401 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.500441 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.500467 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:05Z","lastTransitionTime":"2026-02-18T11:37:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.511587 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.528805 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.542721 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w46bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"067f44ac-9e60-4581-87cc-f2e1c823fc4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c52fc2f1b105e90957f8cd7424352990f9ec9e5350d1a501980bfda1613066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ks8d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w46bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.557077 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47a2352-be16-435d-9207-078c64f99e77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d641439c1ae42b2579e47041fc257fa37643ec1ef4fcacf55b15b82d6ec02cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dda1daddcf256e52a7dc75c6b5c221fcd8b8fad50c50b6053774f9a8fab57e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac316efde113b3547d00f480ee2c658958ba4f99c719d0920f333ab2a62ebaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6a8f8152ee7536b35054611238f4050c7f0e3a756f74ae8525af7f50150f88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.577226 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.592880 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdb7cedc-b2e3-48f0-80e0-e17073b43228\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c29cb76dded770e8691cd270c1c1197ffd3764c3fa73faf7f072a63c9e3e4b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653a3be8615ffc467d4be70b77ab62c837c24df920515a9965ec20ce2941d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-znglx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.603483 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.603533 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.603543 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.603563 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.603579 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:05Z","lastTransitionTime":"2026-02-18T11:37:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.620183 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"653a41bb-bb1d-421c-a92b-7f2811d95edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wg4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.649748 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bbe0881c-3173-4207-adb8-72c0dd4fa9c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e73616098bbc0fcdcd0b7e6b5bf610a7c193bdb65922e4532ce9553e9fc6e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71f2a53a03eafd89329b955a7edbe71c610706392e0a07fa09b9a6247889913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cac665b2d5a707e29339e7e8afa75dffcfd1dc1f42262e2643b2487ae68558a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0517235f78003f9b1a9044287c23aebed811299dc48f24c6643dd3bfcf5cf1d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a0c8c34dd5237a5b2d52b436f5c4deabd96a96ccb8073961dc49aaffd62c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.668385 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65825dc8bc12fe9933ff1392fd17a19042bf62d65c0b1ba22303094881745e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.682403 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9xzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b4595ac-c521-4ada-950d-e1b01cdff99b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f784f6e4f98fea5928c944062388eaa69d5a5782490bdb120cd735f54f52df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zv6rh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9xzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.696933 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3079844ef727cc58eb6253c7eaa0289ca84977a00334a10cdbf493b16133a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.707383 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.707424 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.707436 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.707454 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.707464 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:05Z","lastTransitionTime":"2026-02-18T11:37:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.712582 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5qkb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cd3723d-a12f-4c7c-a1ea-63bfef3c931a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ff8efc7bd7cb5045841f85767cea766898289dab6f4081829d530fbaaf9dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c45cw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5qkb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.811009 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.811093 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.811142 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.811173 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.811198 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:05Z","lastTransitionTime":"2026-02-18T11:37:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.914231 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.914295 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.914314 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.914339 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.914385 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:05Z","lastTransitionTime":"2026-02-18T11:37:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.925546 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-06 22:25:23.86038241 +0000 UTC Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.972469 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.972532 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:37:05 crc kubenswrapper[4922]: E0218 11:37:05.972573 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:37:05 crc kubenswrapper[4922]: I0218 11:37:05.972611 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:37:05 crc kubenswrapper[4922]: E0218 11:37:05.972774 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:37:05 crc kubenswrapper[4922]: E0218 11:37:05.972968 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.016988 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.017029 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.017038 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.017053 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.017064 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:06Z","lastTransitionTime":"2026-02-18T11:37:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.120798 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.120838 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.120846 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.120863 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.120873 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:06Z","lastTransitionTime":"2026-02-18T11:37:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.223494 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.223533 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.223541 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.223562 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.223572 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:06Z","lastTransitionTime":"2026-02-18T11:37:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.223563 4922 generic.go:334] "Generic (PLEG): container finished" podID="592c6351-c252-4c19-b3b1-167096be2de9" containerID="37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296" exitCode=0 Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.223641 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-26zbd" event={"ID":"592c6351-c252-4c19-b3b1-167096be2de9","Type":"ContainerDied","Data":"37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296"} Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.245953 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"384d508e-8f6a-427d-a85d-85bb61d8405e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dacf3659cbb51e3d8df303470837041f3110f38b42187d40cbb4e53a8750cafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab832676116da0ae3d9dc70d182e283d7ae2418db02c1ea0062812cab0d71b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://434010c9772cf1c9db5578ecd4726ae2de0c0a3923d45b522868677cf78fcaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0218 11:36:58.792829 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0218 11:36:58.792990 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0218 11:36:58.794119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-56419459/tls.crt::/tmp/serving-cert-56419459/tls.key\\\\\\\"\\\\nI0218 11:36:58.994875 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0218 11:36:59.009974 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0218 11:36:59.010010 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0218 11:36:59.010039 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0218 11:36:59.010047 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0218 11:36:59.021309 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0218 11:36:59.021338 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021343 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021347 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0218 11:36:59.021350 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0218 11:36:59.021353 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:36:59.021356 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:36:59.021636 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0218 11:36:59.024101 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b891a2fe938b437cbfb28c7715d0ad758541d844a4768434d3c1f1a3f0578c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:06Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.263628 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://454f72313609f075cd3f659b19279f8c3b07e8fea70f0ca7c88c5ca8e2e57397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0577fe2fd5192d4ec36ec8bfc6259a533b6422ef672d8153f601c2ac96bb5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:06Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.283766 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26zbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592c6351-c252-4c19-b3b1-167096be2de9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26zbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:06Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.300478 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w46bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"067f44ac-9e60-4581-87cc-f2e1c823fc4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c52fc2f1b105e90957f8cd7424352990f9ec9e5350d1a501980bfda1613066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ks8d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w46bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:06Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.319832 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47a2352-be16-435d-9207-078c64f99e77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d641439c1ae42b2579e47041fc257fa37643ec1ef4fcacf55b15b82d6ec02cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dda1daddcf256e52a7dc75c6b5c221fcd8b8fad50c50b6053774f9a8fab57e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac316efde113b3547d00f480ee2c658958ba4f99c719d0920f333ab2a62ebaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6a8f8152ee7536b35054611238f4050c7f0e3a756f74ae8525af7f50150f88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:06Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.327316 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.327419 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.327448 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.327482 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.327512 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:06Z","lastTransitionTime":"2026-02-18T11:37:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.340087 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:06Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.357818 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:06Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.377479 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:06Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.408687 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bbe0881c-3173-4207-adb8-72c0dd4fa9c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e73616098bbc0fcdcd0b7e6b5bf610a7c193bdb65922e4532ce9553e9fc6e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71f2a53a03eafd89329b955a7edbe71c610706392e0a07fa09b9a6247889913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cac665b2d5a707e29339e7e8afa75dffcfd1dc1f42262e2643b2487ae68558a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0517235f78003f9b1a9044287c23aebed811299dc48f24c6643dd3bfcf5cf1d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a0c8c34dd5237a5b2d52b436f5c4deabd96a96ccb8073961dc49aaffd62c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:06Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.424113 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65825dc8bc12fe9933ff1392fd17a19042bf62d65c0b1ba22303094881745e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:06Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.430421 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.430487 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.430513 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.430543 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.430567 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:06Z","lastTransitionTime":"2026-02-18T11:37:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.436071 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9xzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b4595ac-c521-4ada-950d-e1b01cdff99b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f784f6e4f98fea5928c944062388eaa69d5a5782490bdb120cd735f54f52df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zv6rh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9xzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:06Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.446975 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3079844ef727cc58eb6253c7eaa0289ca84977a00334a10cdbf493b16133a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:06Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.462968 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdb7cedc-b2e3-48f0-80e0-e17073b43228\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c29cb76dded770e8691cd270c1c1197ffd3764c3fa73faf7f072a63c9e3e4b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653a3be8615ffc467d4be70b77ab62c837c24df920515a9965ec20ce2941d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-znglx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:06Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.488378 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"653a41bb-bb1d-421c-a92b-7f2811d95edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wg4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:06Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.503239 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5qkb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cd3723d-a12f-4c7c-a1ea-63bfef3c931a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ff8efc7bd7cb5045841f85767cea766898289dab6f4081829d530fbaaf9dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c45cw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5qkb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:06Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.533187 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.533230 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.533243 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.533261 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.533273 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:06Z","lastTransitionTime":"2026-02-18T11:37:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.636245 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.636305 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.636320 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.636345 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.636384 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:06Z","lastTransitionTime":"2026-02-18T11:37:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.739100 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.739208 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.739233 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.739265 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.739287 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:06Z","lastTransitionTime":"2026-02-18T11:37:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.842440 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.842488 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.842500 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.842523 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.842537 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:06Z","lastTransitionTime":"2026-02-18T11:37:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.926374 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 20:42:27.171890642 +0000 UTC Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.946650 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.947090 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.947110 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.947140 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:06 crc kubenswrapper[4922]: I0218 11:37:06.947160 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:06Z","lastTransitionTime":"2026-02-18T11:37:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.049911 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.049949 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.049959 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.049975 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.049986 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:07Z","lastTransitionTime":"2026-02-18T11:37:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.153271 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.153325 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.153333 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.153351 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.153384 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:07Z","lastTransitionTime":"2026-02-18T11:37:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.256596 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.256703 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.256723 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.256805 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.256825 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:07Z","lastTransitionTime":"2026-02-18T11:37:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.360702 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.360750 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.360761 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.360780 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.360795 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:07Z","lastTransitionTime":"2026-02-18T11:37:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.463125 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.463184 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.463206 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.463228 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.463245 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:07Z","lastTransitionTime":"2026-02-18T11:37:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.485120 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.486016 4922 scope.go:117] "RemoveContainer" containerID="4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd" Feb 18 11:37:07 crc kubenswrapper[4922]: E0218 11:37:07.486335 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.566885 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.566971 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.566993 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.567027 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.567051 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:07Z","lastTransitionTime":"2026-02-18T11:37:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.670353 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.670422 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.670437 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.670458 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.670473 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:07Z","lastTransitionTime":"2026-02-18T11:37:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.676326 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:37:07 crc kubenswrapper[4922]: E0218 11:37:07.676463 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:37:15.676446509 +0000 UTC m=+37.404150589 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.772427 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.772463 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.772472 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.772488 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.772499 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:07Z","lastTransitionTime":"2026-02-18T11:37:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.777070 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.777129 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.777178 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.777220 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:37:07 crc kubenswrapper[4922]: E0218 11:37:07.777249 4922 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 11:37:07 crc kubenswrapper[4922]: E0218 11:37:07.777276 4922 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 11:37:07 crc kubenswrapper[4922]: E0218 11:37:07.777315 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 11:37:15.777295466 +0000 UTC m=+37.504999546 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 11:37:07 crc kubenswrapper[4922]: E0218 11:37:07.777330 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 11:37:15.777324197 +0000 UTC m=+37.505028277 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 11:37:07 crc kubenswrapper[4922]: E0218 11:37:07.777371 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 11:37:07 crc kubenswrapper[4922]: E0218 11:37:07.777398 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 11:37:07 crc kubenswrapper[4922]: E0218 11:37:07.777415 4922 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:37:07 crc kubenswrapper[4922]: E0218 11:37:07.777483 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-18 11:37:15.777471341 +0000 UTC m=+37.505175641 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:37:07 crc kubenswrapper[4922]: E0218 11:37:07.777580 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 11:37:07 crc kubenswrapper[4922]: E0218 11:37:07.777610 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 11:37:07 crc kubenswrapper[4922]: E0218 11:37:07.777626 4922 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:37:07 crc kubenswrapper[4922]: E0218 11:37:07.777701 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-18 11:37:15.777679676 +0000 UTC m=+37.505383776 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.875809 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.875854 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.875864 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.875882 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.875896 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:07Z","lastTransitionTime":"2026-02-18T11:37:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.926937 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 08:48:02.920680787 +0000 UTC Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.972413 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.972463 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.972545 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:37:07 crc kubenswrapper[4922]: E0218 11:37:07.972674 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:37:07 crc kubenswrapper[4922]: E0218 11:37:07.972736 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:37:07 crc kubenswrapper[4922]: E0218 11:37:07.972890 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.979094 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.979152 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.979170 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.979199 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:07 crc kubenswrapper[4922]: I0218 11:37:07.979217 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:07Z","lastTransitionTime":"2026-02-18T11:37:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.083123 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.083189 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.083212 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.083244 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.083267 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:08Z","lastTransitionTime":"2026-02-18T11:37:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.186162 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.186195 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.186204 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.186224 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.186235 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:08Z","lastTransitionTime":"2026-02-18T11:37:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.232497 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-26zbd" event={"ID":"592c6351-c252-4c19-b3b1-167096be2de9","Type":"ContainerStarted","Data":"fca36b120b6a8f3bf08a810358bdd23b8b3c3105ede8e850232a9fef2d322db0"} Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.253661 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" event={"ID":"653a41bb-bb1d-421c-a92b-7f2811d95edf","Type":"ContainerStarted","Data":"b0e80f07cd7247e51069bf2b1dbdb4c9276cf6ea8904cc3f193a79c72f996bc6"} Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.254146 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.254273 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.256696 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"384d508e-8f6a-427d-a85d-85bb61d8405e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dacf3659cbb51e3d8df303470837041f3110f38b42187d40cbb4e53a8750cafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab832676116da0ae3d9dc70d182e283d7ae2418db02c1ea0062812cab0d71b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://434010c9772cf1c9db5578ecd4726ae2de0c0a3923d45b522868677cf78fcaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0218 11:36:58.792829 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0218 11:36:58.792990 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0218 11:36:58.794119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-56419459/tls.crt::/tmp/serving-cert-56419459/tls.key\\\\\\\"\\\\nI0218 11:36:58.994875 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0218 11:36:59.009974 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0218 11:36:59.010010 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0218 11:36:59.010039 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0218 11:36:59.010047 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0218 11:36:59.021309 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0218 11:36:59.021338 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021343 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021347 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0218 11:36:59.021350 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0218 11:36:59.021353 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:36:59.021356 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:36:59.021636 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0218 11:36:59.024101 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b891a2fe938b437cbfb28c7715d0ad758541d844a4768434d3c1f1a3f0578c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:08Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.269971 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://454f72313609f075cd3f659b19279f8c3b07e8fea70f0ca7c88c5ca8e2e57397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0577fe2fd5192d4ec36ec8bfc6259a533b6422ef672d8153f601c2ac96bb5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:08Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.285436 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26zbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592c6351-c252-4c19-b3b1-167096be2de9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fca36b120b6a8f3bf08a810358bdd23b8b3c3105ede8e850232a9fef2d322db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26zbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:08Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.288580 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.288612 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.288624 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.288643 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.288654 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:08Z","lastTransitionTime":"2026-02-18T11:37:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.288766 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.288860 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.298846 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:08Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.308714 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w46bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"067f44ac-9e60-4581-87cc-f2e1c823fc4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c52fc2f1b105e90957f8cd7424352990f9ec9e5350d1a501980bfda1613066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ks8d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w46bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:08Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.320129 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47a2352-be16-435d-9207-078c64f99e77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d641439c1ae42b2579e47041fc257fa37643ec1ef4fcacf55b15b82d6ec02cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dda1daddcf256e52a7dc75c6b5c221fcd8b8fad50c50b6053774f9a8fab57e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac316efde113b3547d00f480ee2c658958ba4f99c719d0920f333ab2a62ebaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6a8f8152ee7536b35054611238f4050c7f0e3a756f74ae8525af7f50150f88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:08Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.330685 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:08Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.343231 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:08Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.360114 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"653a41bb-bb1d-421c-a92b-7f2811d95edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wg4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:08Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.382854 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bbe0881c-3173-4207-adb8-72c0dd4fa9c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e73616098bbc0fcdcd0b7e6b5bf610a7c193bdb65922e4532ce9553e9fc6e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71f2a53a03eafd89329b955a7edbe71c610706392e0a07fa09b9a6247889913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cac665b2d5a707e29339e7e8afa75dffcfd1dc1f42262e2643b2487ae68558a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0517235f78003f9b1a9044287c23aebed811299dc48f24c6643dd3bfcf5cf1d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a0c8c34dd5237a5b2d52b436f5c4deabd96a96ccb8073961dc49aaffd62c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:08Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.392512 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.392562 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.392573 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.392592 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.392621 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:08Z","lastTransitionTime":"2026-02-18T11:37:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.396250 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65825dc8bc12fe9933ff1392fd17a19042bf62d65c0b1ba22303094881745e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:08Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.408788 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9xzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b4595ac-c521-4ada-950d-e1b01cdff99b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f784f6e4f98fea5928c944062388eaa69d5a5782490bdb120cd735f54f52df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zv6rh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9xzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:08Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.423834 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3079844ef727cc58eb6253c7eaa0289ca84977a00334a10cdbf493b16133a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:08Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.437349 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdb7cedc-b2e3-48f0-80e0-e17073b43228\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c29cb76dded770e8691cd270c1c1197ffd3764c3fa73faf7f072a63c9e3e4b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653a3be8615ffc467d4be70b77ab62c837c24df920515a9965ec20ce2941d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-znglx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:08Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.449590 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5qkb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cd3723d-a12f-4c7c-a1ea-63bfef3c931a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ff8efc7bd7cb5045841f85767cea766898289dab6f4081829d530fbaaf9dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c45cw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5qkb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:08Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.467804 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47a2352-be16-435d-9207-078c64f99e77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d641439c1ae42b2579e47041fc257fa37643ec1ef4fcacf55b15b82d6ec02cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dda1daddcf256e52a7dc75c6b5c221fcd8b8fad50c50b6053774f9a8fab57e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac316efde113b3547d00f480ee2c658958ba4f99c719d0920f333ab2a62ebaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6a8f8152ee7536b35054611238f4050c7f0e3a756f74ae8525af7f50150f88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:08Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.483899 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:08Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.495661 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.495699 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.495711 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.495733 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.495744 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:08Z","lastTransitionTime":"2026-02-18T11:37:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.498340 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:08Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.511143 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:08Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.524601 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w46bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"067f44ac-9e60-4581-87cc-f2e1c823fc4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c52fc2f1b105e90957f8cd7424352990f9ec9e5350d1a501980bfda1613066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ks8d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w46bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:08Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.545532 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bbe0881c-3173-4207-adb8-72c0dd4fa9c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e73616098bbc0fcdcd0b7e6b5bf610a7c193bdb65922e4532ce9553e9fc6e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71f2a53a03eafd89329b955a7edbe71c610706392e0a07fa09b9a6247889913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cac665b2d5a707e29339e7e8afa75dffcfd1dc1f42262e2643b2487ae68558a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0517235f78003f9b1a9044287c23aebed811299dc48f24c6643dd3bfcf5cf1d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a0c8c34dd5237a5b2d52b436f5c4deabd96a96ccb8073961dc49aaffd62c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:08Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.561351 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65825dc8bc12fe9933ff1392fd17a19042bf62d65c0b1ba22303094881745e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:08Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.576730 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9xzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b4595ac-c521-4ada-950d-e1b01cdff99b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f784f6e4f98fea5928c944062388eaa69d5a5782490bdb120cd735f54f52df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zv6rh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9xzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:08Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.590425 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3079844ef727cc58eb6253c7eaa0289ca84977a00334a10cdbf493b16133a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:08Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.598575 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.598652 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.598667 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.598689 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.598912 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:08Z","lastTransitionTime":"2026-02-18T11:37:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.604896 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdb7cedc-b2e3-48f0-80e0-e17073b43228\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c29cb76dded770e8691cd270c1c1197ffd3764c3fa73faf7f072a63c9e3e4b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653a3be8615ffc467d4be70b77ab62c837c24df920515a9965ec20ce2941d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-znglx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:08Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.634940 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"653a41bb-bb1d-421c-a92b-7f2811d95edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0e80f07cd7247e51069bf2b1dbdb4c9276cf6ea8904cc3f193a79c72f996bc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wg4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:08Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.650478 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5qkb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cd3723d-a12f-4c7c-a1ea-63bfef3c931a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ff8efc7bd7cb5045841f85767cea766898289dab6f4081829d530fbaaf9dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c45cw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5qkb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:08Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.666959 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"384d508e-8f6a-427d-a85d-85bb61d8405e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dacf3659cbb51e3d8df303470837041f3110f38b42187d40cbb4e53a8750cafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab832676116da0ae3d9dc70d182e283d7ae2418db02c1ea0062812cab0d71b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://434010c9772cf1c9db5578ecd4726ae2de0c0a3923d45b522868677cf78fcaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0218 11:36:58.792829 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0218 11:36:58.792990 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0218 11:36:58.794119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-56419459/tls.crt::/tmp/serving-cert-56419459/tls.key\\\\\\\"\\\\nI0218 11:36:58.994875 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0218 11:36:59.009974 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0218 11:36:59.010010 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0218 11:36:59.010039 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0218 11:36:59.010047 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0218 11:36:59.021309 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0218 11:36:59.021338 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021343 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021347 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0218 11:36:59.021350 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0218 11:36:59.021353 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:36:59.021356 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:36:59.021636 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0218 11:36:59.024101 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b891a2fe938b437cbfb28c7715d0ad758541d844a4768434d3c1f1a3f0578c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:08Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.683604 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://454f72313609f075cd3f659b19279f8c3b07e8fea70f0ca7c88c5ca8e2e57397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0577fe2fd5192d4ec36ec8bfc6259a533b6422ef672d8153f601c2ac96bb5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:08Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.702551 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.702620 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.702638 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.702659 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.702671 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:08Z","lastTransitionTime":"2026-02-18T11:37:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.704039 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26zbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592c6351-c252-4c19-b3b1-167096be2de9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fca36b120b6a8f3bf08a810358bdd23b8b3c3105ede8e850232a9fef2d322db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26zbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:08Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.805686 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.805768 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.805796 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.805829 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.805854 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:08Z","lastTransitionTime":"2026-02-18T11:37:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.908942 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.909033 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.909059 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.909094 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.909118 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:08Z","lastTransitionTime":"2026-02-18T11:37:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.927287 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 05:20:31.770359106 +0000 UTC Feb 18 11:37:08 crc kubenswrapper[4922]: I0218 11:37:08.997591 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:08Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.012886 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.012966 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.012987 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.013014 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.013031 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:09Z","lastTransitionTime":"2026-02-18T11:37:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.020476 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:09Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.034122 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w46bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"067f44ac-9e60-4581-87cc-f2e1c823fc4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c52fc2f1b105e90957f8cd7424352990f9ec9e5350d1a501980bfda1613066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ks8d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w46bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:09Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.054571 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47a2352-be16-435d-9207-078c64f99e77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d641439c1ae42b2579e47041fc257fa37643ec1ef4fcacf55b15b82d6ec02cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dda1daddcf256e52a7dc75c6b5c221fcd8b8fad50c50b6053774f9a8fab57e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac316efde113b3547d00f480ee2c658958ba4f99c719d0920f333ab2a62ebaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6a8f8152ee7536b35054611238f4050c7f0e3a756f74ae8525af7f50150f88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:09Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.071670 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:09Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.088410 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdb7cedc-b2e3-48f0-80e0-e17073b43228\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c29cb76dded770e8691cd270c1c1197ffd3764c3fa73faf7f072a63c9e3e4b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653a3be8615ffc467d4be70b77ab62c837c24df920515a9965ec20ce2941d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-znglx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:09Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.115140 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.115483 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.115587 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.115670 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.115768 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:09Z","lastTransitionTime":"2026-02-18T11:37:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.116975 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"653a41bb-bb1d-421c-a92b-7f2811d95edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0e80f07cd7247e51069bf2b1dbdb4c9276cf6ea8904cc3f193a79c72f996bc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wg4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:09Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.146468 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bbe0881c-3173-4207-adb8-72c0dd4fa9c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e73616098bbc0fcdcd0b7e6b5bf610a7c193bdb65922e4532ce9553e9fc6e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71f2a53a03eafd89329b955a7edbe71c610706392e0a07fa09b9a6247889913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cac665b2d5a707e29339e7e8afa75dffcfd1dc1f42262e2643b2487ae68558a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0517235f78003f9b1a9044287c23aebed811299dc48f24c6643dd3bfcf5cf1d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a0c8c34dd5237a5b2d52b436f5c4deabd96a96ccb8073961dc49aaffd62c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:09Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.165027 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65825dc8bc12fe9933ff1392fd17a19042bf62d65c0b1ba22303094881745e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:09Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.178497 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9xzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b4595ac-c521-4ada-950d-e1b01cdff99b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f784f6e4f98fea5928c944062388eaa69d5a5782490bdb120cd735f54f52df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zv6rh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9xzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:09Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.191259 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3079844ef727cc58eb6253c7eaa0289ca84977a00334a10cdbf493b16133a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:09Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.202678 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5qkb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cd3723d-a12f-4c7c-a1ea-63bfef3c931a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ff8efc7bd7cb5045841f85767cea766898289dab6f4081829d530fbaaf9dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c45cw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5qkb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:09Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.218018 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"384d508e-8f6a-427d-a85d-85bb61d8405e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dacf3659cbb51e3d8df303470837041f3110f38b42187d40cbb4e53a8750cafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab832676116da0ae3d9dc70d182e283d7ae2418db02c1ea0062812cab0d71b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://434010c9772cf1c9db5578ecd4726ae2de0c0a3923d45b522868677cf78fcaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0218 11:36:58.792829 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0218 11:36:58.792990 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0218 11:36:58.794119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-56419459/tls.crt::/tmp/serving-cert-56419459/tls.key\\\\\\\"\\\\nI0218 11:36:58.994875 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0218 11:36:59.009974 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0218 11:36:59.010010 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0218 11:36:59.010039 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0218 11:36:59.010047 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0218 11:36:59.021309 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0218 11:36:59.021338 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021343 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021347 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0218 11:36:59.021350 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0218 11:36:59.021353 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:36:59.021356 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:36:59.021636 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0218 11:36:59.024101 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b891a2fe938b437cbfb28c7715d0ad758541d844a4768434d3c1f1a3f0578c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:09Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.218795 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.218838 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.218850 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.218891 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.218904 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:09Z","lastTransitionTime":"2026-02-18T11:37:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.231512 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://454f72313609f075cd3f659b19279f8c3b07e8fea70f0ca7c88c5ca8e2e57397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0577fe2fd5192d4ec36ec8bfc6259a533b6422ef672d8153f601c2ac96bb5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:09Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.249858 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26zbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592c6351-c252-4c19-b3b1-167096be2de9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fca36b120b6a8f3bf08a810358bdd23b8b3c3105ede8e850232a9fef2d322db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26zbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:09Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.255869 4922 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.322067 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.322108 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.322121 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.322138 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.322150 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:09Z","lastTransitionTime":"2026-02-18T11:37:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.425464 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.425681 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.425707 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.425722 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.425732 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:09Z","lastTransitionTime":"2026-02-18T11:37:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.529137 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.529186 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.529203 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.529226 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.529251 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:09Z","lastTransitionTime":"2026-02-18T11:37:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.632877 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.632947 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.632970 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.633002 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.633026 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:09Z","lastTransitionTime":"2026-02-18T11:37:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.736595 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.736986 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.737002 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.737021 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.737035 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:09Z","lastTransitionTime":"2026-02-18T11:37:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.840854 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.840929 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.840941 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.840964 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.840979 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:09Z","lastTransitionTime":"2026-02-18T11:37:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.927626 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 13:51:40.002640609 +0000 UTC Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.943679 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.943733 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.943742 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.943756 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.943766 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:09Z","lastTransitionTime":"2026-02-18T11:37:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.973112 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.973184 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:37:09 crc kubenswrapper[4922]: I0218 11:37:09.973163 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:37:09 crc kubenswrapper[4922]: E0218 11:37:09.973333 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:37:09 crc kubenswrapper[4922]: E0218 11:37:09.973451 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:37:09 crc kubenswrapper[4922]: E0218 11:37:09.973599 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:37:10 crc kubenswrapper[4922]: I0218 11:37:10.047959 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:10 crc kubenswrapper[4922]: I0218 11:37:10.048022 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:10 crc kubenswrapper[4922]: I0218 11:37:10.048041 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:10 crc kubenswrapper[4922]: I0218 11:37:10.048069 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:10 crc kubenswrapper[4922]: I0218 11:37:10.048087 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:10Z","lastTransitionTime":"2026-02-18T11:37:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:10 crc kubenswrapper[4922]: I0218 11:37:10.151559 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:10 crc kubenswrapper[4922]: I0218 11:37:10.151627 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:10 crc kubenswrapper[4922]: I0218 11:37:10.151646 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:10 crc kubenswrapper[4922]: I0218 11:37:10.151675 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:10 crc kubenswrapper[4922]: I0218 11:37:10.151726 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:10Z","lastTransitionTime":"2026-02-18T11:37:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:10 crc kubenswrapper[4922]: I0218 11:37:10.254998 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:10 crc kubenswrapper[4922]: I0218 11:37:10.255045 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:10 crc kubenswrapper[4922]: I0218 11:37:10.255057 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:10 crc kubenswrapper[4922]: I0218 11:37:10.255073 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:10 crc kubenswrapper[4922]: I0218 11:37:10.255085 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:10Z","lastTransitionTime":"2026-02-18T11:37:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:10 crc kubenswrapper[4922]: I0218 11:37:10.260297 4922 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 18 11:37:10 crc kubenswrapper[4922]: I0218 11:37:10.366099 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:10 crc kubenswrapper[4922]: I0218 11:37:10.366152 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:10 crc kubenswrapper[4922]: I0218 11:37:10.366206 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:10 crc kubenswrapper[4922]: I0218 11:37:10.366222 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:10 crc kubenswrapper[4922]: I0218 11:37:10.366234 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:10Z","lastTransitionTime":"2026-02-18T11:37:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:10 crc kubenswrapper[4922]: I0218 11:37:10.468209 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:10 crc kubenswrapper[4922]: I0218 11:37:10.468240 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:10 crc kubenswrapper[4922]: I0218 11:37:10.468248 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:10 crc kubenswrapper[4922]: I0218 11:37:10.468281 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:10 crc kubenswrapper[4922]: I0218 11:37:10.468291 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:10Z","lastTransitionTime":"2026-02-18T11:37:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:10 crc kubenswrapper[4922]: I0218 11:37:10.570584 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:10 crc kubenswrapper[4922]: I0218 11:37:10.570628 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:10 crc kubenswrapper[4922]: I0218 11:37:10.570665 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:10 crc kubenswrapper[4922]: I0218 11:37:10.570681 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:10 crc kubenswrapper[4922]: I0218 11:37:10.570692 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:10Z","lastTransitionTime":"2026-02-18T11:37:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:10 crc kubenswrapper[4922]: I0218 11:37:10.674660 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:10 crc kubenswrapper[4922]: I0218 11:37:10.674697 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:10 crc kubenswrapper[4922]: I0218 11:37:10.674709 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:10 crc kubenswrapper[4922]: I0218 11:37:10.674725 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:10 crc kubenswrapper[4922]: I0218 11:37:10.674738 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:10Z","lastTransitionTime":"2026-02-18T11:37:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:10 crc kubenswrapper[4922]: I0218 11:37:10.776959 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:10 crc kubenswrapper[4922]: I0218 11:37:10.776995 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:10 crc kubenswrapper[4922]: I0218 11:37:10.777003 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:10 crc kubenswrapper[4922]: I0218 11:37:10.777016 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:10 crc kubenswrapper[4922]: I0218 11:37:10.777025 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:10Z","lastTransitionTime":"2026-02-18T11:37:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:10 crc kubenswrapper[4922]: I0218 11:37:10.880425 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:10 crc kubenswrapper[4922]: I0218 11:37:10.880487 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:10 crc kubenswrapper[4922]: I0218 11:37:10.880505 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:10 crc kubenswrapper[4922]: I0218 11:37:10.880527 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:10 crc kubenswrapper[4922]: I0218 11:37:10.880543 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:10Z","lastTransitionTime":"2026-02-18T11:37:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:10 crc kubenswrapper[4922]: I0218 11:37:10.928438 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 11:15:19.402143584 +0000 UTC Feb 18 11:37:10 crc kubenswrapper[4922]: I0218 11:37:10.983027 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:10 crc kubenswrapper[4922]: I0218 11:37:10.983140 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:10 crc kubenswrapper[4922]: I0218 11:37:10.983151 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:10 crc kubenswrapper[4922]: I0218 11:37:10.983166 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:10 crc kubenswrapper[4922]: I0218 11:37:10.983180 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:10Z","lastTransitionTime":"2026-02-18T11:37:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.085296 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.085326 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.085335 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.085347 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.085356 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:11Z","lastTransitionTime":"2026-02-18T11:37:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.187461 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.187503 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.187513 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.187527 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.187539 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:11Z","lastTransitionTime":"2026-02-18T11:37:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.289607 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.289637 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.289645 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.289657 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.289665 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:11Z","lastTransitionTime":"2026-02-18T11:37:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.392348 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.392416 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.392429 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.392451 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.392466 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:11Z","lastTransitionTime":"2026-02-18T11:37:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.495646 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.495716 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.495730 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.495756 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.495772 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:11Z","lastTransitionTime":"2026-02-18T11:37:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.598577 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.598646 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.598656 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.598673 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.598682 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:11Z","lastTransitionTime":"2026-02-18T11:37:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.702127 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.702213 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.702235 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.702276 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.702295 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:11Z","lastTransitionTime":"2026-02-18T11:37:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.791454 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjnxs"] Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.792216 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjnxs" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.796831 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.800595 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.805830 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.805878 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.805891 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.805909 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.805920 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:11Z","lastTransitionTime":"2026-02-18T11:37:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.812840 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:11Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.838579 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:11Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.859683 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:11Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.878708 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/648f85d5-dbc6-4db6-b590-3edc96740212-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-hjnxs\" (UID: \"648f85d5-dbc6-4db6-b590-3edc96740212\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjnxs" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.878796 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8l2m\" (UniqueName: \"kubernetes.io/projected/648f85d5-dbc6-4db6-b590-3edc96740212-kube-api-access-t8l2m\") pod \"ovnkube-control-plane-749d76644c-hjnxs\" (UID: \"648f85d5-dbc6-4db6-b590-3edc96740212\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjnxs" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.878845 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/648f85d5-dbc6-4db6-b590-3edc96740212-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-hjnxs\" (UID: \"648f85d5-dbc6-4db6-b590-3edc96740212\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjnxs" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.878876 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/648f85d5-dbc6-4db6-b590-3edc96740212-env-overrides\") pod \"ovnkube-control-plane-749d76644c-hjnxs\" (UID: \"648f85d5-dbc6-4db6-b590-3edc96740212\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjnxs" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.879920 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w46bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"067f44ac-9e60-4581-87cc-f2e1c823fc4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c52fc2f1b105e90957f8cd7424352990f9ec9e5350d1a501980bfda1613066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ks8d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w46bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:11Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.899444 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47a2352-be16-435d-9207-078c64f99e77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d641439c1ae42b2579e47041fc257fa37643ec1ef4fcacf55b15b82d6ec02cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dda1daddcf256e52a7dc75c6b5c221fcd8b8fad50c50b6053774f9a8fab57e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac316efde113b3547d00f480ee2c658958ba4f99c719d0920f333ab2a62ebaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6a8f8152ee7536b35054611238f4050c7f0e3a756f74ae8525af7f50150f88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:11Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.909068 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.909139 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.909157 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.909191 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.909212 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:11Z","lastTransitionTime":"2026-02-18T11:37:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.919887 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9xzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b4595ac-c521-4ada-950d-e1b01cdff99b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f784f6e4f98fea5928c944062388eaa69d5a5782490bdb120cd735f54f52df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zv6rh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9xzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:11Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.929406 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 07:50:41.61267516 +0000 UTC Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.937993 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3079844ef727cc58eb6253c7eaa0289ca84977a00334a10cdbf493b16133a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:11Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.955070 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdb7cedc-b2e3-48f0-80e0-e17073b43228\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c29cb76dded770e8691cd270c1c1197ffd3764c3fa73faf7f072a63c9e3e4b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653a3be8615ffc467d4be70b77ab62c837c24df920515a9965ec20ce2941d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-znglx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:11Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.972580 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.972753 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.972754 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:37:11 crc kubenswrapper[4922]: E0218 11:37:11.972973 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:37:11 crc kubenswrapper[4922]: E0218 11:37:11.973057 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:37:11 crc kubenswrapper[4922]: E0218 11:37:11.973128 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.979274 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/648f85d5-dbc6-4db6-b590-3edc96740212-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-hjnxs\" (UID: \"648f85d5-dbc6-4db6-b590-3edc96740212\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjnxs" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.979335 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8l2m\" (UniqueName: \"kubernetes.io/projected/648f85d5-dbc6-4db6-b590-3edc96740212-kube-api-access-t8l2m\") pod \"ovnkube-control-plane-749d76644c-hjnxs\" (UID: \"648f85d5-dbc6-4db6-b590-3edc96740212\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjnxs" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.979406 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/648f85d5-dbc6-4db6-b590-3edc96740212-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-hjnxs\" (UID: \"648f85d5-dbc6-4db6-b590-3edc96740212\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjnxs" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.979445 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/648f85d5-dbc6-4db6-b590-3edc96740212-env-overrides\") pod \"ovnkube-control-plane-749d76644c-hjnxs\" (UID: \"648f85d5-dbc6-4db6-b590-3edc96740212\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjnxs" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.980217 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/648f85d5-dbc6-4db6-b590-3edc96740212-env-overrides\") pod \"ovnkube-control-plane-749d76644c-hjnxs\" (UID: \"648f85d5-dbc6-4db6-b590-3edc96740212\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjnxs" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.980936 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/648f85d5-dbc6-4db6-b590-3edc96740212-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-hjnxs\" (UID: \"648f85d5-dbc6-4db6-b590-3edc96740212\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjnxs" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.990236 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"653a41bb-bb1d-421c-a92b-7f2811d95edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0e80f07cd7247e51069bf2b1dbdb4c9276cf6ea8904cc3f193a79c72f996bc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wg4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:11Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:11 crc kubenswrapper[4922]: I0218 11:37:11.990524 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/648f85d5-dbc6-4db6-b590-3edc96740212-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-hjnxs\" (UID: \"648f85d5-dbc6-4db6-b590-3edc96740212\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjnxs" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.001311 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8l2m\" (UniqueName: \"kubernetes.io/projected/648f85d5-dbc6-4db6-b590-3edc96740212-kube-api-access-t8l2m\") pod \"ovnkube-control-plane-749d76644c-hjnxs\" (UID: \"648f85d5-dbc6-4db6-b590-3edc96740212\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjnxs" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.018679 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.018733 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.018743 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.018759 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.018769 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:12Z","lastTransitionTime":"2026-02-18T11:37:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.021353 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bbe0881c-3173-4207-adb8-72c0dd4fa9c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e73616098bbc0fcdcd0b7e6b5bf610a7c193bdb65922e4532ce9553e9fc6e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71f2a53a03eafd89329b955a7edbe71c610706392e0a07fa09b9a6247889913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cac665b2d5a707e29339e7e8afa75dffcfd1dc1f42262e2643b2487ae68558a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0517235f78003f9b1a9044287c23aebed811299dc48f24c6643dd3bfcf5cf1d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a0c8c34dd5237a5b2d52b436f5c4deabd96a96ccb8073961dc49aaffd62c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:12Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.034953 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65825dc8bc12fe9933ff1392fd17a19042bf62d65c0b1ba22303094881745e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:12Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.048574 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5qkb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cd3723d-a12f-4c7c-a1ea-63bfef3c931a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ff8efc7bd7cb5045841f85767cea766898289dab6f4081829d530fbaaf9dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c45cw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5qkb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:12Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.074551 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26zbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592c6351-c252-4c19-b3b1-167096be2de9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fca36b120b6a8f3bf08a810358bdd23b8b3c3105ede8e850232a9fef2d322db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26zbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:12Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.096770 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjnxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"648f85d5-dbc6-4db6-b590-3edc96740212\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8l2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8l2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:37:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjnxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:12Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.120084 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjnxs" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.120940 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"384d508e-8f6a-427d-a85d-85bb61d8405e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dacf3659cbb51e3d8df303470837041f3110f38b42187d40cbb4e53a8750cafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab832676116da0ae3d9dc70d182e283d7ae2418db02c1ea0062812cab0d71b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://434010c9772cf1c9db5578ecd4726ae2de0c0a3923d45b522868677cf78fcaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0218 11:36:58.792829 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0218 11:36:58.792990 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0218 11:36:58.794119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-56419459/tls.crt::/tmp/serving-cert-56419459/tls.key\\\\\\\"\\\\nI0218 11:36:58.994875 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0218 11:36:59.009974 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0218 11:36:59.010010 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0218 11:36:59.010039 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0218 11:36:59.010047 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0218 11:36:59.021309 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0218 11:36:59.021338 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021343 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021347 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0218 11:36:59.021350 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0218 11:36:59.021353 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:36:59.021356 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:36:59.021636 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0218 11:36:59.024101 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b891a2fe938b437cbfb28c7715d0ad758541d844a4768434d3c1f1a3f0578c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:12Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.122319 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.122343 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.122357 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.122400 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.122414 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:12Z","lastTransitionTime":"2026-02-18T11:37:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.168122 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://454f72313609f075cd3f659b19279f8c3b07e8fea70f0ca7c88c5ca8e2e57397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0577fe2fd5192d4ec36ec8bfc6259a533b6422ef672d8153f601c2ac96bb5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:12Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.227892 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.227972 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.227984 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.228002 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.228014 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:12Z","lastTransitionTime":"2026-02-18T11:37:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.268661 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjnxs" event={"ID":"648f85d5-dbc6-4db6-b590-3edc96740212","Type":"ContainerStarted","Data":"1bcc10b804e00f4e192af1480d9a0c3b7bddddf3bf72aaefae3d196eefc623b3"} Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.270730 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wg4r5_653a41bb-bb1d-421c-a92b-7f2811d95edf/ovnkube-controller/0.log" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.272976 4922 generic.go:334] "Generic (PLEG): container finished" podID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerID="b0e80f07cd7247e51069bf2b1dbdb4c9276cf6ea8904cc3f193a79c72f996bc6" exitCode=1 Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.273021 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" event={"ID":"653a41bb-bb1d-421c-a92b-7f2811d95edf","Type":"ContainerDied","Data":"b0e80f07cd7247e51069bf2b1dbdb4c9276cf6ea8904cc3f193a79c72f996bc6"} Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.273767 4922 scope.go:117] "RemoveContainer" containerID="b0e80f07cd7247e51069bf2b1dbdb4c9276cf6ea8904cc3f193a79c72f996bc6" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.291000 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:12Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.305797 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:12Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.321855 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:12Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.332978 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.333245 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.333266 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.333555 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w46bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"067f44ac-9e60-4581-87cc-f2e1c823fc4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c52fc2f1b105e90957f8cd7424352990f9ec9e5350d1a501980bfda1613066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ks8d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w46bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:12Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.333920 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.334022 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:12Z","lastTransitionTime":"2026-02-18T11:37:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.347293 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47a2352-be16-435d-9207-078c64f99e77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d641439c1ae42b2579e47041fc257fa37643ec1ef4fcacf55b15b82d6ec02cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dda1daddcf256e52a7dc75c6b5c221fcd8b8fad50c50b6053774f9a8fab57e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac316efde113b3547d00f480ee2c658958ba4f99c719d0920f333ab2a62ebaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6a8f8152ee7536b35054611238f4050c7f0e3a756f74ae8525af7f50150f88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:12Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.363468 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3079844ef727cc58eb6253c7eaa0289ca84977a00334a10cdbf493b16133a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:12Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.379856 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdb7cedc-b2e3-48f0-80e0-e17073b43228\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c29cb76dded770e8691cd270c1c1197ffd3764c3fa73faf7f072a63c9e3e4b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653a3be8615ffc467d4be70b77ab62c837c24df920515a9965ec20ce2941d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-znglx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:12Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.408467 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"653a41bb-bb1d-421c-a92b-7f2811d95edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0e80f07cd7247e51069bf2b1dbdb4c9276cf6ea8904cc3f193a79c72f996bc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0e80f07cd7247e51069bf2b1dbdb4c9276cf6ea8904cc3f193a79c72f996bc6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:37:11Z\\\",\\\"message\\\":\\\"r *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:37:11.244406 6237 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:37:11.245284 6237 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0218 11:37:11.245412 6237 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0218 11:37:11.245452 6237 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0218 11:37:11.245461 6237 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0218 11:37:11.245483 6237 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0218 11:37:11.245493 6237 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0218 11:37:11.245499 6237 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0218 11:37:11.245735 6237 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0218 11:37:11.245804 6237 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0218 11:37:11.245838 6237 factory.go:656] Stopping watch factory\\\\nI0218 11:37:11.245855 6237 ovnkube.go:599] Stopped ovnkube\\\\nI0218 11:37:11.245876 6237 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0218 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wg4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:12Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.428167 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bbe0881c-3173-4207-adb8-72c0dd4fa9c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e73616098bbc0fcdcd0b7e6b5bf610a7c193bdb65922e4532ce9553e9fc6e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71f2a53a03eafd89329b955a7edbe71c610706392e0a07fa09b9a6247889913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cac665b2d5a707e29339e7e8afa75dffcfd1dc1f42262e2643b2487ae68558a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0517235f78003f9b1a9044287c23aebed811299dc48f24c6643dd3bfcf5cf1d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a0c8c34dd5237a5b2d52b436f5c4deabd96a96ccb8073961dc49aaffd62c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:12Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.436507 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.436554 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.436564 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.436580 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.436590 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:12Z","lastTransitionTime":"2026-02-18T11:37:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.446881 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65825dc8bc12fe9933ff1392fd17a19042bf62d65c0b1ba22303094881745e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:12Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.464310 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9xzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b4595ac-c521-4ada-950d-e1b01cdff99b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f784f6e4f98fea5928c944062388eaa69d5a5782490bdb120cd735f54f52df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zv6rh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9xzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:12Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.478271 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5qkb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cd3723d-a12f-4c7c-a1ea-63bfef3c931a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ff8efc7bd7cb5045841f85767cea766898289dab6f4081829d530fbaaf9dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c45cw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5qkb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:12Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.494600 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjnxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"648f85d5-dbc6-4db6-b590-3edc96740212\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8l2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8l2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:37:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjnxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:12Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.513931 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"384d508e-8f6a-427d-a85d-85bb61d8405e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dacf3659cbb51e3d8df303470837041f3110f38b42187d40cbb4e53a8750cafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab832676116da0ae3d9dc70d182e283d7ae2418db02c1ea0062812cab0d71b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://434010c9772cf1c9db5578ecd4726ae2de0c0a3923d45b522868677cf78fcaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0218 11:36:58.792829 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0218 11:36:58.792990 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0218 11:36:58.794119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-56419459/tls.crt::/tmp/serving-cert-56419459/tls.key\\\\\\\"\\\\nI0218 11:36:58.994875 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0218 11:36:59.009974 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0218 11:36:59.010010 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0218 11:36:59.010039 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0218 11:36:59.010047 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0218 11:36:59.021309 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0218 11:36:59.021338 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021343 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021347 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0218 11:36:59.021350 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0218 11:36:59.021353 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:36:59.021356 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:36:59.021636 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0218 11:36:59.024101 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b891a2fe938b437cbfb28c7715d0ad758541d844a4768434d3c1f1a3f0578c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:12Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.535343 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://454f72313609f075cd3f659b19279f8c3b07e8fea70f0ca7c88c5ca8e2e57397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0577fe2fd5192d4ec36ec8bfc6259a533b6422ef672d8153f601c2ac96bb5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:12Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.539218 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.539250 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.539259 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.539277 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.539288 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:12Z","lastTransitionTime":"2026-02-18T11:37:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.551867 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26zbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592c6351-c252-4c19-b3b1-167096be2de9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fca36b120b6a8f3bf08a810358bdd23b8b3c3105ede8e850232a9fef2d322db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26zbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:12Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.640919 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.640976 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.640984 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.640996 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.641007 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:12Z","lastTransitionTime":"2026-02-18T11:37:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.744044 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.744097 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.744115 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.744136 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.744152 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:12Z","lastTransitionTime":"2026-02-18T11:37:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.846680 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.846727 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.846743 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.846763 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.846778 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:12Z","lastTransitionTime":"2026-02-18T11:37:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.929705 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 14:37:13.876288719 +0000 UTC Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.949836 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.949887 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.949905 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.949929 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:12 crc kubenswrapper[4922]: I0218 11:37:12.949947 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:12Z","lastTransitionTime":"2026-02-18T11:37:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.051912 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.051978 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.051990 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.052006 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.052018 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:13Z","lastTransitionTime":"2026-02-18T11:37:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.155046 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.155086 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.155097 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.155114 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.155126 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:13Z","lastTransitionTime":"2026-02-18T11:37:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.258195 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.258232 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.258241 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.258257 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.258293 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:13Z","lastTransitionTime":"2026-02-18T11:37:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.360730 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.360768 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.360778 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.360791 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.360800 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:13Z","lastTransitionTime":"2026-02-18T11:37:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.463023 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.463060 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.463068 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.463082 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.463099 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:13Z","lastTransitionTime":"2026-02-18T11:37:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.565600 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.565648 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.565659 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.565676 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.565690 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:13Z","lastTransitionTime":"2026-02-18T11:37:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.668128 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.668191 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.668208 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.668234 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.668263 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:13Z","lastTransitionTime":"2026-02-18T11:37:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.728070 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-pspfr"] Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.728965 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:37:13 crc kubenswrapper[4922]: E0218 11:37:13.729092 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pspfr" podUID="4702cf45-b47b-4291-a553-5bfc7bc22674" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.748019 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5qkb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cd3723d-a12f-4c7c-a1ea-63bfef3c931a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ff8efc7bd7cb5045841f85767cea766898289dab6f4081829d530fbaaf9dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c45cw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5qkb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:13Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.765232 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"384d508e-8f6a-427d-a85d-85bb61d8405e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dacf3659cbb51e3d8df303470837041f3110f38b42187d40cbb4e53a8750cafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab832676116da0ae3d9dc70d182e283d7ae2418db02c1ea0062812cab0d71b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://434010c9772cf1c9db5578ecd4726ae2de0c0a3923d45b522868677cf78fcaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0218 11:36:58.792829 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0218 11:36:58.792990 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0218 11:36:58.794119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-56419459/tls.crt::/tmp/serving-cert-56419459/tls.key\\\\\\\"\\\\nI0218 11:36:58.994875 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0218 11:36:59.009974 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0218 11:36:59.010010 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0218 11:36:59.010039 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0218 11:36:59.010047 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0218 11:36:59.021309 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0218 11:36:59.021338 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021343 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021347 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0218 11:36:59.021350 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0218 11:36:59.021353 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:36:59.021356 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:36:59.021636 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0218 11:36:59.024101 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b891a2fe938b437cbfb28c7715d0ad758541d844a4768434d3c1f1a3f0578c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:13Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.770387 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.770431 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.770442 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.770458 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.770470 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:13Z","lastTransitionTime":"2026-02-18T11:37:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.817868 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4702cf45-b47b-4291-a553-5bfc7bc22674-metrics-certs\") pod \"network-metrics-daemon-pspfr\" (UID: \"4702cf45-b47b-4291-a553-5bfc7bc22674\") " pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.817931 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7224x\" (UniqueName: \"kubernetes.io/projected/4702cf45-b47b-4291-a553-5bfc7bc22674-kube-api-access-7224x\") pod \"network-metrics-daemon-pspfr\" (UID: \"4702cf45-b47b-4291-a553-5bfc7bc22674\") " pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.819645 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://454f72313609f075cd3f659b19279f8c3b07e8fea70f0ca7c88c5ca8e2e57397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0577fe2fd5192d4ec36ec8bfc6259a533b6422ef672d8153f601c2ac96bb5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:13Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.840263 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26zbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592c6351-c252-4c19-b3b1-167096be2de9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fca36b120b6a8f3bf08a810358bdd23b8b3c3105ede8e850232a9fef2d322db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26zbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:13Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.853668 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjnxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"648f85d5-dbc6-4db6-b590-3edc96740212\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8l2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8l2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:37:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjnxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:13Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.864690 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pspfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702cf45-b47b-4291-a553-5bfc7bc22674\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7224x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7224x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:37:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pspfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:13Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.872587 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.872648 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.872665 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.872688 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.872703 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:13Z","lastTransitionTime":"2026-02-18T11:37:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.880150 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47a2352-be16-435d-9207-078c64f99e77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d641439c1ae42b2579e47041fc257fa37643ec1ef4fcacf55b15b82d6ec02cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dda1daddcf256e52a7dc75c6b5c221fcd8b8fad50c50b6053774f9a8fab57e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac316efde113b3547d00f480ee2c658958ba4f99c719d0920f333ab2a62ebaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6a8f8152ee7536b35054611238f4050c7f0e3a756f74ae8525af7f50150f88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:13Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.893295 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:13Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.912430 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:13Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.918629 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7224x\" (UniqueName: \"kubernetes.io/projected/4702cf45-b47b-4291-a553-5bfc7bc22674-kube-api-access-7224x\") pod \"network-metrics-daemon-pspfr\" (UID: \"4702cf45-b47b-4291-a553-5bfc7bc22674\") " pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.918723 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4702cf45-b47b-4291-a553-5bfc7bc22674-metrics-certs\") pod \"network-metrics-daemon-pspfr\" (UID: \"4702cf45-b47b-4291-a553-5bfc7bc22674\") " pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:37:13 crc kubenswrapper[4922]: E0218 11:37:13.918962 4922 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 11:37:13 crc kubenswrapper[4922]: E0218 11:37:13.919025 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4702cf45-b47b-4291-a553-5bfc7bc22674-metrics-certs podName:4702cf45-b47b-4291-a553-5bfc7bc22674 nodeName:}" failed. No retries permitted until 2026-02-18 11:37:14.419004649 +0000 UTC m=+36.146708739 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4702cf45-b47b-4291-a553-5bfc7bc22674-metrics-certs") pod "network-metrics-daemon-pspfr" (UID: "4702cf45-b47b-4291-a553-5bfc7bc22674") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.930517 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 14:35:41.676364415 +0000 UTC Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.931803 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:13Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.941846 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7224x\" (UniqueName: \"kubernetes.io/projected/4702cf45-b47b-4291-a553-5bfc7bc22674-kube-api-access-7224x\") pod \"network-metrics-daemon-pspfr\" (UID: \"4702cf45-b47b-4291-a553-5bfc7bc22674\") " pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.947970 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w46bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"067f44ac-9e60-4581-87cc-f2e1c823fc4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c52fc2f1b105e90957f8cd7424352990f9ec9e5350d1a501980bfda1613066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ks8d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w46bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:13Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.971215 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bbe0881c-3173-4207-adb8-72c0dd4fa9c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e73616098bbc0fcdcd0b7e6b5bf610a7c193bdb65922e4532ce9553e9fc6e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71f2a53a03eafd89329b955a7edbe71c610706392e0a07fa09b9a6247889913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cac665b2d5a707e29339e7e8afa75dffcfd1dc1f42262e2643b2487ae68558a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0517235f78003f9b1a9044287c23aebed811299dc48f24c6643dd3bfcf5cf1d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a0c8c34dd5237a5b2d52b436f5c4deabd96a96ccb8073961dc49aaffd62c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:13Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.972252 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.972269 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.972267 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:37:13 crc kubenswrapper[4922]: E0218 11:37:13.972475 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:37:13 crc kubenswrapper[4922]: E0218 11:37:13.972608 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:37:13 crc kubenswrapper[4922]: E0218 11:37:13.972719 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.974971 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.975029 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.975045 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.975067 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.975081 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:13Z","lastTransitionTime":"2026-02-18T11:37:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:13 crc kubenswrapper[4922]: I0218 11:37:13.988850 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65825dc8bc12fe9933ff1392fd17a19042bf62d65c0b1ba22303094881745e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:13Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.003251 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9xzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b4595ac-c521-4ada-950d-e1b01cdff99b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f784f6e4f98fea5928c944062388eaa69d5a5782490bdb120cd735f54f52df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zv6rh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9xzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:14Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.019382 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3079844ef727cc58eb6253c7eaa0289ca84977a00334a10cdbf493b16133a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:14Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.036476 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdb7cedc-b2e3-48f0-80e0-e17073b43228\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c29cb76dded770e8691cd270c1c1197ffd3764c3fa73faf7f072a63c9e3e4b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653a3be8615ffc467d4be70b77ab62c837c24df920515a9965ec20ce2941d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-znglx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:14Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.059739 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"653a41bb-bb1d-421c-a92b-7f2811d95edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0e80f07cd7247e51069bf2b1dbdb4c9276cf6ea8904cc3f193a79c72f996bc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0e80f07cd7247e51069bf2b1dbdb4c9276cf6ea8904cc3f193a79c72f996bc6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:37:11Z\\\",\\\"message\\\":\\\"r *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:37:11.244406 6237 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:37:11.245284 6237 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0218 11:37:11.245412 6237 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0218 11:37:11.245452 6237 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0218 11:37:11.245461 6237 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0218 11:37:11.245483 6237 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0218 11:37:11.245493 6237 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0218 11:37:11.245499 6237 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0218 11:37:11.245735 6237 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0218 11:37:11.245804 6237 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0218 11:37:11.245838 6237 factory.go:656] Stopping watch factory\\\\nI0218 11:37:11.245855 6237 ovnkube.go:599] Stopped ovnkube\\\\nI0218 11:37:11.245876 6237 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0218 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wg4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:14Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.076853 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.076887 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.076898 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.076915 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.076929 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:14Z","lastTransitionTime":"2026-02-18T11:37:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.180010 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.180072 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.180087 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.180111 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.180126 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:14Z","lastTransitionTime":"2026-02-18T11:37:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.282220 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.282264 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.282274 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.282294 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.282310 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:14Z","lastTransitionTime":"2026-02-18T11:37:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.283496 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjnxs" event={"ID":"648f85d5-dbc6-4db6-b590-3edc96740212","Type":"ContainerStarted","Data":"61f493c42d7223ae48d738cf9973c2f6f5997722b017cc1665ec5f27ea0542dc"} Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.283543 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjnxs" event={"ID":"648f85d5-dbc6-4db6-b590-3edc96740212","Type":"ContainerStarted","Data":"2e419e9c268c54b7829fa957e360c5b2c9943232871f9773ab32017c36a44b79"} Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.286492 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wg4r5_653a41bb-bb1d-421c-a92b-7f2811d95edf/ovnkube-controller/0.log" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.297520 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" event={"ID":"653a41bb-bb1d-421c-a92b-7f2811d95edf","Type":"ContainerStarted","Data":"a72b7b3feb4b1eda6d0a688e2f4f1dbb05dbf932f4e786b6e414f147964ce746"} Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.297716 4922 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.298467 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5qkb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cd3723d-a12f-4c7c-a1ea-63bfef3c931a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ff8efc7bd7cb5045841f85767cea766898289dab6f4081829d530fbaaf9dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c45cw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5qkb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:14Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.309085 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjnxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"648f85d5-dbc6-4db6-b590-3edc96740212\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e419e9c268c54b7829fa957e360c5b2c9943232871f9773ab32017c36a44b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8l2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61f493c42d7223ae48d738cf9973c2f6f5997722b017cc1665ec5f27ea0542dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8l2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:37:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjnxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:14Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.321486 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"384d508e-8f6a-427d-a85d-85bb61d8405e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dacf3659cbb51e3d8df303470837041f3110f38b42187d40cbb4e53a8750cafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab832676116da0ae3d9dc70d182e283d7ae2418db02c1ea0062812cab0d71b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://434010c9772cf1c9db5578ecd4726ae2de0c0a3923d45b522868677cf78fcaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0218 11:36:58.792829 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0218 11:36:58.792990 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0218 11:36:58.794119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-56419459/tls.crt::/tmp/serving-cert-56419459/tls.key\\\\\\\"\\\\nI0218 11:36:58.994875 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0218 11:36:59.009974 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0218 11:36:59.010010 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0218 11:36:59.010039 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0218 11:36:59.010047 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0218 11:36:59.021309 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0218 11:36:59.021338 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021343 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021347 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0218 11:36:59.021350 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0218 11:36:59.021353 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:36:59.021356 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:36:59.021636 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0218 11:36:59.024101 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b891a2fe938b437cbfb28c7715d0ad758541d844a4768434d3c1f1a3f0578c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:14Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.333937 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://454f72313609f075cd3f659b19279f8c3b07e8fea70f0ca7c88c5ca8e2e57397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0577fe2fd5192d4ec36ec8bfc6259a533b6422ef672d8153f601c2ac96bb5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:14Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.341535 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.341588 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.341602 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.341624 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.341642 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:14Z","lastTransitionTime":"2026-02-18T11:37:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.350192 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26zbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592c6351-c252-4c19-b3b1-167096be2de9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fca36b120b6a8f3bf08a810358bdd23b8b3c3105ede8e850232a9fef2d322db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26zbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:14Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:14 crc kubenswrapper[4922]: E0218 11:37:14.353426 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"142c3684-8991-4ed8-97d2-827c32777413\\\",\\\"systemUUID\\\":\\\"e6651d54-876d-4ba5-b7d2-813fd96498e1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:14Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.358891 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.358923 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.358932 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.358947 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.358957 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:14Z","lastTransitionTime":"2026-02-18T11:37:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.363985 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:14Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:14 crc kubenswrapper[4922]: E0218 11:37:14.369227 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"142c3684-8991-4ed8-97d2-827c32777413\\\",\\\"systemUUID\\\":\\\"e6651d54-876d-4ba5-b7d2-813fd96498e1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:14Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.371982 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.372007 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.372016 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.372032 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.372042 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:14Z","lastTransitionTime":"2026-02-18T11:37:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.374671 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:14Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:14 crc kubenswrapper[4922]: E0218 11:37:14.384306 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"142c3684-8991-4ed8-97d2-827c32777413\\\",\\\"systemUUID\\\":\\\"e6651d54-876d-4ba5-b7d2-813fd96498e1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:14Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.385998 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:14Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.388155 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.388191 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.388203 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.388220 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.388231 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:14Z","lastTransitionTime":"2026-02-18T11:37:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.396092 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w46bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"067f44ac-9e60-4581-87cc-f2e1c823fc4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c52fc2f1b105e90957f8cd7424352990f9ec9e5350d1a501980bfda1613066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ks8d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w46bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:14Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:14 crc kubenswrapper[4922]: E0218 11:37:14.402878 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"142c3684-8991-4ed8-97d2-827c32777413\\\",\\\"systemUUID\\\":\\\"e6651d54-876d-4ba5-b7d2-813fd96498e1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:14Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.408005 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pspfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702cf45-b47b-4291-a553-5bfc7bc22674\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7224x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7224x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:37:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pspfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:14Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.408645 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.408688 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.408701 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.408720 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.408732 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:14Z","lastTransitionTime":"2026-02-18T11:37:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:14 crc kubenswrapper[4922]: E0218 11:37:14.422212 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"142c3684-8991-4ed8-97d2-827c32777413\\\",\\\"systemUUID\\\":\\\"e6651d54-876d-4ba5-b7d2-813fd96498e1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:14Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:14 crc kubenswrapper[4922]: E0218 11:37:14.422400 4922 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.422417 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47a2352-be16-435d-9207-078c64f99e77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d641439c1ae42b2579e47041fc257fa37643ec1ef4fcacf55b15b82d6ec02cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dda1daddcf256e52a7dc75c6b5c221fcd8b8fad50c50b6053774f9a8fab57e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac316efde113b3547d00f480ee2c658958ba4f99c719d0920f333ab2a62ebaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6a8f8152ee7536b35054611238f4050c7f0e3a756f74ae8525af7f50150f88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:14Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.422793 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4702cf45-b47b-4291-a553-5bfc7bc22674-metrics-certs\") pod \"network-metrics-daemon-pspfr\" (UID: \"4702cf45-b47b-4291-a553-5bfc7bc22674\") " pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:37:14 crc kubenswrapper[4922]: E0218 11:37:14.422954 4922 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 11:37:14 crc kubenswrapper[4922]: E0218 11:37:14.423029 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4702cf45-b47b-4291-a553-5bfc7bc22674-metrics-certs podName:4702cf45-b47b-4291-a553-5bfc7bc22674 nodeName:}" failed. No retries permitted until 2026-02-18 11:37:15.42301068 +0000 UTC m=+37.150714760 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4702cf45-b47b-4291-a553-5bfc7bc22674-metrics-certs") pod "network-metrics-daemon-pspfr" (UID: "4702cf45-b47b-4291-a553-5bfc7bc22674") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.425007 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.425039 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.425051 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.425071 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.425087 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:14Z","lastTransitionTime":"2026-02-18T11:37:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.439103 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3079844ef727cc58eb6253c7eaa0289ca84977a00334a10cdbf493b16133a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:14Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.449760 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdb7cedc-b2e3-48f0-80e0-e17073b43228\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c29cb76dded770e8691cd270c1c1197ffd3764c3fa73faf7f072a63c9e3e4b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653a3be8615ffc467d4be70b77ab62c837c24df920515a9965ec20ce2941d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-znglx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:14Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.470705 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"653a41bb-bb1d-421c-a92b-7f2811d95edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0e80f07cd7247e51069bf2b1dbdb4c9276cf6ea8904cc3f193a79c72f996bc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0e80f07cd7247e51069bf2b1dbdb4c9276cf6ea8904cc3f193a79c72f996bc6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:37:11Z\\\",\\\"message\\\":\\\"r *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:37:11.244406 6237 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:37:11.245284 6237 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0218 11:37:11.245412 6237 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0218 11:37:11.245452 6237 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0218 11:37:11.245461 6237 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0218 11:37:11.245483 6237 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0218 11:37:11.245493 6237 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0218 11:37:11.245499 6237 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0218 11:37:11.245735 6237 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0218 11:37:11.245804 6237 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0218 11:37:11.245838 6237 factory.go:656] Stopping watch factory\\\\nI0218 11:37:11.245855 6237 ovnkube.go:599] Stopped ovnkube\\\\nI0218 11:37:11.245876 6237 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0218 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wg4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:14Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.490280 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bbe0881c-3173-4207-adb8-72c0dd4fa9c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e73616098bbc0fcdcd0b7e6b5bf610a7c193bdb65922e4532ce9553e9fc6e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71f2a53a03eafd89329b955a7edbe71c610706392e0a07fa09b9a6247889913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cac665b2d5a707e29339e7e8afa75dffcfd1dc1f42262e2643b2487ae68558a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0517235f78003f9b1a9044287c23aebed811299dc48f24c6643dd3bfcf5cf1d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a0c8c34dd5237a5b2d52b436f5c4deabd96a96ccb8073961dc49aaffd62c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:14Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.506286 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65825dc8bc12fe9933ff1392fd17a19042bf62d65c0b1ba22303094881745e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:14Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.521691 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9xzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b4595ac-c521-4ada-950d-e1b01cdff99b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f784f6e4f98fea5928c944062388eaa69d5a5782490bdb120cd735f54f52df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zv6rh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9xzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:14Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.527499 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.527546 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.527558 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.527576 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.527592 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:14Z","lastTransitionTime":"2026-02-18T11:37:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.538141 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65825dc8bc12fe9933ff1392fd17a19042bf62d65c0b1ba22303094881745e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:14Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.551462 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9xzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b4595ac-c521-4ada-950d-e1b01cdff99b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f784f6e4f98fea5928c944062388eaa69d5a5782490bdb120cd735f54f52df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zv6rh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9xzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:14Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.563910 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3079844ef727cc58eb6253c7eaa0289ca84977a00334a10cdbf493b16133a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:14Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.578287 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdb7cedc-b2e3-48f0-80e0-e17073b43228\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c29cb76dded770e8691cd270c1c1197ffd3764c3fa73faf7f072a63c9e3e4b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653a3be8615ffc467d4be70b77ab62c837c24df920515a9965ec20ce2941d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-znglx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:14Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.604693 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"653a41bb-bb1d-421c-a92b-7f2811d95edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a72b7b3feb4b1eda6d0a688e2f4f1dbb05dbf932f4e786b6e414f147964ce746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0e80f07cd7247e51069bf2b1dbdb4c9276cf6ea8904cc3f193a79c72f996bc6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:37:11Z\\\",\\\"message\\\":\\\"r *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:37:11.244406 6237 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:37:11.245284 6237 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0218 11:37:11.245412 6237 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0218 11:37:11.245452 6237 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0218 11:37:11.245461 6237 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0218 11:37:11.245483 6237 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0218 11:37:11.245493 6237 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0218 11:37:11.245499 6237 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0218 11:37:11.245735 6237 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0218 11:37:11.245804 6237 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0218 11:37:11.245838 6237 factory.go:656] Stopping watch factory\\\\nI0218 11:37:11.245855 6237 ovnkube.go:599] Stopped ovnkube\\\\nI0218 11:37:11.245876 6237 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0218 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wg4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:14Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.631892 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.631968 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.631985 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.632009 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.632025 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:14Z","lastTransitionTime":"2026-02-18T11:37:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.637057 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bbe0881c-3173-4207-adb8-72c0dd4fa9c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e73616098bbc0fcdcd0b7e6b5bf610a7c193bdb65922e4532ce9553e9fc6e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71f2a53a03eafd89329b955a7edbe71c610706392e0a07fa09b9a6247889913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cac665b2d5a707e29339e7e8afa75dffcfd1dc1f42262e2643b2487ae68558a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0517235f78003f9b1a9044287c23aebed811299dc48f24c6643dd3bfcf5cf1d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a0c8c34dd5237a5b2d52b436f5c4deabd96a96ccb8073961dc49aaffd62c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:14Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.648125 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5qkb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cd3723d-a12f-4c7c-a1ea-63bfef3c931a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ff8efc7bd7cb5045841f85767cea766898289dab6f4081829d530fbaaf9dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c45cw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5qkb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:14Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.666522 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://454f72313609f075cd3f659b19279f8c3b07e8fea70f0ca7c88c5ca8e2e57397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0577fe2fd5192d4ec36ec8bfc6259a533b6422ef672d8153f601c2ac96bb5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:14Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.687484 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26zbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592c6351-c252-4c19-b3b1-167096be2de9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fca36b120b6a8f3bf08a810358bdd23b8b3c3105ede8e850232a9fef2d322db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26zbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:14Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.701724 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjnxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"648f85d5-dbc6-4db6-b590-3edc96740212\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e419e9c268c54b7829fa957e360c5b2c9943232871f9773ab32017c36a44b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8l2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61f493c42d7223ae48d738cf9973c2f6f5997722b017cc1665ec5f27ea0542dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8l2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:37:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjnxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:14Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.717427 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"384d508e-8f6a-427d-a85d-85bb61d8405e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dacf3659cbb51e3d8df303470837041f3110f38b42187d40cbb4e53a8750cafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab832676116da0ae3d9dc70d182e283d7ae2418db02c1ea0062812cab0d71b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://434010c9772cf1c9db5578ecd4726ae2de0c0a3923d45b522868677cf78fcaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0218 11:36:58.792829 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0218 11:36:58.792990 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0218 11:36:58.794119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-56419459/tls.crt::/tmp/serving-cert-56419459/tls.key\\\\\\\"\\\\nI0218 11:36:58.994875 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0218 11:36:59.009974 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0218 11:36:59.010010 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0218 11:36:59.010039 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0218 11:36:59.010047 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0218 11:36:59.021309 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0218 11:36:59.021338 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021343 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021347 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0218 11:36:59.021350 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0218 11:36:59.021353 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:36:59.021356 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:36:59.021636 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0218 11:36:59.024101 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b891a2fe938b437cbfb28c7715d0ad758541d844a4768434d3c1f1a3f0578c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:14Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.734664 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47a2352-be16-435d-9207-078c64f99e77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d641439c1ae42b2579e47041fc257fa37643ec1ef4fcacf55b15b82d6ec02cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dda1daddcf256e52a7dc75c6b5c221fcd8b8fad50c50b6053774f9a8fab57e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac316efde113b3547d00f480ee2c658958ba4f99c719d0920f333ab2a62ebaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6a8f8152ee7536b35054611238f4050c7f0e3a756f74ae8525af7f50150f88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:14Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.736432 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.736496 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.736515 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.736543 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.736569 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:14Z","lastTransitionTime":"2026-02-18T11:37:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.749419 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:14Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.767568 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:14Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.784865 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:14Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.797136 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w46bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"067f44ac-9e60-4581-87cc-f2e1c823fc4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c52fc2f1b105e90957f8cd7424352990f9ec9e5350d1a501980bfda1613066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ks8d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w46bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:14Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.810105 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pspfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702cf45-b47b-4291-a553-5bfc7bc22674\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7224x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7224x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:37:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pspfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:14Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.838818 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.838859 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.838869 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.838884 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.838895 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:14Z","lastTransitionTime":"2026-02-18T11:37:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.931260 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 04:34:29.893011706 +0000 UTC Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.941272 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.941317 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.941328 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.941344 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:14 crc kubenswrapper[4922]: I0218 11:37:14.941356 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:14Z","lastTransitionTime":"2026-02-18T11:37:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.044760 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.044806 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.044817 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.044836 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.044848 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:15Z","lastTransitionTime":"2026-02-18T11:37:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.147270 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.147407 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.147421 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.147439 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.147459 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:15Z","lastTransitionTime":"2026-02-18T11:37:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.250815 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.250870 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.250882 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.250902 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.250914 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:15Z","lastTransitionTime":"2026-02-18T11:37:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.306599 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wg4r5_653a41bb-bb1d-421c-a92b-7f2811d95edf/ovnkube-controller/1.log" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.307829 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wg4r5_653a41bb-bb1d-421c-a92b-7f2811d95edf/ovnkube-controller/0.log" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.311356 4922 generic.go:334] "Generic (PLEG): container finished" podID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerID="a72b7b3feb4b1eda6d0a688e2f4f1dbb05dbf932f4e786b6e414f147964ce746" exitCode=1 Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.311573 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" event={"ID":"653a41bb-bb1d-421c-a92b-7f2811d95edf","Type":"ContainerDied","Data":"a72b7b3feb4b1eda6d0a688e2f4f1dbb05dbf932f4e786b6e414f147964ce746"} Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.311676 4922 scope.go:117] "RemoveContainer" containerID="b0e80f07cd7247e51069bf2b1dbdb4c9276cf6ea8904cc3f193a79c72f996bc6" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.312858 4922 scope.go:117] "RemoveContainer" containerID="a72b7b3feb4b1eda6d0a688e2f4f1dbb05dbf932f4e786b6e414f147964ce746" Feb 18 11:37:15 crc kubenswrapper[4922]: E0218 11:37:15.313150 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-wg4r5_openshift-ovn-kubernetes(653a41bb-bb1d-421c-a92b-7f2811d95edf)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.336349 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"384d508e-8f6a-427d-a85d-85bb61d8405e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dacf3659cbb51e3d8df303470837041f3110f38b42187d40cbb4e53a8750cafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab832676116da0ae3d9dc70d182e283d7ae2418db02c1ea0062812cab0d71b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://434010c9772cf1c9db5578ecd4726ae2de0c0a3923d45b522868677cf78fcaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0218 11:36:58.792829 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0218 11:36:58.792990 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0218 11:36:58.794119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-56419459/tls.crt::/tmp/serving-cert-56419459/tls.key\\\\\\\"\\\\nI0218 11:36:58.994875 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0218 11:36:59.009974 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0218 11:36:59.010010 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0218 11:36:59.010039 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0218 11:36:59.010047 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0218 11:36:59.021309 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0218 11:36:59.021338 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021343 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021347 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0218 11:36:59.021350 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0218 11:36:59.021353 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:36:59.021356 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:36:59.021636 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0218 11:36:59.024101 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b891a2fe938b437cbfb28c7715d0ad758541d844a4768434d3c1f1a3f0578c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:15Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.355095 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.355185 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.355201 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.355224 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.355239 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:15Z","lastTransitionTime":"2026-02-18T11:37:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.357677 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://454f72313609f075cd3f659b19279f8c3b07e8fea70f0ca7c88c5ca8e2e57397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0577fe2fd5192d4ec36ec8bfc6259a533b6422ef672d8153f601c2ac96bb5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:15Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.375502 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26zbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592c6351-c252-4c19-b3b1-167096be2de9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fca36b120b6a8f3bf08a810358bdd23b8b3c3105ede8e850232a9fef2d322db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26zbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:15Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.387203 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjnxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"648f85d5-dbc6-4db6-b590-3edc96740212\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e419e9c268c54b7829fa957e360c5b2c9943232871f9773ab32017c36a44b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8l2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61f493c42d7223ae48d738cf9973c2f6f5997722b017cc1665ec5f27ea0542dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8l2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:37:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjnxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:15Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.399204 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:15Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.410944 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:15Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.421044 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w46bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"067f44ac-9e60-4581-87cc-f2e1c823fc4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c52fc2f1b105e90957f8cd7424352990f9ec9e5350d1a501980bfda1613066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ks8d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w46bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:15Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.431205 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pspfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702cf45-b47b-4291-a553-5bfc7bc22674\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7224x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7224x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:37:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pspfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:15Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.433417 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4702cf45-b47b-4291-a553-5bfc7bc22674-metrics-certs\") pod \"network-metrics-daemon-pspfr\" (UID: \"4702cf45-b47b-4291-a553-5bfc7bc22674\") " pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:37:15 crc kubenswrapper[4922]: E0218 11:37:15.433601 4922 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 11:37:15 crc kubenswrapper[4922]: E0218 11:37:15.433658 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4702cf45-b47b-4291-a553-5bfc7bc22674-metrics-certs podName:4702cf45-b47b-4291-a553-5bfc7bc22674 nodeName:}" failed. No retries permitted until 2026-02-18 11:37:17.433644643 +0000 UTC m=+39.161348723 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4702cf45-b47b-4291-a553-5bfc7bc22674-metrics-certs") pod "network-metrics-daemon-pspfr" (UID: "4702cf45-b47b-4291-a553-5bfc7bc22674") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.443068 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47a2352-be16-435d-9207-078c64f99e77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d641439c1ae42b2579e47041fc257fa37643ec1ef4fcacf55b15b82d6ec02cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dda1daddcf256e52a7dc75c6b5c221fcd8b8fad50c50b6053774f9a8fab57e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac316efde113b3547d00f480ee2c658958ba4f99c719d0920f333ab2a62ebaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6a8f8152ee7536b35054611238f4050c7f0e3a756f74ae8525af7f50150f88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:15Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.453050 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:15Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.458822 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.458866 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.458876 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.458897 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.458917 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:15Z","lastTransitionTime":"2026-02-18T11:37:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.463741 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdb7cedc-b2e3-48f0-80e0-e17073b43228\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c29cb76dded770e8691cd270c1c1197ffd3764c3fa73faf7f072a63c9e3e4b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653a3be8615ffc467d4be70b77ab62c837c24df920515a9965ec20ce2941d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-znglx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:15Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.487593 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"653a41bb-bb1d-421c-a92b-7f2811d95edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a72b7b3feb4b1eda6d0a688e2f4f1dbb05dbf932f4e786b6e414f147964ce746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0e80f07cd7247e51069bf2b1dbdb4c9276cf6ea8904cc3f193a79c72f996bc6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:37:11Z\\\",\\\"message\\\":\\\"r *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:37:11.244406 6237 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:37:11.245284 6237 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0218 11:37:11.245412 6237 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0218 11:37:11.245452 6237 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0218 11:37:11.245461 6237 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0218 11:37:11.245483 6237 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0218 11:37:11.245493 6237 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0218 11:37:11.245499 6237 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0218 11:37:11.245735 6237 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0218 11:37:11.245804 6237 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0218 11:37:11.245838 6237 factory.go:656] Stopping watch factory\\\\nI0218 11:37:11.245855 6237 ovnkube.go:599] Stopped ovnkube\\\\nI0218 11:37:11.245876 6237 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0218 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a72b7b3feb4b1eda6d0a688e2f4f1dbb05dbf932f4e786b6e414f147964ce746\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"message\\\":\\\"rking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI0218 11:37:14.424128 6409 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0218 11:37:14.424192 6409 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0218 11:37:14.424197 6409 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0218 11:37:14.424238 6409 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0218 11:37:14.424249 6409 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0218 11:37:14.424254 6409 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0218 11:37:14.424121 6409 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF0218 11:37:14.423917 6409 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wg4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:15Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.507555 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bbe0881c-3173-4207-adb8-72c0dd4fa9c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e73616098bbc0fcdcd0b7e6b5bf610a7c193bdb65922e4532ce9553e9fc6e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71f2a53a03eafd89329b955a7edbe71c610706392e0a07fa09b9a6247889913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cac665b2d5a707e29339e7e8afa75dffcfd1dc1f42262e2643b2487ae68558a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0517235f78003f9b1a9044287c23aebed811299dc48f24c6643dd3bfcf5cf1d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a0c8c34dd5237a5b2d52b436f5c4deabd96a96ccb8073961dc49aaffd62c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:15Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.521467 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65825dc8bc12fe9933ff1392fd17a19042bf62d65c0b1ba22303094881745e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:15Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.545074 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9xzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b4595ac-c521-4ada-950d-e1b01cdff99b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f784f6e4f98fea5928c944062388eaa69d5a5782490bdb120cd735f54f52df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zv6rh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9xzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:15Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.559242 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3079844ef727cc58eb6253c7eaa0289ca84977a00334a10cdbf493b16133a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:15Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.561420 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.561721 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.561766 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.561790 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.561802 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:15Z","lastTransitionTime":"2026-02-18T11:37:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.571009 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5qkb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cd3723d-a12f-4c7c-a1ea-63bfef3c931a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ff8efc7bd7cb5045841f85767cea766898289dab6f4081829d530fbaaf9dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c45cw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5qkb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:15Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.665759 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.665816 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.665828 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.665877 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.665890 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:15Z","lastTransitionTime":"2026-02-18T11:37:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.737108 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:37:15 crc kubenswrapper[4922]: E0218 11:37:15.737394 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:37:31.737312652 +0000 UTC m=+53.465016772 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.769291 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.769348 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.769391 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.769417 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.769436 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:15Z","lastTransitionTime":"2026-02-18T11:37:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.839525 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.839592 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.839655 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.839709 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:37:15 crc kubenswrapper[4922]: E0218 11:37:15.839778 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 11:37:15 crc kubenswrapper[4922]: E0218 11:37:15.839812 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 11:37:15 crc kubenswrapper[4922]: E0218 11:37:15.839831 4922 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:37:15 crc kubenswrapper[4922]: E0218 11:37:15.839836 4922 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 11:37:15 crc kubenswrapper[4922]: E0218 11:37:15.839831 4922 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 11:37:15 crc kubenswrapper[4922]: E0218 11:37:15.839867 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 11:37:15 crc kubenswrapper[4922]: E0218 11:37:15.839897 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 11:37:31.83987587 +0000 UTC m=+53.567579970 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 11:37:15 crc kubenswrapper[4922]: E0218 11:37:15.839905 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 11:37:15 crc kubenswrapper[4922]: E0218 11:37:15.839920 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-18 11:37:31.83990949 +0000 UTC m=+53.567613590 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:37:15 crc kubenswrapper[4922]: E0218 11:37:15.839926 4922 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:37:15 crc kubenswrapper[4922]: E0218 11:37:15.839942 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 11:37:31.839933001 +0000 UTC m=+53.567637091 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 11:37:15 crc kubenswrapper[4922]: E0218 11:37:15.839995 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-18 11:37:31.839970572 +0000 UTC m=+53.567674692 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.874483 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.874543 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.874557 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.874582 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.874600 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:15Z","lastTransitionTime":"2026-02-18T11:37:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.932679 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 18:10:49.156870836 +0000 UTC Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.972805 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.972931 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:37:15 crc kubenswrapper[4922]: E0218 11:37:15.973226 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.973663 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:37:15 crc kubenswrapper[4922]: E0218 11:37:15.973959 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pspfr" podUID="4702cf45-b47b-4291-a553-5bfc7bc22674" Feb 18 11:37:15 crc kubenswrapper[4922]: E0218 11:37:15.973995 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.974206 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:37:15 crc kubenswrapper[4922]: E0218 11:37:15.974469 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.977213 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.977305 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.977469 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.977897 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:15 crc kubenswrapper[4922]: I0218 11:37:15.978019 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:15Z","lastTransitionTime":"2026-02-18T11:37:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:16 crc kubenswrapper[4922]: I0218 11:37:16.081016 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:16 crc kubenswrapper[4922]: I0218 11:37:16.081076 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:16 crc kubenswrapper[4922]: I0218 11:37:16.081094 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:16 crc kubenswrapper[4922]: I0218 11:37:16.081121 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:16 crc kubenswrapper[4922]: I0218 11:37:16.081140 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:16Z","lastTransitionTime":"2026-02-18T11:37:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:16 crc kubenswrapper[4922]: I0218 11:37:16.184311 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:16 crc kubenswrapper[4922]: I0218 11:37:16.184417 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:16 crc kubenswrapper[4922]: I0218 11:37:16.184436 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:16 crc kubenswrapper[4922]: I0218 11:37:16.184462 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:16 crc kubenswrapper[4922]: I0218 11:37:16.184484 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:16Z","lastTransitionTime":"2026-02-18T11:37:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:16 crc kubenswrapper[4922]: I0218 11:37:16.288387 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:16 crc kubenswrapper[4922]: I0218 11:37:16.288461 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:16 crc kubenswrapper[4922]: I0218 11:37:16.288495 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:16 crc kubenswrapper[4922]: I0218 11:37:16.288528 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:16 crc kubenswrapper[4922]: I0218 11:37:16.288549 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:16Z","lastTransitionTime":"2026-02-18T11:37:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:16 crc kubenswrapper[4922]: I0218 11:37:16.318770 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wg4r5_653a41bb-bb1d-421c-a92b-7f2811d95edf/ovnkube-controller/1.log" Feb 18 11:37:16 crc kubenswrapper[4922]: I0218 11:37:16.392595 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:16 crc kubenswrapper[4922]: I0218 11:37:16.392649 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:16 crc kubenswrapper[4922]: I0218 11:37:16.392668 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:16 crc kubenswrapper[4922]: I0218 11:37:16.392693 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:16 crc kubenswrapper[4922]: I0218 11:37:16.392709 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:16Z","lastTransitionTime":"2026-02-18T11:37:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:16 crc kubenswrapper[4922]: I0218 11:37:16.495539 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:16 crc kubenswrapper[4922]: I0218 11:37:16.495585 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:16 crc kubenswrapper[4922]: I0218 11:37:16.495595 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:16 crc kubenswrapper[4922]: I0218 11:37:16.495614 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:16 crc kubenswrapper[4922]: I0218 11:37:16.495627 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:16Z","lastTransitionTime":"2026-02-18T11:37:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:16 crc kubenswrapper[4922]: I0218 11:37:16.597975 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:16 crc kubenswrapper[4922]: I0218 11:37:16.598041 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:16 crc kubenswrapper[4922]: I0218 11:37:16.598056 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:16 crc kubenswrapper[4922]: I0218 11:37:16.598082 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:16 crc kubenswrapper[4922]: I0218 11:37:16.598104 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:16Z","lastTransitionTime":"2026-02-18T11:37:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:16 crc kubenswrapper[4922]: I0218 11:37:16.701009 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:16 crc kubenswrapper[4922]: I0218 11:37:16.701066 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:16 crc kubenswrapper[4922]: I0218 11:37:16.701081 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:16 crc kubenswrapper[4922]: I0218 11:37:16.701106 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:16 crc kubenswrapper[4922]: I0218 11:37:16.701122 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:16Z","lastTransitionTime":"2026-02-18T11:37:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:16 crc kubenswrapper[4922]: I0218 11:37:16.804323 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:16 crc kubenswrapper[4922]: I0218 11:37:16.804397 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:16 crc kubenswrapper[4922]: I0218 11:37:16.804411 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:16 crc kubenswrapper[4922]: I0218 11:37:16.804432 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:16 crc kubenswrapper[4922]: I0218 11:37:16.804456 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:16Z","lastTransitionTime":"2026-02-18T11:37:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:16 crc kubenswrapper[4922]: I0218 11:37:16.907749 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:16 crc kubenswrapper[4922]: I0218 11:37:16.907824 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:16 crc kubenswrapper[4922]: I0218 11:37:16.907841 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:16 crc kubenswrapper[4922]: I0218 11:37:16.907908 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:16 crc kubenswrapper[4922]: I0218 11:37:16.907930 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:16Z","lastTransitionTime":"2026-02-18T11:37:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:16 crc kubenswrapper[4922]: I0218 11:37:16.933236 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 03:05:20.182744194 +0000 UTC Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.011407 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.011468 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.011483 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.011525 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.011541 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:17Z","lastTransitionTime":"2026-02-18T11:37:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.114912 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.114960 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.114973 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.114991 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.115003 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:17Z","lastTransitionTime":"2026-02-18T11:37:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.218399 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.218453 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.218467 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.218485 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.218501 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:17Z","lastTransitionTime":"2026-02-18T11:37:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.321898 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.321952 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.321965 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.321985 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.321998 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:17Z","lastTransitionTime":"2026-02-18T11:37:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.425097 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.425163 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.425183 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.425209 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.425229 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:17Z","lastTransitionTime":"2026-02-18T11:37:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.457823 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4702cf45-b47b-4291-a553-5bfc7bc22674-metrics-certs\") pod \"network-metrics-daemon-pspfr\" (UID: \"4702cf45-b47b-4291-a553-5bfc7bc22674\") " pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:37:17 crc kubenswrapper[4922]: E0218 11:37:17.458026 4922 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 11:37:17 crc kubenswrapper[4922]: E0218 11:37:17.458142 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4702cf45-b47b-4291-a553-5bfc7bc22674-metrics-certs podName:4702cf45-b47b-4291-a553-5bfc7bc22674 nodeName:}" failed. No retries permitted until 2026-02-18 11:37:21.458120507 +0000 UTC m=+43.185824587 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4702cf45-b47b-4291-a553-5bfc7bc22674-metrics-certs") pod "network-metrics-daemon-pspfr" (UID: "4702cf45-b47b-4291-a553-5bfc7bc22674") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.528844 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.528899 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.528908 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.528927 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.528938 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:17Z","lastTransitionTime":"2026-02-18T11:37:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.632726 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.632782 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.632795 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.632812 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.632827 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:17Z","lastTransitionTime":"2026-02-18T11:37:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.735653 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.735702 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.735710 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.735726 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.735736 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:17Z","lastTransitionTime":"2026-02-18T11:37:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.838728 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.839344 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.839432 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.839472 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.839517 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:17Z","lastTransitionTime":"2026-02-18T11:37:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.934113 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 16:59:59.31929538 +0000 UTC Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.942783 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.942833 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.942847 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.942866 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.942880 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:17Z","lastTransitionTime":"2026-02-18T11:37:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.973042 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:37:17 crc kubenswrapper[4922]: E0218 11:37:17.973235 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pspfr" podUID="4702cf45-b47b-4291-a553-5bfc7bc22674" Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.973347 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:37:17 crc kubenswrapper[4922]: E0218 11:37:17.973526 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.973637 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:37:17 crc kubenswrapper[4922]: E0218 11:37:17.973709 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:37:17 crc kubenswrapper[4922]: I0218 11:37:17.973761 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:37:17 crc kubenswrapper[4922]: E0218 11:37:17.973813 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:37:18 crc kubenswrapper[4922]: I0218 11:37:18.046200 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:18 crc kubenswrapper[4922]: I0218 11:37:18.046248 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:18 crc kubenswrapper[4922]: I0218 11:37:18.046264 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:18 crc kubenswrapper[4922]: I0218 11:37:18.046288 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:18 crc kubenswrapper[4922]: I0218 11:37:18.046304 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:18Z","lastTransitionTime":"2026-02-18T11:37:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:18 crc kubenswrapper[4922]: I0218 11:37:18.149565 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:18 crc kubenswrapper[4922]: I0218 11:37:18.149655 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:18 crc kubenswrapper[4922]: I0218 11:37:18.149716 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:18 crc kubenswrapper[4922]: I0218 11:37:18.149748 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:18 crc kubenswrapper[4922]: I0218 11:37:18.149771 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:18Z","lastTransitionTime":"2026-02-18T11:37:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:18 crc kubenswrapper[4922]: I0218 11:37:18.252526 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:18 crc kubenswrapper[4922]: I0218 11:37:18.252567 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:18 crc kubenswrapper[4922]: I0218 11:37:18.252579 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:18 crc kubenswrapper[4922]: I0218 11:37:18.252597 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:18 crc kubenswrapper[4922]: I0218 11:37:18.252612 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:18Z","lastTransitionTime":"2026-02-18T11:37:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:18 crc kubenswrapper[4922]: I0218 11:37:18.356260 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:18 crc kubenswrapper[4922]: I0218 11:37:18.356332 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:18 crc kubenswrapper[4922]: I0218 11:37:18.356354 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:18 crc kubenswrapper[4922]: I0218 11:37:18.356417 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:18 crc kubenswrapper[4922]: I0218 11:37:18.356433 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:18Z","lastTransitionTime":"2026-02-18T11:37:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:18 crc kubenswrapper[4922]: I0218 11:37:18.459086 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:18 crc kubenswrapper[4922]: I0218 11:37:18.459132 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:18 crc kubenswrapper[4922]: I0218 11:37:18.459142 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:18 crc kubenswrapper[4922]: I0218 11:37:18.459159 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:18 crc kubenswrapper[4922]: I0218 11:37:18.459174 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:18Z","lastTransitionTime":"2026-02-18T11:37:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:18 crc kubenswrapper[4922]: I0218 11:37:18.562176 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:18 crc kubenswrapper[4922]: I0218 11:37:18.562243 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:18 crc kubenswrapper[4922]: I0218 11:37:18.562258 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:18 crc kubenswrapper[4922]: I0218 11:37:18.562275 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:18 crc kubenswrapper[4922]: I0218 11:37:18.562286 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:18Z","lastTransitionTime":"2026-02-18T11:37:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:18 crc kubenswrapper[4922]: I0218 11:37:18.664994 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:18 crc kubenswrapper[4922]: I0218 11:37:18.665036 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:18 crc kubenswrapper[4922]: I0218 11:37:18.665049 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:18 crc kubenswrapper[4922]: I0218 11:37:18.665064 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:18 crc kubenswrapper[4922]: I0218 11:37:18.665076 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:18Z","lastTransitionTime":"2026-02-18T11:37:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:18 crc kubenswrapper[4922]: I0218 11:37:18.767830 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:18 crc kubenswrapper[4922]: I0218 11:37:18.767898 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:18 crc kubenswrapper[4922]: I0218 11:37:18.767911 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:18 crc kubenswrapper[4922]: I0218 11:37:18.767935 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:18 crc kubenswrapper[4922]: I0218 11:37:18.767948 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:18Z","lastTransitionTime":"2026-02-18T11:37:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:18 crc kubenswrapper[4922]: I0218 11:37:18.871080 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:18 crc kubenswrapper[4922]: I0218 11:37:18.871127 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:18 crc kubenswrapper[4922]: I0218 11:37:18.871140 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:18 crc kubenswrapper[4922]: I0218 11:37:18.871155 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:18 crc kubenswrapper[4922]: I0218 11:37:18.871166 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:18Z","lastTransitionTime":"2026-02-18T11:37:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:18 crc kubenswrapper[4922]: I0218 11:37:18.934728 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 07:07:37.729716804 +0000 UTC Feb 18 11:37:18 crc kubenswrapper[4922]: I0218 11:37:18.973271 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:18 crc kubenswrapper[4922]: I0218 11:37:18.973322 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:18 crc kubenswrapper[4922]: I0218 11:37:18.973332 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:18 crc kubenswrapper[4922]: I0218 11:37:18.973391 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:18 crc kubenswrapper[4922]: I0218 11:37:18.973407 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:18Z","lastTransitionTime":"2026-02-18T11:37:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:18 crc kubenswrapper[4922]: I0218 11:37:18.985994 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47a2352-be16-435d-9207-078c64f99e77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d641439c1ae42b2579e47041fc257fa37643ec1ef4fcacf55b15b82d6ec02cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dda1daddcf256e52a7dc75c6b5c221fcd8b8fad50c50b6053774f9a8fab57e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac316efde113b3547d00f480ee2c658958ba4f99c719d0920f333ab2a62ebaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6a8f8152ee7536b35054611238f4050c7f0e3a756f74ae8525af7f50150f88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:18Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.007267 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:19Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.024734 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:19Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.040602 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:19Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.053743 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w46bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"067f44ac-9e60-4581-87cc-f2e1c823fc4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c52fc2f1b105e90957f8cd7424352990f9ec9e5350d1a501980bfda1613066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ks8d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w46bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:19Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.073013 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pspfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702cf45-b47b-4291-a553-5bfc7bc22674\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7224x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7224x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:37:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pspfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:19Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.075546 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.075578 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.075589 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.075607 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.075617 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:19Z","lastTransitionTime":"2026-02-18T11:37:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.094620 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65825dc8bc12fe9933ff1392fd17a19042bf62d65c0b1ba22303094881745e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:19Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.106781 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9xzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b4595ac-c521-4ada-950d-e1b01cdff99b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f784f6e4f98fea5928c944062388eaa69d5a5782490bdb120cd735f54f52df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zv6rh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9xzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:19Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.118834 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3079844ef727cc58eb6253c7eaa0289ca84977a00334a10cdbf493b16133a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:19Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.133035 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdb7cedc-b2e3-48f0-80e0-e17073b43228\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c29cb76dded770e8691cd270c1c1197ffd3764c3fa73faf7f072a63c9e3e4b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653a3be8615ffc467d4be70b77ab62c837c24df920515a9965ec20ce2941d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-znglx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:19Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.157652 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"653a41bb-bb1d-421c-a92b-7f2811d95edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a72b7b3feb4b1eda6d0a688e2f4f1dbb05dbf932f4e786b6e414f147964ce746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0e80f07cd7247e51069bf2b1dbdb4c9276cf6ea8904cc3f193a79c72f996bc6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:37:11Z\\\",\\\"message\\\":\\\"r *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:37:11.244406 6237 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:37:11.245284 6237 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0218 11:37:11.245412 6237 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0218 11:37:11.245452 6237 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0218 11:37:11.245461 6237 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0218 11:37:11.245483 6237 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0218 11:37:11.245493 6237 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0218 11:37:11.245499 6237 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0218 11:37:11.245735 6237 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0218 11:37:11.245804 6237 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0218 11:37:11.245838 6237 factory.go:656] Stopping watch factory\\\\nI0218 11:37:11.245855 6237 ovnkube.go:599] Stopped ovnkube\\\\nI0218 11:37:11.245876 6237 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0218 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a72b7b3feb4b1eda6d0a688e2f4f1dbb05dbf932f4e786b6e414f147964ce746\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"message\\\":\\\"rking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI0218 11:37:14.424128 6409 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0218 11:37:14.424192 6409 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0218 11:37:14.424197 6409 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0218 11:37:14.424238 6409 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0218 11:37:14.424249 6409 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0218 11:37:14.424254 6409 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0218 11:37:14.424121 6409 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF0218 11:37:14.423917 6409 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wg4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:19Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.180302 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.180357 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.180414 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.180448 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.180471 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:19Z","lastTransitionTime":"2026-02-18T11:37:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.180309 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bbe0881c-3173-4207-adb8-72c0dd4fa9c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e73616098bbc0fcdcd0b7e6b5bf610a7c193bdb65922e4532ce9553e9fc6e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71f2a53a03eafd89329b955a7edbe71c610706392e0a07fa09b9a6247889913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cac665b2d5a707e29339e7e8afa75dffcfd1dc1f42262e2643b2487ae68558a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0517235f78003f9b1a9044287c23aebed811299dc48f24c6643dd3bfcf5cf1d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a0c8c34dd5237a5b2d52b436f5c4deabd96a96ccb8073961dc49aaffd62c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:19Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.193513 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5qkb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cd3723d-a12f-4c7c-a1ea-63bfef3c931a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ff8efc7bd7cb5045841f85767cea766898289dab6f4081829d530fbaaf9dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c45cw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5qkb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:19Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.210440 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://454f72313609f075cd3f659b19279f8c3b07e8fea70f0ca7c88c5ca8e2e57397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0577fe2fd5192d4ec36ec8bfc6259a533b6422ef672d8153f601c2ac96bb5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:19Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.232729 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26zbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592c6351-c252-4c19-b3b1-167096be2de9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fca36b120b6a8f3bf08a810358bdd23b8b3c3105ede8e850232a9fef2d322db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26zbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:19Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.246624 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjnxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"648f85d5-dbc6-4db6-b590-3edc96740212\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e419e9c268c54b7829fa957e360c5b2c9943232871f9773ab32017c36a44b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8l2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61f493c42d7223ae48d738cf9973c2f6f5997722b017cc1665ec5f27ea0542dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8l2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:37:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjnxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:19Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.261319 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"384d508e-8f6a-427d-a85d-85bb61d8405e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dacf3659cbb51e3d8df303470837041f3110f38b42187d40cbb4e53a8750cafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab832676116da0ae3d9dc70d182e283d7ae2418db02c1ea0062812cab0d71b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://434010c9772cf1c9db5578ecd4726ae2de0c0a3923d45b522868677cf78fcaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0218 11:36:58.792829 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0218 11:36:58.792990 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0218 11:36:58.794119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-56419459/tls.crt::/tmp/serving-cert-56419459/tls.key\\\\\\\"\\\\nI0218 11:36:58.994875 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0218 11:36:59.009974 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0218 11:36:59.010010 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0218 11:36:59.010039 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0218 11:36:59.010047 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0218 11:36:59.021309 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0218 11:36:59.021338 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021343 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021347 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0218 11:36:59.021350 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0218 11:36:59.021353 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:36:59.021356 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:36:59.021636 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0218 11:36:59.024101 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b891a2fe938b437cbfb28c7715d0ad758541d844a4768434d3c1f1a3f0578c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:19Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.284213 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.284274 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.284287 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.284309 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.284326 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:19Z","lastTransitionTime":"2026-02-18T11:37:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.388324 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.388419 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.388432 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.388450 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.388461 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:19Z","lastTransitionTime":"2026-02-18T11:37:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.491544 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.491607 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.491621 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.491640 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.491653 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:19Z","lastTransitionTime":"2026-02-18T11:37:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.596075 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.596132 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.596146 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.596167 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.596185 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:19Z","lastTransitionTime":"2026-02-18T11:37:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.699533 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.699583 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.699595 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.699619 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.699636 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:19Z","lastTransitionTime":"2026-02-18T11:37:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.802664 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.802711 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.802721 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.802737 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.802745 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:19Z","lastTransitionTime":"2026-02-18T11:37:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.906327 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.906469 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.906490 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.906519 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.906545 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:19Z","lastTransitionTime":"2026-02-18T11:37:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.935767 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 08:43:39.867443601 +0000 UTC Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.972503 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.972585 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.972616 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:37:19 crc kubenswrapper[4922]: E0218 11:37:19.972800 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:37:19 crc kubenswrapper[4922]: I0218 11:37:19.972846 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:37:19 crc kubenswrapper[4922]: E0218 11:37:19.972906 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pspfr" podUID="4702cf45-b47b-4291-a553-5bfc7bc22674" Feb 18 11:37:19 crc kubenswrapper[4922]: E0218 11:37:19.973028 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:37:19 crc kubenswrapper[4922]: E0218 11:37:19.973218 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:37:20 crc kubenswrapper[4922]: I0218 11:37:20.010586 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:20 crc kubenswrapper[4922]: I0218 11:37:20.010657 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:20 crc kubenswrapper[4922]: I0218 11:37:20.010677 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:20 crc kubenswrapper[4922]: I0218 11:37:20.010720 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:20 crc kubenswrapper[4922]: I0218 11:37:20.010760 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:20Z","lastTransitionTime":"2026-02-18T11:37:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:20 crc kubenswrapper[4922]: I0218 11:37:20.114761 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:20 crc kubenswrapper[4922]: I0218 11:37:20.114875 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:20 crc kubenswrapper[4922]: I0218 11:37:20.114894 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:20 crc kubenswrapper[4922]: I0218 11:37:20.114918 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:20 crc kubenswrapper[4922]: I0218 11:37:20.114935 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:20Z","lastTransitionTime":"2026-02-18T11:37:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:20 crc kubenswrapper[4922]: I0218 11:37:20.218453 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:20 crc kubenswrapper[4922]: I0218 11:37:20.218529 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:20 crc kubenswrapper[4922]: I0218 11:37:20.218568 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:20 crc kubenswrapper[4922]: I0218 11:37:20.218604 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:20 crc kubenswrapper[4922]: I0218 11:37:20.218629 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:20Z","lastTransitionTime":"2026-02-18T11:37:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:20 crc kubenswrapper[4922]: I0218 11:37:20.322167 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:20 crc kubenswrapper[4922]: I0218 11:37:20.322267 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:20 crc kubenswrapper[4922]: I0218 11:37:20.322292 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:20 crc kubenswrapper[4922]: I0218 11:37:20.322321 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:20 crc kubenswrapper[4922]: I0218 11:37:20.322346 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:20Z","lastTransitionTime":"2026-02-18T11:37:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:20 crc kubenswrapper[4922]: I0218 11:37:20.425249 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:20 crc kubenswrapper[4922]: I0218 11:37:20.425299 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:20 crc kubenswrapper[4922]: I0218 11:37:20.425310 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:20 crc kubenswrapper[4922]: I0218 11:37:20.425329 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:20 crc kubenswrapper[4922]: I0218 11:37:20.425347 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:20Z","lastTransitionTime":"2026-02-18T11:37:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:20 crc kubenswrapper[4922]: I0218 11:37:20.528260 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:20 crc kubenswrapper[4922]: I0218 11:37:20.528335 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:20 crc kubenswrapper[4922]: I0218 11:37:20.528390 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:20 crc kubenswrapper[4922]: I0218 11:37:20.528426 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:20 crc kubenswrapper[4922]: I0218 11:37:20.528448 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:20Z","lastTransitionTime":"2026-02-18T11:37:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:20 crc kubenswrapper[4922]: I0218 11:37:20.631918 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:20 crc kubenswrapper[4922]: I0218 11:37:20.631992 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:20 crc kubenswrapper[4922]: I0218 11:37:20.632011 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:20 crc kubenswrapper[4922]: I0218 11:37:20.632035 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:20 crc kubenswrapper[4922]: I0218 11:37:20.632052 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:20Z","lastTransitionTime":"2026-02-18T11:37:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:20 crc kubenswrapper[4922]: I0218 11:37:20.735314 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:20 crc kubenswrapper[4922]: I0218 11:37:20.735417 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:20 crc kubenswrapper[4922]: I0218 11:37:20.735439 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:20 crc kubenswrapper[4922]: I0218 11:37:20.735465 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:20 crc kubenswrapper[4922]: I0218 11:37:20.735483 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:20Z","lastTransitionTime":"2026-02-18T11:37:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:20 crc kubenswrapper[4922]: I0218 11:37:20.841850 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:20 crc kubenswrapper[4922]: I0218 11:37:20.841921 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:20 crc kubenswrapper[4922]: I0218 11:37:20.841933 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:20 crc kubenswrapper[4922]: I0218 11:37:20.841947 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:20 crc kubenswrapper[4922]: I0218 11:37:20.842010 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:20Z","lastTransitionTime":"2026-02-18T11:37:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:20 crc kubenswrapper[4922]: I0218 11:37:20.936761 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 01:14:05.63413404 +0000 UTC Feb 18 11:37:20 crc kubenswrapper[4922]: I0218 11:37:20.944431 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:20 crc kubenswrapper[4922]: I0218 11:37:20.944477 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:20 crc kubenswrapper[4922]: I0218 11:37:20.944493 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:20 crc kubenswrapper[4922]: I0218 11:37:20.944515 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:20 crc kubenswrapper[4922]: I0218 11:37:20.944530 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:20Z","lastTransitionTime":"2026-02-18T11:37:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:20 crc kubenswrapper[4922]: I0218 11:37:20.973337 4922 scope.go:117] "RemoveContainer" containerID="4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.048165 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.048232 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.048249 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.048278 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.048346 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:21Z","lastTransitionTime":"2026-02-18T11:37:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.150660 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.150748 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.150798 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.150824 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.150873 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:21Z","lastTransitionTime":"2026-02-18T11:37:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.253000 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.253038 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.253046 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.253058 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.253067 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:21Z","lastTransitionTime":"2026-02-18T11:37:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.343127 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.345311 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9af62821d90e57595091aa9e7f1083acd21cb955ada0837c5ab4f71885ff7f1a"} Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.345683 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.355582 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.355636 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.355652 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.355677 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.355695 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:21Z","lastTransitionTime":"2026-02-18T11:37:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.364407 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"384d508e-8f6a-427d-a85d-85bb61d8405e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dacf3659cbb51e3d8df303470837041f3110f38b42187d40cbb4e53a8750cafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab832676116da0ae3d9dc70d182e283d7ae2418db02c1ea0062812cab0d71b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://434010c9772cf1c9db5578ecd4726ae2de0c0a3923d45b522868677cf78fcaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af62821d90e57595091aa9e7f1083acd21cb955ada0837c5ab4f71885ff7f1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0218 11:36:58.792829 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0218 11:36:58.792990 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0218 11:36:58.794119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-56419459/tls.crt::/tmp/serving-cert-56419459/tls.key\\\\\\\"\\\\nI0218 11:36:58.994875 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0218 11:36:59.009974 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0218 11:36:59.010010 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0218 11:36:59.010039 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0218 11:36:59.010047 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0218 11:36:59.021309 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0218 11:36:59.021338 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021343 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021347 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0218 11:36:59.021350 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0218 11:36:59.021353 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:36:59.021356 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:36:59.021636 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0218 11:36:59.024101 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b891a2fe938b437cbfb28c7715d0ad758541d844a4768434d3c1f1a3f0578c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:21Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.381109 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://454f72313609f075cd3f659b19279f8c3b07e8fea70f0ca7c88c5ca8e2e57397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0577fe2fd5192d4ec36ec8bfc6259a533b6422ef672d8153f601c2ac96bb5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:21Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.403328 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26zbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592c6351-c252-4c19-b3b1-167096be2de9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fca36b120b6a8f3bf08a810358bdd23b8b3c3105ede8e850232a9fef2d322db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26zbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:21Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.416242 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjnxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"648f85d5-dbc6-4db6-b590-3edc96740212\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e419e9c268c54b7829fa957e360c5b2c9943232871f9773ab32017c36a44b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8l2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61f493c42d7223ae48d738cf9973c2f6f5997722b017cc1665ec5f27ea0542dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8l2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:37:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjnxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:21Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.431628 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47a2352-be16-435d-9207-078c64f99e77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d641439c1ae42b2579e47041fc257fa37643ec1ef4fcacf55b15b82d6ec02cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dda1daddcf256e52a7dc75c6b5c221fcd8b8fad50c50b6053774f9a8fab57e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac316efde113b3547d00f480ee2c658958ba4f99c719d0920f333ab2a62ebaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6a8f8152ee7536b35054611238f4050c7f0e3a756f74ae8525af7f50150f88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:21Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.447946 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:21Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.458918 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.459331 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.459504 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.459645 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.459767 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:21Z","lastTransitionTime":"2026-02-18T11:37:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.464504 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:21Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.477910 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:21Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.489590 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w46bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"067f44ac-9e60-4581-87cc-f2e1c823fc4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c52fc2f1b105e90957f8cd7424352990f9ec9e5350d1a501980bfda1613066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ks8d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w46bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:21Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.498263 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pspfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702cf45-b47b-4291-a553-5bfc7bc22674\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7224x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7224x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:37:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pspfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:21Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.503667 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4702cf45-b47b-4291-a553-5bfc7bc22674-metrics-certs\") pod \"network-metrics-daemon-pspfr\" (UID: \"4702cf45-b47b-4291-a553-5bfc7bc22674\") " pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:37:21 crc kubenswrapper[4922]: E0218 11:37:21.503835 4922 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 11:37:21 crc kubenswrapper[4922]: E0218 11:37:21.503943 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4702cf45-b47b-4291-a553-5bfc7bc22674-metrics-certs podName:4702cf45-b47b-4291-a553-5bfc7bc22674 nodeName:}" failed. No retries permitted until 2026-02-18 11:37:29.50392515 +0000 UTC m=+51.231629230 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4702cf45-b47b-4291-a553-5bfc7bc22674-metrics-certs") pod "network-metrics-daemon-pspfr" (UID: "4702cf45-b47b-4291-a553-5bfc7bc22674") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.562285 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.562342 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.562355 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.562389 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.562401 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:21Z","lastTransitionTime":"2026-02-18T11:37:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.564057 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bbe0881c-3173-4207-adb8-72c0dd4fa9c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e73616098bbc0fcdcd0b7e6b5bf610a7c193bdb65922e4532ce9553e9fc6e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71f2a53a03eafd89329b955a7edbe71c610706392e0a07fa09b9a6247889913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cac665b2d5a707e29339e7e8afa75dffcfd1dc1f42262e2643b2487ae68558a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0517235f78003f9b1a9044287c23aebed811299dc48f24c6643dd3bfcf5cf1d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a0c8c34dd5237a5b2d52b436f5c4deabd96a96ccb8073961dc49aaffd62c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:21Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.583163 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65825dc8bc12fe9933ff1392fd17a19042bf62d65c0b1ba22303094881745e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:21Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.595748 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9xzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b4595ac-c521-4ada-950d-e1b01cdff99b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f784f6e4f98fea5928c944062388eaa69d5a5782490bdb120cd735f54f52df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zv6rh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9xzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:21Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.606219 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3079844ef727cc58eb6253c7eaa0289ca84977a00334a10cdbf493b16133a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:21Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.618279 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdb7cedc-b2e3-48f0-80e0-e17073b43228\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c29cb76dded770e8691cd270c1c1197ffd3764c3fa73faf7f072a63c9e3e4b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653a3be8615ffc467d4be70b77ab62c837c24df920515a9965ec20ce2941d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-znglx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:21Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.637052 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"653a41bb-bb1d-421c-a92b-7f2811d95edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a72b7b3feb4b1eda6d0a688e2f4f1dbb05dbf932f4e786b6e414f147964ce746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0e80f07cd7247e51069bf2b1dbdb4c9276cf6ea8904cc3f193a79c72f996bc6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:37:11Z\\\",\\\"message\\\":\\\"r *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:37:11.244406 6237 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:37:11.245284 6237 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0218 11:37:11.245412 6237 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0218 11:37:11.245452 6237 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0218 11:37:11.245461 6237 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0218 11:37:11.245483 6237 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0218 11:37:11.245493 6237 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0218 11:37:11.245499 6237 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0218 11:37:11.245735 6237 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0218 11:37:11.245804 6237 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0218 11:37:11.245838 6237 factory.go:656] Stopping watch factory\\\\nI0218 11:37:11.245855 6237 ovnkube.go:599] Stopped ovnkube\\\\nI0218 11:37:11.245876 6237 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0218 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a72b7b3feb4b1eda6d0a688e2f4f1dbb05dbf932f4e786b6e414f147964ce746\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"message\\\":\\\"rking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI0218 11:37:14.424128 6409 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0218 11:37:14.424192 6409 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0218 11:37:14.424197 6409 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0218 11:37:14.424238 6409 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0218 11:37:14.424249 6409 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0218 11:37:14.424254 6409 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0218 11:37:14.424121 6409 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF0218 11:37:14.423917 6409 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wg4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:21Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.647913 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5qkb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cd3723d-a12f-4c7c-a1ea-63bfef3c931a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ff8efc7bd7cb5045841f85767cea766898289dab6f4081829d530fbaaf9dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c45cw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5qkb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:21Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.664268 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.664297 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.664305 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.664318 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.664328 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:21Z","lastTransitionTime":"2026-02-18T11:37:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.767051 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.767115 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.767134 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.767158 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.767174 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:21Z","lastTransitionTime":"2026-02-18T11:37:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.870165 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.870221 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.870232 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.870251 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.870266 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:21Z","lastTransitionTime":"2026-02-18T11:37:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.937802 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 17:34:01.704519731 +0000 UTC Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.972296 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.972339 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:37:21 crc kubenswrapper[4922]: E0218 11:37:21.972446 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.972446 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.972486 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:37:21 crc kubenswrapper[4922]: E0218 11:37:21.972745 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:37:21 crc kubenswrapper[4922]: E0218 11:37:21.972904 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pspfr" podUID="4702cf45-b47b-4291-a553-5bfc7bc22674" Feb 18 11:37:21 crc kubenswrapper[4922]: E0218 11:37:21.973019 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.974196 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.974265 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.974289 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.974323 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:21 crc kubenswrapper[4922]: I0218 11:37:21.974345 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:21Z","lastTransitionTime":"2026-02-18T11:37:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:22 crc kubenswrapper[4922]: I0218 11:37:22.077926 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:22 crc kubenswrapper[4922]: I0218 11:37:22.077981 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:22 crc kubenswrapper[4922]: I0218 11:37:22.077991 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:22 crc kubenswrapper[4922]: I0218 11:37:22.078012 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:22 crc kubenswrapper[4922]: I0218 11:37:22.078023 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:22Z","lastTransitionTime":"2026-02-18T11:37:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:22 crc kubenswrapper[4922]: I0218 11:37:22.180886 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:22 crc kubenswrapper[4922]: I0218 11:37:22.180946 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:22 crc kubenswrapper[4922]: I0218 11:37:22.180967 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:22 crc kubenswrapper[4922]: I0218 11:37:22.180995 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:22 crc kubenswrapper[4922]: I0218 11:37:22.181017 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:22Z","lastTransitionTime":"2026-02-18T11:37:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:22 crc kubenswrapper[4922]: I0218 11:37:22.284741 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:22 crc kubenswrapper[4922]: I0218 11:37:22.284790 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:22 crc kubenswrapper[4922]: I0218 11:37:22.284799 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:22 crc kubenswrapper[4922]: I0218 11:37:22.284814 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:22 crc kubenswrapper[4922]: I0218 11:37:22.284823 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:22Z","lastTransitionTime":"2026-02-18T11:37:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:22 crc kubenswrapper[4922]: I0218 11:37:22.387282 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:22 crc kubenswrapper[4922]: I0218 11:37:22.387335 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:22 crc kubenswrapper[4922]: I0218 11:37:22.387346 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:22 crc kubenswrapper[4922]: I0218 11:37:22.387402 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:22 crc kubenswrapper[4922]: I0218 11:37:22.387418 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:22Z","lastTransitionTime":"2026-02-18T11:37:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:22 crc kubenswrapper[4922]: I0218 11:37:22.490114 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:22 crc kubenswrapper[4922]: I0218 11:37:22.490144 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:22 crc kubenswrapper[4922]: I0218 11:37:22.490155 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:22 crc kubenswrapper[4922]: I0218 11:37:22.490172 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:22 crc kubenswrapper[4922]: I0218 11:37:22.490185 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:22Z","lastTransitionTime":"2026-02-18T11:37:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:22 crc kubenswrapper[4922]: I0218 11:37:22.593616 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:22 crc kubenswrapper[4922]: I0218 11:37:22.593692 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:22 crc kubenswrapper[4922]: I0218 11:37:22.593716 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:22 crc kubenswrapper[4922]: I0218 11:37:22.593747 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:22 crc kubenswrapper[4922]: I0218 11:37:22.593768 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:22Z","lastTransitionTime":"2026-02-18T11:37:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:22 crc kubenswrapper[4922]: I0218 11:37:22.697290 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:22 crc kubenswrapper[4922]: I0218 11:37:22.697426 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:22 crc kubenswrapper[4922]: I0218 11:37:22.697442 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:22 crc kubenswrapper[4922]: I0218 11:37:22.697467 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:22 crc kubenswrapper[4922]: I0218 11:37:22.697485 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:22Z","lastTransitionTime":"2026-02-18T11:37:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:22 crc kubenswrapper[4922]: I0218 11:37:22.799632 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:22 crc kubenswrapper[4922]: I0218 11:37:22.799683 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:22 crc kubenswrapper[4922]: I0218 11:37:22.799697 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:22 crc kubenswrapper[4922]: I0218 11:37:22.799717 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:22 crc kubenswrapper[4922]: I0218 11:37:22.799729 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:22Z","lastTransitionTime":"2026-02-18T11:37:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:22 crc kubenswrapper[4922]: I0218 11:37:22.902754 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:22 crc kubenswrapper[4922]: I0218 11:37:22.902809 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:22 crc kubenswrapper[4922]: I0218 11:37:22.902818 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:22 crc kubenswrapper[4922]: I0218 11:37:22.902833 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:22 crc kubenswrapper[4922]: I0218 11:37:22.902842 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:22Z","lastTransitionTime":"2026-02-18T11:37:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:22 crc kubenswrapper[4922]: I0218 11:37:22.938498 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 18:46:56.867062781 +0000 UTC Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.004849 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.004894 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.004905 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.004920 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.004932 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:23Z","lastTransitionTime":"2026-02-18T11:37:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.107665 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.107713 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.107726 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.107741 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.107754 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:23Z","lastTransitionTime":"2026-02-18T11:37:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.210620 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.210872 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.210943 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.211011 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.211077 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:23Z","lastTransitionTime":"2026-02-18T11:37:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.313605 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.313880 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.313946 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.314026 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.314092 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:23Z","lastTransitionTime":"2026-02-18T11:37:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.416612 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.416867 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.416932 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.416993 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.417059 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:23Z","lastTransitionTime":"2026-02-18T11:37:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.519023 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.519076 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.519088 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.519102 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.519112 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:23Z","lastTransitionTime":"2026-02-18T11:37:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.622111 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.622157 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.622167 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.622183 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.622193 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:23Z","lastTransitionTime":"2026-02-18T11:37:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.725915 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.726327 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.726440 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.726580 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.726673 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:23Z","lastTransitionTime":"2026-02-18T11:37:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.829620 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.829697 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.829714 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.829740 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.829759 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:23Z","lastTransitionTime":"2026-02-18T11:37:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.932903 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.932958 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.932968 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.932984 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.932995 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:23Z","lastTransitionTime":"2026-02-18T11:37:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.939543 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 07:14:06.916483861 +0000 UTC Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.973019 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.973090 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.973214 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:37:23 crc kubenswrapper[4922]: I0218 11:37:23.973392 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:37:23 crc kubenswrapper[4922]: E0218 11:37:23.973392 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:37:23 crc kubenswrapper[4922]: E0218 11:37:23.973599 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pspfr" podUID="4702cf45-b47b-4291-a553-5bfc7bc22674" Feb 18 11:37:23 crc kubenswrapper[4922]: E0218 11:37:23.973780 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:37:23 crc kubenswrapper[4922]: E0218 11:37:23.973915 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.036280 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.036327 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.036335 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.036351 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.036385 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:24Z","lastTransitionTime":"2026-02-18T11:37:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.139204 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.139281 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.139304 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.139328 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.139347 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:24Z","lastTransitionTime":"2026-02-18T11:37:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.242625 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.242693 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.242720 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.242749 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.242769 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:24Z","lastTransitionTime":"2026-02-18T11:37:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.346698 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.346862 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.346883 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.346906 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.346961 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:24Z","lastTransitionTime":"2026-02-18T11:37:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.450052 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.450101 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.450110 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.450129 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.450141 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:24Z","lastTransitionTime":"2026-02-18T11:37:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.553641 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.553695 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.553713 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.553742 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.553756 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:24Z","lastTransitionTime":"2026-02-18T11:37:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.657479 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.657580 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.657603 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.657632 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.657652 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:24Z","lastTransitionTime":"2026-02-18T11:37:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.760478 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.760532 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.760549 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.760574 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.760591 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:24Z","lastTransitionTime":"2026-02-18T11:37:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.813525 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.813584 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.813603 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.813627 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.813644 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:24Z","lastTransitionTime":"2026-02-18T11:37:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:24 crc kubenswrapper[4922]: E0218 11:37:24.831626 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"142c3684-8991-4ed8-97d2-827c32777413\\\",\\\"systemUUID\\\":\\\"e6651d54-876d-4ba5-b7d2-813fd96498e1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:24Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.836593 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.836813 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.836941 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.837101 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.837230 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:24Z","lastTransitionTime":"2026-02-18T11:37:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:24 crc kubenswrapper[4922]: E0218 11:37:24.859400 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"142c3684-8991-4ed8-97d2-827c32777413\\\",\\\"systemUUID\\\":\\\"e6651d54-876d-4ba5-b7d2-813fd96498e1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:24Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.864593 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.864636 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.864646 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.864662 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.864672 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:24Z","lastTransitionTime":"2026-02-18T11:37:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:24 crc kubenswrapper[4922]: E0218 11:37:24.882744 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"142c3684-8991-4ed8-97d2-827c32777413\\\",\\\"systemUUID\\\":\\\"e6651d54-876d-4ba5-b7d2-813fd96498e1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:24Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.887034 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.887149 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.887176 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.887205 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.887230 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:24Z","lastTransitionTime":"2026-02-18T11:37:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:24 crc kubenswrapper[4922]: E0218 11:37:24.909020 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"142c3684-8991-4ed8-97d2-827c32777413\\\",\\\"systemUUID\\\":\\\"e6651d54-876d-4ba5-b7d2-813fd96498e1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:24Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.914787 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.914860 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.914877 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.914902 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.914929 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:24Z","lastTransitionTime":"2026-02-18T11:37:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:24 crc kubenswrapper[4922]: E0218 11:37:24.935326 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"142c3684-8991-4ed8-97d2-827c32777413\\\",\\\"systemUUID\\\":\\\"e6651d54-876d-4ba5-b7d2-813fd96498e1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:24Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:24 crc kubenswrapper[4922]: E0218 11:37:24.935620 4922 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.937911 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.938001 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.938021 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.938042 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.938058 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:24Z","lastTransitionTime":"2026-02-18T11:37:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:24 crc kubenswrapper[4922]: I0218 11:37:24.940230 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 02:52:11.751949413 +0000 UTC Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.040672 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.040715 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.040727 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.040744 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.040756 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:25Z","lastTransitionTime":"2026-02-18T11:37:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.143428 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.143494 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.143516 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.143543 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.143563 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:25Z","lastTransitionTime":"2026-02-18T11:37:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.246284 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.246434 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.246460 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.246490 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.246509 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:25Z","lastTransitionTime":"2026-02-18T11:37:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.293159 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.294506 4922 scope.go:117] "RemoveContainer" containerID="a72b7b3feb4b1eda6d0a688e2f4f1dbb05dbf932f4e786b6e414f147964ce746" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.310545 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://454f72313609f075cd3f659b19279f8c3b07e8fea70f0ca7c88c5ca8e2e57397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0577fe2fd5192d4ec36ec8bfc6259a533b6422ef672d8153f601c2ac96bb5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:25Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.336602 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26zbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592c6351-c252-4c19-b3b1-167096be2de9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fca36b120b6a8f3bf08a810358bdd23b8b3c3105ede8e850232a9fef2d322db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26zbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:25Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.350033 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.350092 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.350113 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.350137 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.350155 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:25Z","lastTransitionTime":"2026-02-18T11:37:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.356884 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjnxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"648f85d5-dbc6-4db6-b590-3edc96740212\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e419e9c268c54b7829fa957e360c5b2c9943232871f9773ab32017c36a44b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8l2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61f493c42d7223ae48d738cf9973c2f6f5997722b017cc1665ec5f27ea0542dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8l2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:37:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjnxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:25Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.374389 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"384d508e-8f6a-427d-a85d-85bb61d8405e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dacf3659cbb51e3d8df303470837041f3110f38b42187d40cbb4e53a8750cafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab832676116da0ae3d9dc70d182e283d7ae2418db02c1ea0062812cab0d71b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://434010c9772cf1c9db5578ecd4726ae2de0c0a3923d45b522868677cf78fcaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af62821d90e57595091aa9e7f1083acd21cb955ada0837c5ab4f71885ff7f1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0218 11:36:58.792829 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0218 11:36:58.792990 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0218 11:36:58.794119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-56419459/tls.crt::/tmp/serving-cert-56419459/tls.key\\\\\\\"\\\\nI0218 11:36:58.994875 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0218 11:36:59.009974 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0218 11:36:59.010010 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0218 11:36:59.010039 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0218 11:36:59.010047 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0218 11:36:59.021309 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0218 11:36:59.021338 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021343 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021347 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0218 11:36:59.021350 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0218 11:36:59.021353 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:36:59.021356 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:36:59.021636 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0218 11:36:59.024101 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b891a2fe938b437cbfb28c7715d0ad758541d844a4768434d3c1f1a3f0578c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:25Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.388949 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47a2352-be16-435d-9207-078c64f99e77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d641439c1ae42b2579e47041fc257fa37643ec1ef4fcacf55b15b82d6ec02cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dda1daddcf256e52a7dc75c6b5c221fcd8b8fad50c50b6053774f9a8fab57e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac316efde113b3547d00f480ee2c658958ba4f99c719d0920f333ab2a62ebaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6a8f8152ee7536b35054611238f4050c7f0e3a756f74ae8525af7f50150f88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:25Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.407283 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:25Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.427499 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:25Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.445849 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:25Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.452535 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.452568 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.452580 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.452597 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.452609 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:25Z","lastTransitionTime":"2026-02-18T11:37:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.460517 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w46bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"067f44ac-9e60-4581-87cc-f2e1c823fc4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c52fc2f1b105e90957f8cd7424352990f9ec9e5350d1a501980bfda1613066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ks8d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w46bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:25Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.475929 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pspfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702cf45-b47b-4291-a553-5bfc7bc22674\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7224x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7224x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:37:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pspfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:25Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.502147 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65825dc8bc12fe9933ff1392fd17a19042bf62d65c0b1ba22303094881745e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:25Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.521715 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9xzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b4595ac-c521-4ada-950d-e1b01cdff99b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f784f6e4f98fea5928c944062388eaa69d5a5782490bdb120cd735f54f52df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zv6rh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9xzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:25Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.537974 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3079844ef727cc58eb6253c7eaa0289ca84977a00334a10cdbf493b16133a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:25Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.551245 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdb7cedc-b2e3-48f0-80e0-e17073b43228\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c29cb76dded770e8691cd270c1c1197ffd3764c3fa73faf7f072a63c9e3e4b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653a3be8615ffc467d4be70b77ab62c837c24df920515a9965ec20ce2941d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-znglx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:25Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.556502 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.556542 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.556556 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.556576 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.556589 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:25Z","lastTransitionTime":"2026-02-18T11:37:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.576133 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"653a41bb-bb1d-421c-a92b-7f2811d95edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a72b7b3feb4b1eda6d0a688e2f4f1dbb05dbf932f4e786b6e414f147964ce746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a72b7b3feb4b1eda6d0a688e2f4f1dbb05dbf932f4e786b6e414f147964ce746\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"message\\\":\\\"rking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI0218 11:37:14.424128 6409 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0218 11:37:14.424192 6409 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0218 11:37:14.424197 6409 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0218 11:37:14.424238 6409 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0218 11:37:14.424249 6409 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0218 11:37:14.424254 6409 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0218 11:37:14.424121 6409 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF0218 11:37:14.423917 6409 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-wg4r5_openshift-ovn-kubernetes(653a41bb-bb1d-421c-a92b-7f2811d95edf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wg4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:25Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.599804 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bbe0881c-3173-4207-adb8-72c0dd4fa9c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e73616098bbc0fcdcd0b7e6b5bf610a7c193bdb65922e4532ce9553e9fc6e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71f2a53a03eafd89329b955a7edbe71c610706392e0a07fa09b9a6247889913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cac665b2d5a707e29339e7e8afa75dffcfd1dc1f42262e2643b2487ae68558a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0517235f78003f9b1a9044287c23aebed811299dc48f24c6643dd3bfcf5cf1d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a0c8c34dd5237a5b2d52b436f5c4deabd96a96ccb8073961dc49aaffd62c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:25Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.615173 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5qkb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cd3723d-a12f-4c7c-a1ea-63bfef3c931a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ff8efc7bd7cb5045841f85767cea766898289dab6f4081829d530fbaaf9dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c45cw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5qkb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:25Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.659892 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.659942 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.659956 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.659978 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.659992 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:25Z","lastTransitionTime":"2026-02-18T11:37:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.762309 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.762383 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.762400 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.762420 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.762436 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:25Z","lastTransitionTime":"2026-02-18T11:37:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.865535 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.865577 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.865586 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.865600 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.865611 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:25Z","lastTransitionTime":"2026-02-18T11:37:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.941330 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 15:02:18.993819763 +0000 UTC Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.968278 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.968348 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.968428 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.968450 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.968461 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:25Z","lastTransitionTime":"2026-02-18T11:37:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.972759 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.972830 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.972820 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:37:25 crc kubenswrapper[4922]: I0218 11:37:25.972761 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:37:25 crc kubenswrapper[4922]: E0218 11:37:25.972896 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pspfr" podUID="4702cf45-b47b-4291-a553-5bfc7bc22674" Feb 18 11:37:25 crc kubenswrapper[4922]: E0218 11:37:25.973010 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:37:25 crc kubenswrapper[4922]: E0218 11:37:25.973130 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:37:25 crc kubenswrapper[4922]: E0218 11:37:25.973229 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.070771 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.070824 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.070847 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.070875 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.070894 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:26Z","lastTransitionTime":"2026-02-18T11:37:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.156236 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.167263 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.173976 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.174035 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.174046 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.174066 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.174080 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:26Z","lastTransitionTime":"2026-02-18T11:37:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.184164 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"653a41bb-bb1d-421c-a92b-7f2811d95edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a72b7b3feb4b1eda6d0a688e2f4f1dbb05dbf932f4e786b6e414f147964ce746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a72b7b3feb4b1eda6d0a688e2f4f1dbb05dbf932f4e786b6e414f147964ce746\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"message\\\":\\\"rking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI0218 11:37:14.424128 6409 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0218 11:37:14.424192 6409 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0218 11:37:14.424197 6409 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0218 11:37:14.424238 6409 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0218 11:37:14.424249 6409 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0218 11:37:14.424254 6409 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0218 11:37:14.424121 6409 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF0218 11:37:14.423917 6409 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-wg4r5_openshift-ovn-kubernetes(653a41bb-bb1d-421c-a92b-7f2811d95edf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wg4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:26Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.209647 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bbe0881c-3173-4207-adb8-72c0dd4fa9c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e73616098bbc0fcdcd0b7e6b5bf610a7c193bdb65922e4532ce9553e9fc6e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71f2a53a03eafd89329b955a7edbe71c610706392e0a07fa09b9a6247889913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cac665b2d5a707e29339e7e8afa75dffcfd1dc1f42262e2643b2487ae68558a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0517235f78003f9b1a9044287c23aebed811299dc48f24c6643dd3bfcf5cf1d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a0c8c34dd5237a5b2d52b436f5c4deabd96a96ccb8073961dc49aaffd62c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:26Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.224134 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65825dc8bc12fe9933ff1392fd17a19042bf62d65c0b1ba22303094881745e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:26Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.241628 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9xzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b4595ac-c521-4ada-950d-e1b01cdff99b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f784f6e4f98fea5928c944062388eaa69d5a5782490bdb120cd735f54f52df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zv6rh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9xzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:26Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.253970 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3079844ef727cc58eb6253c7eaa0289ca84977a00334a10cdbf493b16133a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:26Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.263289 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdb7cedc-b2e3-48f0-80e0-e17073b43228\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c29cb76dded770e8691cd270c1c1197ffd3764c3fa73faf7f072a63c9e3e4b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653a3be8615ffc467d4be70b77ab62c837c24df920515a9965ec20ce2941d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-znglx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:26Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.272183 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5qkb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cd3723d-a12f-4c7c-a1ea-63bfef3c931a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ff8efc7bd7cb5045841f85767cea766898289dab6f4081829d530fbaaf9dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c45cw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5qkb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:26Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.275772 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.275807 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.275818 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.275836 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.275847 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:26Z","lastTransitionTime":"2026-02-18T11:37:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.288025 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"384d508e-8f6a-427d-a85d-85bb61d8405e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dacf3659cbb51e3d8df303470837041f3110f38b42187d40cbb4e53a8750cafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab832676116da0ae3d9dc70d182e283d7ae2418db02c1ea0062812cab0d71b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://434010c9772cf1c9db5578ecd4726ae2de0c0a3923d45b522868677cf78fcaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af62821d90e57595091aa9e7f1083acd21cb955ada0837c5ab4f71885ff7f1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0218 11:36:58.792829 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0218 11:36:58.792990 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0218 11:36:58.794119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-56419459/tls.crt::/tmp/serving-cert-56419459/tls.key\\\\\\\"\\\\nI0218 11:36:58.994875 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0218 11:36:59.009974 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0218 11:36:59.010010 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0218 11:36:59.010039 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0218 11:36:59.010047 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0218 11:36:59.021309 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0218 11:36:59.021338 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021343 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021347 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0218 11:36:59.021350 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0218 11:36:59.021353 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:36:59.021356 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:36:59.021636 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0218 11:36:59.024101 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b891a2fe938b437cbfb28c7715d0ad758541d844a4768434d3c1f1a3f0578c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:26Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.301886 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://454f72313609f075cd3f659b19279f8c3b07e8fea70f0ca7c88c5ca8e2e57397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0577fe2fd5192d4ec36ec8bfc6259a533b6422ef672d8153f601c2ac96bb5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:26Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.314727 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26zbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592c6351-c252-4c19-b3b1-167096be2de9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fca36b120b6a8f3bf08a810358bdd23b8b3c3105ede8e850232a9fef2d322db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26zbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:26Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.324860 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjnxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"648f85d5-dbc6-4db6-b590-3edc96740212\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e419e9c268c54b7829fa957e360c5b2c9943232871f9773ab32017c36a44b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8l2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61f493c42d7223ae48d738cf9973c2f6f5997722b017cc1665ec5f27ea0542dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8l2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:37:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjnxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:26Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.335969 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:26Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.345636 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w46bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"067f44ac-9e60-4581-87cc-f2e1c823fc4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c52fc2f1b105e90957f8cd7424352990f9ec9e5350d1a501980bfda1613066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ks8d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w46bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:26Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.355532 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pspfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702cf45-b47b-4291-a553-5bfc7bc22674\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7224x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7224x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:37:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pspfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:26Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.362639 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wg4r5_653a41bb-bb1d-421c-a92b-7f2811d95edf/ovnkube-controller/2.log" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.363304 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wg4r5_653a41bb-bb1d-421c-a92b-7f2811d95edf/ovnkube-controller/1.log" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.366057 4922 generic.go:334] "Generic (PLEG): container finished" podID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerID="9c0bf0b715f9f63ac45a7294c1ecd769e8b753eba698432388e02092cd5fb353" exitCode=1 Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.366133 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" event={"ID":"653a41bb-bb1d-421c-a92b-7f2811d95edf","Type":"ContainerDied","Data":"9c0bf0b715f9f63ac45a7294c1ecd769e8b753eba698432388e02092cd5fb353"} Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.366194 4922 scope.go:117] "RemoveContainer" containerID="a72b7b3feb4b1eda6d0a688e2f4f1dbb05dbf932f4e786b6e414f147964ce746" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.366994 4922 scope.go:117] "RemoveContainer" containerID="9c0bf0b715f9f63ac45a7294c1ecd769e8b753eba698432388e02092cd5fb353" Feb 18 11:37:26 crc kubenswrapper[4922]: E0218 11:37:26.367172 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wg4r5_openshift-ovn-kubernetes(653a41bb-bb1d-421c-a92b-7f2811d95edf)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.369544 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47a2352-be16-435d-9207-078c64f99e77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d641439c1ae42b2579e47041fc257fa37643ec1ef4fcacf55b15b82d6ec02cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dda1daddcf256e52a7dc75c6b5c221fcd8b8fad50c50b6053774f9a8fab57e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac316efde113b3547d00f480ee2c658958ba4f99c719d0920f333ab2a62ebaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6a8f8152ee7536b35054611238f4050c7f0e3a756f74ae8525af7f50150f88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:26Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.378040 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.378065 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.378073 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.378087 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.378096 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:26Z","lastTransitionTime":"2026-02-18T11:37:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.383549 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:26Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.396080 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:26Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.407492 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5qkb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cd3723d-a12f-4c7c-a1ea-63bfef3c931a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ff8efc7bd7cb5045841f85767cea766898289dab6f4081829d530fbaaf9dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c45cw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5qkb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:26Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.423638 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"384d508e-8f6a-427d-a85d-85bb61d8405e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dacf3659cbb51e3d8df303470837041f3110f38b42187d40cbb4e53a8750cafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab832676116da0ae3d9dc70d182e283d7ae2418db02c1ea0062812cab0d71b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://434010c9772cf1c9db5578ecd4726ae2de0c0a3923d45b522868677cf78fcaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af62821d90e57595091aa9e7f1083acd21cb955ada0837c5ab4f71885ff7f1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0218 11:36:58.792829 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0218 11:36:58.792990 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0218 11:36:58.794119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-56419459/tls.crt::/tmp/serving-cert-56419459/tls.key\\\\\\\"\\\\nI0218 11:36:58.994875 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0218 11:36:59.009974 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0218 11:36:59.010010 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0218 11:36:59.010039 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0218 11:36:59.010047 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0218 11:36:59.021309 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0218 11:36:59.021338 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021343 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021347 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0218 11:36:59.021350 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0218 11:36:59.021353 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:36:59.021356 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:36:59.021636 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0218 11:36:59.024101 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b891a2fe938b437cbfb28c7715d0ad758541d844a4768434d3c1f1a3f0578c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:26Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.436041 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://454f72313609f075cd3f659b19279f8c3b07e8fea70f0ca7c88c5ca8e2e57397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0577fe2fd5192d4ec36ec8bfc6259a533b6422ef672d8153f601c2ac96bb5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:26Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.447886 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26zbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592c6351-c252-4c19-b3b1-167096be2de9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fca36b120b6a8f3bf08a810358bdd23b8b3c3105ede8e850232a9fef2d322db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26zbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:26Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.457472 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjnxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"648f85d5-dbc6-4db6-b590-3edc96740212\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e419e9c268c54b7829fa957e360c5b2c9943232871f9773ab32017c36a44b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8l2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61f493c42d7223ae48d738cf9973c2f6f5997722b017cc1665ec5f27ea0542dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8l2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:37:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjnxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:26Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.467089 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:26Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.476170 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w46bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"067f44ac-9e60-4581-87cc-f2e1c823fc4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c52fc2f1b105e90957f8cd7424352990f9ec9e5350d1a501980bfda1613066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ks8d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w46bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:26Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.480014 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.480035 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.480067 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.480081 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.480088 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:26Z","lastTransitionTime":"2026-02-18T11:37:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.488721 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pspfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702cf45-b47b-4291-a553-5bfc7bc22674\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7224x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7224x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:37:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pspfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:26Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.499718 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47a2352-be16-435d-9207-078c64f99e77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d641439c1ae42b2579e47041fc257fa37643ec1ef4fcacf55b15b82d6ec02cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dda1daddcf256e52a7dc75c6b5c221fcd8b8fad50c50b6053774f9a8fab57e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac316efde113b3547d00f480ee2c658958ba4f99c719d0920f333ab2a62ebaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6a8f8152ee7536b35054611238f4050c7f0e3a756f74ae8525af7f50150f88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:26Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.510028 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"143e32d8-7a32-4ab1-9790-a5b4fe1f0180\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c70b47d4c3df2edbdf1c33bdcff5642932d5212a23d54b20e79ec909b2e3de12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e27a0c7d36ad7d1c3f884b9bb69b963a922ad7851d263b3761ed5bc7e79a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a488435a081bdf17035a1ff4df73b0723045f0eef9049ec259198c697ecc33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39ce7973e25ba416a5f82cbafb8c1b3bea34718c49f8a8a56bd46970dc688727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39ce7973e25ba416a5f82cbafb8c1b3bea34718c49f8a8a56bd46970dc688727\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:26Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.520746 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:26Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.531979 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:26Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.548136 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"653a41bb-bb1d-421c-a92b-7f2811d95edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c0bf0b715f9f63ac45a7294c1ecd769e8b753eba698432388e02092cd5fb353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a72b7b3feb4b1eda6d0a688e2f4f1dbb05dbf932f4e786b6e414f147964ce746\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"message\\\":\\\"rking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI0218 11:37:14.424128 6409 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0218 11:37:14.424192 6409 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0218 11:37:14.424197 6409 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0218 11:37:14.424238 6409 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0218 11:37:14.424249 6409 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0218 11:37:14.424254 6409 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0218 11:37:14.424121 6409 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF0218 11:37:14.423917 6409 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c0bf0b715f9f63ac45a7294c1ecd769e8b753eba698432388e02092cd5fb353\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:37:26Z\\\",\\\"message\\\":\\\"andler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0218 11:37:26.126150 6616 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0218 11:37:26.126155 6616 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0218 11:37:26.126204 6616 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0218 11:37:26.126216 6616 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0218 11:37:26.126228 6616 handler.go:208] Removed *v1.Node event handler 2\\\\nI0218 11:37:26.126241 6616 handler.go:208] Removed *v1.Node event handler 7\\\\nI0218 11:37:26.126248 6616 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0218 11:37:26.126253 6616 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0218 11:37:26.126261 6616 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0218 11:37:26.126270 6616 factory.go:656] Stopping watch factory\\\\nI0218 11:37:26.126287 6616 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0218 11:37:26.126604 6616 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0218 11:37:26.126749 6616 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0218 11:37:26.126791 6616 ovnkube.go:599] Stopped ovnkube\\\\nI0218 11:37:26.126827 6616 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0218 11:37:26.126933 6616 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wg4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:26Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.566834 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bbe0881c-3173-4207-adb8-72c0dd4fa9c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e73616098bbc0fcdcd0b7e6b5bf610a7c193bdb65922e4532ce9553e9fc6e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71f2a53a03eafd89329b955a7edbe71c610706392e0a07fa09b9a6247889913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cac665b2d5a707e29339e7e8afa75dffcfd1dc1f42262e2643b2487ae68558a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0517235f78003f9b1a9044287c23aebed811299dc48f24c6643dd3bfcf5cf1d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a0c8c34dd5237a5b2d52b436f5c4deabd96a96ccb8073961dc49aaffd62c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:26Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.577867 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65825dc8bc12fe9933ff1392fd17a19042bf62d65c0b1ba22303094881745e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:26Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.582020 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.582051 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.582062 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.582076 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.582086 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:26Z","lastTransitionTime":"2026-02-18T11:37:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.588561 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9xzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b4595ac-c521-4ada-950d-e1b01cdff99b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f784f6e4f98fea5928c944062388eaa69d5a5782490bdb120cd735f54f52df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zv6rh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9xzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:26Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.599348 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3079844ef727cc58eb6253c7eaa0289ca84977a00334a10cdbf493b16133a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:26Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.609966 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdb7cedc-b2e3-48f0-80e0-e17073b43228\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c29cb76dded770e8691cd270c1c1197ffd3764c3fa73faf7f072a63c9e3e4b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653a3be8615ffc467d4be70b77ab62c837c24df920515a9965ec20ce2941d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-znglx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:26Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.684673 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.684719 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.684736 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.684757 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.684770 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:26Z","lastTransitionTime":"2026-02-18T11:37:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.787571 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.787649 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.787681 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.787711 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.787736 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:26Z","lastTransitionTime":"2026-02-18T11:37:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.891107 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.891496 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.891645 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.891806 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.891938 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:26Z","lastTransitionTime":"2026-02-18T11:37:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.941753 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 18:17:53.987306455 +0000 UTC Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.995228 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.995291 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.995309 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.995334 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:26 crc kubenswrapper[4922]: I0218 11:37:26.995350 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:26Z","lastTransitionTime":"2026-02-18T11:37:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:27 crc kubenswrapper[4922]: I0218 11:37:27.098130 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:27 crc kubenswrapper[4922]: I0218 11:37:27.098215 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:27 crc kubenswrapper[4922]: I0218 11:37:27.098233 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:27 crc kubenswrapper[4922]: I0218 11:37:27.098734 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:27 crc kubenswrapper[4922]: I0218 11:37:27.098793 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:27Z","lastTransitionTime":"2026-02-18T11:37:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:27 crc kubenswrapper[4922]: I0218 11:37:27.202194 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:27 crc kubenswrapper[4922]: I0218 11:37:27.202241 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:27 crc kubenswrapper[4922]: I0218 11:37:27.202260 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:27 crc kubenswrapper[4922]: I0218 11:37:27.202281 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:27 crc kubenswrapper[4922]: I0218 11:37:27.202298 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:27Z","lastTransitionTime":"2026-02-18T11:37:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:27 crc kubenswrapper[4922]: I0218 11:37:27.304819 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:27 crc kubenswrapper[4922]: I0218 11:37:27.304858 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:27 crc kubenswrapper[4922]: I0218 11:37:27.304869 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:27 crc kubenswrapper[4922]: I0218 11:37:27.304884 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:27 crc kubenswrapper[4922]: I0218 11:37:27.304894 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:27Z","lastTransitionTime":"2026-02-18T11:37:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:27 crc kubenswrapper[4922]: I0218 11:37:27.369273 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wg4r5_653a41bb-bb1d-421c-a92b-7f2811d95edf/ovnkube-controller/2.log" Feb 18 11:37:27 crc kubenswrapper[4922]: I0218 11:37:27.407689 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:27 crc kubenswrapper[4922]: I0218 11:37:27.407726 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:27 crc kubenswrapper[4922]: I0218 11:37:27.407734 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:27 crc kubenswrapper[4922]: I0218 11:37:27.407750 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:27 crc kubenswrapper[4922]: I0218 11:37:27.407759 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:27Z","lastTransitionTime":"2026-02-18T11:37:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:27 crc kubenswrapper[4922]: I0218 11:37:27.511474 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:27 crc kubenswrapper[4922]: I0218 11:37:27.511540 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:27 crc kubenswrapper[4922]: I0218 11:37:27.511602 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:27 crc kubenswrapper[4922]: I0218 11:37:27.511627 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:27 crc kubenswrapper[4922]: I0218 11:37:27.511641 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:27Z","lastTransitionTime":"2026-02-18T11:37:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:27 crc kubenswrapper[4922]: I0218 11:37:27.614774 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:27 crc kubenswrapper[4922]: I0218 11:37:27.614835 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:27 crc kubenswrapper[4922]: I0218 11:37:27.614850 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:27 crc kubenswrapper[4922]: I0218 11:37:27.614886 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:27 crc kubenswrapper[4922]: I0218 11:37:27.614898 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:27Z","lastTransitionTime":"2026-02-18T11:37:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:27 crc kubenswrapper[4922]: I0218 11:37:27.717203 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:27 crc kubenswrapper[4922]: I0218 11:37:27.717274 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:27 crc kubenswrapper[4922]: I0218 11:37:27.717293 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:27 crc kubenswrapper[4922]: I0218 11:37:27.717316 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:27 crc kubenswrapper[4922]: I0218 11:37:27.717332 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:27Z","lastTransitionTime":"2026-02-18T11:37:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:27 crc kubenswrapper[4922]: I0218 11:37:27.820495 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:27 crc kubenswrapper[4922]: I0218 11:37:27.820537 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:27 crc kubenswrapper[4922]: I0218 11:37:27.820547 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:27 crc kubenswrapper[4922]: I0218 11:37:27.820562 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:27 crc kubenswrapper[4922]: I0218 11:37:27.820574 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:27Z","lastTransitionTime":"2026-02-18T11:37:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:27 crc kubenswrapper[4922]: I0218 11:37:27.923798 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:27 crc kubenswrapper[4922]: I0218 11:37:27.923848 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:27 crc kubenswrapper[4922]: I0218 11:37:27.923859 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:27 crc kubenswrapper[4922]: I0218 11:37:27.923877 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:27 crc kubenswrapper[4922]: I0218 11:37:27.923891 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:27Z","lastTransitionTime":"2026-02-18T11:37:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:27 crc kubenswrapper[4922]: I0218 11:37:27.942646 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 11:05:58.245849871 +0000 UTC Feb 18 11:37:27 crc kubenswrapper[4922]: I0218 11:37:27.972343 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:37:27 crc kubenswrapper[4922]: I0218 11:37:27.972468 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:37:27 crc kubenswrapper[4922]: I0218 11:37:27.972355 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:37:27 crc kubenswrapper[4922]: E0218 11:37:27.972571 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:37:27 crc kubenswrapper[4922]: I0218 11:37:27.972654 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:37:27 crc kubenswrapper[4922]: E0218 11:37:27.972730 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pspfr" podUID="4702cf45-b47b-4291-a553-5bfc7bc22674" Feb 18 11:37:27 crc kubenswrapper[4922]: E0218 11:37:27.972851 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:37:27 crc kubenswrapper[4922]: E0218 11:37:27.972999 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:37:28 crc kubenswrapper[4922]: I0218 11:37:28.026722 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:28 crc kubenswrapper[4922]: I0218 11:37:28.026793 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:28 crc kubenswrapper[4922]: I0218 11:37:28.026815 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:28 crc kubenswrapper[4922]: I0218 11:37:28.026851 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:28 crc kubenswrapper[4922]: I0218 11:37:28.026870 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:28Z","lastTransitionTime":"2026-02-18T11:37:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:28 crc kubenswrapper[4922]: I0218 11:37:28.129598 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:28 crc kubenswrapper[4922]: I0218 11:37:28.129674 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:28 crc kubenswrapper[4922]: I0218 11:37:28.129699 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:28 crc kubenswrapper[4922]: I0218 11:37:28.129773 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:28 crc kubenswrapper[4922]: I0218 11:37:28.129791 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:28Z","lastTransitionTime":"2026-02-18T11:37:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:28 crc kubenswrapper[4922]: I0218 11:37:28.232326 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:28 crc kubenswrapper[4922]: I0218 11:37:28.232399 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:28 crc kubenswrapper[4922]: I0218 11:37:28.232415 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:28 crc kubenswrapper[4922]: I0218 11:37:28.232432 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:28 crc kubenswrapper[4922]: I0218 11:37:28.232444 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:28Z","lastTransitionTime":"2026-02-18T11:37:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:28 crc kubenswrapper[4922]: I0218 11:37:28.336326 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:28 crc kubenswrapper[4922]: I0218 11:37:28.336457 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:28 crc kubenswrapper[4922]: I0218 11:37:28.336512 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:28 crc kubenswrapper[4922]: I0218 11:37:28.336560 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:28 crc kubenswrapper[4922]: I0218 11:37:28.336587 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:28Z","lastTransitionTime":"2026-02-18T11:37:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:28 crc kubenswrapper[4922]: I0218 11:37:28.440355 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:28 crc kubenswrapper[4922]: I0218 11:37:28.440518 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:28 crc kubenswrapper[4922]: I0218 11:37:28.440542 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:28 crc kubenswrapper[4922]: I0218 11:37:28.440572 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:28 crc kubenswrapper[4922]: I0218 11:37:28.440594 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:28Z","lastTransitionTime":"2026-02-18T11:37:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:28 crc kubenswrapper[4922]: I0218 11:37:28.542942 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:28 crc kubenswrapper[4922]: I0218 11:37:28.543028 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:28 crc kubenswrapper[4922]: I0218 11:37:28.543061 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:28 crc kubenswrapper[4922]: I0218 11:37:28.543094 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:28 crc kubenswrapper[4922]: I0218 11:37:28.543117 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:28Z","lastTransitionTime":"2026-02-18T11:37:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:28 crc kubenswrapper[4922]: I0218 11:37:28.646240 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:28 crc kubenswrapper[4922]: I0218 11:37:28.646325 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:28 crc kubenswrapper[4922]: I0218 11:37:28.646426 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:28 crc kubenswrapper[4922]: I0218 11:37:28.646456 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:28 crc kubenswrapper[4922]: I0218 11:37:28.646476 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:28Z","lastTransitionTime":"2026-02-18T11:37:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:28 crc kubenswrapper[4922]: I0218 11:37:28.749222 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:28 crc kubenswrapper[4922]: I0218 11:37:28.749280 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:28 crc kubenswrapper[4922]: I0218 11:37:28.749295 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:28 crc kubenswrapper[4922]: I0218 11:37:28.749313 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:28 crc kubenswrapper[4922]: I0218 11:37:28.749326 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:28Z","lastTransitionTime":"2026-02-18T11:37:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:28 crc kubenswrapper[4922]: I0218 11:37:28.851834 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:28 crc kubenswrapper[4922]: I0218 11:37:28.851886 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:28 crc kubenswrapper[4922]: I0218 11:37:28.851899 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:28 crc kubenswrapper[4922]: I0218 11:37:28.851917 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:28 crc kubenswrapper[4922]: I0218 11:37:28.851930 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:28Z","lastTransitionTime":"2026-02-18T11:37:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:28 crc kubenswrapper[4922]: I0218 11:37:28.943527 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 05:02:39.976614463 +0000 UTC Feb 18 11:37:28 crc kubenswrapper[4922]: I0218 11:37:28.954834 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:28 crc kubenswrapper[4922]: I0218 11:37:28.954889 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:28 crc kubenswrapper[4922]: I0218 11:37:28.954902 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:28 crc kubenswrapper[4922]: I0218 11:37:28.954919 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:28 crc kubenswrapper[4922]: I0218 11:37:28.954931 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:28Z","lastTransitionTime":"2026-02-18T11:37:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:28 crc kubenswrapper[4922]: I0218 11:37:28.991529 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bbe0881c-3173-4207-adb8-72c0dd4fa9c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e73616098bbc0fcdcd0b7e6b5bf610a7c193bdb65922e4532ce9553e9fc6e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71f2a53a03eafd89329b955a7edbe71c610706392e0a07fa09b9a6247889913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cac665b2d5a707e29339e7e8afa75dffcfd1dc1f42262e2643b2487ae68558a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0517235f78003f9b1a9044287c23aebed811299dc48f24c6643dd3bfcf5cf1d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a0c8c34dd5237a5b2d52b436f5c4deabd96a96ccb8073961dc49aaffd62c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:28Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.005498 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65825dc8bc12fe9933ff1392fd17a19042bf62d65c0b1ba22303094881745e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:29Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.023389 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9xzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b4595ac-c521-4ada-950d-e1b01cdff99b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f784f6e4f98fea5928c944062388eaa69d5a5782490bdb120cd735f54f52df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zv6rh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9xzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:29Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.039501 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3079844ef727cc58eb6253c7eaa0289ca84977a00334a10cdbf493b16133a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:29Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.055080 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdb7cedc-b2e3-48f0-80e0-e17073b43228\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c29cb76dded770e8691cd270c1c1197ffd3764c3fa73faf7f072a63c9e3e4b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653a3be8615ffc467d4be70b77ab62c837c24df920515a9965ec20ce2941d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-znglx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:29Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.056952 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.057022 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.057037 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.057059 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.057098 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:29Z","lastTransitionTime":"2026-02-18T11:37:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.087319 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"653a41bb-bb1d-421c-a92b-7f2811d95edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c0bf0b715f9f63ac45a7294c1ecd769e8b753eba698432388e02092cd5fb353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a72b7b3feb4b1eda6d0a688e2f4f1dbb05dbf932f4e786b6e414f147964ce746\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"message\\\":\\\"rking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI0218 11:37:14.424128 6409 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0218 11:37:14.424192 6409 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0218 11:37:14.424197 6409 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0218 11:37:14.424238 6409 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0218 11:37:14.424249 6409 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0218 11:37:14.424254 6409 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0218 11:37:14.424121 6409 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF0218 11:37:14.423917 6409 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c0bf0b715f9f63ac45a7294c1ecd769e8b753eba698432388e02092cd5fb353\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:37:26Z\\\",\\\"message\\\":\\\"andler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0218 11:37:26.126150 6616 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0218 11:37:26.126155 6616 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0218 11:37:26.126204 6616 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0218 11:37:26.126216 6616 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0218 11:37:26.126228 6616 handler.go:208] Removed *v1.Node event handler 2\\\\nI0218 11:37:26.126241 6616 handler.go:208] Removed *v1.Node event handler 7\\\\nI0218 11:37:26.126248 6616 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0218 11:37:26.126253 6616 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0218 11:37:26.126261 6616 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0218 11:37:26.126270 6616 factory.go:656] Stopping watch factory\\\\nI0218 11:37:26.126287 6616 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0218 11:37:26.126604 6616 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0218 11:37:26.126749 6616 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0218 11:37:26.126791 6616 ovnkube.go:599] Stopped ovnkube\\\\nI0218 11:37:26.126827 6616 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0218 11:37:26.126933 6616 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wg4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:29Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.098979 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5qkb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cd3723d-a12f-4c7c-a1ea-63bfef3c931a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ff8efc7bd7cb5045841f85767cea766898289dab6f4081829d530fbaaf9dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c45cw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5qkb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:29Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.116120 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"384d508e-8f6a-427d-a85d-85bb61d8405e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dacf3659cbb51e3d8df303470837041f3110f38b42187d40cbb4e53a8750cafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab832676116da0ae3d9dc70d182e283d7ae2418db02c1ea0062812cab0d71b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://434010c9772cf1c9db5578ecd4726ae2de0c0a3923d45b522868677cf78fcaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af62821d90e57595091aa9e7f1083acd21cb955ada0837c5ab4f71885ff7f1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0218 11:36:58.792829 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0218 11:36:58.792990 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0218 11:36:58.794119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-56419459/tls.crt::/tmp/serving-cert-56419459/tls.key\\\\\\\"\\\\nI0218 11:36:58.994875 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0218 11:36:59.009974 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0218 11:36:59.010010 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0218 11:36:59.010039 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0218 11:36:59.010047 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0218 11:36:59.021309 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0218 11:36:59.021338 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021343 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021347 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0218 11:36:59.021350 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0218 11:36:59.021353 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:36:59.021356 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:36:59.021636 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0218 11:36:59.024101 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b891a2fe938b437cbfb28c7715d0ad758541d844a4768434d3c1f1a3f0578c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:29Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.133441 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://454f72313609f075cd3f659b19279f8c3b07e8fea70f0ca7c88c5ca8e2e57397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0577fe2fd5192d4ec36ec8bfc6259a533b6422ef672d8153f601c2ac96bb5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:29Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.149134 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26zbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592c6351-c252-4c19-b3b1-167096be2de9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fca36b120b6a8f3bf08a810358bdd23b8b3c3105ede8e850232a9fef2d322db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26zbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:29Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.160615 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.160674 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.160685 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.160703 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.160715 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:29Z","lastTransitionTime":"2026-02-18T11:37:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.163267 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjnxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"648f85d5-dbc6-4db6-b590-3edc96740212\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e419e9c268c54b7829fa957e360c5b2c9943232871f9773ab32017c36a44b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8l2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61f493c42d7223ae48d738cf9973c2f6f5997722b017cc1665ec5f27ea0542dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8l2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:37:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjnxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:29Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.176768 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w46bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"067f44ac-9e60-4581-87cc-f2e1c823fc4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c52fc2f1b105e90957f8cd7424352990f9ec9e5350d1a501980bfda1613066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ks8d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w46bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:29Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.194311 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pspfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702cf45-b47b-4291-a553-5bfc7bc22674\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7224x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7224x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:37:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pspfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:29Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.209929 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47a2352-be16-435d-9207-078c64f99e77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d641439c1ae42b2579e47041fc257fa37643ec1ef4fcacf55b15b82d6ec02cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dda1daddcf256e52a7dc75c6b5c221fcd8b8fad50c50b6053774f9a8fab57e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac316efde113b3547d00f480ee2c658958ba4f99c719d0920f333ab2a62ebaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6a8f8152ee7536b35054611238f4050c7f0e3a756f74ae8525af7f50150f88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:29Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.226022 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"143e32d8-7a32-4ab1-9790-a5b4fe1f0180\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c70b47d4c3df2edbdf1c33bdcff5642932d5212a23d54b20e79ec909b2e3de12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e27a0c7d36ad7d1c3f884b9bb69b963a922ad7851d263b3761ed5bc7e79a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a488435a081bdf17035a1ff4df73b0723045f0eef9049ec259198c697ecc33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39ce7973e25ba416a5f82cbafb8c1b3bea34718c49f8a8a56bd46970dc688727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39ce7973e25ba416a5f82cbafb8c1b3bea34718c49f8a8a56bd46970dc688727\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:29Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.242765 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:29Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.259237 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:29Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.262702 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.262737 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.262748 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.262766 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.262779 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:29Z","lastTransitionTime":"2026-02-18T11:37:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.274619 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:29Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.365758 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.365824 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.365835 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.365859 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.365874 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:29Z","lastTransitionTime":"2026-02-18T11:37:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.468544 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.468607 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.468624 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.468652 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.468670 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:29Z","lastTransitionTime":"2026-02-18T11:37:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.572637 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.572694 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.572716 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.572748 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.572772 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:29Z","lastTransitionTime":"2026-02-18T11:37:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.593327 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4702cf45-b47b-4291-a553-5bfc7bc22674-metrics-certs\") pod \"network-metrics-daemon-pspfr\" (UID: \"4702cf45-b47b-4291-a553-5bfc7bc22674\") " pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:37:29 crc kubenswrapper[4922]: E0218 11:37:29.593548 4922 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 11:37:29 crc kubenswrapper[4922]: E0218 11:37:29.593674 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4702cf45-b47b-4291-a553-5bfc7bc22674-metrics-certs podName:4702cf45-b47b-4291-a553-5bfc7bc22674 nodeName:}" failed. No retries permitted until 2026-02-18 11:37:45.593647664 +0000 UTC m=+67.321351774 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4702cf45-b47b-4291-a553-5bfc7bc22674-metrics-certs") pod "network-metrics-daemon-pspfr" (UID: "4702cf45-b47b-4291-a553-5bfc7bc22674") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.675613 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.675688 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.675699 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.675715 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.675724 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:29Z","lastTransitionTime":"2026-02-18T11:37:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.778571 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.778638 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.778662 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.778696 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.778724 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:29Z","lastTransitionTime":"2026-02-18T11:37:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.883001 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.883066 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.883087 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.883114 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.883135 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:29Z","lastTransitionTime":"2026-02-18T11:37:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.943737 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 09:53:01.649068939 +0000 UTC Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.973100 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:37:29 crc kubenswrapper[4922]: E0218 11:37:29.973355 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.973598 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.973683 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:37:29 crc kubenswrapper[4922]: E0218 11:37:29.973810 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pspfr" podUID="4702cf45-b47b-4291-a553-5bfc7bc22674" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.973627 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:37:29 crc kubenswrapper[4922]: E0218 11:37:29.973972 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:37:29 crc kubenswrapper[4922]: E0218 11:37:29.974166 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.986769 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.986828 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.986851 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.986880 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:29 crc kubenswrapper[4922]: I0218 11:37:29.986903 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:29Z","lastTransitionTime":"2026-02-18T11:37:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:30 crc kubenswrapper[4922]: I0218 11:37:30.089776 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:30 crc kubenswrapper[4922]: I0218 11:37:30.089851 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:30 crc kubenswrapper[4922]: I0218 11:37:30.089870 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:30 crc kubenswrapper[4922]: I0218 11:37:30.089896 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:30 crc kubenswrapper[4922]: I0218 11:37:30.089916 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:30Z","lastTransitionTime":"2026-02-18T11:37:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:30 crc kubenswrapper[4922]: I0218 11:37:30.192959 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:30 crc kubenswrapper[4922]: I0218 11:37:30.193050 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:30 crc kubenswrapper[4922]: I0218 11:37:30.193075 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:30 crc kubenswrapper[4922]: I0218 11:37:30.193103 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:30 crc kubenswrapper[4922]: I0218 11:37:30.193123 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:30Z","lastTransitionTime":"2026-02-18T11:37:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:30 crc kubenswrapper[4922]: I0218 11:37:30.296577 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:30 crc kubenswrapper[4922]: I0218 11:37:30.296700 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:30 crc kubenswrapper[4922]: I0218 11:37:30.296720 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:30 crc kubenswrapper[4922]: I0218 11:37:30.296748 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:30 crc kubenswrapper[4922]: I0218 11:37:30.296765 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:30Z","lastTransitionTime":"2026-02-18T11:37:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:30 crc kubenswrapper[4922]: I0218 11:37:30.400073 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:30 crc kubenswrapper[4922]: I0218 11:37:30.400127 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:30 crc kubenswrapper[4922]: I0218 11:37:30.400146 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:30 crc kubenswrapper[4922]: I0218 11:37:30.400174 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:30 crc kubenswrapper[4922]: I0218 11:37:30.400214 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:30Z","lastTransitionTime":"2026-02-18T11:37:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:30 crc kubenswrapper[4922]: I0218 11:37:30.503134 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:30 crc kubenswrapper[4922]: I0218 11:37:30.503186 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:30 crc kubenswrapper[4922]: I0218 11:37:30.503200 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:30 crc kubenswrapper[4922]: I0218 11:37:30.503223 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:30 crc kubenswrapper[4922]: I0218 11:37:30.503238 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:30Z","lastTransitionTime":"2026-02-18T11:37:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:30 crc kubenswrapper[4922]: I0218 11:37:30.606518 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:30 crc kubenswrapper[4922]: I0218 11:37:30.606623 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:30 crc kubenswrapper[4922]: I0218 11:37:30.606641 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:30 crc kubenswrapper[4922]: I0218 11:37:30.606663 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:30 crc kubenswrapper[4922]: I0218 11:37:30.606680 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:30Z","lastTransitionTime":"2026-02-18T11:37:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:30 crc kubenswrapper[4922]: I0218 11:37:30.709638 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:30 crc kubenswrapper[4922]: I0218 11:37:30.709756 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:30 crc kubenswrapper[4922]: I0218 11:37:30.709776 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:30 crc kubenswrapper[4922]: I0218 11:37:30.709801 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:30 crc kubenswrapper[4922]: I0218 11:37:30.709815 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:30Z","lastTransitionTime":"2026-02-18T11:37:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:30 crc kubenswrapper[4922]: I0218 11:37:30.812466 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:30 crc kubenswrapper[4922]: I0218 11:37:30.812536 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:30 crc kubenswrapper[4922]: I0218 11:37:30.812549 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:30 crc kubenswrapper[4922]: I0218 11:37:30.812569 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:30 crc kubenswrapper[4922]: I0218 11:37:30.812585 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:30Z","lastTransitionTime":"2026-02-18T11:37:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:30 crc kubenswrapper[4922]: I0218 11:37:30.915549 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:30 crc kubenswrapper[4922]: I0218 11:37:30.915618 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:30 crc kubenswrapper[4922]: I0218 11:37:30.915638 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:30 crc kubenswrapper[4922]: I0218 11:37:30.915662 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:30 crc kubenswrapper[4922]: I0218 11:37:30.915680 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:30Z","lastTransitionTime":"2026-02-18T11:37:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:30 crc kubenswrapper[4922]: I0218 11:37:30.943956 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 14:17:36.510236309 +0000 UTC Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.018759 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.018800 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.018809 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.018824 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.018835 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:31Z","lastTransitionTime":"2026-02-18T11:37:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.121957 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.122028 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.122048 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.122075 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.122096 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:31Z","lastTransitionTime":"2026-02-18T11:37:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.225224 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.225274 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.225288 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.225308 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.225321 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:31Z","lastTransitionTime":"2026-02-18T11:37:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.328218 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.328266 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.328277 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.328294 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.328306 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:31Z","lastTransitionTime":"2026-02-18T11:37:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.431260 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.431319 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.431333 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.431353 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.431387 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:31Z","lastTransitionTime":"2026-02-18T11:37:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.534703 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.534793 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.534816 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.534846 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.534868 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:31Z","lastTransitionTime":"2026-02-18T11:37:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.637844 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.637936 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.637964 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.637990 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.638008 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:31Z","lastTransitionTime":"2026-02-18T11:37:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.740526 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.740570 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.740585 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.740605 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.740621 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:31Z","lastTransitionTime":"2026-02-18T11:37:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.818007 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:37:31 crc kubenswrapper[4922]: E0218 11:37:31.818254 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:38:03.818229974 +0000 UTC m=+85.545934064 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.843230 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.843286 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.843308 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.843331 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.843346 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:31Z","lastTransitionTime":"2026-02-18T11:37:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.919419 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.919514 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.919581 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:37:31 crc kubenswrapper[4922]: E0218 11:37:31.919613 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.919624 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:37:31 crc kubenswrapper[4922]: E0218 11:37:31.919649 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 11:37:31 crc kubenswrapper[4922]: E0218 11:37:31.919666 4922 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:37:31 crc kubenswrapper[4922]: E0218 11:37:31.919710 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 11:37:31 crc kubenswrapper[4922]: E0218 11:37:31.919732 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-18 11:38:03.919709967 +0000 UTC m=+85.647414057 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:37:31 crc kubenswrapper[4922]: E0218 11:37:31.919737 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 11:37:31 crc kubenswrapper[4922]: E0218 11:37:31.919755 4922 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:37:31 crc kubenswrapper[4922]: E0218 11:37:31.919795 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-18 11:38:03.919781808 +0000 UTC m=+85.647485898 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:37:31 crc kubenswrapper[4922]: E0218 11:37:31.919839 4922 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 11:37:31 crc kubenswrapper[4922]: E0218 11:37:31.919874 4922 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 11:37:31 crc kubenswrapper[4922]: E0218 11:37:31.919956 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 11:38:03.919928642 +0000 UTC m=+85.647632772 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 11:37:31 crc kubenswrapper[4922]: E0218 11:37:31.919995 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 11:38:03.919975453 +0000 UTC m=+85.647679583 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.944117 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 12:15:39.199787056 +0000 UTC Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.946148 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.946206 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.946229 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.946259 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.946281 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:31Z","lastTransitionTime":"2026-02-18T11:37:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.972587 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.972712 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.972587 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:37:31 crc kubenswrapper[4922]: E0218 11:37:31.972771 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:37:31 crc kubenswrapper[4922]: I0218 11:37:31.972609 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:37:31 crc kubenswrapper[4922]: E0218 11:37:31.972922 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:37:31 crc kubenswrapper[4922]: E0218 11:37:31.973037 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pspfr" podUID="4702cf45-b47b-4291-a553-5bfc7bc22674" Feb 18 11:37:31 crc kubenswrapper[4922]: E0218 11:37:31.973176 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:37:32 crc kubenswrapper[4922]: I0218 11:37:32.049322 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:32 crc kubenswrapper[4922]: I0218 11:37:32.049401 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:32 crc kubenswrapper[4922]: I0218 11:37:32.049421 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:32 crc kubenswrapper[4922]: I0218 11:37:32.049449 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:32 crc kubenswrapper[4922]: I0218 11:37:32.049464 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:32Z","lastTransitionTime":"2026-02-18T11:37:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:32 crc kubenswrapper[4922]: I0218 11:37:32.152583 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:32 crc kubenswrapper[4922]: I0218 11:37:32.152641 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:32 crc kubenswrapper[4922]: I0218 11:37:32.152656 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:32 crc kubenswrapper[4922]: I0218 11:37:32.152678 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:32 crc kubenswrapper[4922]: I0218 11:37:32.152694 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:32Z","lastTransitionTime":"2026-02-18T11:37:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:32 crc kubenswrapper[4922]: I0218 11:37:32.256265 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:32 crc kubenswrapper[4922]: I0218 11:37:32.256329 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:32 crc kubenswrapper[4922]: I0218 11:37:32.256340 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:32 crc kubenswrapper[4922]: I0218 11:37:32.256407 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:32 crc kubenswrapper[4922]: I0218 11:37:32.256423 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:32Z","lastTransitionTime":"2026-02-18T11:37:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:32 crc kubenswrapper[4922]: I0218 11:37:32.359468 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:32 crc kubenswrapper[4922]: I0218 11:37:32.359523 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:32 crc kubenswrapper[4922]: I0218 11:37:32.359535 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:32 crc kubenswrapper[4922]: I0218 11:37:32.359555 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:32 crc kubenswrapper[4922]: I0218 11:37:32.359569 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:32Z","lastTransitionTime":"2026-02-18T11:37:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:32 crc kubenswrapper[4922]: I0218 11:37:32.463237 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:32 crc kubenswrapper[4922]: I0218 11:37:32.463279 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:32 crc kubenswrapper[4922]: I0218 11:37:32.463295 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:32 crc kubenswrapper[4922]: I0218 11:37:32.463316 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:32 crc kubenswrapper[4922]: I0218 11:37:32.463329 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:32Z","lastTransitionTime":"2026-02-18T11:37:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:32 crc kubenswrapper[4922]: I0218 11:37:32.566142 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:32 crc kubenswrapper[4922]: I0218 11:37:32.566231 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:32 crc kubenswrapper[4922]: I0218 11:37:32.566250 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:32 crc kubenswrapper[4922]: I0218 11:37:32.566272 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:32 crc kubenswrapper[4922]: I0218 11:37:32.566286 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:32Z","lastTransitionTime":"2026-02-18T11:37:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:32 crc kubenswrapper[4922]: I0218 11:37:32.668527 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:32 crc kubenswrapper[4922]: I0218 11:37:32.668755 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:32 crc kubenswrapper[4922]: I0218 11:37:32.668816 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:32 crc kubenswrapper[4922]: I0218 11:37:32.668906 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:32 crc kubenswrapper[4922]: I0218 11:37:32.668967 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:32Z","lastTransitionTime":"2026-02-18T11:37:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:32 crc kubenswrapper[4922]: I0218 11:37:32.772016 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:32 crc kubenswrapper[4922]: I0218 11:37:32.772084 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:32 crc kubenswrapper[4922]: I0218 11:37:32.772095 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:32 crc kubenswrapper[4922]: I0218 11:37:32.772112 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:32 crc kubenswrapper[4922]: I0218 11:37:32.772124 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:32Z","lastTransitionTime":"2026-02-18T11:37:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:32 crc kubenswrapper[4922]: I0218 11:37:32.874199 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:32 crc kubenswrapper[4922]: I0218 11:37:32.874232 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:32 crc kubenswrapper[4922]: I0218 11:37:32.874242 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:32 crc kubenswrapper[4922]: I0218 11:37:32.874260 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:32 crc kubenswrapper[4922]: I0218 11:37:32.874271 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:32Z","lastTransitionTime":"2026-02-18T11:37:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:32 crc kubenswrapper[4922]: I0218 11:37:32.944508 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 22:12:38.291885881 +0000 UTC Feb 18 11:37:32 crc kubenswrapper[4922]: I0218 11:37:32.977049 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:32 crc kubenswrapper[4922]: I0218 11:37:32.977090 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:32 crc kubenswrapper[4922]: I0218 11:37:32.977107 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:32 crc kubenswrapper[4922]: I0218 11:37:32.977129 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:32 crc kubenswrapper[4922]: I0218 11:37:32.977145 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:32Z","lastTransitionTime":"2026-02-18T11:37:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:33 crc kubenswrapper[4922]: I0218 11:37:33.079776 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:33 crc kubenswrapper[4922]: I0218 11:37:33.079836 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:33 crc kubenswrapper[4922]: I0218 11:37:33.079853 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:33 crc kubenswrapper[4922]: I0218 11:37:33.079878 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:33 crc kubenswrapper[4922]: I0218 11:37:33.079893 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:33Z","lastTransitionTime":"2026-02-18T11:37:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:33 crc kubenswrapper[4922]: I0218 11:37:33.182208 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:33 crc kubenswrapper[4922]: I0218 11:37:33.182534 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:33 crc kubenswrapper[4922]: I0218 11:37:33.182712 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:33 crc kubenswrapper[4922]: I0218 11:37:33.182849 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:33 crc kubenswrapper[4922]: I0218 11:37:33.182967 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:33Z","lastTransitionTime":"2026-02-18T11:37:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:33 crc kubenswrapper[4922]: I0218 11:37:33.285166 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:33 crc kubenswrapper[4922]: I0218 11:37:33.285200 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:33 crc kubenswrapper[4922]: I0218 11:37:33.285213 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:33 crc kubenswrapper[4922]: I0218 11:37:33.285225 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:33 crc kubenswrapper[4922]: I0218 11:37:33.285235 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:33Z","lastTransitionTime":"2026-02-18T11:37:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:33 crc kubenswrapper[4922]: I0218 11:37:33.392218 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:33 crc kubenswrapper[4922]: I0218 11:37:33.392473 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:33 crc kubenswrapper[4922]: I0218 11:37:33.392484 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:33 crc kubenswrapper[4922]: I0218 11:37:33.392499 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:33 crc kubenswrapper[4922]: I0218 11:37:33.392510 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:33Z","lastTransitionTime":"2026-02-18T11:37:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:33 crc kubenswrapper[4922]: I0218 11:37:33.495524 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:33 crc kubenswrapper[4922]: I0218 11:37:33.495637 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:33 crc kubenswrapper[4922]: I0218 11:37:33.495666 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:33 crc kubenswrapper[4922]: I0218 11:37:33.495702 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:33 crc kubenswrapper[4922]: I0218 11:37:33.495732 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:33Z","lastTransitionTime":"2026-02-18T11:37:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:33 crc kubenswrapper[4922]: I0218 11:37:33.600805 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:33 crc kubenswrapper[4922]: I0218 11:37:33.600871 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:33 crc kubenswrapper[4922]: I0218 11:37:33.600884 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:33 crc kubenswrapper[4922]: I0218 11:37:33.600903 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:33 crc kubenswrapper[4922]: I0218 11:37:33.600917 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:33Z","lastTransitionTime":"2026-02-18T11:37:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:33 crc kubenswrapper[4922]: I0218 11:37:33.703983 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:33 crc kubenswrapper[4922]: I0218 11:37:33.704042 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:33 crc kubenswrapper[4922]: I0218 11:37:33.704056 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:33 crc kubenswrapper[4922]: I0218 11:37:33.704077 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:33 crc kubenswrapper[4922]: I0218 11:37:33.704092 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:33Z","lastTransitionTime":"2026-02-18T11:37:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:33 crc kubenswrapper[4922]: I0218 11:37:33.806898 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:33 crc kubenswrapper[4922]: I0218 11:37:33.806951 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:33 crc kubenswrapper[4922]: I0218 11:37:33.806961 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:33 crc kubenswrapper[4922]: I0218 11:37:33.806976 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:33 crc kubenswrapper[4922]: I0218 11:37:33.806985 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:33Z","lastTransitionTime":"2026-02-18T11:37:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:33 crc kubenswrapper[4922]: I0218 11:37:33.909004 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:33 crc kubenswrapper[4922]: I0218 11:37:33.909072 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:33 crc kubenswrapper[4922]: I0218 11:37:33.909101 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:33 crc kubenswrapper[4922]: I0218 11:37:33.909129 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:33 crc kubenswrapper[4922]: I0218 11:37:33.909147 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:33Z","lastTransitionTime":"2026-02-18T11:37:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:33 crc kubenswrapper[4922]: I0218 11:37:33.945606 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 18:53:40.938791194 +0000 UTC Feb 18 11:37:33 crc kubenswrapper[4922]: I0218 11:37:33.972918 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:37:33 crc kubenswrapper[4922]: I0218 11:37:33.973000 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:37:33 crc kubenswrapper[4922]: E0218 11:37:33.973079 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:37:33 crc kubenswrapper[4922]: E0218 11:37:33.973157 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:37:33 crc kubenswrapper[4922]: I0218 11:37:33.973309 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:37:33 crc kubenswrapper[4922]: I0218 11:37:33.973797 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:37:33 crc kubenswrapper[4922]: E0218 11:37:33.973940 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:37:33 crc kubenswrapper[4922]: E0218 11:37:33.974060 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pspfr" podUID="4702cf45-b47b-4291-a553-5bfc7bc22674" Feb 18 11:37:34 crc kubenswrapper[4922]: I0218 11:37:34.011972 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:34 crc kubenswrapper[4922]: I0218 11:37:34.012014 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:34 crc kubenswrapper[4922]: I0218 11:37:34.012023 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:34 crc kubenswrapper[4922]: I0218 11:37:34.012037 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:34 crc kubenswrapper[4922]: I0218 11:37:34.012047 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:34Z","lastTransitionTime":"2026-02-18T11:37:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:34 crc kubenswrapper[4922]: I0218 11:37:34.115044 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:34 crc kubenswrapper[4922]: I0218 11:37:34.115091 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:34 crc kubenswrapper[4922]: I0218 11:37:34.115103 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:34 crc kubenswrapper[4922]: I0218 11:37:34.115119 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:34 crc kubenswrapper[4922]: I0218 11:37:34.115131 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:34Z","lastTransitionTime":"2026-02-18T11:37:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:34 crc kubenswrapper[4922]: I0218 11:37:34.218234 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:34 crc kubenswrapper[4922]: I0218 11:37:34.218282 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:34 crc kubenswrapper[4922]: I0218 11:37:34.218295 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:34 crc kubenswrapper[4922]: I0218 11:37:34.218318 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:34 crc kubenswrapper[4922]: I0218 11:37:34.218331 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:34Z","lastTransitionTime":"2026-02-18T11:37:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:34 crc kubenswrapper[4922]: I0218 11:37:34.322126 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:34 crc kubenswrapper[4922]: I0218 11:37:34.322187 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:34 crc kubenswrapper[4922]: I0218 11:37:34.322205 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:34 crc kubenswrapper[4922]: I0218 11:37:34.322231 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:34 crc kubenswrapper[4922]: I0218 11:37:34.322249 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:34Z","lastTransitionTime":"2026-02-18T11:37:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:34 crc kubenswrapper[4922]: I0218 11:37:34.425401 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:34 crc kubenswrapper[4922]: I0218 11:37:34.425612 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:34 crc kubenswrapper[4922]: I0218 11:37:34.425643 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:34 crc kubenswrapper[4922]: I0218 11:37:34.425676 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:34 crc kubenswrapper[4922]: I0218 11:37:34.425701 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:34Z","lastTransitionTime":"2026-02-18T11:37:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:34 crc kubenswrapper[4922]: I0218 11:37:34.528769 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:34 crc kubenswrapper[4922]: I0218 11:37:34.528840 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:34 crc kubenswrapper[4922]: I0218 11:37:34.528856 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:34 crc kubenswrapper[4922]: I0218 11:37:34.528880 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:34 crc kubenswrapper[4922]: I0218 11:37:34.528898 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:34Z","lastTransitionTime":"2026-02-18T11:37:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:34 crc kubenswrapper[4922]: I0218 11:37:34.631772 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:34 crc kubenswrapper[4922]: I0218 11:37:34.631844 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:34 crc kubenswrapper[4922]: I0218 11:37:34.631869 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:34 crc kubenswrapper[4922]: I0218 11:37:34.631896 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:34 crc kubenswrapper[4922]: I0218 11:37:34.631917 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:34Z","lastTransitionTime":"2026-02-18T11:37:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:34 crc kubenswrapper[4922]: I0218 11:37:34.734704 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:34 crc kubenswrapper[4922]: I0218 11:37:34.734773 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:34 crc kubenswrapper[4922]: I0218 11:37:34.734786 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:34 crc kubenswrapper[4922]: I0218 11:37:34.734817 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:34 crc kubenswrapper[4922]: I0218 11:37:34.734835 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:34Z","lastTransitionTime":"2026-02-18T11:37:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:34 crc kubenswrapper[4922]: I0218 11:37:34.838215 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:34 crc kubenswrapper[4922]: I0218 11:37:34.838290 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:34 crc kubenswrapper[4922]: I0218 11:37:34.838311 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:34 crc kubenswrapper[4922]: I0218 11:37:34.838339 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:34 crc kubenswrapper[4922]: I0218 11:37:34.838404 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:34Z","lastTransitionTime":"2026-02-18T11:37:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:34 crc kubenswrapper[4922]: I0218 11:37:34.942139 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:34 crc kubenswrapper[4922]: I0218 11:37:34.942202 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:34 crc kubenswrapper[4922]: I0218 11:37:34.942214 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:34 crc kubenswrapper[4922]: I0218 11:37:34.942237 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:34 crc kubenswrapper[4922]: I0218 11:37:34.942251 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:34Z","lastTransitionTime":"2026-02-18T11:37:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:34 crc kubenswrapper[4922]: I0218 11:37:34.946245 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 21:44:15.432524908 +0000 UTC Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.041268 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.041339 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.041405 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.041440 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.041462 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:35Z","lastTransitionTime":"2026-02-18T11:37:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:35 crc kubenswrapper[4922]: E0218 11:37:35.065332 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"142c3684-8991-4ed8-97d2-827c32777413\\\",\\\"systemUUID\\\":\\\"e6651d54-876d-4ba5-b7d2-813fd96498e1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:35Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.071667 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.071744 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.071758 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.071783 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.071797 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:35Z","lastTransitionTime":"2026-02-18T11:37:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:35 crc kubenswrapper[4922]: E0218 11:37:35.093413 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"142c3684-8991-4ed8-97d2-827c32777413\\\",\\\"systemUUID\\\":\\\"e6651d54-876d-4ba5-b7d2-813fd96498e1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:35Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.103142 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.103223 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.103242 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.103269 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.103288 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:35Z","lastTransitionTime":"2026-02-18T11:37:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:35 crc kubenswrapper[4922]: E0218 11:37:35.122648 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"142c3684-8991-4ed8-97d2-827c32777413\\\",\\\"systemUUID\\\":\\\"e6651d54-876d-4ba5-b7d2-813fd96498e1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:35Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.128220 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.128504 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.128523 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.128548 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.128567 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:35Z","lastTransitionTime":"2026-02-18T11:37:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:35 crc kubenswrapper[4922]: E0218 11:37:35.145808 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"142c3684-8991-4ed8-97d2-827c32777413\\\",\\\"systemUUID\\\":\\\"e6651d54-876d-4ba5-b7d2-813fd96498e1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:35Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.151251 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.151313 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.151335 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.151393 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.151416 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:35Z","lastTransitionTime":"2026-02-18T11:37:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:35 crc kubenswrapper[4922]: E0218 11:37:35.178788 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"142c3684-8991-4ed8-97d2-827c32777413\\\",\\\"systemUUID\\\":\\\"e6651d54-876d-4ba5-b7d2-813fd96498e1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:35Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:35 crc kubenswrapper[4922]: E0218 11:37:35.178931 4922 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.182002 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.182064 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.182079 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.182100 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.182117 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:35Z","lastTransitionTime":"2026-02-18T11:37:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.284792 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.284829 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.284837 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.284852 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.284861 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:35Z","lastTransitionTime":"2026-02-18T11:37:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.387865 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.387957 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.388026 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.388062 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.388085 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:35Z","lastTransitionTime":"2026-02-18T11:37:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.491173 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.491214 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.491226 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.491243 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.491255 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:35Z","lastTransitionTime":"2026-02-18T11:37:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.594401 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.594474 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.594488 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.594512 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.594526 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:35Z","lastTransitionTime":"2026-02-18T11:37:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.698019 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.698101 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.698128 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.698157 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.698176 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:35Z","lastTransitionTime":"2026-02-18T11:37:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.801909 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.801985 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.802011 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.802039 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.802059 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:35Z","lastTransitionTime":"2026-02-18T11:37:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.904704 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.904748 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.904762 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.904779 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.904791 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:35Z","lastTransitionTime":"2026-02-18T11:37:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.946764 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 20:17:54.475449117 +0000 UTC Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.972440 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.972509 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.972580 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:37:35 crc kubenswrapper[4922]: I0218 11:37:35.972459 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:37:35 crc kubenswrapper[4922]: E0218 11:37:35.972582 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:37:35 crc kubenswrapper[4922]: E0218 11:37:35.972703 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:37:35 crc kubenswrapper[4922]: E0218 11:37:35.972877 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:37:35 crc kubenswrapper[4922]: E0218 11:37:35.973223 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pspfr" podUID="4702cf45-b47b-4291-a553-5bfc7bc22674" Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.009086 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.009143 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.009152 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.009168 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.009180 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:36Z","lastTransitionTime":"2026-02-18T11:37:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.112309 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.112400 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.112424 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.112446 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.112461 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:36Z","lastTransitionTime":"2026-02-18T11:37:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.215776 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.215841 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.215866 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.215894 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.215914 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:36Z","lastTransitionTime":"2026-02-18T11:37:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.319711 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.319821 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.319846 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.319871 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.319892 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:36Z","lastTransitionTime":"2026-02-18T11:37:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.423549 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.423612 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.423623 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.423643 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.423701 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:36Z","lastTransitionTime":"2026-02-18T11:37:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.525994 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.526068 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.526105 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.526138 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.526163 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:36Z","lastTransitionTime":"2026-02-18T11:37:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.629492 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.629542 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.629561 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.629585 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.629603 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:36Z","lastTransitionTime":"2026-02-18T11:37:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.733332 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.733436 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.733461 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.733491 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.733512 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:36Z","lastTransitionTime":"2026-02-18T11:37:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.836205 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.836266 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.836285 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.836308 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.836327 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:36Z","lastTransitionTime":"2026-02-18T11:37:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.939733 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.940013 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.940090 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.940190 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.940250 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:36Z","lastTransitionTime":"2026-02-18T11:37:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.947337 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 04:43:00.85248585 +0000 UTC Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.966451 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.973047 4922 scope.go:117] "RemoveContainer" containerID="9c0bf0b715f9f63ac45a7294c1ecd769e8b753eba698432388e02092cd5fb353" Feb 18 11:37:36 crc kubenswrapper[4922]: E0218 11:37:36.973213 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wg4r5_openshift-ovn-kubernetes(653a41bb-bb1d-421c-a92b-7f2811d95edf)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.982302 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5qkb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cd3723d-a12f-4c7c-a1ea-63bfef3c931a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ff8efc7bd7cb5045841f85767cea766898289dab6f4081829d530fbaaf9dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c45cw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5qkb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:36Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:36 crc kubenswrapper[4922]: I0218 11:37:36.996309 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"384d508e-8f6a-427d-a85d-85bb61d8405e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dacf3659cbb51e3d8df303470837041f3110f38b42187d40cbb4e53a8750cafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab832676116da0ae3d9dc70d182e283d7ae2418db02c1ea0062812cab0d71b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://434010c9772cf1c9db5578ecd4726ae2de0c0a3923d45b522868677cf78fcaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af62821d90e57595091aa9e7f1083acd21cb955ada0837c5ab4f71885ff7f1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0218 11:36:58.792829 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0218 11:36:58.792990 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0218 11:36:58.794119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-56419459/tls.crt::/tmp/serving-cert-56419459/tls.key\\\\\\\"\\\\nI0218 11:36:58.994875 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0218 11:36:59.009974 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0218 11:36:59.010010 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0218 11:36:59.010039 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0218 11:36:59.010047 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0218 11:36:59.021309 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0218 11:36:59.021338 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021343 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021347 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0218 11:36:59.021350 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0218 11:36:59.021353 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:36:59.021356 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:36:59.021636 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0218 11:36:59.024101 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b891a2fe938b437cbfb28c7715d0ad758541d844a4768434d3c1f1a3f0578c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:36Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.007184 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://454f72313609f075cd3f659b19279f8c3b07e8fea70f0ca7c88c5ca8e2e57397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0577fe2fd5192d4ec36ec8bfc6259a533b6422ef672d8153f601c2ac96bb5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:37Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.019716 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26zbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592c6351-c252-4c19-b3b1-167096be2de9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fca36b120b6a8f3bf08a810358bdd23b8b3c3105ede8e850232a9fef2d322db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26zbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:37Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.032515 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjnxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"648f85d5-dbc6-4db6-b590-3edc96740212\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e419e9c268c54b7829fa957e360c5b2c9943232871f9773ab32017c36a44b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8l2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61f493c42d7223ae48d738cf9973c2f6f5997722b017cc1665ec5f27ea0542dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8l2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:37:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjnxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:37Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.042710 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.042738 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.042747 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.042760 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.042769 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:37Z","lastTransitionTime":"2026-02-18T11:37:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.045137 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:37Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.058856 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w46bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"067f44ac-9e60-4581-87cc-f2e1c823fc4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c52fc2f1b105e90957f8cd7424352990f9ec9e5350d1a501980bfda1613066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ks8d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w46bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:37Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.072650 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pspfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702cf45-b47b-4291-a553-5bfc7bc22674\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7224x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7224x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:37:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pspfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:37Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.088966 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47a2352-be16-435d-9207-078c64f99e77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d641439c1ae42b2579e47041fc257fa37643ec1ef4fcacf55b15b82d6ec02cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dda1daddcf256e52a7dc75c6b5c221fcd8b8fad50c50b6053774f9a8fab57e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac316efde113b3547d00f480ee2c658958ba4f99c719d0920f333ab2a62ebaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6a8f8152ee7536b35054611238f4050c7f0e3a756f74ae8525af7f50150f88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:37Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.103345 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"143e32d8-7a32-4ab1-9790-a5b4fe1f0180\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c70b47d4c3df2edbdf1c33bdcff5642932d5212a23d54b20e79ec909b2e3de12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e27a0c7d36ad7d1c3f884b9bb69b963a922ad7851d263b3761ed5bc7e79a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a488435a081bdf17035a1ff4df73b0723045f0eef9049ec259198c697ecc33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39ce7973e25ba416a5f82cbafb8c1b3bea34718c49f8a8a56bd46970dc688727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39ce7973e25ba416a5f82cbafb8c1b3bea34718c49f8a8a56bd46970dc688727\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:37Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.116339 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:37Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.136988 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:37Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.146841 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.146877 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.146908 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.146924 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.146934 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:37Z","lastTransitionTime":"2026-02-18T11:37:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.161180 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"653a41bb-bb1d-421c-a92b-7f2811d95edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c0bf0b715f9f63ac45a7294c1ecd769e8b753eba698432388e02092cd5fb353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a72b7b3feb4b1eda6d0a688e2f4f1dbb05dbf932f4e786b6e414f147964ce746\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"message\\\":\\\"rking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI0218 11:37:14.424128 6409 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0218 11:37:14.424192 6409 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0218 11:37:14.424197 6409 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0218 11:37:14.424238 6409 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0218 11:37:14.424249 6409 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0218 11:37:14.424254 6409 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0218 11:37:14.424121 6409 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF0218 11:37:14.423917 6409 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c0bf0b715f9f63ac45a7294c1ecd769e8b753eba698432388e02092cd5fb353\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:37:26Z\\\",\\\"message\\\":\\\"andler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0218 11:37:26.126150 6616 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0218 11:37:26.126155 6616 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0218 11:37:26.126204 6616 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0218 11:37:26.126216 6616 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0218 11:37:26.126228 6616 handler.go:208] Removed *v1.Node event handler 2\\\\nI0218 11:37:26.126241 6616 handler.go:208] Removed *v1.Node event handler 7\\\\nI0218 11:37:26.126248 6616 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0218 11:37:26.126253 6616 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0218 11:37:26.126261 6616 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0218 11:37:26.126270 6616 factory.go:656] Stopping watch factory\\\\nI0218 11:37:26.126287 6616 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0218 11:37:26.126604 6616 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0218 11:37:26.126749 6616 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0218 11:37:26.126791 6616 ovnkube.go:599] Stopped ovnkube\\\\nI0218 11:37:26.126827 6616 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0218 11:37:26.126933 6616 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wg4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:37Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.189850 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bbe0881c-3173-4207-adb8-72c0dd4fa9c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e73616098bbc0fcdcd0b7e6b5bf610a7c193bdb65922e4532ce9553e9fc6e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71f2a53a03eafd89329b955a7edbe71c610706392e0a07fa09b9a6247889913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cac665b2d5a707e29339e7e8afa75dffcfd1dc1f42262e2643b2487ae68558a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0517235f78003f9b1a9044287c23aebed811299dc48f24c6643dd3bfcf5cf1d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a0c8c34dd5237a5b2d52b436f5c4deabd96a96ccb8073961dc49aaffd62c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:37Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.211206 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65825dc8bc12fe9933ff1392fd17a19042bf62d65c0b1ba22303094881745e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:37Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.224727 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9xzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b4595ac-c521-4ada-950d-e1b01cdff99b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f784f6e4f98fea5928c944062388eaa69d5a5782490bdb120cd735f54f52df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zv6rh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9xzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:37Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.242941 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3079844ef727cc58eb6253c7eaa0289ca84977a00334a10cdbf493b16133a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:37Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.249914 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.249972 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.249987 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.250009 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.250023 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:37Z","lastTransitionTime":"2026-02-18T11:37:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.260869 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdb7cedc-b2e3-48f0-80e0-e17073b43228\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c29cb76dded770e8691cd270c1c1197ffd3764c3fa73faf7f072a63c9e3e4b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653a3be8615ffc467d4be70b77ab62c837c24df920515a9965ec20ce2941d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-znglx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:37Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.273927 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdb7cedc-b2e3-48f0-80e0-e17073b43228\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c29cb76dded770e8691cd270c1c1197ffd3764c3fa73faf7f072a63c9e3e4b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653a3be8615ffc467d4be70b77ab62c837c24df920515a9965ec20ce2941d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-znglx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:37Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.295296 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"653a41bb-bb1d-421c-a92b-7f2811d95edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c0bf0b715f9f63ac45a7294c1ecd769e8b753eba698432388e02092cd5fb353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c0bf0b715f9f63ac45a7294c1ecd769e8b753eba698432388e02092cd5fb353\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:37:26Z\\\",\\\"message\\\":\\\"andler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0218 11:37:26.126150 6616 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0218 11:37:26.126155 6616 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0218 11:37:26.126204 6616 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0218 11:37:26.126216 6616 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0218 11:37:26.126228 6616 handler.go:208] Removed *v1.Node event handler 2\\\\nI0218 11:37:26.126241 6616 handler.go:208] Removed *v1.Node event handler 7\\\\nI0218 11:37:26.126248 6616 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0218 11:37:26.126253 6616 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0218 11:37:26.126261 6616 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0218 11:37:26.126270 6616 factory.go:656] Stopping watch factory\\\\nI0218 11:37:26.126287 6616 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0218 11:37:26.126604 6616 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0218 11:37:26.126749 6616 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0218 11:37:26.126791 6616 ovnkube.go:599] Stopped ovnkube\\\\nI0218 11:37:26.126827 6616 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0218 11:37:26.126933 6616 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wg4r5_openshift-ovn-kubernetes(653a41bb-bb1d-421c-a92b-7f2811d95edf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wg4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:37Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.352644 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.352683 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.352695 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.352712 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.352729 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:37Z","lastTransitionTime":"2026-02-18T11:37:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.352859 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bbe0881c-3173-4207-adb8-72c0dd4fa9c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e73616098bbc0fcdcd0b7e6b5bf610a7c193bdb65922e4532ce9553e9fc6e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71f2a53a03eafd89329b955a7edbe71c610706392e0a07fa09b9a6247889913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cac665b2d5a707e29339e7e8afa75dffcfd1dc1f42262e2643b2487ae68558a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0517235f78003f9b1a9044287c23aebed811299dc48f24c6643dd3bfcf5cf1d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a0c8c34dd5237a5b2d52b436f5c4deabd96a96ccb8073961dc49aaffd62c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:37Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.375110 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65825dc8bc12fe9933ff1392fd17a19042bf62d65c0b1ba22303094881745e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:37Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.395379 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9xzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b4595ac-c521-4ada-950d-e1b01cdff99b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f784f6e4f98fea5928c944062388eaa69d5a5782490bdb120cd735f54f52df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zv6rh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9xzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:37Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.410965 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3079844ef727cc58eb6253c7eaa0289ca84977a00334a10cdbf493b16133a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:37Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.423245 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5qkb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cd3723d-a12f-4c7c-a1ea-63bfef3c931a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ff8efc7bd7cb5045841f85767cea766898289dab6f4081829d530fbaaf9dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c45cw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5qkb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:37Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.437734 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"384d508e-8f6a-427d-a85d-85bb61d8405e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dacf3659cbb51e3d8df303470837041f3110f38b42187d40cbb4e53a8750cafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab832676116da0ae3d9dc70d182e283d7ae2418db02c1ea0062812cab0d71b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://434010c9772cf1c9db5578ecd4726ae2de0c0a3923d45b522868677cf78fcaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af62821d90e57595091aa9e7f1083acd21cb955ada0837c5ab4f71885ff7f1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0218 11:36:58.792829 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0218 11:36:58.792990 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0218 11:36:58.794119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-56419459/tls.crt::/tmp/serving-cert-56419459/tls.key\\\\\\\"\\\\nI0218 11:36:58.994875 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0218 11:36:59.009974 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0218 11:36:59.010010 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0218 11:36:59.010039 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0218 11:36:59.010047 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0218 11:36:59.021309 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0218 11:36:59.021338 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021343 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021347 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0218 11:36:59.021350 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0218 11:36:59.021353 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:36:59.021356 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:36:59.021636 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0218 11:36:59.024101 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b891a2fe938b437cbfb28c7715d0ad758541d844a4768434d3c1f1a3f0578c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:37Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.452830 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://454f72313609f075cd3f659b19279f8c3b07e8fea70f0ca7c88c5ca8e2e57397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0577fe2fd5192d4ec36ec8bfc6259a533b6422ef672d8153f601c2ac96bb5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:37Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.455774 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.455824 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.455836 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.455855 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.455867 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:37Z","lastTransitionTime":"2026-02-18T11:37:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.475526 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26zbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592c6351-c252-4c19-b3b1-167096be2de9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fca36b120b6a8f3bf08a810358bdd23b8b3c3105ede8e850232a9fef2d322db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26zbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:37Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.495560 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjnxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"648f85d5-dbc6-4db6-b590-3edc96740212\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e419e9c268c54b7829fa957e360c5b2c9943232871f9773ab32017c36a44b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8l2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61f493c42d7223ae48d738cf9973c2f6f5997722b017cc1665ec5f27ea0542dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8l2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:37:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjnxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:37Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.511929 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:37Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.526302 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:37Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.542430 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w46bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"067f44ac-9e60-4581-87cc-f2e1c823fc4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c52fc2f1b105e90957f8cd7424352990f9ec9e5350d1a501980bfda1613066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ks8d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w46bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:37Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.556610 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pspfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702cf45-b47b-4291-a553-5bfc7bc22674\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7224x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7224x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:37:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pspfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:37Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.558456 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.558498 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.558507 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.558536 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.558545 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:37Z","lastTransitionTime":"2026-02-18T11:37:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.570253 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47a2352-be16-435d-9207-078c64f99e77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d641439c1ae42b2579e47041fc257fa37643ec1ef4fcacf55b15b82d6ec02cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dda1daddcf256e52a7dc75c6b5c221fcd8b8fad50c50b6053774f9a8fab57e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac316efde113b3547d00f480ee2c658958ba4f99c719d0920f333ab2a62ebaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6a8f8152ee7536b35054611238f4050c7f0e3a756f74ae8525af7f50150f88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:37Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.581268 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"143e32d8-7a32-4ab1-9790-a5b4fe1f0180\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c70b47d4c3df2edbdf1c33bdcff5642932d5212a23d54b20e79ec909b2e3de12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e27a0c7d36ad7d1c3f884b9bb69b963a922ad7851d263b3761ed5bc7e79a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a488435a081bdf17035a1ff4df73b0723045f0eef9049ec259198c697ecc33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39ce7973e25ba416a5f82cbafb8c1b3bea34718c49f8a8a56bd46970dc688727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39ce7973e25ba416a5f82cbafb8c1b3bea34718c49f8a8a56bd46970dc688727\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:37Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.594944 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:37Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.662233 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.662324 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.662348 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.662410 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.662435 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:37Z","lastTransitionTime":"2026-02-18T11:37:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.765090 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.765138 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.765149 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.765165 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.765175 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:37Z","lastTransitionTime":"2026-02-18T11:37:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.868346 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.868491 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.868517 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.868548 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.868570 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:37Z","lastTransitionTime":"2026-02-18T11:37:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.947879 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 10:16:39.122661884 +0000 UTC Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.971807 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.971878 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.971899 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.971922 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.971942 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:37Z","lastTransitionTime":"2026-02-18T11:37:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.972569 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.972764 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:37:37 crc kubenswrapper[4922]: E0218 11:37:37.972907 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.972951 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:37:37 crc kubenswrapper[4922]: E0218 11:37:37.973279 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:37:37 crc kubenswrapper[4922]: E0218 11:37:37.973488 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:37:37 crc kubenswrapper[4922]: I0218 11:37:37.973623 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:37:37 crc kubenswrapper[4922]: E0218 11:37:37.973807 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pspfr" podUID="4702cf45-b47b-4291-a553-5bfc7bc22674" Feb 18 11:37:38 crc kubenswrapper[4922]: I0218 11:37:38.075598 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:38 crc kubenswrapper[4922]: I0218 11:37:38.075686 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:38 crc kubenswrapper[4922]: I0218 11:37:38.075713 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:38 crc kubenswrapper[4922]: I0218 11:37:38.075745 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:38 crc kubenswrapper[4922]: I0218 11:37:38.075769 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:38Z","lastTransitionTime":"2026-02-18T11:37:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:38 crc kubenswrapper[4922]: I0218 11:37:38.178165 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:38 crc kubenswrapper[4922]: I0218 11:37:38.178622 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:38 crc kubenswrapper[4922]: I0218 11:37:38.178833 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:38 crc kubenswrapper[4922]: I0218 11:37:38.179024 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:38 crc kubenswrapper[4922]: I0218 11:37:38.179209 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:38Z","lastTransitionTime":"2026-02-18T11:37:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:38 crc kubenswrapper[4922]: I0218 11:37:38.282468 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:38 crc kubenswrapper[4922]: I0218 11:37:38.282544 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:38 crc kubenswrapper[4922]: I0218 11:37:38.282577 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:38 crc kubenswrapper[4922]: I0218 11:37:38.282612 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:38 crc kubenswrapper[4922]: I0218 11:37:38.282637 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:38Z","lastTransitionTime":"2026-02-18T11:37:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:38 crc kubenswrapper[4922]: I0218 11:37:38.385285 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:38 crc kubenswrapper[4922]: I0218 11:37:38.385514 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:38 crc kubenswrapper[4922]: I0218 11:37:38.385544 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:38 crc kubenswrapper[4922]: I0218 11:37:38.385575 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:38 crc kubenswrapper[4922]: I0218 11:37:38.385600 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:38Z","lastTransitionTime":"2026-02-18T11:37:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:38 crc kubenswrapper[4922]: I0218 11:37:38.489766 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:38 crc kubenswrapper[4922]: I0218 11:37:38.489837 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:38 crc kubenswrapper[4922]: I0218 11:37:38.489861 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:38 crc kubenswrapper[4922]: I0218 11:37:38.489894 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:38 crc kubenswrapper[4922]: I0218 11:37:38.489917 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:38Z","lastTransitionTime":"2026-02-18T11:37:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:38 crc kubenswrapper[4922]: I0218 11:37:38.593072 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:38 crc kubenswrapper[4922]: I0218 11:37:38.593132 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:38 crc kubenswrapper[4922]: I0218 11:37:38.593152 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:38 crc kubenswrapper[4922]: I0218 11:37:38.593175 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:38 crc kubenswrapper[4922]: I0218 11:37:38.593191 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:38Z","lastTransitionTime":"2026-02-18T11:37:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:38 crc kubenswrapper[4922]: I0218 11:37:38.696004 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:38 crc kubenswrapper[4922]: I0218 11:37:38.696088 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:38 crc kubenswrapper[4922]: I0218 11:37:38.696110 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:38 crc kubenswrapper[4922]: I0218 11:37:38.696145 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:38 crc kubenswrapper[4922]: I0218 11:37:38.696166 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:38Z","lastTransitionTime":"2026-02-18T11:37:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:38 crc kubenswrapper[4922]: I0218 11:37:38.799049 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:38 crc kubenswrapper[4922]: I0218 11:37:38.799122 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:38 crc kubenswrapper[4922]: I0218 11:37:38.799140 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:38 crc kubenswrapper[4922]: I0218 11:37:38.799167 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:38 crc kubenswrapper[4922]: I0218 11:37:38.799186 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:38Z","lastTransitionTime":"2026-02-18T11:37:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:38 crc kubenswrapper[4922]: I0218 11:37:38.902202 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:38 crc kubenswrapper[4922]: I0218 11:37:38.902297 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:38 crc kubenswrapper[4922]: I0218 11:37:38.902316 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:38 crc kubenswrapper[4922]: I0218 11:37:38.902340 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:38 crc kubenswrapper[4922]: I0218 11:37:38.902387 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:38Z","lastTransitionTime":"2026-02-18T11:37:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:38 crc kubenswrapper[4922]: I0218 11:37:38.948109 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 18:49:59.265596094 +0000 UTC Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.004963 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.005569 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.005585 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.005605 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.005618 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:39Z","lastTransitionTime":"2026-02-18T11:37:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.006736 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bbe0881c-3173-4207-adb8-72c0dd4fa9c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e73616098bbc0fcdcd0b7e6b5bf610a7c193bdb65922e4532ce9553e9fc6e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71f2a53a03eafd89329b955a7edbe71c610706392e0a07fa09b9a6247889913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cac665b2d5a707e29339e7e8afa75dffcfd1dc1f42262e2643b2487ae68558a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0517235f78003f9b1a9044287c23aebed811299dc48f24c6643dd3bfcf5cf1d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a0c8c34dd5237a5b2d52b436f5c4deabd96a96ccb8073961dc49aaffd62c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:39Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.028772 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65825dc8bc12fe9933ff1392fd17a19042bf62d65c0b1ba22303094881745e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:39Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.049516 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9xzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b4595ac-c521-4ada-950d-e1b01cdff99b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f784f6e4f98fea5928c944062388eaa69d5a5782490bdb120cd735f54f52df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zv6rh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9xzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:39Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.067809 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3079844ef727cc58eb6253c7eaa0289ca84977a00334a10cdbf493b16133a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:39Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.088793 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdb7cedc-b2e3-48f0-80e0-e17073b43228\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c29cb76dded770e8691cd270c1c1197ffd3764c3fa73faf7f072a63c9e3e4b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653a3be8615ffc467d4be70b77ab62c837c24df920515a9965ec20ce2941d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-znglx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:39Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.110279 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.110759 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.110967 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.111151 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.111286 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:39Z","lastTransitionTime":"2026-02-18T11:37:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.127509 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"653a41bb-bb1d-421c-a92b-7f2811d95edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c0bf0b715f9f63ac45a7294c1ecd769e8b753eba698432388e02092cd5fb353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c0bf0b715f9f63ac45a7294c1ecd769e8b753eba698432388e02092cd5fb353\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:37:26Z\\\",\\\"message\\\":\\\"andler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0218 11:37:26.126150 6616 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0218 11:37:26.126155 6616 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0218 11:37:26.126204 6616 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0218 11:37:26.126216 6616 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0218 11:37:26.126228 6616 handler.go:208] Removed *v1.Node event handler 2\\\\nI0218 11:37:26.126241 6616 handler.go:208] Removed *v1.Node event handler 7\\\\nI0218 11:37:26.126248 6616 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0218 11:37:26.126253 6616 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0218 11:37:26.126261 6616 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0218 11:37:26.126270 6616 factory.go:656] Stopping watch factory\\\\nI0218 11:37:26.126287 6616 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0218 11:37:26.126604 6616 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0218 11:37:26.126749 6616 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0218 11:37:26.126791 6616 ovnkube.go:599] Stopped ovnkube\\\\nI0218 11:37:26.126827 6616 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0218 11:37:26.126933 6616 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wg4r5_openshift-ovn-kubernetes(653a41bb-bb1d-421c-a92b-7f2811d95edf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wg4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:39Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.145568 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5qkb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cd3723d-a12f-4c7c-a1ea-63bfef3c931a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ff8efc7bd7cb5045841f85767cea766898289dab6f4081829d530fbaaf9dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c45cw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5qkb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:39Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.169532 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"384d508e-8f6a-427d-a85d-85bb61d8405e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dacf3659cbb51e3d8df303470837041f3110f38b42187d40cbb4e53a8750cafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab832676116da0ae3d9dc70d182e283d7ae2418db02c1ea0062812cab0d71b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://434010c9772cf1c9db5578ecd4726ae2de0c0a3923d45b522868677cf78fcaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af62821d90e57595091aa9e7f1083acd21cb955ada0837c5ab4f71885ff7f1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0218 11:36:58.792829 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0218 11:36:58.792990 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0218 11:36:58.794119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-56419459/tls.crt::/tmp/serving-cert-56419459/tls.key\\\\\\\"\\\\nI0218 11:36:58.994875 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0218 11:36:59.009974 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0218 11:36:59.010010 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0218 11:36:59.010039 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0218 11:36:59.010047 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0218 11:36:59.021309 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0218 11:36:59.021338 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021343 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021347 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0218 11:36:59.021350 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0218 11:36:59.021353 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:36:59.021356 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:36:59.021636 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0218 11:36:59.024101 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b891a2fe938b437cbfb28c7715d0ad758541d844a4768434d3c1f1a3f0578c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:39Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.190012 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://454f72313609f075cd3f659b19279f8c3b07e8fea70f0ca7c88c5ca8e2e57397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0577fe2fd5192d4ec36ec8bfc6259a533b6422ef672d8153f601c2ac96bb5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:39Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.215709 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.215821 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.215840 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.215867 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.215884 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:39Z","lastTransitionTime":"2026-02-18T11:37:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.216519 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26zbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592c6351-c252-4c19-b3b1-167096be2de9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fca36b120b6a8f3bf08a810358bdd23b8b3c3105ede8e850232a9fef2d322db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26zbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:39Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.234302 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjnxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"648f85d5-dbc6-4db6-b590-3edc96740212\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e419e9c268c54b7829fa957e360c5b2c9943232871f9773ab32017c36a44b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8l2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61f493c42d7223ae48d738cf9973c2f6f5997722b017cc1665ec5f27ea0542dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8l2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:37:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjnxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:39Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.248474 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47a2352-be16-435d-9207-078c64f99e77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d641439c1ae42b2579e47041fc257fa37643ec1ef4fcacf55b15b82d6ec02cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dda1daddcf256e52a7dc75c6b5c221fcd8b8fad50c50b6053774f9a8fab57e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac316efde113b3547d00f480ee2c658958ba4f99c719d0920f333ab2a62ebaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6a8f8152ee7536b35054611238f4050c7f0e3a756f74ae8525af7f50150f88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:39Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.265334 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"143e32d8-7a32-4ab1-9790-a5b4fe1f0180\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c70b47d4c3df2edbdf1c33bdcff5642932d5212a23d54b20e79ec909b2e3de12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e27a0c7d36ad7d1c3f884b9bb69b963a922ad7851d263b3761ed5bc7e79a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a488435a081bdf17035a1ff4df73b0723045f0eef9049ec259198c697ecc33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39ce7973e25ba416a5f82cbafb8c1b3bea34718c49f8a8a56bd46970dc688727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39ce7973e25ba416a5f82cbafb8c1b3bea34718c49f8a8a56bd46970dc688727\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:39Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.286464 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:39Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.300741 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:39Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.318395 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.318692 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.318826 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.319045 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.319198 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:39Z","lastTransitionTime":"2026-02-18T11:37:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.320940 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:39Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.336931 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w46bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"067f44ac-9e60-4581-87cc-f2e1c823fc4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c52fc2f1b105e90957f8cd7424352990f9ec9e5350d1a501980bfda1613066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ks8d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w46bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:39Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.353141 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pspfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702cf45-b47b-4291-a553-5bfc7bc22674\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7224x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7224x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:37:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pspfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:39Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.421801 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.421853 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.421871 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.421898 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.421916 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:39Z","lastTransitionTime":"2026-02-18T11:37:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.524411 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.524754 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.524938 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.525119 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.525290 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:39Z","lastTransitionTime":"2026-02-18T11:37:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.628173 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.628232 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.628258 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.628290 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.628315 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:39Z","lastTransitionTime":"2026-02-18T11:37:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.731924 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.731971 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.731982 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.732001 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.732013 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:39Z","lastTransitionTime":"2026-02-18T11:37:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.834586 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.834644 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.834660 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.834682 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.834701 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:39Z","lastTransitionTime":"2026-02-18T11:37:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.937160 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.937210 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.937223 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.937243 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.937255 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:39Z","lastTransitionTime":"2026-02-18T11:37:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.948536 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 15:38:43.179070188 +0000 UTC Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.972256 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.972277 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:37:39 crc kubenswrapper[4922]: E0218 11:37:39.972392 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.972438 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:37:39 crc kubenswrapper[4922]: I0218 11:37:39.972458 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:37:39 crc kubenswrapper[4922]: E0218 11:37:39.972633 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:37:39 crc kubenswrapper[4922]: E0218 11:37:39.972699 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:37:39 crc kubenswrapper[4922]: E0218 11:37:39.972785 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pspfr" podUID="4702cf45-b47b-4291-a553-5bfc7bc22674" Feb 18 11:37:40 crc kubenswrapper[4922]: I0218 11:37:40.040076 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:40 crc kubenswrapper[4922]: I0218 11:37:40.040136 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:40 crc kubenswrapper[4922]: I0218 11:37:40.040153 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:40 crc kubenswrapper[4922]: I0218 11:37:40.040178 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:40 crc kubenswrapper[4922]: I0218 11:37:40.040196 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:40Z","lastTransitionTime":"2026-02-18T11:37:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:40 crc kubenswrapper[4922]: I0218 11:37:40.144115 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:40 crc kubenswrapper[4922]: I0218 11:37:40.144172 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:40 crc kubenswrapper[4922]: I0218 11:37:40.144188 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:40 crc kubenswrapper[4922]: I0218 11:37:40.144216 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:40 crc kubenswrapper[4922]: I0218 11:37:40.144236 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:40Z","lastTransitionTime":"2026-02-18T11:37:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:40 crc kubenswrapper[4922]: I0218 11:37:40.246964 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:40 crc kubenswrapper[4922]: I0218 11:37:40.247024 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:40 crc kubenswrapper[4922]: I0218 11:37:40.247045 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:40 crc kubenswrapper[4922]: I0218 11:37:40.247081 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:40 crc kubenswrapper[4922]: I0218 11:37:40.247105 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:40Z","lastTransitionTime":"2026-02-18T11:37:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:40 crc kubenswrapper[4922]: I0218 11:37:40.349879 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:40 crc kubenswrapper[4922]: I0218 11:37:40.349934 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:40 crc kubenswrapper[4922]: I0218 11:37:40.349946 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:40 crc kubenswrapper[4922]: I0218 11:37:40.349967 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:40 crc kubenswrapper[4922]: I0218 11:37:40.349983 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:40Z","lastTransitionTime":"2026-02-18T11:37:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:40 crc kubenswrapper[4922]: I0218 11:37:40.452854 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:40 crc kubenswrapper[4922]: I0218 11:37:40.452928 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:40 crc kubenswrapper[4922]: I0218 11:37:40.452952 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:40 crc kubenswrapper[4922]: I0218 11:37:40.452986 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:40 crc kubenswrapper[4922]: I0218 11:37:40.453009 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:40Z","lastTransitionTime":"2026-02-18T11:37:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:40 crc kubenswrapper[4922]: I0218 11:37:40.557265 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:40 crc kubenswrapper[4922]: I0218 11:37:40.557349 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:40 crc kubenswrapper[4922]: I0218 11:37:40.557400 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:40 crc kubenswrapper[4922]: I0218 11:37:40.557433 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:40 crc kubenswrapper[4922]: I0218 11:37:40.557459 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:40Z","lastTransitionTime":"2026-02-18T11:37:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:40 crc kubenswrapper[4922]: I0218 11:37:40.660629 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:40 crc kubenswrapper[4922]: I0218 11:37:40.660690 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:40 crc kubenswrapper[4922]: I0218 11:37:40.660707 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:40 crc kubenswrapper[4922]: I0218 11:37:40.660730 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:40 crc kubenswrapper[4922]: I0218 11:37:40.660747 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:40Z","lastTransitionTime":"2026-02-18T11:37:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:40 crc kubenswrapper[4922]: I0218 11:37:40.763076 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:40 crc kubenswrapper[4922]: I0218 11:37:40.763184 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:40 crc kubenswrapper[4922]: I0218 11:37:40.763205 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:40 crc kubenswrapper[4922]: I0218 11:37:40.763233 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:40 crc kubenswrapper[4922]: I0218 11:37:40.763252 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:40Z","lastTransitionTime":"2026-02-18T11:37:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:40 crc kubenswrapper[4922]: I0218 11:37:40.866594 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:40 crc kubenswrapper[4922]: I0218 11:37:40.866632 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:40 crc kubenswrapper[4922]: I0218 11:37:40.866654 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:40 crc kubenswrapper[4922]: I0218 11:37:40.866672 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:40 crc kubenswrapper[4922]: I0218 11:37:40.866705 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:40Z","lastTransitionTime":"2026-02-18T11:37:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:40 crc kubenswrapper[4922]: I0218 11:37:40.949733 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 10:53:52.234772539 +0000 UTC Feb 18 11:37:40 crc kubenswrapper[4922]: I0218 11:37:40.970217 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:40 crc kubenswrapper[4922]: I0218 11:37:40.970259 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:40 crc kubenswrapper[4922]: I0218 11:37:40.970272 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:40 crc kubenswrapper[4922]: I0218 11:37:40.970290 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:40 crc kubenswrapper[4922]: I0218 11:37:40.970303 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:40Z","lastTransitionTime":"2026-02-18T11:37:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.072790 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.072849 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.072867 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.072891 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.072909 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:41Z","lastTransitionTime":"2026-02-18T11:37:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.177146 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.177211 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.177233 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.177261 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.177284 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:41Z","lastTransitionTime":"2026-02-18T11:37:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.279978 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.280103 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.280119 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.280141 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.280157 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:41Z","lastTransitionTime":"2026-02-18T11:37:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.382381 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.382437 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.382448 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.382467 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.382479 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:41Z","lastTransitionTime":"2026-02-18T11:37:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.484880 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.484934 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.484951 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.484974 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.484991 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:41Z","lastTransitionTime":"2026-02-18T11:37:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.587299 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.587329 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.587337 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.587348 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.587385 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:41Z","lastTransitionTime":"2026-02-18T11:37:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.690679 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.690728 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.690744 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.690764 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.690781 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:41Z","lastTransitionTime":"2026-02-18T11:37:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.792931 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.792960 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.792969 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.792984 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.792994 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:41Z","lastTransitionTime":"2026-02-18T11:37:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.895934 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.895969 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.895979 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.895995 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.896019 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:41Z","lastTransitionTime":"2026-02-18T11:37:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.950166 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 01:52:31.021065778 +0000 UTC Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.972800 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:37:41 crc kubenswrapper[4922]: E0218 11:37:41.972945 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.973185 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:37:41 crc kubenswrapper[4922]: E0218 11:37:41.973264 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.973480 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:37:41 crc kubenswrapper[4922]: E0218 11:37:41.973574 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pspfr" podUID="4702cf45-b47b-4291-a553-5bfc7bc22674" Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.973931 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:37:41 crc kubenswrapper[4922]: E0218 11:37:41.974092 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.998451 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.998491 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.998500 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.998514 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:41 crc kubenswrapper[4922]: I0218 11:37:41.998523 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:41Z","lastTransitionTime":"2026-02-18T11:37:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:42 crc kubenswrapper[4922]: I0218 11:37:42.101547 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:42 crc kubenswrapper[4922]: I0218 11:37:42.101589 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:42 crc kubenswrapper[4922]: I0218 11:37:42.101606 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:42 crc kubenswrapper[4922]: I0218 11:37:42.101628 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:42 crc kubenswrapper[4922]: I0218 11:37:42.101647 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:42Z","lastTransitionTime":"2026-02-18T11:37:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:42 crc kubenswrapper[4922]: I0218 11:37:42.205160 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:42 crc kubenswrapper[4922]: I0218 11:37:42.205216 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:42 crc kubenswrapper[4922]: I0218 11:37:42.205237 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:42 crc kubenswrapper[4922]: I0218 11:37:42.205264 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:42 crc kubenswrapper[4922]: I0218 11:37:42.205281 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:42Z","lastTransitionTime":"2026-02-18T11:37:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:42 crc kubenswrapper[4922]: I0218 11:37:42.309416 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:42 crc kubenswrapper[4922]: I0218 11:37:42.309496 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:42 crc kubenswrapper[4922]: I0218 11:37:42.309512 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:42 crc kubenswrapper[4922]: I0218 11:37:42.309538 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:42 crc kubenswrapper[4922]: I0218 11:37:42.309561 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:42Z","lastTransitionTime":"2026-02-18T11:37:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:42 crc kubenswrapper[4922]: I0218 11:37:42.412629 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:42 crc kubenswrapper[4922]: I0218 11:37:42.412678 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:42 crc kubenswrapper[4922]: I0218 11:37:42.412694 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:42 crc kubenswrapper[4922]: I0218 11:37:42.412716 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:42 crc kubenswrapper[4922]: I0218 11:37:42.412732 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:42Z","lastTransitionTime":"2026-02-18T11:37:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:42 crc kubenswrapper[4922]: I0218 11:37:42.515745 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:42 crc kubenswrapper[4922]: I0218 11:37:42.515791 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:42 crc kubenswrapper[4922]: I0218 11:37:42.515802 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:42 crc kubenswrapper[4922]: I0218 11:37:42.515819 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:42 crc kubenswrapper[4922]: I0218 11:37:42.515831 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:42Z","lastTransitionTime":"2026-02-18T11:37:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:42 crc kubenswrapper[4922]: I0218 11:37:42.618429 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:42 crc kubenswrapper[4922]: I0218 11:37:42.618481 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:42 crc kubenswrapper[4922]: I0218 11:37:42.618490 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:42 crc kubenswrapper[4922]: I0218 11:37:42.618504 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:42 crc kubenswrapper[4922]: I0218 11:37:42.618512 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:42Z","lastTransitionTime":"2026-02-18T11:37:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:42 crc kubenswrapper[4922]: I0218 11:37:42.721729 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:42 crc kubenswrapper[4922]: I0218 11:37:42.721824 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:42 crc kubenswrapper[4922]: I0218 11:37:42.721848 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:42 crc kubenswrapper[4922]: I0218 11:37:42.721878 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:42 crc kubenswrapper[4922]: I0218 11:37:42.721898 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:42Z","lastTransitionTime":"2026-02-18T11:37:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:42 crc kubenswrapper[4922]: I0218 11:37:42.826151 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:42 crc kubenswrapper[4922]: I0218 11:37:42.826232 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:42 crc kubenswrapper[4922]: I0218 11:37:42.826250 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:42 crc kubenswrapper[4922]: I0218 11:37:42.826280 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:42 crc kubenswrapper[4922]: I0218 11:37:42.826302 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:42Z","lastTransitionTime":"2026-02-18T11:37:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:42 crc kubenswrapper[4922]: I0218 11:37:42.929477 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:42 crc kubenswrapper[4922]: I0218 11:37:42.929585 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:42 crc kubenswrapper[4922]: I0218 11:37:42.929606 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:42 crc kubenswrapper[4922]: I0218 11:37:42.929642 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:42 crc kubenswrapper[4922]: I0218 11:37:42.929668 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:42Z","lastTransitionTime":"2026-02-18T11:37:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:42 crc kubenswrapper[4922]: I0218 11:37:42.951003 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 03:50:39.83676961 +0000 UTC Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.031922 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.031969 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.031980 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.031995 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.032006 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:43Z","lastTransitionTime":"2026-02-18T11:37:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.133950 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.135103 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.135181 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.135197 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.135216 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:43Z","lastTransitionTime":"2026-02-18T11:37:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.237770 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.237816 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.237827 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.237842 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.237854 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:43Z","lastTransitionTime":"2026-02-18T11:37:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.340021 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.340056 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.340067 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.340083 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.340094 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:43Z","lastTransitionTime":"2026-02-18T11:37:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.441858 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.441892 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.441903 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.441918 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.441928 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:43Z","lastTransitionTime":"2026-02-18T11:37:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.544517 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.544550 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.544558 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.544571 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.544580 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:43Z","lastTransitionTime":"2026-02-18T11:37:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.647668 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.647710 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.647726 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.647740 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.647750 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:43Z","lastTransitionTime":"2026-02-18T11:37:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.750556 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.750610 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.750626 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.750652 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.750669 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:43Z","lastTransitionTime":"2026-02-18T11:37:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.853379 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.853430 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.853442 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.853459 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.853471 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:43Z","lastTransitionTime":"2026-02-18T11:37:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.951602 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 17:09:47.664911598 +0000 UTC Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.956319 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.956425 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.956453 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.956485 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.956507 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:43Z","lastTransitionTime":"2026-02-18T11:37:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.972737 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:37:43 crc kubenswrapper[4922]: E0218 11:37:43.972877 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.973111 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:37:43 crc kubenswrapper[4922]: E0218 11:37:43.973173 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.973827 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:37:43 crc kubenswrapper[4922]: I0218 11:37:43.974594 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:37:43 crc kubenswrapper[4922]: E0218 11:37:43.975273 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pspfr" podUID="4702cf45-b47b-4291-a553-5bfc7bc22674" Feb 18 11:37:43 crc kubenswrapper[4922]: E0218 11:37:43.974663 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:37:44 crc kubenswrapper[4922]: I0218 11:37:44.060029 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:44 crc kubenswrapper[4922]: I0218 11:37:44.060095 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:44 crc kubenswrapper[4922]: I0218 11:37:44.060117 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:44 crc kubenswrapper[4922]: I0218 11:37:44.060142 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:44 crc kubenswrapper[4922]: I0218 11:37:44.060159 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:44Z","lastTransitionTime":"2026-02-18T11:37:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:44 crc kubenswrapper[4922]: I0218 11:37:44.163208 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:44 crc kubenswrapper[4922]: I0218 11:37:44.163253 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:44 crc kubenswrapper[4922]: I0218 11:37:44.163265 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:44 crc kubenswrapper[4922]: I0218 11:37:44.163281 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:44 crc kubenswrapper[4922]: I0218 11:37:44.163295 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:44Z","lastTransitionTime":"2026-02-18T11:37:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:44 crc kubenswrapper[4922]: I0218 11:37:44.265878 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:44 crc kubenswrapper[4922]: I0218 11:37:44.265920 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:44 crc kubenswrapper[4922]: I0218 11:37:44.265933 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:44 crc kubenswrapper[4922]: I0218 11:37:44.265949 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:44 crc kubenswrapper[4922]: I0218 11:37:44.265966 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:44Z","lastTransitionTime":"2026-02-18T11:37:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:44 crc kubenswrapper[4922]: I0218 11:37:44.369036 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:44 crc kubenswrapper[4922]: I0218 11:37:44.369093 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:44 crc kubenswrapper[4922]: I0218 11:37:44.369110 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:44 crc kubenswrapper[4922]: I0218 11:37:44.369132 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:44 crc kubenswrapper[4922]: I0218 11:37:44.369150 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:44Z","lastTransitionTime":"2026-02-18T11:37:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:44 crc kubenswrapper[4922]: I0218 11:37:44.471485 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:44 crc kubenswrapper[4922]: I0218 11:37:44.471528 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:44 crc kubenswrapper[4922]: I0218 11:37:44.471539 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:44 crc kubenswrapper[4922]: I0218 11:37:44.471557 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:44 crc kubenswrapper[4922]: I0218 11:37:44.471569 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:44Z","lastTransitionTime":"2026-02-18T11:37:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:44 crc kubenswrapper[4922]: I0218 11:37:44.574136 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:44 crc kubenswrapper[4922]: I0218 11:37:44.574171 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:44 crc kubenswrapper[4922]: I0218 11:37:44.574182 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:44 crc kubenswrapper[4922]: I0218 11:37:44.574197 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:44 crc kubenswrapper[4922]: I0218 11:37:44.574209 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:44Z","lastTransitionTime":"2026-02-18T11:37:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:44 crc kubenswrapper[4922]: I0218 11:37:44.676648 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:44 crc kubenswrapper[4922]: I0218 11:37:44.676682 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:44 crc kubenswrapper[4922]: I0218 11:37:44.676693 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:44 crc kubenswrapper[4922]: I0218 11:37:44.676709 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:44 crc kubenswrapper[4922]: I0218 11:37:44.676722 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:44Z","lastTransitionTime":"2026-02-18T11:37:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:44 crc kubenswrapper[4922]: I0218 11:37:44.779700 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:44 crc kubenswrapper[4922]: I0218 11:37:44.779765 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:44 crc kubenswrapper[4922]: I0218 11:37:44.779776 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:44 crc kubenswrapper[4922]: I0218 11:37:44.779799 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:44 crc kubenswrapper[4922]: I0218 11:37:44.779812 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:44Z","lastTransitionTime":"2026-02-18T11:37:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:44 crc kubenswrapper[4922]: I0218 11:37:44.882438 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:44 crc kubenswrapper[4922]: I0218 11:37:44.882493 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:44 crc kubenswrapper[4922]: I0218 11:37:44.882506 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:44 crc kubenswrapper[4922]: I0218 11:37:44.882524 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:44 crc kubenswrapper[4922]: I0218 11:37:44.882535 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:44Z","lastTransitionTime":"2026-02-18T11:37:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:44 crc kubenswrapper[4922]: I0218 11:37:44.952688 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 10:07:01.625055123 +0000 UTC Feb 18 11:37:44 crc kubenswrapper[4922]: I0218 11:37:44.984926 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:44 crc kubenswrapper[4922]: I0218 11:37:44.984970 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:44 crc kubenswrapper[4922]: I0218 11:37:44.984980 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:44 crc kubenswrapper[4922]: I0218 11:37:44.984993 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:44 crc kubenswrapper[4922]: I0218 11:37:44.985004 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:44Z","lastTransitionTime":"2026-02-18T11:37:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.087194 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.087256 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.087267 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.087290 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.087303 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:45Z","lastTransitionTime":"2026-02-18T11:37:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.190051 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.190125 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.190147 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.190178 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.190196 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:45Z","lastTransitionTime":"2026-02-18T11:37:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.293071 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.293107 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.293118 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.293134 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.293145 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:45Z","lastTransitionTime":"2026-02-18T11:37:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.356073 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.356148 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.356163 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.356182 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.356193 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:45Z","lastTransitionTime":"2026-02-18T11:37:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:45 crc kubenswrapper[4922]: E0218 11:37:45.372855 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"142c3684-8991-4ed8-97d2-827c32777413\\\",\\\"systemUUID\\\":\\\"e6651d54-876d-4ba5-b7d2-813fd96498e1\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:45Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.376946 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.376973 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.376983 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.376995 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.377004 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:45Z","lastTransitionTime":"2026-02-18T11:37:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:45 crc kubenswrapper[4922]: E0218 11:37:45.389866 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"142c3684-8991-4ed8-97d2-827c32777413\\\",\\\"systemUUID\\\":\\\"e6651d54-876d-4ba5-b7d2-813fd96498e1\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:45Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.397583 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.397621 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.397657 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.397678 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.397690 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:45Z","lastTransitionTime":"2026-02-18T11:37:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:45 crc kubenswrapper[4922]: E0218 11:37:45.413537 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"142c3684-8991-4ed8-97d2-827c32777413\\\",\\\"systemUUID\\\":\\\"e6651d54-876d-4ba5-b7d2-813fd96498e1\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:45Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.418262 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.418329 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.418341 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.418373 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.418385 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:45Z","lastTransitionTime":"2026-02-18T11:37:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:45 crc kubenswrapper[4922]: E0218 11:37:45.441863 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"142c3684-8991-4ed8-97d2-827c32777413\\\",\\\"systemUUID\\\":\\\"e6651d54-876d-4ba5-b7d2-813fd96498e1\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:45Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.445888 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.445932 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.445947 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.445970 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.445987 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:45Z","lastTransitionTime":"2026-02-18T11:37:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:45 crc kubenswrapper[4922]: E0218 11:37:45.462487 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"142c3684-8991-4ed8-97d2-827c32777413\\\",\\\"systemUUID\\\":\\\"e6651d54-876d-4ba5-b7d2-813fd96498e1\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:45Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:45 crc kubenswrapper[4922]: E0218 11:37:45.462661 4922 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.464285 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.464329 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.464343 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.464385 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.464404 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:45Z","lastTransitionTime":"2026-02-18T11:37:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.567756 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.568297 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.568495 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.568647 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.568797 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:45Z","lastTransitionTime":"2026-02-18T11:37:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.671855 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.671917 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.671927 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.671947 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.671961 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:45Z","lastTransitionTime":"2026-02-18T11:37:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.675666 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4702cf45-b47b-4291-a553-5bfc7bc22674-metrics-certs\") pod \"network-metrics-daemon-pspfr\" (UID: \"4702cf45-b47b-4291-a553-5bfc7bc22674\") " pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:37:45 crc kubenswrapper[4922]: E0218 11:37:45.675983 4922 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 11:37:45 crc kubenswrapper[4922]: E0218 11:37:45.676118 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4702cf45-b47b-4291-a553-5bfc7bc22674-metrics-certs podName:4702cf45-b47b-4291-a553-5bfc7bc22674 nodeName:}" failed. No retries permitted until 2026-02-18 11:38:17.676086113 +0000 UTC m=+99.403790233 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4702cf45-b47b-4291-a553-5bfc7bc22674-metrics-certs") pod "network-metrics-daemon-pspfr" (UID: "4702cf45-b47b-4291-a553-5bfc7bc22674") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.774496 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.774550 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.774560 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.774579 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.774592 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:45Z","lastTransitionTime":"2026-02-18T11:37:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.876600 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.876644 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.876662 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.876677 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.876688 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:45Z","lastTransitionTime":"2026-02-18T11:37:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.953719 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 13:55:41.827592233 +0000 UTC Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.973018 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.973051 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.973116 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.973036 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:37:45 crc kubenswrapper[4922]: E0218 11:37:45.973158 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:37:45 crc kubenswrapper[4922]: E0218 11:37:45.973347 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:37:45 crc kubenswrapper[4922]: E0218 11:37:45.973450 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pspfr" podUID="4702cf45-b47b-4291-a553-5bfc7bc22674" Feb 18 11:37:45 crc kubenswrapper[4922]: E0218 11:37:45.973546 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.982742 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.982806 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.982815 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.982831 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:45 crc kubenswrapper[4922]: I0218 11:37:45.982841 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:45Z","lastTransitionTime":"2026-02-18T11:37:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:46 crc kubenswrapper[4922]: I0218 11:37:46.085279 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:46 crc kubenswrapper[4922]: I0218 11:37:46.085343 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:46 crc kubenswrapper[4922]: I0218 11:37:46.085377 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:46 crc kubenswrapper[4922]: I0218 11:37:46.085397 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:46 crc kubenswrapper[4922]: I0218 11:37:46.085412 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:46Z","lastTransitionTime":"2026-02-18T11:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:46 crc kubenswrapper[4922]: I0218 11:37:46.188053 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:46 crc kubenswrapper[4922]: I0218 11:37:46.188103 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:46 crc kubenswrapper[4922]: I0218 11:37:46.188115 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:46 crc kubenswrapper[4922]: I0218 11:37:46.188134 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:46 crc kubenswrapper[4922]: I0218 11:37:46.188149 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:46Z","lastTransitionTime":"2026-02-18T11:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:46 crc kubenswrapper[4922]: I0218 11:37:46.291514 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:46 crc kubenswrapper[4922]: I0218 11:37:46.291577 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:46 crc kubenswrapper[4922]: I0218 11:37:46.291590 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:46 crc kubenswrapper[4922]: I0218 11:37:46.291612 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:46 crc kubenswrapper[4922]: I0218 11:37:46.291624 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:46Z","lastTransitionTime":"2026-02-18T11:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:46 crc kubenswrapper[4922]: I0218 11:37:46.394411 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:46 crc kubenswrapper[4922]: I0218 11:37:46.394516 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:46 crc kubenswrapper[4922]: I0218 11:37:46.394545 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:46 crc kubenswrapper[4922]: I0218 11:37:46.394636 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:46 crc kubenswrapper[4922]: I0218 11:37:46.394663 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:46Z","lastTransitionTime":"2026-02-18T11:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:46 crc kubenswrapper[4922]: I0218 11:37:46.497318 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:46 crc kubenswrapper[4922]: I0218 11:37:46.497395 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:46 crc kubenswrapper[4922]: I0218 11:37:46.497410 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:46 crc kubenswrapper[4922]: I0218 11:37:46.497429 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:46 crc kubenswrapper[4922]: I0218 11:37:46.497444 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:46Z","lastTransitionTime":"2026-02-18T11:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:46 crc kubenswrapper[4922]: I0218 11:37:46.599759 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:46 crc kubenswrapper[4922]: I0218 11:37:46.599850 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:46 crc kubenswrapper[4922]: I0218 11:37:46.599870 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:46 crc kubenswrapper[4922]: I0218 11:37:46.599903 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:46 crc kubenswrapper[4922]: I0218 11:37:46.599921 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:46Z","lastTransitionTime":"2026-02-18T11:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:46 crc kubenswrapper[4922]: I0218 11:37:46.703697 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:46 crc kubenswrapper[4922]: I0218 11:37:46.703737 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:46 crc kubenswrapper[4922]: I0218 11:37:46.703746 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:46 crc kubenswrapper[4922]: I0218 11:37:46.703760 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:46 crc kubenswrapper[4922]: I0218 11:37:46.703771 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:46Z","lastTransitionTime":"2026-02-18T11:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:46 crc kubenswrapper[4922]: I0218 11:37:46.806863 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:46 crc kubenswrapper[4922]: I0218 11:37:46.806932 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:46 crc kubenswrapper[4922]: I0218 11:37:46.806946 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:46 crc kubenswrapper[4922]: I0218 11:37:46.806970 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:46 crc kubenswrapper[4922]: I0218 11:37:46.806983 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:46Z","lastTransitionTime":"2026-02-18T11:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:46 crc kubenswrapper[4922]: I0218 11:37:46.937384 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:46 crc kubenswrapper[4922]: I0218 11:37:46.937444 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:46 crc kubenswrapper[4922]: I0218 11:37:46.937457 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:46 crc kubenswrapper[4922]: I0218 11:37:46.937474 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:46 crc kubenswrapper[4922]: I0218 11:37:46.937487 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:46Z","lastTransitionTime":"2026-02-18T11:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:46 crc kubenswrapper[4922]: I0218 11:37:46.954785 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 04:42:56.206733582 +0000 UTC Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.040662 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.040735 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.040760 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.040795 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.040821 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:47Z","lastTransitionTime":"2026-02-18T11:37:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.143992 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.144031 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.144042 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.144057 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.144071 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:47Z","lastTransitionTime":"2026-02-18T11:37:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.246528 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.246571 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.246581 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.246596 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.246606 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:47Z","lastTransitionTime":"2026-02-18T11:37:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.349568 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.349607 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.349617 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.349634 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.349646 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:47Z","lastTransitionTime":"2026-02-18T11:37:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.451675 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.451739 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.451757 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.451782 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.451800 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:47Z","lastTransitionTime":"2026-02-18T11:37:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.554257 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.554303 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.554312 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.554330 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.554339 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:47Z","lastTransitionTime":"2026-02-18T11:37:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.657773 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.657846 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.657863 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.657888 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.657908 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:47Z","lastTransitionTime":"2026-02-18T11:37:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.761107 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.761154 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.761168 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.761189 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.761201 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:47Z","lastTransitionTime":"2026-02-18T11:37:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.865943 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.866001 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.866015 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.866041 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.866067 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:47Z","lastTransitionTime":"2026-02-18T11:37:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.954981 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 01:38:52.949408249 +0000 UTC Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.968791 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.968844 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.968857 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.968876 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.968889 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:47Z","lastTransitionTime":"2026-02-18T11:37:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.972524 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.972565 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.972639 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:37:47 crc kubenswrapper[4922]: I0218 11:37:47.972551 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:37:47 crc kubenswrapper[4922]: E0218 11:37:47.972692 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:37:47 crc kubenswrapper[4922]: E0218 11:37:47.972811 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pspfr" podUID="4702cf45-b47b-4291-a553-5bfc7bc22674" Feb 18 11:37:47 crc kubenswrapper[4922]: E0218 11:37:47.972903 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:37:47 crc kubenswrapper[4922]: E0218 11:37:47.972954 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.071456 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.071516 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.071533 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.071560 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.071577 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:48Z","lastTransitionTime":"2026-02-18T11:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.174166 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.174213 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.174222 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.174238 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.174251 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:48Z","lastTransitionTime":"2026-02-18T11:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.277469 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.277528 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.277540 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.277557 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.277570 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:48Z","lastTransitionTime":"2026-02-18T11:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.379727 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.379786 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.379803 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.379827 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.379849 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:48Z","lastTransitionTime":"2026-02-18T11:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.453414 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-c9xzd_9b4595ac-c521-4ada-950d-e1b01cdff99b/kube-multus/0.log" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.453463 4922 generic.go:334] "Generic (PLEG): container finished" podID="9b4595ac-c521-4ada-950d-e1b01cdff99b" containerID="83f784f6e4f98fea5928c944062388eaa69d5a5782490bdb120cd735f54f52df" exitCode=1 Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.453501 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-c9xzd" event={"ID":"9b4595ac-c521-4ada-950d-e1b01cdff99b","Type":"ContainerDied","Data":"83f784f6e4f98fea5928c944062388eaa69d5a5782490bdb120cd735f54f52df"} Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.454014 4922 scope.go:117] "RemoveContainer" containerID="83f784f6e4f98fea5928c944062388eaa69d5a5782490bdb120cd735f54f52df" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.463571 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5qkb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cd3723d-a12f-4c7c-a1ea-63bfef3c931a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ff8efc7bd7cb5045841f85767cea766898289dab6f4081829d530fbaaf9dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c45cw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5qkb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:48Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.477090 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"384d508e-8f6a-427d-a85d-85bb61d8405e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dacf3659cbb51e3d8df303470837041f3110f38b42187d40cbb4e53a8750cafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab832676116da0ae3d9dc70d182e283d7ae2418db02c1ea0062812cab0d71b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://434010c9772cf1c9db5578ecd4726ae2de0c0a3923d45b522868677cf78fcaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af62821d90e57595091aa9e7f1083acd21cb955ada0837c5ab4f71885ff7f1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0218 11:36:58.792829 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0218 11:36:58.792990 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0218 11:36:58.794119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-56419459/tls.crt::/tmp/serving-cert-56419459/tls.key\\\\\\\"\\\\nI0218 11:36:58.994875 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0218 11:36:59.009974 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0218 11:36:59.010010 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0218 11:36:59.010039 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0218 11:36:59.010047 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0218 11:36:59.021309 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0218 11:36:59.021338 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021343 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021347 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0218 11:36:59.021350 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0218 11:36:59.021353 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:36:59.021356 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:36:59.021636 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0218 11:36:59.024101 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b891a2fe938b437cbfb28c7715d0ad758541d844a4768434d3c1f1a3f0578c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:48Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.481433 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.481471 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.481516 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.481531 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.481540 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:48Z","lastTransitionTime":"2026-02-18T11:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.491561 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://454f72313609f075cd3f659b19279f8c3b07e8fea70f0ca7c88c5ca8e2e57397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0577fe2fd5192d4ec36ec8bfc6259a533b6422ef672d8153f601c2ac96bb5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:48Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.509902 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26zbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592c6351-c252-4c19-b3b1-167096be2de9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fca36b120b6a8f3bf08a810358bdd23b8b3c3105ede8e850232a9fef2d322db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26zbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:48Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.523826 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjnxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"648f85d5-dbc6-4db6-b590-3edc96740212\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e419e9c268c54b7829fa957e360c5b2c9943232871f9773ab32017c36a44b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8l2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61f493c42d7223ae48d738cf9973c2f6f5997722b017cc1665ec5f27ea0542dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8l2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:37:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjnxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:48Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.537936 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:48Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.548880 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w46bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"067f44ac-9e60-4581-87cc-f2e1c823fc4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c52fc2f1b105e90957f8cd7424352990f9ec9e5350d1a501980bfda1613066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ks8d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w46bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:48Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.560921 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pspfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702cf45-b47b-4291-a553-5bfc7bc22674\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7224x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7224x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:37:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pspfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:48Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.575632 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47a2352-be16-435d-9207-078c64f99e77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d641439c1ae42b2579e47041fc257fa37643ec1ef4fcacf55b15b82d6ec02cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dda1daddcf256e52a7dc75c6b5c221fcd8b8fad50c50b6053774f9a8fab57e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac316efde113b3547d00f480ee2c658958ba4f99c719d0920f333ab2a62ebaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6a8f8152ee7536b35054611238f4050c7f0e3a756f74ae8525af7f50150f88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:48Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.584235 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.584283 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.584294 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.584311 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.584322 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:48Z","lastTransitionTime":"2026-02-18T11:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.591855 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"143e32d8-7a32-4ab1-9790-a5b4fe1f0180\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c70b47d4c3df2edbdf1c33bdcff5642932d5212a23d54b20e79ec909b2e3de12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e27a0c7d36ad7d1c3f884b9bb69b963a922ad7851d263b3761ed5bc7e79a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a488435a081bdf17035a1ff4df73b0723045f0eef9049ec259198c697ecc33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39ce7973e25ba416a5f82cbafb8c1b3bea34718c49f8a8a56bd46970dc688727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39ce7973e25ba416a5f82cbafb8c1b3bea34718c49f8a8a56bd46970dc688727\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:48Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.610629 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:48Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.622331 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:48Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.647631 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"653a41bb-bb1d-421c-a92b-7f2811d95edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c0bf0b715f9f63ac45a7294c1ecd769e8b753eba698432388e02092cd5fb353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c0bf0b715f9f63ac45a7294c1ecd769e8b753eba698432388e02092cd5fb353\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:37:26Z\\\",\\\"message\\\":\\\"andler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0218 11:37:26.126150 6616 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0218 11:37:26.126155 6616 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0218 11:37:26.126204 6616 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0218 11:37:26.126216 6616 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0218 11:37:26.126228 6616 handler.go:208] Removed *v1.Node event handler 2\\\\nI0218 11:37:26.126241 6616 handler.go:208] Removed *v1.Node event handler 7\\\\nI0218 11:37:26.126248 6616 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0218 11:37:26.126253 6616 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0218 11:37:26.126261 6616 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0218 11:37:26.126270 6616 factory.go:656] Stopping watch factory\\\\nI0218 11:37:26.126287 6616 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0218 11:37:26.126604 6616 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0218 11:37:26.126749 6616 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0218 11:37:26.126791 6616 ovnkube.go:599] Stopped ovnkube\\\\nI0218 11:37:26.126827 6616 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0218 11:37:26.126933 6616 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wg4r5_openshift-ovn-kubernetes(653a41bb-bb1d-421c-a92b-7f2811d95edf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wg4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:48Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.665147 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bbe0881c-3173-4207-adb8-72c0dd4fa9c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e73616098bbc0fcdcd0b7e6b5bf610a7c193bdb65922e4532ce9553e9fc6e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71f2a53a03eafd89329b955a7edbe71c610706392e0a07fa09b9a6247889913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cac665b2d5a707e29339e7e8afa75dffcfd1dc1f42262e2643b2487ae68558a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0517235f78003f9b1a9044287c23aebed811299dc48f24c6643dd3bfcf5cf1d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a0c8c34dd5237a5b2d52b436f5c4deabd96a96ccb8073961dc49aaffd62c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:48Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.679218 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65825dc8bc12fe9933ff1392fd17a19042bf62d65c0b1ba22303094881745e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:48Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.686502 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.686528 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.686536 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.686550 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.686561 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:48Z","lastTransitionTime":"2026-02-18T11:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.691740 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9xzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b4595ac-c521-4ada-950d-e1b01cdff99b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f784f6e4f98fea5928c944062388eaa69d5a5782490bdb120cd735f54f52df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83f784f6e4f98fea5928c944062388eaa69d5a5782490bdb120cd735f54f52df\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:37:47Z\\\",\\\"message\\\":\\\"2026-02-18T11:37:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_66f40f5e-a7a5-411c-a907-9573d8b5508f\\\\n2026-02-18T11:37:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_66f40f5e-a7a5-411c-a907-9573d8b5508f to /host/opt/cni/bin/\\\\n2026-02-18T11:37:02Z [verbose] multus-daemon started\\\\n2026-02-18T11:37:02Z [verbose] Readiness Indicator file check\\\\n2026-02-18T11:37:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zv6rh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9xzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:48Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.702704 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3079844ef727cc58eb6253c7eaa0289ca84977a00334a10cdbf493b16133a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:48Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.713335 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdb7cedc-b2e3-48f0-80e0-e17073b43228\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c29cb76dded770e8691cd270c1c1197ffd3764c3fa73faf7f072a63c9e3e4b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653a3be8615ffc467d4be70b77ab62c837c24df920515a9965ec20ce2941d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-znglx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:48Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.789524 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.789565 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.789575 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.789592 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.789602 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:48Z","lastTransitionTime":"2026-02-18T11:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.892742 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.892821 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.892846 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.892890 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.892913 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:48Z","lastTransitionTime":"2026-02-18T11:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.955925 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 20:59:43.130843761 +0000 UTC Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.985697 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5qkb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cd3723d-a12f-4c7c-a1ea-63bfef3c931a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ff8efc7bd7cb5045841f85767cea766898289dab6f4081829d530fbaaf9dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c45cw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5qkb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:48Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.995689 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.995726 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.995738 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.995755 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.995767 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:48Z","lastTransitionTime":"2026-02-18T11:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:48 crc kubenswrapper[4922]: I0218 11:37:48.999913 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjnxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"648f85d5-dbc6-4db6-b590-3edc96740212\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e419e9c268c54b7829fa957e360c5b2c9943232871f9773ab32017c36a44b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8l2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61f493c42d7223ae48d738cf9973c2f6f5997722b017cc1665ec5f27ea0542dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8l2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:37:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjnxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:48Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.013010 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"384d508e-8f6a-427d-a85d-85bb61d8405e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dacf3659cbb51e3d8df303470837041f3110f38b42187d40cbb4e53a8750cafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab832676116da0ae3d9dc70d182e283d7ae2418db02c1ea0062812cab0d71b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://434010c9772cf1c9db5578ecd4726ae2de0c0a3923d45b522868677cf78fcaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af62821d90e57595091aa9e7f1083acd21cb955ada0837c5ab4f71885ff7f1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0218 11:36:58.792829 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0218 11:36:58.792990 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0218 11:36:58.794119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-56419459/tls.crt::/tmp/serving-cert-56419459/tls.key\\\\\\\"\\\\nI0218 11:36:58.994875 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0218 11:36:59.009974 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0218 11:36:59.010010 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0218 11:36:59.010039 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0218 11:36:59.010047 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0218 11:36:59.021309 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0218 11:36:59.021338 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021343 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021347 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0218 11:36:59.021350 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0218 11:36:59.021353 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:36:59.021356 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:36:59.021636 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0218 11:36:59.024101 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b891a2fe938b437cbfb28c7715d0ad758541d844a4768434d3c1f1a3f0578c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:49Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.025193 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://454f72313609f075cd3f659b19279f8c3b07e8fea70f0ca7c88c5ca8e2e57397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0577fe2fd5192d4ec36ec8bfc6259a533b6422ef672d8153f601c2ac96bb5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:49Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.043724 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26zbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592c6351-c252-4c19-b3b1-167096be2de9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fca36b120b6a8f3bf08a810358bdd23b8b3c3105ede8e850232a9fef2d322db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26zbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:49Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.066616 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:49Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.081727 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:49Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.095087 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:49Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.098423 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.098476 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.098495 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.098520 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.098544 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:49Z","lastTransitionTime":"2026-02-18T11:37:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.109383 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w46bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"067f44ac-9e60-4581-87cc-f2e1c823fc4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c52fc2f1b105e90957f8cd7424352990f9ec9e5350d1a501980bfda1613066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ks8d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w46bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:49Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.123158 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pspfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702cf45-b47b-4291-a553-5bfc7bc22674\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7224x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7224x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:37:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pspfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:49Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.137024 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47a2352-be16-435d-9207-078c64f99e77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d641439c1ae42b2579e47041fc257fa37643ec1ef4fcacf55b15b82d6ec02cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dda1daddcf256e52a7dc75c6b5c221fcd8b8fad50c50b6053774f9a8fab57e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac316efde113b3547d00f480ee2c658958ba4f99c719d0920f333ab2a62ebaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6a8f8152ee7536b35054611238f4050c7f0e3a756f74ae8525af7f50150f88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:49Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.148148 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"143e32d8-7a32-4ab1-9790-a5b4fe1f0180\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c70b47d4c3df2edbdf1c33bdcff5642932d5212a23d54b20e79ec909b2e3de12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e27a0c7d36ad7d1c3f884b9bb69b963a922ad7851d263b3761ed5bc7e79a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a488435a081bdf17035a1ff4df73b0723045f0eef9049ec259198c697ecc33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39ce7973e25ba416a5f82cbafb8c1b3bea34718c49f8a8a56bd46970dc688727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39ce7973e25ba416a5f82cbafb8c1b3bea34718c49f8a8a56bd46970dc688727\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:49Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.158851 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3079844ef727cc58eb6253c7eaa0289ca84977a00334a10cdbf493b16133a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:49Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.170917 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdb7cedc-b2e3-48f0-80e0-e17073b43228\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c29cb76dded770e8691cd270c1c1197ffd3764c3fa73faf7f072a63c9e3e4b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653a3be8615ffc467d4be70b77ab62c837c24df920515a9965ec20ce2941d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-znglx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:49Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.199176 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"653a41bb-bb1d-421c-a92b-7f2811d95edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c0bf0b715f9f63ac45a7294c1ecd769e8b753eba698432388e02092cd5fb353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c0bf0b715f9f63ac45a7294c1ecd769e8b753eba698432388e02092cd5fb353\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:37:26Z\\\",\\\"message\\\":\\\"andler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0218 11:37:26.126150 6616 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0218 11:37:26.126155 6616 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0218 11:37:26.126204 6616 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0218 11:37:26.126216 6616 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0218 11:37:26.126228 6616 handler.go:208] Removed *v1.Node event handler 2\\\\nI0218 11:37:26.126241 6616 handler.go:208] Removed *v1.Node event handler 7\\\\nI0218 11:37:26.126248 6616 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0218 11:37:26.126253 6616 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0218 11:37:26.126261 6616 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0218 11:37:26.126270 6616 factory.go:656] Stopping watch factory\\\\nI0218 11:37:26.126287 6616 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0218 11:37:26.126604 6616 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0218 11:37:26.126749 6616 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0218 11:37:26.126791 6616 ovnkube.go:599] Stopped ovnkube\\\\nI0218 11:37:26.126827 6616 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0218 11:37:26.126933 6616 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wg4r5_openshift-ovn-kubernetes(653a41bb-bb1d-421c-a92b-7f2811d95edf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wg4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:49Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.200447 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.200477 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.200489 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.200511 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.200525 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:49Z","lastTransitionTime":"2026-02-18T11:37:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.224353 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bbe0881c-3173-4207-adb8-72c0dd4fa9c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e73616098bbc0fcdcd0b7e6b5bf610a7c193bdb65922e4532ce9553e9fc6e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71f2a53a03eafd89329b955a7edbe71c610706392e0a07fa09b9a6247889913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cac665b2d5a707e29339e7e8afa75dffcfd1dc1f42262e2643b2487ae68558a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0517235f78003f9b1a9044287c23aebed811299dc48f24c6643dd3bfcf5cf1d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a0c8c34dd5237a5b2d52b436f5c4deabd96a96ccb8073961dc49aaffd62c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:49Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.242211 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65825dc8bc12fe9933ff1392fd17a19042bf62d65c0b1ba22303094881745e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:49Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.258767 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9xzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b4595ac-c521-4ada-950d-e1b01cdff99b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f784f6e4f98fea5928c944062388eaa69d5a5782490bdb120cd735f54f52df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83f784f6e4f98fea5928c944062388eaa69d5a5782490bdb120cd735f54f52df\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:37:47Z\\\",\\\"message\\\":\\\"2026-02-18T11:37:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_66f40f5e-a7a5-411c-a907-9573d8b5508f\\\\n2026-02-18T11:37:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_66f40f5e-a7a5-411c-a907-9573d8b5508f to /host/opt/cni/bin/\\\\n2026-02-18T11:37:02Z [verbose] multus-daemon started\\\\n2026-02-18T11:37:02Z [verbose] Readiness Indicator file check\\\\n2026-02-18T11:37:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zv6rh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9xzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:49Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.303269 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.303294 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.303302 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.303315 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.303324 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:49Z","lastTransitionTime":"2026-02-18T11:37:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.404961 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.404995 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.405003 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.405018 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.405027 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:49Z","lastTransitionTime":"2026-02-18T11:37:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.458134 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-c9xzd_9b4595ac-c521-4ada-950d-e1b01cdff99b/kube-multus/0.log" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.458194 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-c9xzd" event={"ID":"9b4595ac-c521-4ada-950d-e1b01cdff99b","Type":"ContainerStarted","Data":"71d0689c83ee6c0c1704fac1f69846ec7d6ef1b16479ee4eae33acefd6b84765"} Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.473261 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"384d508e-8f6a-427d-a85d-85bb61d8405e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dacf3659cbb51e3d8df303470837041f3110f38b42187d40cbb4e53a8750cafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab832676116da0ae3d9dc70d182e283d7ae2418db02c1ea0062812cab0d71b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://434010c9772cf1c9db5578ecd4726ae2de0c0a3923d45b522868677cf78fcaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af62821d90e57595091aa9e7f1083acd21cb955ada0837c5ab4f71885ff7f1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0218 11:36:58.792829 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0218 11:36:58.792990 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0218 11:36:58.794119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-56419459/tls.crt::/tmp/serving-cert-56419459/tls.key\\\\\\\"\\\\nI0218 11:36:58.994875 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0218 11:36:59.009974 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0218 11:36:59.010010 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0218 11:36:59.010039 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0218 11:36:59.010047 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0218 11:36:59.021309 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0218 11:36:59.021338 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021343 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021347 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0218 11:36:59.021350 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0218 11:36:59.021353 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:36:59.021356 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:36:59.021636 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0218 11:36:59.024101 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b891a2fe938b437cbfb28c7715d0ad758541d844a4768434d3c1f1a3f0578c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:49Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.488418 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://454f72313609f075cd3f659b19279f8c3b07e8fea70f0ca7c88c5ca8e2e57397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0577fe2fd5192d4ec36ec8bfc6259a533b6422ef672d8153f601c2ac96bb5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:49Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.502878 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26zbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592c6351-c252-4c19-b3b1-167096be2de9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fca36b120b6a8f3bf08a810358bdd23b8b3c3105ede8e850232a9fef2d322db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26zbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:49Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.506879 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.506911 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.506923 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.506940 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.506952 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:49Z","lastTransitionTime":"2026-02-18T11:37:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.512470 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjnxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"648f85d5-dbc6-4db6-b590-3edc96740212\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e419e9c268c54b7829fa957e360c5b2c9943232871f9773ab32017c36a44b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8l2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61f493c42d7223ae48d738cf9973c2f6f5997722b017cc1665ec5f27ea0542dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8l2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:37:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjnxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:49Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.521412 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w46bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"067f44ac-9e60-4581-87cc-f2e1c823fc4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c52fc2f1b105e90957f8cd7424352990f9ec9e5350d1a501980bfda1613066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ks8d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w46bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:49Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.531188 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pspfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702cf45-b47b-4291-a553-5bfc7bc22674\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7224x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7224x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:37:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pspfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:49Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.542232 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47a2352-be16-435d-9207-078c64f99e77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d641439c1ae42b2579e47041fc257fa37643ec1ef4fcacf55b15b82d6ec02cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dda1daddcf256e52a7dc75c6b5c221fcd8b8fad50c50b6053774f9a8fab57e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac316efde113b3547d00f480ee2c658958ba4f99c719d0920f333ab2a62ebaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6a8f8152ee7536b35054611238f4050c7f0e3a756f74ae8525af7f50150f88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:49Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.552283 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"143e32d8-7a32-4ab1-9790-a5b4fe1f0180\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c70b47d4c3df2edbdf1c33bdcff5642932d5212a23d54b20e79ec909b2e3de12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e27a0c7d36ad7d1c3f884b9bb69b963a922ad7851d263b3761ed5bc7e79a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a488435a081bdf17035a1ff4df73b0723045f0eef9049ec259198c697ecc33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39ce7973e25ba416a5f82cbafb8c1b3bea34718c49f8a8a56bd46970dc688727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39ce7973e25ba416a5f82cbafb8c1b3bea34718c49f8a8a56bd46970dc688727\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:49Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.562954 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:49Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.573592 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:49Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.583622 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:49Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.600016 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bbe0881c-3173-4207-adb8-72c0dd4fa9c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e73616098bbc0fcdcd0b7e6b5bf610a7c193bdb65922e4532ce9553e9fc6e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71f2a53a03eafd89329b955a7edbe71c610706392e0a07fa09b9a6247889913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cac665b2d5a707e29339e7e8afa75dffcfd1dc1f42262e2643b2487ae68558a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0517235f78003f9b1a9044287c23aebed811299dc48f24c6643dd3bfcf5cf1d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a0c8c34dd5237a5b2d52b436f5c4deabd96a96ccb8073961dc49aaffd62c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:49Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.609986 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.610016 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.610026 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.610039 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.610048 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:49Z","lastTransitionTime":"2026-02-18T11:37:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.611669 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65825dc8bc12fe9933ff1392fd17a19042bf62d65c0b1ba22303094881745e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:49Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.621718 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9xzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b4595ac-c521-4ada-950d-e1b01cdff99b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d0689c83ee6c0c1704fac1f69846ec7d6ef1b16479ee4eae33acefd6b84765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83f784f6e4f98fea5928c944062388eaa69d5a5782490bdb120cd735f54f52df\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:37:47Z\\\",\\\"message\\\":\\\"2026-02-18T11:37:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_66f40f5e-a7a5-411c-a907-9573d8b5508f\\\\n2026-02-18T11:37:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_66f40f5e-a7a5-411c-a907-9573d8b5508f to /host/opt/cni/bin/\\\\n2026-02-18T11:37:02Z [verbose] multus-daemon started\\\\n2026-02-18T11:37:02Z [verbose] Readiness Indicator file check\\\\n2026-02-18T11:37:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zv6rh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9xzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:49Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.630700 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3079844ef727cc58eb6253c7eaa0289ca84977a00334a10cdbf493b16133a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:49Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.642438 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdb7cedc-b2e3-48f0-80e0-e17073b43228\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c29cb76dded770e8691cd270c1c1197ffd3764c3fa73faf7f072a63c9e3e4b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653a3be8615ffc467d4be70b77ab62c837c24df920515a9965ec20ce2941d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-znglx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:49Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.658652 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"653a41bb-bb1d-421c-a92b-7f2811d95edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c0bf0b715f9f63ac45a7294c1ecd769e8b753eba698432388e02092cd5fb353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c0bf0b715f9f63ac45a7294c1ecd769e8b753eba698432388e02092cd5fb353\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:37:26Z\\\",\\\"message\\\":\\\"andler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0218 11:37:26.126150 6616 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0218 11:37:26.126155 6616 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0218 11:37:26.126204 6616 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0218 11:37:26.126216 6616 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0218 11:37:26.126228 6616 handler.go:208] Removed *v1.Node event handler 2\\\\nI0218 11:37:26.126241 6616 handler.go:208] Removed *v1.Node event handler 7\\\\nI0218 11:37:26.126248 6616 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0218 11:37:26.126253 6616 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0218 11:37:26.126261 6616 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0218 11:37:26.126270 6616 factory.go:656] Stopping watch factory\\\\nI0218 11:37:26.126287 6616 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0218 11:37:26.126604 6616 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0218 11:37:26.126749 6616 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0218 11:37:26.126791 6616 ovnkube.go:599] Stopped ovnkube\\\\nI0218 11:37:26.126827 6616 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0218 11:37:26.126933 6616 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wg4r5_openshift-ovn-kubernetes(653a41bb-bb1d-421c-a92b-7f2811d95edf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wg4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:49Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.668508 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5qkb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cd3723d-a12f-4c7c-a1ea-63bfef3c931a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ff8efc7bd7cb5045841f85767cea766898289dab6f4081829d530fbaaf9dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c45cw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5qkb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:49Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.713304 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.713379 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.713389 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.713432 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.713442 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:49Z","lastTransitionTime":"2026-02-18T11:37:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.816800 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.816843 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.816851 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.816866 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.816876 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:49Z","lastTransitionTime":"2026-02-18T11:37:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.919274 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.919321 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.919331 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.919348 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.919375 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:49Z","lastTransitionTime":"2026-02-18T11:37:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.957172 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 12:46:03.022797714 +0000 UTC Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.972444 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.972507 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.972539 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:37:49 crc kubenswrapper[4922]: I0218 11:37:49.972543 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:37:49 crc kubenswrapper[4922]: E0218 11:37:49.972810 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:37:49 crc kubenswrapper[4922]: E0218 11:37:49.972919 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pspfr" podUID="4702cf45-b47b-4291-a553-5bfc7bc22674" Feb 18 11:37:49 crc kubenswrapper[4922]: E0218 11:37:49.973037 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:37:49 crc kubenswrapper[4922]: E0218 11:37:49.973143 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:37:50 crc kubenswrapper[4922]: I0218 11:37:50.021491 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:50 crc kubenswrapper[4922]: I0218 11:37:50.021530 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:50 crc kubenswrapper[4922]: I0218 11:37:50.021539 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:50 crc kubenswrapper[4922]: I0218 11:37:50.021553 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:50 crc kubenswrapper[4922]: I0218 11:37:50.021563 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:50Z","lastTransitionTime":"2026-02-18T11:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:50 crc kubenswrapper[4922]: I0218 11:37:50.123761 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:50 crc kubenswrapper[4922]: I0218 11:37:50.123801 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:50 crc kubenswrapper[4922]: I0218 11:37:50.123810 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:50 crc kubenswrapper[4922]: I0218 11:37:50.123824 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:50 crc kubenswrapper[4922]: I0218 11:37:50.123835 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:50Z","lastTransitionTime":"2026-02-18T11:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:50 crc kubenswrapper[4922]: I0218 11:37:50.226071 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:50 crc kubenswrapper[4922]: I0218 11:37:50.226121 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:50 crc kubenswrapper[4922]: I0218 11:37:50.226133 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:50 crc kubenswrapper[4922]: I0218 11:37:50.226150 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:50 crc kubenswrapper[4922]: I0218 11:37:50.226162 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:50Z","lastTransitionTime":"2026-02-18T11:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:50 crc kubenswrapper[4922]: I0218 11:37:50.367001 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:50 crc kubenswrapper[4922]: I0218 11:37:50.367035 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:50 crc kubenswrapper[4922]: I0218 11:37:50.367045 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:50 crc kubenswrapper[4922]: I0218 11:37:50.367062 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:50 crc kubenswrapper[4922]: I0218 11:37:50.367074 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:50Z","lastTransitionTime":"2026-02-18T11:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:50 crc kubenswrapper[4922]: I0218 11:37:50.471850 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:50 crc kubenswrapper[4922]: I0218 11:37:50.471884 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:50 crc kubenswrapper[4922]: I0218 11:37:50.471895 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:50 crc kubenswrapper[4922]: I0218 11:37:50.471909 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:50 crc kubenswrapper[4922]: I0218 11:37:50.471917 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:50Z","lastTransitionTime":"2026-02-18T11:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:50 crc kubenswrapper[4922]: I0218 11:37:50.575771 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:50 crc kubenswrapper[4922]: I0218 11:37:50.575806 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:50 crc kubenswrapper[4922]: I0218 11:37:50.575815 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:50 crc kubenswrapper[4922]: I0218 11:37:50.575836 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:50 crc kubenswrapper[4922]: I0218 11:37:50.575845 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:50Z","lastTransitionTime":"2026-02-18T11:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:50 crc kubenswrapper[4922]: I0218 11:37:50.678306 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:50 crc kubenswrapper[4922]: I0218 11:37:50.678388 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:50 crc kubenswrapper[4922]: I0218 11:37:50.678403 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:50 crc kubenswrapper[4922]: I0218 11:37:50.678420 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:50 crc kubenswrapper[4922]: I0218 11:37:50.678432 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:50Z","lastTransitionTime":"2026-02-18T11:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:50 crc kubenswrapper[4922]: I0218 11:37:50.781510 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:50 crc kubenswrapper[4922]: I0218 11:37:50.781585 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:50 crc kubenswrapper[4922]: I0218 11:37:50.781610 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:50 crc kubenswrapper[4922]: I0218 11:37:50.781641 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:50 crc kubenswrapper[4922]: I0218 11:37:50.781662 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:50Z","lastTransitionTime":"2026-02-18T11:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:50 crc kubenswrapper[4922]: I0218 11:37:50.884677 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:50 crc kubenswrapper[4922]: I0218 11:37:50.884743 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:50 crc kubenswrapper[4922]: I0218 11:37:50.884754 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:50 crc kubenswrapper[4922]: I0218 11:37:50.884772 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:50 crc kubenswrapper[4922]: I0218 11:37:50.884782 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:50Z","lastTransitionTime":"2026-02-18T11:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:50 crc kubenswrapper[4922]: I0218 11:37:50.958189 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 10:54:26.043990047 +0000 UTC Feb 18 11:37:50 crc kubenswrapper[4922]: I0218 11:37:50.974161 4922 scope.go:117] "RemoveContainer" containerID="9c0bf0b715f9f63ac45a7294c1ecd769e8b753eba698432388e02092cd5fb353" Feb 18 11:37:50 crc kubenswrapper[4922]: I0218 11:37:50.987212 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:50 crc kubenswrapper[4922]: I0218 11:37:50.987243 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:50 crc kubenswrapper[4922]: I0218 11:37:50.987252 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:50 crc kubenswrapper[4922]: I0218 11:37:50.987266 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:50 crc kubenswrapper[4922]: I0218 11:37:50.987275 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:50Z","lastTransitionTime":"2026-02-18T11:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.090287 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.090320 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.090335 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.090351 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.090378 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:51Z","lastTransitionTime":"2026-02-18T11:37:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.192823 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.192891 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.192900 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.192915 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.192925 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:51Z","lastTransitionTime":"2026-02-18T11:37:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.295170 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.295203 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.295213 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.295231 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.295242 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:51Z","lastTransitionTime":"2026-02-18T11:37:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.397527 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.397571 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.397581 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.397599 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.397609 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:51Z","lastTransitionTime":"2026-02-18T11:37:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.464819 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wg4r5_653a41bb-bb1d-421c-a92b-7f2811d95edf/ovnkube-controller/2.log" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.467326 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" event={"ID":"653a41bb-bb1d-421c-a92b-7f2811d95edf","Type":"ContainerStarted","Data":"1d1aca901c6bb735f04a76a87f7a05d6c549b8f936a07fb1917ece2b9fdd0081"} Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.467858 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.484146 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"384d508e-8f6a-427d-a85d-85bb61d8405e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dacf3659cbb51e3d8df303470837041f3110f38b42187d40cbb4e53a8750cafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab832676116da0ae3d9dc70d182e283d7ae2418db02c1ea0062812cab0d71b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://434010c9772cf1c9db5578ecd4726ae2de0c0a3923d45b522868677cf78fcaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af62821d90e57595091aa9e7f1083acd21cb955ada0837c5ab4f71885ff7f1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0218 11:36:58.792829 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0218 11:36:58.792990 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0218 11:36:58.794119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-56419459/tls.crt::/tmp/serving-cert-56419459/tls.key\\\\\\\"\\\\nI0218 11:36:58.994875 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0218 11:36:59.009974 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0218 11:36:59.010010 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0218 11:36:59.010039 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0218 11:36:59.010047 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0218 11:36:59.021309 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0218 11:36:59.021338 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021343 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021347 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0218 11:36:59.021350 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0218 11:36:59.021353 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:36:59.021356 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:36:59.021636 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0218 11:36:59.024101 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b891a2fe938b437cbfb28c7715d0ad758541d844a4768434d3c1f1a3f0578c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:51Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.498964 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://454f72313609f075cd3f659b19279f8c3b07e8fea70f0ca7c88c5ca8e2e57397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0577fe2fd5192d4ec36ec8bfc6259a533b6422ef672d8153f601c2ac96bb5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:51Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.507686 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.507748 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.507763 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.507789 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.507806 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:51Z","lastTransitionTime":"2026-02-18T11:37:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.522101 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26zbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592c6351-c252-4c19-b3b1-167096be2de9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fca36b120b6a8f3bf08a810358bdd23b8b3c3105ede8e850232a9fef2d322db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26zbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:51Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.535343 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjnxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"648f85d5-dbc6-4db6-b590-3edc96740212\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e419e9c268c54b7829fa957e360c5b2c9943232871f9773ab32017c36a44b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8l2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61f493c42d7223ae48d738cf9973c2f6f5997722b017cc1665ec5f27ea0542dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8l2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:37:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjnxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:51Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.549280 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:51Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.561832 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w46bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"067f44ac-9e60-4581-87cc-f2e1c823fc4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c52fc2f1b105e90957f8cd7424352990f9ec9e5350d1a501980bfda1613066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ks8d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w46bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:51Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.573683 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pspfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702cf45-b47b-4291-a553-5bfc7bc22674\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7224x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7224x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:37:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pspfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:51Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.590198 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47a2352-be16-435d-9207-078c64f99e77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d641439c1ae42b2579e47041fc257fa37643ec1ef4fcacf55b15b82d6ec02cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dda1daddcf256e52a7dc75c6b5c221fcd8b8fad50c50b6053774f9a8fab57e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac316efde113b3547d00f480ee2c658958ba4f99c719d0920f333ab2a62ebaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6a8f8152ee7536b35054611238f4050c7f0e3a756f74ae8525af7f50150f88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:51Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.607547 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"143e32d8-7a32-4ab1-9790-a5b4fe1f0180\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c70b47d4c3df2edbdf1c33bdcff5642932d5212a23d54b20e79ec909b2e3de12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e27a0c7d36ad7d1c3f884b9bb69b963a922ad7851d263b3761ed5bc7e79a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a488435a081bdf17035a1ff4df73b0723045f0eef9049ec259198c697ecc33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39ce7973e25ba416a5f82cbafb8c1b3bea34718c49f8a8a56bd46970dc688727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39ce7973e25ba416a5f82cbafb8c1b3bea34718c49f8a8a56bd46970dc688727\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:51Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.610110 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.610142 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.610151 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.610166 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.610179 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:51Z","lastTransitionTime":"2026-02-18T11:37:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.623028 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:51Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.636036 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:51Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.658490 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"653a41bb-bb1d-421c-a92b-7f2811d95edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d1aca901c6bb735f04a76a87f7a05d6c549b8f936a07fb1917ece2b9fdd0081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c0bf0b715f9f63ac45a7294c1ecd769e8b753eba698432388e02092cd5fb353\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:37:26Z\\\",\\\"message\\\":\\\"andler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0218 11:37:26.126150 6616 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0218 11:37:26.126155 6616 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0218 11:37:26.126204 6616 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0218 11:37:26.126216 6616 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0218 11:37:26.126228 6616 handler.go:208] Removed *v1.Node event handler 2\\\\nI0218 11:37:26.126241 6616 handler.go:208] Removed *v1.Node event handler 7\\\\nI0218 11:37:26.126248 6616 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0218 11:37:26.126253 6616 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0218 11:37:26.126261 6616 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0218 11:37:26.126270 6616 factory.go:656] Stopping watch factory\\\\nI0218 11:37:26.126287 6616 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0218 11:37:26.126604 6616 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0218 11:37:26.126749 6616 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0218 11:37:26.126791 6616 ovnkube.go:599] Stopped ovnkube\\\\nI0218 11:37:26.126827 6616 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0218 11:37:26.126933 6616 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wg4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:51Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.685407 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bbe0881c-3173-4207-adb8-72c0dd4fa9c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e73616098bbc0fcdcd0b7e6b5bf610a7c193bdb65922e4532ce9553e9fc6e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71f2a53a03eafd89329b955a7edbe71c610706392e0a07fa09b9a6247889913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cac665b2d5a707e29339e7e8afa75dffcfd1dc1f42262e2643b2487ae68558a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0517235f78003f9b1a9044287c23aebed811299dc48f24c6643dd3bfcf5cf1d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a0c8c34dd5237a5b2d52b436f5c4deabd96a96ccb8073961dc49aaffd62c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:51Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.700928 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65825dc8bc12fe9933ff1392fd17a19042bf62d65c0b1ba22303094881745e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:51Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.712825 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.712878 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.712889 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.712910 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.712923 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:51Z","lastTransitionTime":"2026-02-18T11:37:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.718411 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9xzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b4595ac-c521-4ada-950d-e1b01cdff99b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d0689c83ee6c0c1704fac1f69846ec7d6ef1b16479ee4eae33acefd6b84765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83f784f6e4f98fea5928c944062388eaa69d5a5782490bdb120cd735f54f52df\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:37:47Z\\\",\\\"message\\\":\\\"2026-02-18T11:37:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_66f40f5e-a7a5-411c-a907-9573d8b5508f\\\\n2026-02-18T11:37:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_66f40f5e-a7a5-411c-a907-9573d8b5508f to /host/opt/cni/bin/\\\\n2026-02-18T11:37:02Z [verbose] multus-daemon started\\\\n2026-02-18T11:37:02Z [verbose] Readiness Indicator file check\\\\n2026-02-18T11:37:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zv6rh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9xzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:51Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.730885 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3079844ef727cc58eb6253c7eaa0289ca84977a00334a10cdbf493b16133a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:51Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.747820 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdb7cedc-b2e3-48f0-80e0-e17073b43228\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c29cb76dded770e8691cd270c1c1197ffd3764c3fa73faf7f072a63c9e3e4b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653a3be8615ffc467d4be70b77ab62c837c24df920515a9965ec20ce2941d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-znglx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:51Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.760289 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5qkb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cd3723d-a12f-4c7c-a1ea-63bfef3c931a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ff8efc7bd7cb5045841f85767cea766898289dab6f4081829d530fbaaf9dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c45cw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5qkb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:51Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.815283 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.815336 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.815351 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.815390 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.815402 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:51Z","lastTransitionTime":"2026-02-18T11:37:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.918623 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.918683 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.918704 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.918730 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.918748 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:51Z","lastTransitionTime":"2026-02-18T11:37:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.959190 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 03:13:41.319725541 +0000 UTC Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.972581 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.972649 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.972690 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:37:51 crc kubenswrapper[4922]: I0218 11:37:51.972599 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:37:51 crc kubenswrapper[4922]: E0218 11:37:51.972755 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:37:51 crc kubenswrapper[4922]: E0218 11:37:51.972853 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:37:51 crc kubenswrapper[4922]: E0218 11:37:51.972991 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pspfr" podUID="4702cf45-b47b-4291-a553-5bfc7bc22674" Feb 18 11:37:51 crc kubenswrapper[4922]: E0218 11:37:51.973200 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.021838 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.021881 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.021892 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.021908 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.021919 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:52Z","lastTransitionTime":"2026-02-18T11:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.124199 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.124254 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.124275 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.124302 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.124320 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:52Z","lastTransitionTime":"2026-02-18T11:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.227299 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.227380 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.227393 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.227415 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.227430 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:52Z","lastTransitionTime":"2026-02-18T11:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.330278 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.330392 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.330419 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.330454 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.330483 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:52Z","lastTransitionTime":"2026-02-18T11:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.436493 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.436543 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.436567 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.436590 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.436605 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:52Z","lastTransitionTime":"2026-02-18T11:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.475221 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wg4r5_653a41bb-bb1d-421c-a92b-7f2811d95edf/ovnkube-controller/3.log" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.476578 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wg4r5_653a41bb-bb1d-421c-a92b-7f2811d95edf/ovnkube-controller/2.log" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.479918 4922 generic.go:334] "Generic (PLEG): container finished" podID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerID="1d1aca901c6bb735f04a76a87f7a05d6c549b8f936a07fb1917ece2b9fdd0081" exitCode=1 Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.479989 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" event={"ID":"653a41bb-bb1d-421c-a92b-7f2811d95edf","Type":"ContainerDied","Data":"1d1aca901c6bb735f04a76a87f7a05d6c549b8f936a07fb1917ece2b9fdd0081"} Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.480045 4922 scope.go:117] "RemoveContainer" containerID="9c0bf0b715f9f63ac45a7294c1ecd769e8b753eba698432388e02092cd5fb353" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.480926 4922 scope.go:117] "RemoveContainer" containerID="1d1aca901c6bb735f04a76a87f7a05d6c549b8f936a07fb1917ece2b9fdd0081" Feb 18 11:37:52 crc kubenswrapper[4922]: E0218 11:37:52.481161 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-wg4r5_openshift-ovn-kubernetes(653a41bb-bb1d-421c-a92b-7f2811d95edf)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.500325 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5qkb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cd3723d-a12f-4c7c-a1ea-63bfef3c931a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ff8efc7bd7cb5045841f85767cea766898289dab6f4081829d530fbaaf9dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c45cw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5qkb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:52Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.520995 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"384d508e-8f6a-427d-a85d-85bb61d8405e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dacf3659cbb51e3d8df303470837041f3110f38b42187d40cbb4e53a8750cafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab832676116da0ae3d9dc70d182e283d7ae2418db02c1ea0062812cab0d71b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://434010c9772cf1c9db5578ecd4726ae2de0c0a3923d45b522868677cf78fcaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af62821d90e57595091aa9e7f1083acd21cb955ada0837c5ab4f71885ff7f1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0218 11:36:58.792829 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0218 11:36:58.792990 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0218 11:36:58.794119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-56419459/tls.crt::/tmp/serving-cert-56419459/tls.key\\\\\\\"\\\\nI0218 11:36:58.994875 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0218 11:36:59.009974 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0218 11:36:59.010010 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0218 11:36:59.010039 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0218 11:36:59.010047 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0218 11:36:59.021309 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0218 11:36:59.021338 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021343 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021347 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0218 11:36:59.021350 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0218 11:36:59.021353 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:36:59.021356 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:36:59.021636 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0218 11:36:59.024101 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b891a2fe938b437cbfb28c7715d0ad758541d844a4768434d3c1f1a3f0578c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:52Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.541965 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.542023 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.542036 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.542060 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.542073 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:52Z","lastTransitionTime":"2026-02-18T11:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.545524 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://454f72313609f075cd3f659b19279f8c3b07e8fea70f0ca7c88c5ca8e2e57397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0577fe2fd5192d4ec36ec8bfc6259a533b6422ef672d8153f601c2ac96bb5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:52Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.568518 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26zbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592c6351-c252-4c19-b3b1-167096be2de9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fca36b120b6a8f3bf08a810358bdd23b8b3c3105ede8e850232a9fef2d322db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26zbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:52Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.585814 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjnxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"648f85d5-dbc6-4db6-b590-3edc96740212\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e419e9c268c54b7829fa957e360c5b2c9943232871f9773ab32017c36a44b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8l2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61f493c42d7223ae48d738cf9973c2f6f5997722b017cc1665ec5f27ea0542dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8l2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:37:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjnxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:52Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.603140 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pspfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702cf45-b47b-4291-a553-5bfc7bc22674\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7224x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7224x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:37:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pspfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:52Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.624014 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47a2352-be16-435d-9207-078c64f99e77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d641439c1ae42b2579e47041fc257fa37643ec1ef4fcacf55b15b82d6ec02cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dda1daddcf256e52a7dc75c6b5c221fcd8b8fad50c50b6053774f9a8fab57e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac316efde113b3547d00f480ee2c658958ba4f99c719d0920f333ab2a62ebaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6a8f8152ee7536b35054611238f4050c7f0e3a756f74ae8525af7f50150f88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:52Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.637690 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"143e32d8-7a32-4ab1-9790-a5b4fe1f0180\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c70b47d4c3df2edbdf1c33bdcff5642932d5212a23d54b20e79ec909b2e3de12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e27a0c7d36ad7d1c3f884b9bb69b963a922ad7851d263b3761ed5bc7e79a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a488435a081bdf17035a1ff4df73b0723045f0eef9049ec259198c697ecc33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39ce7973e25ba416a5f82cbafb8c1b3bea34718c49f8a8a56bd46970dc688727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39ce7973e25ba416a5f82cbafb8c1b3bea34718c49f8a8a56bd46970dc688727\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:52Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.645704 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.645734 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.645742 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.645757 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.645770 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:52Z","lastTransitionTime":"2026-02-18T11:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.657145 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:52Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.674105 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:52Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.688727 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:52Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.714643 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w46bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"067f44ac-9e60-4581-87cc-f2e1c823fc4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c52fc2f1b105e90957f8cd7424352990f9ec9e5350d1a501980bfda1613066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ks8d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w46bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:52Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.750190 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bbe0881c-3173-4207-adb8-72c0dd4fa9c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e73616098bbc0fcdcd0b7e6b5bf610a7c193bdb65922e4532ce9553e9fc6e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71f2a53a03eafd89329b955a7edbe71c610706392e0a07fa09b9a6247889913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cac665b2d5a707e29339e7e8afa75dffcfd1dc1f42262e2643b2487ae68558a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0517235f78003f9b1a9044287c23aebed811299dc48f24c6643dd3bfcf5cf1d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a0c8c34dd5237a5b2d52b436f5c4deabd96a96ccb8073961dc49aaffd62c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:52Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.751970 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.752012 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.752025 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.752044 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.752057 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:52Z","lastTransitionTime":"2026-02-18T11:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.769085 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65825dc8bc12fe9933ff1392fd17a19042bf62d65c0b1ba22303094881745e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:52Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.788518 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9xzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b4595ac-c521-4ada-950d-e1b01cdff99b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d0689c83ee6c0c1704fac1f69846ec7d6ef1b16479ee4eae33acefd6b84765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83f784f6e4f98fea5928c944062388eaa69d5a5782490bdb120cd735f54f52df\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:37:47Z\\\",\\\"message\\\":\\\"2026-02-18T11:37:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_66f40f5e-a7a5-411c-a907-9573d8b5508f\\\\n2026-02-18T11:37:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_66f40f5e-a7a5-411c-a907-9573d8b5508f to /host/opt/cni/bin/\\\\n2026-02-18T11:37:02Z [verbose] multus-daemon started\\\\n2026-02-18T11:37:02Z [verbose] Readiness Indicator file check\\\\n2026-02-18T11:37:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zv6rh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9xzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:52Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.801996 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3079844ef727cc58eb6253c7eaa0289ca84977a00334a10cdbf493b16133a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:52Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.816556 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdb7cedc-b2e3-48f0-80e0-e17073b43228\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c29cb76dded770e8691cd270c1c1197ffd3764c3fa73faf7f072a63c9e3e4b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653a3be8615ffc467d4be70b77ab62c837c24df920515a9965ec20ce2941d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-znglx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:52Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.839756 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"653a41bb-bb1d-421c-a92b-7f2811d95edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d1aca901c6bb735f04a76a87f7a05d6c549b8f936a07fb1917ece2b9fdd0081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c0bf0b715f9f63ac45a7294c1ecd769e8b753eba698432388e02092cd5fb353\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:37:26Z\\\",\\\"message\\\":\\\"andler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0218 11:37:26.126150 6616 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0218 11:37:26.126155 6616 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0218 11:37:26.126204 6616 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0218 11:37:26.126216 6616 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0218 11:37:26.126228 6616 handler.go:208] Removed *v1.Node event handler 2\\\\nI0218 11:37:26.126241 6616 handler.go:208] Removed *v1.Node event handler 7\\\\nI0218 11:37:26.126248 6616 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0218 11:37:26.126253 6616 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0218 11:37:26.126261 6616 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0218 11:37:26.126270 6616 factory.go:656] Stopping watch factory\\\\nI0218 11:37:26.126287 6616 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0218 11:37:26.126604 6616 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0218 11:37:26.126749 6616 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0218 11:37:26.126791 6616 ovnkube.go:599] Stopped ovnkube\\\\nI0218 11:37:26.126827 6616 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0218 11:37:26.126933 6616 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d1aca901c6bb735f04a76a87f7a05d6c549b8f936a07fb1917ece2b9fdd0081\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:37:51Z\\\",\\\"message\\\":\\\"ernalversions/factory.go:141\\\\nI0218 11:37:51.742382 6997 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0218 11:37:51.742628 6997 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:37:51.742905 6997 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:37:51.743039 6997 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0218 11:37:51.743155 6997 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:37:51.744128 6997 factory.go:656] Stopping watch factory\\\\nI0218 11:37:51.758771 6997 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0218 11:37:51.758806 6997 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0218 11:37:51.758875 6997 ovnkube.go:599] Stopped ovnkube\\\\nI0218 11:37:51.758908 6997 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0218 11:37:51.759034 6997 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wg4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:52Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.855201 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.855258 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.855276 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.855296 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.855312 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:52Z","lastTransitionTime":"2026-02-18T11:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.958012 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.958067 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.958082 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.958104 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.958118 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:52Z","lastTransitionTime":"2026-02-18T11:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:52 crc kubenswrapper[4922]: I0218 11:37:52.960301 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 11:29:15.226446896 +0000 UTC Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.062795 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.062860 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.062870 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.062891 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.062905 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:53Z","lastTransitionTime":"2026-02-18T11:37:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.166121 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.166175 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.166185 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.166204 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.166216 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:53Z","lastTransitionTime":"2026-02-18T11:37:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.269075 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.269154 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.269179 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.269212 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.269236 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:53Z","lastTransitionTime":"2026-02-18T11:37:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.373394 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.373493 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.373515 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.373567 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.373597 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:53Z","lastTransitionTime":"2026-02-18T11:37:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.476869 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.476927 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.476939 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.476959 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.476973 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:53Z","lastTransitionTime":"2026-02-18T11:37:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.486390 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wg4r5_653a41bb-bb1d-421c-a92b-7f2811d95edf/ovnkube-controller/3.log" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.490413 4922 scope.go:117] "RemoveContainer" containerID="1d1aca901c6bb735f04a76a87f7a05d6c549b8f936a07fb1917ece2b9fdd0081" Feb 18 11:37:53 crc kubenswrapper[4922]: E0218 11:37:53.491947 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-wg4r5_openshift-ovn-kubernetes(653a41bb-bb1d-421c-a92b-7f2811d95edf)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.510526 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdb7cedc-b2e3-48f0-80e0-e17073b43228\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c29cb76dded770e8691cd270c1c1197ffd3764c3fa73faf7f072a63c9e3e4b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653a3be8615ffc467d4be70b77ab62c837c24df920515a9965ec20ce2941d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-znglx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:53Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.545888 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"653a41bb-bb1d-421c-a92b-7f2811d95edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d1aca901c6bb735f04a76a87f7a05d6c549b8f936a07fb1917ece2b9fdd0081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d1aca901c6bb735f04a76a87f7a05d6c549b8f936a07fb1917ece2b9fdd0081\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:37:51Z\\\",\\\"message\\\":\\\"ernalversions/factory.go:141\\\\nI0218 11:37:51.742382 6997 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0218 11:37:51.742628 6997 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:37:51.742905 6997 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:37:51.743039 6997 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0218 11:37:51.743155 6997 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:37:51.744128 6997 factory.go:656] Stopping watch factory\\\\nI0218 11:37:51.758771 6997 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0218 11:37:51.758806 6997 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0218 11:37:51.758875 6997 ovnkube.go:599] Stopped ovnkube\\\\nI0218 11:37:51.758908 6997 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0218 11:37:51.759034 6997 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-wg4r5_openshift-ovn-kubernetes(653a41bb-bb1d-421c-a92b-7f2811d95edf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wg4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:53Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.575200 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bbe0881c-3173-4207-adb8-72c0dd4fa9c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e73616098bbc0fcdcd0b7e6b5bf610a7c193bdb65922e4532ce9553e9fc6e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71f2a53a03eafd89329b955a7edbe71c610706392e0a07fa09b9a6247889913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cac665b2d5a707e29339e7e8afa75dffcfd1dc1f42262e2643b2487ae68558a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0517235f78003f9b1a9044287c23aebed811299dc48f24c6643dd3bfcf5cf1d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a0c8c34dd5237a5b2d52b436f5c4deabd96a96ccb8073961dc49aaffd62c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:53Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.580098 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.580236 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.580246 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.580265 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.580275 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:53Z","lastTransitionTime":"2026-02-18T11:37:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.595040 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65825dc8bc12fe9933ff1392fd17a19042bf62d65c0b1ba22303094881745e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:53Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.611967 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9xzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b4595ac-c521-4ada-950d-e1b01cdff99b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d0689c83ee6c0c1704fac1f69846ec7d6ef1b16479ee4eae33acefd6b84765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83f784f6e4f98fea5928c944062388eaa69d5a5782490bdb120cd735f54f52df\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:37:47Z\\\",\\\"message\\\":\\\"2026-02-18T11:37:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_66f40f5e-a7a5-411c-a907-9573d8b5508f\\\\n2026-02-18T11:37:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_66f40f5e-a7a5-411c-a907-9573d8b5508f to /host/opt/cni/bin/\\\\n2026-02-18T11:37:02Z [verbose] multus-daemon started\\\\n2026-02-18T11:37:02Z [verbose] Readiness Indicator file check\\\\n2026-02-18T11:37:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zv6rh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9xzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:53Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.625012 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3079844ef727cc58eb6253c7eaa0289ca84977a00334a10cdbf493b16133a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:53Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.638756 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5qkb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cd3723d-a12f-4c7c-a1ea-63bfef3c931a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ff8efc7bd7cb5045841f85767cea766898289dab6f4081829d530fbaaf9dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c45cw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5qkb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:53Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.658563 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"384d508e-8f6a-427d-a85d-85bb61d8405e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dacf3659cbb51e3d8df303470837041f3110f38b42187d40cbb4e53a8750cafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab832676116da0ae3d9dc70d182e283d7ae2418db02c1ea0062812cab0d71b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://434010c9772cf1c9db5578ecd4726ae2de0c0a3923d45b522868677cf78fcaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af62821d90e57595091aa9e7f1083acd21cb955ada0837c5ab4f71885ff7f1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0218 11:36:58.792829 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0218 11:36:58.792990 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0218 11:36:58.794119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-56419459/tls.crt::/tmp/serving-cert-56419459/tls.key\\\\\\\"\\\\nI0218 11:36:58.994875 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0218 11:36:59.009974 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0218 11:36:59.010010 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0218 11:36:59.010039 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0218 11:36:59.010047 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0218 11:36:59.021309 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0218 11:36:59.021338 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021343 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021347 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0218 11:36:59.021350 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0218 11:36:59.021353 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:36:59.021356 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:36:59.021636 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0218 11:36:59.024101 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b891a2fe938b437cbfb28c7715d0ad758541d844a4768434d3c1f1a3f0578c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:53Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.669692 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://454f72313609f075cd3f659b19279f8c3b07e8fea70f0ca7c88c5ca8e2e57397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0577fe2fd5192d4ec36ec8bfc6259a533b6422ef672d8153f601c2ac96bb5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:53Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.683392 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.683440 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.683451 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.683468 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.683480 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:53Z","lastTransitionTime":"2026-02-18T11:37:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.687729 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26zbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592c6351-c252-4c19-b3b1-167096be2de9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fca36b120b6a8f3bf08a810358bdd23b8b3c3105ede8e850232a9fef2d322db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26zbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:53Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.701978 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjnxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"648f85d5-dbc6-4db6-b590-3edc96740212\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e419e9c268c54b7829fa957e360c5b2c9943232871f9773ab32017c36a44b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8l2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61f493c42d7223ae48d738cf9973c2f6f5997722b017cc1665ec5f27ea0542dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8l2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:37:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjnxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:53Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.719050 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:53Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.734202 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:53Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.749051 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w46bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"067f44ac-9e60-4581-87cc-f2e1c823fc4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c52fc2f1b105e90957f8cd7424352990f9ec9e5350d1a501980bfda1613066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ks8d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w46bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:53Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.761471 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pspfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702cf45-b47b-4291-a553-5bfc7bc22674\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7224x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7224x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:37:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pspfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:53Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.780747 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47a2352-be16-435d-9207-078c64f99e77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d641439c1ae42b2579e47041fc257fa37643ec1ef4fcacf55b15b82d6ec02cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dda1daddcf256e52a7dc75c6b5c221fcd8b8fad50c50b6053774f9a8fab57e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac316efde113b3547d00f480ee2c658958ba4f99c719d0920f333ab2a62ebaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6a8f8152ee7536b35054611238f4050c7f0e3a756f74ae8525af7f50150f88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:53Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.796411 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.796588 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.796677 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.796710 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.796745 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:53Z","lastTransitionTime":"2026-02-18T11:37:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.799565 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"143e32d8-7a32-4ab1-9790-a5b4fe1f0180\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c70b47d4c3df2edbdf1c33bdcff5642932d5212a23d54b20e79ec909b2e3de12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e27a0c7d36ad7d1c3f884b9bb69b963a922ad7851d263b3761ed5bc7e79a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a488435a081bdf17035a1ff4df73b0723045f0eef9049ec259198c697ecc33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39ce7973e25ba416a5f82cbafb8c1b3bea34718c49f8a8a56bd46970dc688727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39ce7973e25ba416a5f82cbafb8c1b3bea34718c49f8a8a56bd46970dc688727\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:53Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.813715 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:53Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.899132 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.899163 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.899173 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.899189 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.899199 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:53Z","lastTransitionTime":"2026-02-18T11:37:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.961291 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 04:53:53.286042603 +0000 UTC Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.972345 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:37:53 crc kubenswrapper[4922]: E0218 11:37:53.972519 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pspfr" podUID="4702cf45-b47b-4291-a553-5bfc7bc22674" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.972744 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:37:53 crc kubenswrapper[4922]: E0218 11:37:53.972801 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.972929 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:37:53 crc kubenswrapper[4922]: E0218 11:37:53.973003 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:37:53 crc kubenswrapper[4922]: I0218 11:37:53.973216 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:37:53 crc kubenswrapper[4922]: E0218 11:37:53.973296 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:37:54 crc kubenswrapper[4922]: I0218 11:37:54.002461 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:54 crc kubenswrapper[4922]: I0218 11:37:54.002517 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:54 crc kubenswrapper[4922]: I0218 11:37:54.002528 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:54 crc kubenswrapper[4922]: I0218 11:37:54.002551 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:54 crc kubenswrapper[4922]: I0218 11:37:54.002564 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:54Z","lastTransitionTime":"2026-02-18T11:37:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:54 crc kubenswrapper[4922]: I0218 11:37:54.104802 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:54 crc kubenswrapper[4922]: I0218 11:37:54.104841 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:54 crc kubenswrapper[4922]: I0218 11:37:54.104851 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:54 crc kubenswrapper[4922]: I0218 11:37:54.104870 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:54 crc kubenswrapper[4922]: I0218 11:37:54.104881 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:54Z","lastTransitionTime":"2026-02-18T11:37:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:54 crc kubenswrapper[4922]: I0218 11:37:54.207173 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:54 crc kubenswrapper[4922]: I0218 11:37:54.207210 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:54 crc kubenswrapper[4922]: I0218 11:37:54.207218 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:54 crc kubenswrapper[4922]: I0218 11:37:54.207235 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:54 crc kubenswrapper[4922]: I0218 11:37:54.207246 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:54Z","lastTransitionTime":"2026-02-18T11:37:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:54 crc kubenswrapper[4922]: I0218 11:37:54.311091 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:54 crc kubenswrapper[4922]: I0218 11:37:54.311137 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:54 crc kubenswrapper[4922]: I0218 11:37:54.311147 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:54 crc kubenswrapper[4922]: I0218 11:37:54.311164 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:54 crc kubenswrapper[4922]: I0218 11:37:54.311177 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:54Z","lastTransitionTime":"2026-02-18T11:37:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:54 crc kubenswrapper[4922]: I0218 11:37:54.416813 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:54 crc kubenswrapper[4922]: I0218 11:37:54.416873 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:54 crc kubenswrapper[4922]: I0218 11:37:54.416883 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:54 crc kubenswrapper[4922]: I0218 11:37:54.416906 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:54 crc kubenswrapper[4922]: I0218 11:37:54.416926 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:54Z","lastTransitionTime":"2026-02-18T11:37:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:54 crc kubenswrapper[4922]: I0218 11:37:54.520863 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:54 crc kubenswrapper[4922]: I0218 11:37:54.520927 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:54 crc kubenswrapper[4922]: I0218 11:37:54.520951 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:54 crc kubenswrapper[4922]: I0218 11:37:54.520985 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:54 crc kubenswrapper[4922]: I0218 11:37:54.521011 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:54Z","lastTransitionTime":"2026-02-18T11:37:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:54 crc kubenswrapper[4922]: I0218 11:37:54.624348 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:54 crc kubenswrapper[4922]: I0218 11:37:54.624473 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:54 crc kubenswrapper[4922]: I0218 11:37:54.624498 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:54 crc kubenswrapper[4922]: I0218 11:37:54.624530 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:54 crc kubenswrapper[4922]: I0218 11:37:54.624549 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:54Z","lastTransitionTime":"2026-02-18T11:37:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:54 crc kubenswrapper[4922]: I0218 11:37:54.727085 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:54 crc kubenswrapper[4922]: I0218 11:37:54.727160 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:54 crc kubenswrapper[4922]: I0218 11:37:54.727178 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:54 crc kubenswrapper[4922]: I0218 11:37:54.727202 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:54 crc kubenswrapper[4922]: I0218 11:37:54.727223 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:54Z","lastTransitionTime":"2026-02-18T11:37:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:54 crc kubenswrapper[4922]: I0218 11:37:54.830950 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:54 crc kubenswrapper[4922]: I0218 11:37:54.831032 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:54 crc kubenswrapper[4922]: I0218 11:37:54.831051 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:54 crc kubenswrapper[4922]: I0218 11:37:54.831088 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:54 crc kubenswrapper[4922]: I0218 11:37:54.831124 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:54Z","lastTransitionTime":"2026-02-18T11:37:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:54 crc kubenswrapper[4922]: I0218 11:37:54.933918 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:54 crc kubenswrapper[4922]: I0218 11:37:54.933969 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:54 crc kubenswrapper[4922]: I0218 11:37:54.933981 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:54 crc kubenswrapper[4922]: I0218 11:37:54.934001 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:54 crc kubenswrapper[4922]: I0218 11:37:54.934017 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:54Z","lastTransitionTime":"2026-02-18T11:37:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:54 crc kubenswrapper[4922]: I0218 11:37:54.961818 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 04:44:28.962686355 +0000 UTC Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.036905 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.036962 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.036978 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.037000 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.037019 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:55Z","lastTransitionTime":"2026-02-18T11:37:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.140426 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.140483 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.140495 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.140518 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.140535 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:55Z","lastTransitionTime":"2026-02-18T11:37:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.243559 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.243625 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.243641 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.243664 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.243679 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:55Z","lastTransitionTime":"2026-02-18T11:37:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.347231 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.347275 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.347287 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.347306 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.347319 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:55Z","lastTransitionTime":"2026-02-18T11:37:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.450730 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.450793 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.450814 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.450834 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.450846 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:55Z","lastTransitionTime":"2026-02-18T11:37:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.520200 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.520289 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.520318 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.520350 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.520436 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:55Z","lastTransitionTime":"2026-02-18T11:37:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:55 crc kubenswrapper[4922]: E0218 11:37:55.545086 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"142c3684-8991-4ed8-97d2-827c32777413\\\",\\\"systemUUID\\\":\\\"e6651d54-876d-4ba5-b7d2-813fd96498e1\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:55Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.551269 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.551321 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.551339 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.551397 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.551415 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:55Z","lastTransitionTime":"2026-02-18T11:37:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:55 crc kubenswrapper[4922]: E0218 11:37:55.570157 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"142c3684-8991-4ed8-97d2-827c32777413\\\",\\\"systemUUID\\\":\\\"e6651d54-876d-4ba5-b7d2-813fd96498e1\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:55Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.577009 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.577088 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.577101 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.577118 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.577129 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:55Z","lastTransitionTime":"2026-02-18T11:37:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:55 crc kubenswrapper[4922]: E0218 11:37:55.598729 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"142c3684-8991-4ed8-97d2-827c32777413\\\",\\\"systemUUID\\\":\\\"e6651d54-876d-4ba5-b7d2-813fd96498e1\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:55Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.604590 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.604733 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.604763 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.604816 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.604854 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:55Z","lastTransitionTime":"2026-02-18T11:37:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:55 crc kubenswrapper[4922]: E0218 11:37:55.625980 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"142c3684-8991-4ed8-97d2-827c32777413\\\",\\\"systemUUID\\\":\\\"e6651d54-876d-4ba5-b7d2-813fd96498e1\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:55Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.630958 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.631002 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.631015 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.631032 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.631045 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:55Z","lastTransitionTime":"2026-02-18T11:37:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:55 crc kubenswrapper[4922]: E0218 11:37:55.657073 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:37:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"142c3684-8991-4ed8-97d2-827c32777413\\\",\\\"systemUUID\\\":\\\"e6651d54-876d-4ba5-b7d2-813fd96498e1\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:55Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:55 crc kubenswrapper[4922]: E0218 11:37:55.657323 4922 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.660255 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.660301 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.660316 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.660340 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.660377 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:55Z","lastTransitionTime":"2026-02-18T11:37:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.763645 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.763709 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.763720 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.763741 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.763755 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:55Z","lastTransitionTime":"2026-02-18T11:37:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.867272 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.867398 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.867423 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.867454 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.867478 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:55Z","lastTransitionTime":"2026-02-18T11:37:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.962971 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 09:15:00.593576208 +0000 UTC Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.970917 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.971084 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.971189 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.971291 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.971390 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:55Z","lastTransitionTime":"2026-02-18T11:37:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.972270 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:37:55 crc kubenswrapper[4922]: E0218 11:37:55.972519 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.972781 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:37:55 crc kubenswrapper[4922]: E0218 11:37:55.972878 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.973131 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:37:55 crc kubenswrapper[4922]: E0218 11:37:55.973258 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pspfr" podUID="4702cf45-b47b-4291-a553-5bfc7bc22674" Feb 18 11:37:55 crc kubenswrapper[4922]: I0218 11:37:55.973444 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:37:55 crc kubenswrapper[4922]: E0218 11:37:55.973585 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:37:56 crc kubenswrapper[4922]: I0218 11:37:56.073909 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:56 crc kubenswrapper[4922]: I0218 11:37:56.073995 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:56 crc kubenswrapper[4922]: I0218 11:37:56.074020 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:56 crc kubenswrapper[4922]: I0218 11:37:56.074054 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:56 crc kubenswrapper[4922]: I0218 11:37:56.074076 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:56Z","lastTransitionTime":"2026-02-18T11:37:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:56 crc kubenswrapper[4922]: I0218 11:37:56.177077 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:56 crc kubenswrapper[4922]: I0218 11:37:56.177469 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:56 crc kubenswrapper[4922]: I0218 11:37:56.177629 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:56 crc kubenswrapper[4922]: I0218 11:37:56.177775 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:56 crc kubenswrapper[4922]: I0218 11:37:56.177919 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:56Z","lastTransitionTime":"2026-02-18T11:37:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:56 crc kubenswrapper[4922]: I0218 11:37:56.281944 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:56 crc kubenswrapper[4922]: I0218 11:37:56.282493 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:56 crc kubenswrapper[4922]: I0218 11:37:56.282793 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:56 crc kubenswrapper[4922]: I0218 11:37:56.283036 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:56 crc kubenswrapper[4922]: I0218 11:37:56.283213 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:56Z","lastTransitionTime":"2026-02-18T11:37:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:56 crc kubenswrapper[4922]: I0218 11:37:56.386103 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:56 crc kubenswrapper[4922]: I0218 11:37:56.386420 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:56 crc kubenswrapper[4922]: I0218 11:37:56.386513 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:56 crc kubenswrapper[4922]: I0218 11:37:56.386600 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:56 crc kubenswrapper[4922]: I0218 11:37:56.386679 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:56Z","lastTransitionTime":"2026-02-18T11:37:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:56 crc kubenswrapper[4922]: I0218 11:37:56.489771 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:56 crc kubenswrapper[4922]: I0218 11:37:56.489827 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:56 crc kubenswrapper[4922]: I0218 11:37:56.489847 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:56 crc kubenswrapper[4922]: I0218 11:37:56.489875 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:56 crc kubenswrapper[4922]: I0218 11:37:56.489898 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:56Z","lastTransitionTime":"2026-02-18T11:37:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:56 crc kubenswrapper[4922]: I0218 11:37:56.593469 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:56 crc kubenswrapper[4922]: I0218 11:37:56.593535 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:56 crc kubenswrapper[4922]: I0218 11:37:56.593552 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:56 crc kubenswrapper[4922]: I0218 11:37:56.593575 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:56 crc kubenswrapper[4922]: I0218 11:37:56.593589 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:56Z","lastTransitionTime":"2026-02-18T11:37:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:56 crc kubenswrapper[4922]: I0218 11:37:56.697156 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:56 crc kubenswrapper[4922]: I0218 11:37:56.697230 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:56 crc kubenswrapper[4922]: I0218 11:37:56.697294 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:56 crc kubenswrapper[4922]: I0218 11:37:56.697316 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:56 crc kubenswrapper[4922]: I0218 11:37:56.697326 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:56Z","lastTransitionTime":"2026-02-18T11:37:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:56 crc kubenswrapper[4922]: I0218 11:37:56.800790 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:56 crc kubenswrapper[4922]: I0218 11:37:56.800873 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:56 crc kubenswrapper[4922]: I0218 11:37:56.800884 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:56 crc kubenswrapper[4922]: I0218 11:37:56.800904 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:56 crc kubenswrapper[4922]: I0218 11:37:56.800919 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:56Z","lastTransitionTime":"2026-02-18T11:37:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:56 crc kubenswrapper[4922]: I0218 11:37:56.904477 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:56 crc kubenswrapper[4922]: I0218 11:37:56.904551 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:56 crc kubenswrapper[4922]: I0218 11:37:56.904570 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:56 crc kubenswrapper[4922]: I0218 11:37:56.904599 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:56 crc kubenswrapper[4922]: I0218 11:37:56.904620 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:56Z","lastTransitionTime":"2026-02-18T11:37:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:56 crc kubenswrapper[4922]: I0218 11:37:56.964045 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 22:16:35.377900249 +0000 UTC Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.008389 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.008470 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.008485 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.008513 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.008535 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:57Z","lastTransitionTime":"2026-02-18T11:37:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.111829 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.111920 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.111970 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.112012 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.112040 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:57Z","lastTransitionTime":"2026-02-18T11:37:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.216025 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.216127 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.216138 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.216158 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.216174 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:57Z","lastTransitionTime":"2026-02-18T11:37:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.319481 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.319533 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.319544 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.319561 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.319587 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:57Z","lastTransitionTime":"2026-02-18T11:37:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.423544 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.423631 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.423658 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.423693 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.423717 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:57Z","lastTransitionTime":"2026-02-18T11:37:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.525929 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.525993 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.526007 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.526038 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.526053 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:57Z","lastTransitionTime":"2026-02-18T11:37:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.628870 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.628917 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.628929 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.628946 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.628960 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:57Z","lastTransitionTime":"2026-02-18T11:37:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.733930 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.734013 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.734035 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.734066 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.734090 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:57Z","lastTransitionTime":"2026-02-18T11:37:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.837747 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.837830 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.837854 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.837886 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.837912 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:57Z","lastTransitionTime":"2026-02-18T11:37:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.941817 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.941890 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.941915 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.941941 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.941956 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:57Z","lastTransitionTime":"2026-02-18T11:37:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.964486 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 10:13:20.457629417 +0000 UTC Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.973091 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.973155 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.973192 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:37:57 crc kubenswrapper[4922]: I0218 11:37:57.973155 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:37:57 crc kubenswrapper[4922]: E0218 11:37:57.973399 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:37:57 crc kubenswrapper[4922]: E0218 11:37:57.973599 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:37:57 crc kubenswrapper[4922]: E0218 11:37:57.973951 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pspfr" podUID="4702cf45-b47b-4291-a553-5bfc7bc22674" Feb 18 11:37:57 crc kubenswrapper[4922]: E0218 11:37:57.974134 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:37:58 crc kubenswrapper[4922]: I0218 11:37:58.045182 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:58 crc kubenswrapper[4922]: I0218 11:37:58.045224 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:58 crc kubenswrapper[4922]: I0218 11:37:58.045234 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:58 crc kubenswrapper[4922]: I0218 11:37:58.045251 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:58 crc kubenswrapper[4922]: I0218 11:37:58.045263 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:58Z","lastTransitionTime":"2026-02-18T11:37:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:58 crc kubenswrapper[4922]: I0218 11:37:58.149158 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:58 crc kubenswrapper[4922]: I0218 11:37:58.149229 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:58 crc kubenswrapper[4922]: I0218 11:37:58.149248 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:58 crc kubenswrapper[4922]: I0218 11:37:58.149279 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:58 crc kubenswrapper[4922]: I0218 11:37:58.149330 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:58Z","lastTransitionTime":"2026-02-18T11:37:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:58 crc kubenswrapper[4922]: I0218 11:37:58.252615 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:58 crc kubenswrapper[4922]: I0218 11:37:58.252703 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:58 crc kubenswrapper[4922]: I0218 11:37:58.252724 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:58 crc kubenswrapper[4922]: I0218 11:37:58.252746 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:58 crc kubenswrapper[4922]: I0218 11:37:58.252758 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:58Z","lastTransitionTime":"2026-02-18T11:37:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:58 crc kubenswrapper[4922]: I0218 11:37:58.356012 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:58 crc kubenswrapper[4922]: I0218 11:37:58.356343 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:58 crc kubenswrapper[4922]: I0218 11:37:58.356487 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:58 crc kubenswrapper[4922]: I0218 11:37:58.356509 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:58 crc kubenswrapper[4922]: I0218 11:37:58.356521 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:58Z","lastTransitionTime":"2026-02-18T11:37:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:58 crc kubenswrapper[4922]: I0218 11:37:58.460547 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:58 crc kubenswrapper[4922]: I0218 11:37:58.460659 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:58 crc kubenswrapper[4922]: I0218 11:37:58.460707 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:58 crc kubenswrapper[4922]: I0218 11:37:58.460744 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:58 crc kubenswrapper[4922]: I0218 11:37:58.460768 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:58Z","lastTransitionTime":"2026-02-18T11:37:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:58 crc kubenswrapper[4922]: I0218 11:37:58.565170 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:58 crc kubenswrapper[4922]: I0218 11:37:58.565249 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:58 crc kubenswrapper[4922]: I0218 11:37:58.565264 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:58 crc kubenswrapper[4922]: I0218 11:37:58.565287 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:58 crc kubenswrapper[4922]: I0218 11:37:58.565302 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:58Z","lastTransitionTime":"2026-02-18T11:37:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:58 crc kubenswrapper[4922]: I0218 11:37:58.668539 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:58 crc kubenswrapper[4922]: I0218 11:37:58.668609 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:58 crc kubenswrapper[4922]: I0218 11:37:58.668627 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:58 crc kubenswrapper[4922]: I0218 11:37:58.668655 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:58 crc kubenswrapper[4922]: I0218 11:37:58.668675 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:58Z","lastTransitionTime":"2026-02-18T11:37:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:58 crc kubenswrapper[4922]: I0218 11:37:58.773074 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:58 crc kubenswrapper[4922]: I0218 11:37:58.773136 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:58 crc kubenswrapper[4922]: I0218 11:37:58.773145 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:58 crc kubenswrapper[4922]: I0218 11:37:58.773165 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:58 crc kubenswrapper[4922]: I0218 11:37:58.773176 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:58Z","lastTransitionTime":"2026-02-18T11:37:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:58 crc kubenswrapper[4922]: I0218 11:37:58.875870 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:58 crc kubenswrapper[4922]: I0218 11:37:58.875920 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:58 crc kubenswrapper[4922]: I0218 11:37:58.875933 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:58 crc kubenswrapper[4922]: I0218 11:37:58.875949 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:58 crc kubenswrapper[4922]: I0218 11:37:58.875962 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:58Z","lastTransitionTime":"2026-02-18T11:37:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:58 crc kubenswrapper[4922]: I0218 11:37:58.965703 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 14:30:00.825208479 +0000 UTC Feb 18 11:37:58 crc kubenswrapper[4922]: I0218 11:37:58.977964 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:58 crc kubenswrapper[4922]: I0218 11:37:58.978012 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:58 crc kubenswrapper[4922]: I0218 11:37:58.978023 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:58 crc kubenswrapper[4922]: I0218 11:37:58.978040 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:58 crc kubenswrapper[4922]: I0218 11:37:58.978054 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:58Z","lastTransitionTime":"2026-02-18T11:37:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:58 crc kubenswrapper[4922]: I0218 11:37:58.991218 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjnxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"648f85d5-dbc6-4db6-b590-3edc96740212\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e419e9c268c54b7829fa957e360c5b2c9943232871f9773ab32017c36a44b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8l2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61f493c42d7223ae48d738cf9973c2f6f5997722b017cc1665ec5f27ea0542dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8l2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:37:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjnxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:58Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.012083 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"384d508e-8f6a-427d-a85d-85bb61d8405e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dacf3659cbb51e3d8df303470837041f3110f38b42187d40cbb4e53a8750cafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab832676116da0ae3d9dc70d182e283d7ae2418db02c1ea0062812cab0d71b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://434010c9772cf1c9db5578ecd4726ae2de0c0a3923d45b522868677cf78fcaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af62821d90e57595091aa9e7f1083acd21cb955ada0837c5ab4f71885ff7f1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0218 11:36:58.792829 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0218 11:36:58.792990 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0218 11:36:58.794119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-56419459/tls.crt::/tmp/serving-cert-56419459/tls.key\\\\\\\"\\\\nI0218 11:36:58.994875 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0218 11:36:59.009974 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0218 11:36:59.010010 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0218 11:36:59.010039 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0218 11:36:59.010047 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0218 11:36:59.021309 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0218 11:36:59.021338 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021343 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021347 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0218 11:36:59.021350 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0218 11:36:59.021353 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:36:59.021356 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:36:59.021636 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0218 11:36:59.024101 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b891a2fe938b437cbfb28c7715d0ad758541d844a4768434d3c1f1a3f0578c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:59Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.027907 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://454f72313609f075cd3f659b19279f8c3b07e8fea70f0ca7c88c5ca8e2e57397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0577fe2fd5192d4ec36ec8bfc6259a533b6422ef672d8153f601c2ac96bb5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:59Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.048122 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26zbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592c6351-c252-4c19-b3b1-167096be2de9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fca36b120b6a8f3bf08a810358bdd23b8b3c3105ede8e850232a9fef2d322db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26zbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:59Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.066092 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:59Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.079964 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.079996 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.080006 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.080021 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.080032 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:59Z","lastTransitionTime":"2026-02-18T11:37:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.085270 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:59Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.103157 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:59Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.117232 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w46bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"067f44ac-9e60-4581-87cc-f2e1c823fc4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c52fc2f1b105e90957f8cd7424352990f9ec9e5350d1a501980bfda1613066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ks8d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w46bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:59Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.130218 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pspfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702cf45-b47b-4291-a553-5bfc7bc22674\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7224x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7224x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:37:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pspfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:59Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.141585 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47a2352-be16-435d-9207-078c64f99e77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d641439c1ae42b2579e47041fc257fa37643ec1ef4fcacf55b15b82d6ec02cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dda1daddcf256e52a7dc75c6b5c221fcd8b8fad50c50b6053774f9a8fab57e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac316efde113b3547d00f480ee2c658958ba4f99c719d0920f333ab2a62ebaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6a8f8152ee7536b35054611238f4050c7f0e3a756f74ae8525af7f50150f88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:59Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.153453 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"143e32d8-7a32-4ab1-9790-a5b4fe1f0180\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c70b47d4c3df2edbdf1c33bdcff5642932d5212a23d54b20e79ec909b2e3de12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e27a0c7d36ad7d1c3f884b9bb69b963a922ad7851d263b3761ed5bc7e79a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a488435a081bdf17035a1ff4df73b0723045f0eef9049ec259198c697ecc33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39ce7973e25ba416a5f82cbafb8c1b3bea34718c49f8a8a56bd46970dc688727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39ce7973e25ba416a5f82cbafb8c1b3bea34718c49f8a8a56bd46970dc688727\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:59Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.167044 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3079844ef727cc58eb6253c7eaa0289ca84977a00334a10cdbf493b16133a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:59Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.179914 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdb7cedc-b2e3-48f0-80e0-e17073b43228\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c29cb76dded770e8691cd270c1c1197ffd3764c3fa73faf7f072a63c9e3e4b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653a3be8615ffc467d4be70b77ab62c837c24df920515a9965ec20ce2941d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-znglx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:59Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.182986 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.183026 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.183041 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.183061 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.183073 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:59Z","lastTransitionTime":"2026-02-18T11:37:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.201637 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"653a41bb-bb1d-421c-a92b-7f2811d95edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d1aca901c6bb735f04a76a87f7a05d6c549b8f936a07fb1917ece2b9fdd0081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d1aca901c6bb735f04a76a87f7a05d6c549b8f936a07fb1917ece2b9fdd0081\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:37:51Z\\\",\\\"message\\\":\\\"ernalversions/factory.go:141\\\\nI0218 11:37:51.742382 6997 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0218 11:37:51.742628 6997 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:37:51.742905 6997 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:37:51.743039 6997 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0218 11:37:51.743155 6997 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:37:51.744128 6997 factory.go:656] Stopping watch factory\\\\nI0218 11:37:51.758771 6997 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0218 11:37:51.758806 6997 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0218 11:37:51.758875 6997 ovnkube.go:599] Stopped ovnkube\\\\nI0218 11:37:51.758908 6997 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0218 11:37:51.759034 6997 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-wg4r5_openshift-ovn-kubernetes(653a41bb-bb1d-421c-a92b-7f2811d95edf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wg4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:59Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.225440 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bbe0881c-3173-4207-adb8-72c0dd4fa9c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e73616098bbc0fcdcd0b7e6b5bf610a7c193bdb65922e4532ce9553e9fc6e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71f2a53a03eafd89329b955a7edbe71c610706392e0a07fa09b9a6247889913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cac665b2d5a707e29339e7e8afa75dffcfd1dc1f42262e2643b2487ae68558a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0517235f78003f9b1a9044287c23aebed811299dc48f24c6643dd3bfcf5cf1d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a0c8c34dd5237a5b2d52b436f5c4deabd96a96ccb8073961dc49aaffd62c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:59Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.236828 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65825dc8bc12fe9933ff1392fd17a19042bf62d65c0b1ba22303094881745e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:59Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.247879 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9xzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b4595ac-c521-4ada-950d-e1b01cdff99b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d0689c83ee6c0c1704fac1f69846ec7d6ef1b16479ee4eae33acefd6b84765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83f784f6e4f98fea5928c944062388eaa69d5a5782490bdb120cd735f54f52df\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:37:47Z\\\",\\\"message\\\":\\\"2026-02-18T11:37:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_66f40f5e-a7a5-411c-a907-9573d8b5508f\\\\n2026-02-18T11:37:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_66f40f5e-a7a5-411c-a907-9573d8b5508f to /host/opt/cni/bin/\\\\n2026-02-18T11:37:02Z [verbose] multus-daemon started\\\\n2026-02-18T11:37:02Z [verbose] Readiness Indicator file check\\\\n2026-02-18T11:37:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zv6rh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9xzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:59Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.256244 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5qkb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cd3723d-a12f-4c7c-a1ea-63bfef3c931a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ff8efc7bd7cb5045841f85767cea766898289dab6f4081829d530fbaaf9dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c45cw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5qkb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:37:59Z is after 2025-08-24T17:21:41Z" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.286228 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.286264 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.286276 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.286289 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.286297 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:59Z","lastTransitionTime":"2026-02-18T11:37:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.388562 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.388873 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.388893 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.388917 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.388934 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:59Z","lastTransitionTime":"2026-02-18T11:37:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.491267 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.491326 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.491342 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.491383 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.491401 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:59Z","lastTransitionTime":"2026-02-18T11:37:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.593901 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.593937 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.593945 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.593958 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.593967 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:59Z","lastTransitionTime":"2026-02-18T11:37:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.696508 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.696554 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.696563 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.696581 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.696592 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:59Z","lastTransitionTime":"2026-02-18T11:37:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.800670 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.800763 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.800790 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.800825 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.800849 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:59Z","lastTransitionTime":"2026-02-18T11:37:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.904442 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.904536 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.904565 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.904597 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.904622 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:37:59Z","lastTransitionTime":"2026-02-18T11:37:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.966853 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 19:49:47.569054548 +0000 UTC Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.972226 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:37:59 crc kubenswrapper[4922]: E0218 11:37:59.972398 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.972454 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:37:59 crc kubenswrapper[4922]: E0218 11:37:59.972512 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.972515 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:37:59 crc kubenswrapper[4922]: I0218 11:37:59.972241 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:37:59 crc kubenswrapper[4922]: E0218 11:37:59.972727 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:37:59 crc kubenswrapper[4922]: E0218 11:37:59.972867 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pspfr" podUID="4702cf45-b47b-4291-a553-5bfc7bc22674" Feb 18 11:38:00 crc kubenswrapper[4922]: I0218 11:38:00.007179 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:00 crc kubenswrapper[4922]: I0218 11:38:00.007237 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:00 crc kubenswrapper[4922]: I0218 11:38:00.007250 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:00 crc kubenswrapper[4922]: I0218 11:38:00.007269 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:00 crc kubenswrapper[4922]: I0218 11:38:00.007281 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:00Z","lastTransitionTime":"2026-02-18T11:38:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:00 crc kubenswrapper[4922]: I0218 11:38:00.110792 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:00 crc kubenswrapper[4922]: I0218 11:38:00.110862 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:00 crc kubenswrapper[4922]: I0218 11:38:00.110877 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:00 crc kubenswrapper[4922]: I0218 11:38:00.110897 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:00 crc kubenswrapper[4922]: I0218 11:38:00.110911 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:00Z","lastTransitionTime":"2026-02-18T11:38:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:00 crc kubenswrapper[4922]: I0218 11:38:00.214139 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:00 crc kubenswrapper[4922]: I0218 11:38:00.214183 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:00 crc kubenswrapper[4922]: I0218 11:38:00.214192 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:00 crc kubenswrapper[4922]: I0218 11:38:00.214206 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:00 crc kubenswrapper[4922]: I0218 11:38:00.214218 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:00Z","lastTransitionTime":"2026-02-18T11:38:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:00 crc kubenswrapper[4922]: I0218 11:38:00.317481 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:00 crc kubenswrapper[4922]: I0218 11:38:00.317528 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:00 crc kubenswrapper[4922]: I0218 11:38:00.317543 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:00 crc kubenswrapper[4922]: I0218 11:38:00.317564 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:00 crc kubenswrapper[4922]: I0218 11:38:00.317580 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:00Z","lastTransitionTime":"2026-02-18T11:38:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:00 crc kubenswrapper[4922]: I0218 11:38:00.421136 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:00 crc kubenswrapper[4922]: I0218 11:38:00.421227 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:00 crc kubenswrapper[4922]: I0218 11:38:00.421252 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:00 crc kubenswrapper[4922]: I0218 11:38:00.421283 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:00 crc kubenswrapper[4922]: I0218 11:38:00.421306 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:00Z","lastTransitionTime":"2026-02-18T11:38:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:00 crc kubenswrapper[4922]: I0218 11:38:00.523768 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:00 crc kubenswrapper[4922]: I0218 11:38:00.523837 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:00 crc kubenswrapper[4922]: I0218 11:38:00.523854 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:00 crc kubenswrapper[4922]: I0218 11:38:00.523880 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:00 crc kubenswrapper[4922]: I0218 11:38:00.523898 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:00Z","lastTransitionTime":"2026-02-18T11:38:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:00 crc kubenswrapper[4922]: I0218 11:38:00.626498 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:00 crc kubenswrapper[4922]: I0218 11:38:00.626559 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:00 crc kubenswrapper[4922]: I0218 11:38:00.626595 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:00 crc kubenswrapper[4922]: I0218 11:38:00.626622 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:00 crc kubenswrapper[4922]: I0218 11:38:00.626642 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:00Z","lastTransitionTime":"2026-02-18T11:38:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:00 crc kubenswrapper[4922]: I0218 11:38:00.729869 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:00 crc kubenswrapper[4922]: I0218 11:38:00.729938 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:00 crc kubenswrapper[4922]: I0218 11:38:00.729966 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:00 crc kubenswrapper[4922]: I0218 11:38:00.729995 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:00 crc kubenswrapper[4922]: I0218 11:38:00.730015 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:00Z","lastTransitionTime":"2026-02-18T11:38:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:00 crc kubenswrapper[4922]: I0218 11:38:00.833745 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:00 crc kubenswrapper[4922]: I0218 11:38:00.833794 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:00 crc kubenswrapper[4922]: I0218 11:38:00.833811 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:00 crc kubenswrapper[4922]: I0218 11:38:00.833835 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:00 crc kubenswrapper[4922]: I0218 11:38:00.833852 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:00Z","lastTransitionTime":"2026-02-18T11:38:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:00 crc kubenswrapper[4922]: I0218 11:38:00.936939 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:00 crc kubenswrapper[4922]: I0218 11:38:00.936998 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:00 crc kubenswrapper[4922]: I0218 11:38:00.937017 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:00 crc kubenswrapper[4922]: I0218 11:38:00.937040 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:00 crc kubenswrapper[4922]: I0218 11:38:00.937060 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:00Z","lastTransitionTime":"2026-02-18T11:38:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:00 crc kubenswrapper[4922]: I0218 11:38:00.967220 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 12:22:09.320796657 +0000 UTC Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.040237 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.040352 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.040404 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.040434 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.040452 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:01Z","lastTransitionTime":"2026-02-18T11:38:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.143293 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.143337 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.143349 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.143380 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.143393 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:01Z","lastTransitionTime":"2026-02-18T11:38:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.246228 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.246308 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.246342 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.246403 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.246426 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:01Z","lastTransitionTime":"2026-02-18T11:38:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.349964 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.350030 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.350048 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.350073 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.350091 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:01Z","lastTransitionTime":"2026-02-18T11:38:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.453096 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.453155 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.453167 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.453186 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.453198 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:01Z","lastTransitionTime":"2026-02-18T11:38:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.556052 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.556107 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.556118 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.556137 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.556150 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:01Z","lastTransitionTime":"2026-02-18T11:38:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.659891 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.659963 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.659987 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.660017 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.660043 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:01Z","lastTransitionTime":"2026-02-18T11:38:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.763307 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.763402 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.763421 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.763443 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.763460 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:01Z","lastTransitionTime":"2026-02-18T11:38:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.866415 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.866467 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.866486 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.866509 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.866529 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:01Z","lastTransitionTime":"2026-02-18T11:38:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.967313 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 02:30:09.330633274 +0000 UTC Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.969413 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.969452 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.969468 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.969490 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.969506 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:01Z","lastTransitionTime":"2026-02-18T11:38:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.972705 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:38:01 crc kubenswrapper[4922]: E0218 11:38:01.972905 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pspfr" podUID="4702cf45-b47b-4291-a553-5bfc7bc22674" Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.973160 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:38:01 crc kubenswrapper[4922]: E0218 11:38:01.973264 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.973516 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:38:01 crc kubenswrapper[4922]: E0218 11:38:01.973622 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:38:01 crc kubenswrapper[4922]: I0218 11:38:01.973820 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:38:01 crc kubenswrapper[4922]: E0218 11:38:01.973914 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:38:02 crc kubenswrapper[4922]: I0218 11:38:02.072079 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:02 crc kubenswrapper[4922]: I0218 11:38:02.072128 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:02 crc kubenswrapper[4922]: I0218 11:38:02.072140 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:02 crc kubenswrapper[4922]: I0218 11:38:02.072159 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:02 crc kubenswrapper[4922]: I0218 11:38:02.072172 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:02Z","lastTransitionTime":"2026-02-18T11:38:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:02 crc kubenswrapper[4922]: I0218 11:38:02.174909 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:02 crc kubenswrapper[4922]: I0218 11:38:02.175019 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:02 crc kubenswrapper[4922]: I0218 11:38:02.175046 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:02 crc kubenswrapper[4922]: I0218 11:38:02.175074 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:02 crc kubenswrapper[4922]: I0218 11:38:02.175095 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:02Z","lastTransitionTime":"2026-02-18T11:38:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:02 crc kubenswrapper[4922]: I0218 11:38:02.278019 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:02 crc kubenswrapper[4922]: I0218 11:38:02.278083 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:02 crc kubenswrapper[4922]: I0218 11:38:02.278106 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:02 crc kubenswrapper[4922]: I0218 11:38:02.278141 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:02 crc kubenswrapper[4922]: I0218 11:38:02.278164 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:02Z","lastTransitionTime":"2026-02-18T11:38:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:02 crc kubenswrapper[4922]: I0218 11:38:02.381335 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:02 crc kubenswrapper[4922]: I0218 11:38:02.381439 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:02 crc kubenswrapper[4922]: I0218 11:38:02.381466 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:02 crc kubenswrapper[4922]: I0218 11:38:02.381497 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:02 crc kubenswrapper[4922]: I0218 11:38:02.381515 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:02Z","lastTransitionTime":"2026-02-18T11:38:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:02 crc kubenswrapper[4922]: I0218 11:38:02.484985 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:02 crc kubenswrapper[4922]: I0218 11:38:02.485050 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:02 crc kubenswrapper[4922]: I0218 11:38:02.485075 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:02 crc kubenswrapper[4922]: I0218 11:38:02.485104 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:02 crc kubenswrapper[4922]: I0218 11:38:02.485127 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:02Z","lastTransitionTime":"2026-02-18T11:38:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:02 crc kubenswrapper[4922]: I0218 11:38:02.588629 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:02 crc kubenswrapper[4922]: I0218 11:38:02.588778 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:02 crc kubenswrapper[4922]: I0218 11:38:02.588810 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:02 crc kubenswrapper[4922]: I0218 11:38:02.588839 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:02 crc kubenswrapper[4922]: I0218 11:38:02.588859 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:02Z","lastTransitionTime":"2026-02-18T11:38:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:02 crc kubenswrapper[4922]: I0218 11:38:02.691875 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:02 crc kubenswrapper[4922]: I0218 11:38:02.691941 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:02 crc kubenswrapper[4922]: I0218 11:38:02.691967 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:02 crc kubenswrapper[4922]: I0218 11:38:02.691995 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:02 crc kubenswrapper[4922]: I0218 11:38:02.692017 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:02Z","lastTransitionTime":"2026-02-18T11:38:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:02 crc kubenswrapper[4922]: I0218 11:38:02.795267 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:02 crc kubenswrapper[4922]: I0218 11:38:02.795520 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:02 crc kubenswrapper[4922]: I0218 11:38:02.795560 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:02 crc kubenswrapper[4922]: I0218 11:38:02.795591 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:02 crc kubenswrapper[4922]: I0218 11:38:02.795612 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:02Z","lastTransitionTime":"2026-02-18T11:38:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:02 crc kubenswrapper[4922]: I0218 11:38:02.898490 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:02 crc kubenswrapper[4922]: I0218 11:38:02.898550 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:02 crc kubenswrapper[4922]: I0218 11:38:02.898569 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:02 crc kubenswrapper[4922]: I0218 11:38:02.898595 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:02 crc kubenswrapper[4922]: I0218 11:38:02.898612 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:02Z","lastTransitionTime":"2026-02-18T11:38:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:02 crc kubenswrapper[4922]: I0218 11:38:02.968345 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-06 21:23:41.430539238 +0000 UTC Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.001198 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.001299 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.001317 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.001341 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.001357 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:03Z","lastTransitionTime":"2026-02-18T11:38:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.104818 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.104907 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.104916 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.104931 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.104939 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:03Z","lastTransitionTime":"2026-02-18T11:38:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.208537 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.208645 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.208666 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.208690 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.208707 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:03Z","lastTransitionTime":"2026-02-18T11:38:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.311434 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.311520 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.311542 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.311571 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.311596 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:03Z","lastTransitionTime":"2026-02-18T11:38:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.415507 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.415569 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.415590 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.415618 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.415639 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:03Z","lastTransitionTime":"2026-02-18T11:38:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.518073 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.518118 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.518130 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.518147 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.518159 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:03Z","lastTransitionTime":"2026-02-18T11:38:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.620156 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.620200 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.620216 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.620233 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.620245 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:03Z","lastTransitionTime":"2026-02-18T11:38:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.722688 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.722765 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.722796 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.722825 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.722847 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:03Z","lastTransitionTime":"2026-02-18T11:38:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.825848 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.825907 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.825936 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.825960 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.825977 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:03Z","lastTransitionTime":"2026-02-18T11:38:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.884900 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:38:03 crc kubenswrapper[4922]: E0218 11:38:03.885290 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:39:07.885247782 +0000 UTC m=+149.612951892 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.929593 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.929671 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.929692 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.929719 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.929741 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:03Z","lastTransitionTime":"2026-02-18T11:38:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.968555 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 22:55:29.993476677 +0000 UTC Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.972148 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:38:03 crc kubenswrapper[4922]: E0218 11:38:03.972402 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pspfr" podUID="4702cf45-b47b-4291-a553-5bfc7bc22674" Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.972761 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:38:03 crc kubenswrapper[4922]: E0218 11:38:03.972915 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.973260 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:38:03 crc kubenswrapper[4922]: E0218 11:38:03.973425 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.973633 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:38:03 crc kubenswrapper[4922]: E0218 11:38:03.973728 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.986605 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.986658 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.986714 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:38:03 crc kubenswrapper[4922]: I0218 11:38:03.986753 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:38:03 crc kubenswrapper[4922]: E0218 11:38:03.986867 4922 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 11:38:03 crc kubenswrapper[4922]: E0218 11:38:03.986879 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 11:38:03 crc kubenswrapper[4922]: E0218 11:38:03.986915 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 11:38:03 crc kubenswrapper[4922]: E0218 11:38:03.986935 4922 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:38:03 crc kubenswrapper[4922]: E0218 11:38:03.986971 4922 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 11:38:03 crc kubenswrapper[4922]: E0218 11:38:03.986984 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 11:38:03 crc kubenswrapper[4922]: E0218 11:38:03.987031 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 11:38:03 crc kubenswrapper[4922]: E0218 11:38:03.987051 4922 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:38:03 crc kubenswrapper[4922]: E0218 11:38:03.986938 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 11:39:07.986918817 +0000 UTC m=+149.714622927 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 11:38:03 crc kubenswrapper[4922]: E0218 11:38:03.987092 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-18 11:39:07.987071501 +0000 UTC m=+149.714775601 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:38:03 crc kubenswrapper[4922]: E0218 11:38:03.987110 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 11:39:07.987099872 +0000 UTC m=+149.714803962 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 11:38:03 crc kubenswrapper[4922]: E0218 11:38:03.987125 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-18 11:39:07.987117872 +0000 UTC m=+149.714821972 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:38:04 crc kubenswrapper[4922]: I0218 11:38:04.032887 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:04 crc kubenswrapper[4922]: I0218 11:38:04.032927 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:04 crc kubenswrapper[4922]: I0218 11:38:04.032945 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:04 crc kubenswrapper[4922]: I0218 11:38:04.032962 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:04 crc kubenswrapper[4922]: I0218 11:38:04.032972 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:04Z","lastTransitionTime":"2026-02-18T11:38:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:04 crc kubenswrapper[4922]: I0218 11:38:04.137003 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:04 crc kubenswrapper[4922]: I0218 11:38:04.137096 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:04 crc kubenswrapper[4922]: I0218 11:38:04.137114 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:04 crc kubenswrapper[4922]: I0218 11:38:04.137183 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:04 crc kubenswrapper[4922]: I0218 11:38:04.137204 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:04Z","lastTransitionTime":"2026-02-18T11:38:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:04 crc kubenswrapper[4922]: I0218 11:38:04.241272 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:04 crc kubenswrapper[4922]: I0218 11:38:04.241395 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:04 crc kubenswrapper[4922]: I0218 11:38:04.241426 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:04 crc kubenswrapper[4922]: I0218 11:38:04.241453 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:04 crc kubenswrapper[4922]: I0218 11:38:04.241471 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:04Z","lastTransitionTime":"2026-02-18T11:38:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:04 crc kubenswrapper[4922]: I0218 11:38:04.345429 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:04 crc kubenswrapper[4922]: I0218 11:38:04.345496 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:04 crc kubenswrapper[4922]: I0218 11:38:04.345514 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:04 crc kubenswrapper[4922]: I0218 11:38:04.345544 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:04 crc kubenswrapper[4922]: I0218 11:38:04.345563 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:04Z","lastTransitionTime":"2026-02-18T11:38:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:04 crc kubenswrapper[4922]: I0218 11:38:04.454768 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:04 crc kubenswrapper[4922]: I0218 11:38:04.454906 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:04 crc kubenswrapper[4922]: I0218 11:38:04.454928 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:04 crc kubenswrapper[4922]: I0218 11:38:04.454957 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:04 crc kubenswrapper[4922]: I0218 11:38:04.454977 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:04Z","lastTransitionTime":"2026-02-18T11:38:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:04 crc kubenswrapper[4922]: I0218 11:38:04.558835 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:04 crc kubenswrapper[4922]: I0218 11:38:04.558887 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:04 crc kubenswrapper[4922]: I0218 11:38:04.558905 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:04 crc kubenswrapper[4922]: I0218 11:38:04.558931 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:04 crc kubenswrapper[4922]: I0218 11:38:04.558955 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:04Z","lastTransitionTime":"2026-02-18T11:38:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:04 crc kubenswrapper[4922]: I0218 11:38:04.662530 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:04 crc kubenswrapper[4922]: I0218 11:38:04.662790 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:04 crc kubenswrapper[4922]: I0218 11:38:04.662803 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:04 crc kubenswrapper[4922]: I0218 11:38:04.662821 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:04 crc kubenswrapper[4922]: I0218 11:38:04.662833 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:04Z","lastTransitionTime":"2026-02-18T11:38:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:04 crc kubenswrapper[4922]: I0218 11:38:04.766179 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:04 crc kubenswrapper[4922]: I0218 11:38:04.766274 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:04 crc kubenswrapper[4922]: I0218 11:38:04.766295 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:04 crc kubenswrapper[4922]: I0218 11:38:04.766320 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:04 crc kubenswrapper[4922]: I0218 11:38:04.766338 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:04Z","lastTransitionTime":"2026-02-18T11:38:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:04 crc kubenswrapper[4922]: I0218 11:38:04.869887 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:04 crc kubenswrapper[4922]: I0218 11:38:04.869990 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:04 crc kubenswrapper[4922]: I0218 11:38:04.870050 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:04 crc kubenswrapper[4922]: I0218 11:38:04.870077 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:04 crc kubenswrapper[4922]: I0218 11:38:04.870133 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:04Z","lastTransitionTime":"2026-02-18T11:38:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:04 crc kubenswrapper[4922]: I0218 11:38:04.969640 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 16:32:35.473729346 +0000 UTC Feb 18 11:38:04 crc kubenswrapper[4922]: I0218 11:38:04.974237 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:04 crc kubenswrapper[4922]: I0218 11:38:04.974297 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:04 crc kubenswrapper[4922]: I0218 11:38:04.974324 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:04 crc kubenswrapper[4922]: I0218 11:38:04.974399 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:04 crc kubenswrapper[4922]: I0218 11:38:04.974426 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:04Z","lastTransitionTime":"2026-02-18T11:38:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:05 crc kubenswrapper[4922]: I0218 11:38:05.078809 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:05 crc kubenswrapper[4922]: I0218 11:38:05.078883 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:05 crc kubenswrapper[4922]: I0218 11:38:05.078900 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:05 crc kubenswrapper[4922]: I0218 11:38:05.078925 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:05 crc kubenswrapper[4922]: I0218 11:38:05.078942 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:05Z","lastTransitionTime":"2026-02-18T11:38:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:05 crc kubenswrapper[4922]: I0218 11:38:05.182709 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:05 crc kubenswrapper[4922]: I0218 11:38:05.182776 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:05 crc kubenswrapper[4922]: I0218 11:38:05.182798 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:05 crc kubenswrapper[4922]: I0218 11:38:05.182825 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:05 crc kubenswrapper[4922]: I0218 11:38:05.182845 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:05Z","lastTransitionTime":"2026-02-18T11:38:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:05 crc kubenswrapper[4922]: I0218 11:38:05.285488 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:05 crc kubenswrapper[4922]: I0218 11:38:05.285537 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:05 crc kubenswrapper[4922]: I0218 11:38:05.285552 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:05 crc kubenswrapper[4922]: I0218 11:38:05.285573 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:05 crc kubenswrapper[4922]: I0218 11:38:05.285587 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:05Z","lastTransitionTime":"2026-02-18T11:38:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:05 crc kubenswrapper[4922]: I0218 11:38:05.389712 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:05 crc kubenswrapper[4922]: I0218 11:38:05.389783 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:05 crc kubenswrapper[4922]: I0218 11:38:05.389942 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:05 crc kubenswrapper[4922]: I0218 11:38:05.389974 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:05 crc kubenswrapper[4922]: I0218 11:38:05.389992 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:05Z","lastTransitionTime":"2026-02-18T11:38:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:05 crc kubenswrapper[4922]: I0218 11:38:05.493468 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:05 crc kubenswrapper[4922]: I0218 11:38:05.493553 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:05 crc kubenswrapper[4922]: I0218 11:38:05.493571 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:05 crc kubenswrapper[4922]: I0218 11:38:05.493592 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:05 crc kubenswrapper[4922]: I0218 11:38:05.493608 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:05Z","lastTransitionTime":"2026-02-18T11:38:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:05 crc kubenswrapper[4922]: I0218 11:38:05.597044 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:05 crc kubenswrapper[4922]: I0218 11:38:05.597135 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:05 crc kubenswrapper[4922]: I0218 11:38:05.597158 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:05 crc kubenswrapper[4922]: I0218 11:38:05.597181 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:05 crc kubenswrapper[4922]: I0218 11:38:05.597199 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:05Z","lastTransitionTime":"2026-02-18T11:38:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:05 crc kubenswrapper[4922]: I0218 11:38:05.700209 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:05 crc kubenswrapper[4922]: I0218 11:38:05.700267 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:05 crc kubenswrapper[4922]: I0218 11:38:05.700278 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:05 crc kubenswrapper[4922]: I0218 11:38:05.700298 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:05 crc kubenswrapper[4922]: I0218 11:38:05.700312 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:05Z","lastTransitionTime":"2026-02-18T11:38:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:05 crc kubenswrapper[4922]: I0218 11:38:05.803220 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:05 crc kubenswrapper[4922]: I0218 11:38:05.803265 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:05 crc kubenswrapper[4922]: I0218 11:38:05.803278 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:05 crc kubenswrapper[4922]: I0218 11:38:05.803293 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:05 crc kubenswrapper[4922]: I0218 11:38:05.803304 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:05Z","lastTransitionTime":"2026-02-18T11:38:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:05 crc kubenswrapper[4922]: I0218 11:38:05.906664 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:05 crc kubenswrapper[4922]: I0218 11:38:05.906719 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:05 crc kubenswrapper[4922]: I0218 11:38:05.906733 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:05 crc kubenswrapper[4922]: I0218 11:38:05.906752 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:05 crc kubenswrapper[4922]: I0218 11:38:05.906764 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:05Z","lastTransitionTime":"2026-02-18T11:38:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:05 crc kubenswrapper[4922]: I0218 11:38:05.970393 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 15:47:01.474552542 +0000 UTC Feb 18 11:38:05 crc kubenswrapper[4922]: I0218 11:38:05.972795 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:38:05 crc kubenswrapper[4922]: I0218 11:38:05.972828 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:38:05 crc kubenswrapper[4922]: I0218 11:38:05.972799 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:38:05 crc kubenswrapper[4922]: I0218 11:38:05.972863 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:38:05 crc kubenswrapper[4922]: E0218 11:38:05.972994 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:38:05 crc kubenswrapper[4922]: E0218 11:38:05.973105 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:38:05 crc kubenswrapper[4922]: E0218 11:38:05.973213 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:38:05 crc kubenswrapper[4922]: E0218 11:38:05.973265 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pspfr" podUID="4702cf45-b47b-4291-a553-5bfc7bc22674" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.009577 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.009671 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.009690 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.009715 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.009735 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:06Z","lastTransitionTime":"2026-02-18T11:38:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.022052 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.022102 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.022119 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.022143 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.022166 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:06Z","lastTransitionTime":"2026-02-18T11:38:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:06 crc kubenswrapper[4922]: E0218 11:38:06.036676 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:38:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:38:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:38:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:38:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:38:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:38:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:38:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:38:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"142c3684-8991-4ed8-97d2-827c32777413\\\",\\\"systemUUID\\\":\\\"e6651d54-876d-4ba5-b7d2-813fd96498e1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:38:06Z is after 2025-08-24T17:21:41Z" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.042220 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.042263 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.042279 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.042298 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.042314 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:06Z","lastTransitionTime":"2026-02-18T11:38:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:06 crc kubenswrapper[4922]: E0218 11:38:06.054052 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:38:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:38:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:38:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:38:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:38:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:38:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:38:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:38:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"142c3684-8991-4ed8-97d2-827c32777413\\\",\\\"systemUUID\\\":\\\"e6651d54-876d-4ba5-b7d2-813fd96498e1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:38:06Z is after 2025-08-24T17:21:41Z" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.058883 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.058921 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.058934 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.058950 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.058969 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:06Z","lastTransitionTime":"2026-02-18T11:38:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:06 crc kubenswrapper[4922]: E0218 11:38:06.077613 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:38:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:38:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:38:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:38:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:38:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:38:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:38:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:38:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"142c3684-8991-4ed8-97d2-827c32777413\\\",\\\"systemUUID\\\":\\\"e6651d54-876d-4ba5-b7d2-813fd96498e1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:38:06Z is after 2025-08-24T17:21:41Z" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.082994 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.083031 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.083043 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.083060 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.083072 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:06Z","lastTransitionTime":"2026-02-18T11:38:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:06 crc kubenswrapper[4922]: E0218 11:38:06.101702 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:38:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:38:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:38:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:38:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:38:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:38:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:38:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:38:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"142c3684-8991-4ed8-97d2-827c32777413\\\",\\\"systemUUID\\\":\\\"e6651d54-876d-4ba5-b7d2-813fd96498e1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:38:06Z is after 2025-08-24T17:21:41Z" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.106992 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.107268 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.107292 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.107312 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.107335 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:06Z","lastTransitionTime":"2026-02-18T11:38:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:06 crc kubenswrapper[4922]: E0218 11:38:06.128713 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:38:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:38:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:38:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:38:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:38:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:38:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:38:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:38:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"142c3684-8991-4ed8-97d2-827c32777413\\\",\\\"systemUUID\\\":\\\"e6651d54-876d-4ba5-b7d2-813fd96498e1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:38:06Z is after 2025-08-24T17:21:41Z" Feb 18 11:38:06 crc kubenswrapper[4922]: E0218 11:38:06.128953 4922 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.131531 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.131605 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.131628 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.131656 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.131674 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:06Z","lastTransitionTime":"2026-02-18T11:38:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.235790 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.235862 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.235879 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.235905 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.235926 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:06Z","lastTransitionTime":"2026-02-18T11:38:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.339029 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.339078 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.339089 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.339104 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.339116 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:06Z","lastTransitionTime":"2026-02-18T11:38:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.441958 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.442046 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.442064 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.442090 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.442108 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:06Z","lastTransitionTime":"2026-02-18T11:38:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.544287 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.544398 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.544413 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.544436 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.544453 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:06Z","lastTransitionTime":"2026-02-18T11:38:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.648347 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.648436 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.648454 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.648481 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.648499 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:06Z","lastTransitionTime":"2026-02-18T11:38:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.752080 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.752139 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.752152 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.752173 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.752186 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:06Z","lastTransitionTime":"2026-02-18T11:38:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.855593 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.855659 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.855679 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.855705 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.855724 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:06Z","lastTransitionTime":"2026-02-18T11:38:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.958743 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.958793 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.958812 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.958833 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.958850 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:06Z","lastTransitionTime":"2026-02-18T11:38:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:06 crc kubenswrapper[4922]: I0218 11:38:06.973573 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 22:00:36.122546661 +0000 UTC Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.062341 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.062444 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.062474 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.062505 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.062531 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:07Z","lastTransitionTime":"2026-02-18T11:38:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.166065 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.166126 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.166145 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.166169 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.166188 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:07Z","lastTransitionTime":"2026-02-18T11:38:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.269339 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.269412 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.269425 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.269445 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.269459 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:07Z","lastTransitionTime":"2026-02-18T11:38:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.373808 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.373857 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.373871 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.373891 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.373911 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:07Z","lastTransitionTime":"2026-02-18T11:38:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.477146 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.477193 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.477209 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.477232 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.477248 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:07Z","lastTransitionTime":"2026-02-18T11:38:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.582450 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.582493 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.582507 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.582530 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.582543 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:07Z","lastTransitionTime":"2026-02-18T11:38:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.685033 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.685069 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.685079 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.685093 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.685105 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:07Z","lastTransitionTime":"2026-02-18T11:38:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.787528 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.787627 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.787653 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.787686 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.787709 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:07Z","lastTransitionTime":"2026-02-18T11:38:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.891228 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.891276 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.891318 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.891338 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.891352 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:07Z","lastTransitionTime":"2026-02-18T11:38:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.972580 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.972793 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.972964 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.972978 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:38:07 crc kubenswrapper[4922]: E0218 11:38:07.973149 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:38:07 crc kubenswrapper[4922]: E0218 11:38:07.973752 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pspfr" podUID="4702cf45-b47b-4291-a553-5bfc7bc22674" Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.973812 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 11:18:19.064430432 +0000 UTC Feb 18 11:38:07 crc kubenswrapper[4922]: E0218 11:38:07.973809 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:38:07 crc kubenswrapper[4922]: E0218 11:38:07.974496 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.975121 4922 scope.go:117] "RemoveContainer" containerID="1d1aca901c6bb735f04a76a87f7a05d6c549b8f936a07fb1917ece2b9fdd0081" Feb 18 11:38:07 crc kubenswrapper[4922]: E0218 11:38:07.975454 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-wg4r5_openshift-ovn-kubernetes(653a41bb-bb1d-421c-a92b-7f2811d95edf)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.994476 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.994533 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.994552 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.994577 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:07 crc kubenswrapper[4922]: I0218 11:38:07.994600 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:07Z","lastTransitionTime":"2026-02-18T11:38:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:08 crc kubenswrapper[4922]: I0218 11:38:08.097739 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:08 crc kubenswrapper[4922]: I0218 11:38:08.097791 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:08 crc kubenswrapper[4922]: I0218 11:38:08.097803 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:08 crc kubenswrapper[4922]: I0218 11:38:08.097820 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:08 crc kubenswrapper[4922]: I0218 11:38:08.097833 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:08Z","lastTransitionTime":"2026-02-18T11:38:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:08 crc kubenswrapper[4922]: I0218 11:38:08.200572 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:08 crc kubenswrapper[4922]: I0218 11:38:08.200675 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:08 crc kubenswrapper[4922]: I0218 11:38:08.200712 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:08 crc kubenswrapper[4922]: I0218 11:38:08.200755 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:08 crc kubenswrapper[4922]: I0218 11:38:08.200783 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:08Z","lastTransitionTime":"2026-02-18T11:38:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:08 crc kubenswrapper[4922]: I0218 11:38:08.304433 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:08 crc kubenswrapper[4922]: I0218 11:38:08.304496 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:08 crc kubenswrapper[4922]: I0218 11:38:08.304512 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:08 crc kubenswrapper[4922]: I0218 11:38:08.304532 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:08 crc kubenswrapper[4922]: I0218 11:38:08.304547 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:08Z","lastTransitionTime":"2026-02-18T11:38:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:08 crc kubenswrapper[4922]: I0218 11:38:08.407702 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:08 crc kubenswrapper[4922]: I0218 11:38:08.407749 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:08 crc kubenswrapper[4922]: I0218 11:38:08.407760 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:08 crc kubenswrapper[4922]: I0218 11:38:08.407778 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:08 crc kubenswrapper[4922]: I0218 11:38:08.407790 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:08Z","lastTransitionTime":"2026-02-18T11:38:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:08 crc kubenswrapper[4922]: I0218 11:38:08.511355 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:08 crc kubenswrapper[4922]: I0218 11:38:08.511483 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:08 crc kubenswrapper[4922]: I0218 11:38:08.511509 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:08 crc kubenswrapper[4922]: I0218 11:38:08.511540 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:08 crc kubenswrapper[4922]: I0218 11:38:08.511568 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:08Z","lastTransitionTime":"2026-02-18T11:38:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:08 crc kubenswrapper[4922]: I0218 11:38:08.615147 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:08 crc kubenswrapper[4922]: I0218 11:38:08.615246 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:08 crc kubenswrapper[4922]: I0218 11:38:08.615271 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:08 crc kubenswrapper[4922]: I0218 11:38:08.615305 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:08 crc kubenswrapper[4922]: I0218 11:38:08.615334 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:08Z","lastTransitionTime":"2026-02-18T11:38:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:08 crc kubenswrapper[4922]: I0218 11:38:08.718155 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:08 crc kubenswrapper[4922]: I0218 11:38:08.718230 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:08 crc kubenswrapper[4922]: I0218 11:38:08.718249 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:08 crc kubenswrapper[4922]: I0218 11:38:08.718308 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:08 crc kubenswrapper[4922]: I0218 11:38:08.718327 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:08Z","lastTransitionTime":"2026-02-18T11:38:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:08 crc kubenswrapper[4922]: I0218 11:38:08.821393 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:08 crc kubenswrapper[4922]: I0218 11:38:08.821443 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:08 crc kubenswrapper[4922]: I0218 11:38:08.821457 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:08 crc kubenswrapper[4922]: I0218 11:38:08.821477 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:08 crc kubenswrapper[4922]: I0218 11:38:08.821490 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:08Z","lastTransitionTime":"2026-02-18T11:38:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:08 crc kubenswrapper[4922]: I0218 11:38:08.924260 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:08 crc kubenswrapper[4922]: I0218 11:38:08.924326 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:08 crc kubenswrapper[4922]: I0218 11:38:08.924346 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:08 crc kubenswrapper[4922]: I0218 11:38:08.924400 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:08 crc kubenswrapper[4922]: I0218 11:38:08.924421 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:08Z","lastTransitionTime":"2026-02-18T11:38:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:08 crc kubenswrapper[4922]: I0218 11:38:08.974137 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 19:19:47.805369703 +0000 UTC Feb 18 11:38:08 crc kubenswrapper[4922]: I0218 11:38:08.991811 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5qkb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cd3723d-a12f-4c7c-a1ea-63bfef3c931a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ff8efc7bd7cb5045841f85767cea766898289dab6f4081829d530fbaaf9dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c45cw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5qkb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:38:08Z is after 2025-08-24T17:21:41Z" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.027726 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.027779 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.027797 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.027817 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.027832 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:09Z","lastTransitionTime":"2026-02-18T11:38:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.069792 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"384d508e-8f6a-427d-a85d-85bb61d8405e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dacf3659cbb51e3d8df303470837041f3110f38b42187d40cbb4e53a8750cafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab832676116da0ae3d9dc70d182e283d7ae2418db02c1ea0062812cab0d71b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://434010c9772cf1c9db5578ecd4726ae2de0c0a3923d45b522868677cf78fcaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af62821d90e57595091aa9e7f1083acd21cb955ada0837c5ab4f71885ff7f1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0218 11:36:58.792829 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0218 11:36:58.792990 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0218 11:36:58.794119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-56419459/tls.crt::/tmp/serving-cert-56419459/tls.key\\\\\\\"\\\\nI0218 11:36:58.994875 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0218 11:36:59.009974 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0218 11:36:59.010010 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0218 11:36:59.010039 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0218 11:36:59.010047 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0218 11:36:59.021309 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0218 11:36:59.021338 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021343 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021347 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0218 11:36:59.021350 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0218 11:36:59.021353 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:36:59.021356 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:36:59.021636 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0218 11:36:59.024101 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b891a2fe938b437cbfb28c7715d0ad758541d844a4768434d3c1f1a3f0578c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:38:09Z is after 2025-08-24T17:21:41Z" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.091698 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://454f72313609f075cd3f659b19279f8c3b07e8fea70f0ca7c88c5ca8e2e57397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0577fe2fd5192d4ec36ec8bfc6259a533b6422ef672d8153f601c2ac96bb5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:38:09Z is after 2025-08-24T17:21:41Z" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.113596 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26zbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592c6351-c252-4c19-b3b1-167096be2de9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fca36b120b6a8f3bf08a810358bdd23b8b3c3105ede8e850232a9fef2d322db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26zbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:38:09Z is after 2025-08-24T17:21:41Z" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.128561 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjnxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"648f85d5-dbc6-4db6-b590-3edc96740212\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e419e9c268c54b7829fa957e360c5b2c9943232871f9773ab32017c36a44b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8l2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61f493c42d7223ae48d738cf9973c2f6f5997722b017cc1665ec5f27ea0542dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8l2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:37:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjnxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:38:09Z is after 2025-08-24T17:21:41Z" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.130650 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.130725 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.130748 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.130780 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.130804 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:09Z","lastTransitionTime":"2026-02-18T11:38:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.142049 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pspfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702cf45-b47b-4291-a553-5bfc7bc22674\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7224x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7224x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:37:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pspfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:38:09Z is after 2025-08-24T17:21:41Z" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.158990 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47a2352-be16-435d-9207-078c64f99e77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d641439c1ae42b2579e47041fc257fa37643ec1ef4fcacf55b15b82d6ec02cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dda1daddcf256e52a7dc75c6b5c221fcd8b8fad50c50b6053774f9a8fab57e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac316efde113b3547d00f480ee2c658958ba4f99c719d0920f333ab2a62ebaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6a8f8152ee7536b35054611238f4050c7f0e3a756f74ae8525af7f50150f88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:38:09Z is after 2025-08-24T17:21:41Z" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.175796 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"143e32d8-7a32-4ab1-9790-a5b4fe1f0180\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c70b47d4c3df2edbdf1c33bdcff5642932d5212a23d54b20e79ec909b2e3de12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e27a0c7d36ad7d1c3f884b9bb69b963a922ad7851d263b3761ed5bc7e79a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a488435a081bdf17035a1ff4df73b0723045f0eef9049ec259198c697ecc33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39ce7973e25ba416a5f82cbafb8c1b3bea34718c49f8a8a56bd46970dc688727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39ce7973e25ba416a5f82cbafb8c1b3bea34718c49f8a8a56bd46970dc688727\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:38:09Z is after 2025-08-24T17:21:41Z" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.193659 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:38:09Z is after 2025-08-24T17:21:41Z" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.219023 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:38:09Z is after 2025-08-24T17:21:41Z" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.234298 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.234340 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.234424 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.234459 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.234474 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:09Z","lastTransitionTime":"2026-02-18T11:38:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.237523 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:38:09Z is after 2025-08-24T17:21:41Z" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.252088 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w46bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"067f44ac-9e60-4581-87cc-f2e1c823fc4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c52fc2f1b105e90957f8cd7424352990f9ec9e5350d1a501980bfda1613066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ks8d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w46bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:38:09Z is after 2025-08-24T17:21:41Z" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.282715 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bbe0881c-3173-4207-adb8-72c0dd4fa9c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e73616098bbc0fcdcd0b7e6b5bf610a7c193bdb65922e4532ce9553e9fc6e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71f2a53a03eafd89329b955a7edbe71c610706392e0a07fa09b9a6247889913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cac665b2d5a707e29339e7e8afa75dffcfd1dc1f42262e2643b2487ae68558a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0517235f78003f9b1a9044287c23aebed811299dc48f24c6643dd3bfcf5cf1d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a0c8c34dd5237a5b2d52b436f5c4deabd96a96ccb8073961dc49aaffd62c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:38:09Z is after 2025-08-24T17:21:41Z" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.321474 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65825dc8bc12fe9933ff1392fd17a19042bf62d65c0b1ba22303094881745e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:38:09Z is after 2025-08-24T17:21:41Z" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.336106 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.336138 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.336148 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.336163 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.336171 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:09Z","lastTransitionTime":"2026-02-18T11:38:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.347859 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9xzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b4595ac-c521-4ada-950d-e1b01cdff99b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d0689c83ee6c0c1704fac1f69846ec7d6ef1b16479ee4eae33acefd6b84765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83f784f6e4f98fea5928c944062388eaa69d5a5782490bdb120cd735f54f52df\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:37:47Z\\\",\\\"message\\\":\\\"2026-02-18T11:37:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_66f40f5e-a7a5-411c-a907-9573d8b5508f\\\\n2026-02-18T11:37:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_66f40f5e-a7a5-411c-a907-9573d8b5508f to /host/opt/cni/bin/\\\\n2026-02-18T11:37:02Z [verbose] multus-daemon started\\\\n2026-02-18T11:37:02Z [verbose] Readiness Indicator file check\\\\n2026-02-18T11:37:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zv6rh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9xzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:38:09Z is after 2025-08-24T17:21:41Z" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.359379 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3079844ef727cc58eb6253c7eaa0289ca84977a00334a10cdbf493b16133a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:38:09Z is after 2025-08-24T17:21:41Z" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.370977 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdb7cedc-b2e3-48f0-80e0-e17073b43228\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c29cb76dded770e8691cd270c1c1197ffd3764c3fa73faf7f072a63c9e3e4b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653a3be8615ffc467d4be70b77ab62c837c24df920515a9965ec20ce2941d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-znglx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:38:09Z is after 2025-08-24T17:21:41Z" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.390840 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"653a41bb-bb1d-421c-a92b-7f2811d95edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d1aca901c6bb735f04a76a87f7a05d6c549b8f936a07fb1917ece2b9fdd0081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d1aca901c6bb735f04a76a87f7a05d6c549b8f936a07fb1917ece2b9fdd0081\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:37:51Z\\\",\\\"message\\\":\\\"ernalversions/factory.go:141\\\\nI0218 11:37:51.742382 6997 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0218 11:37:51.742628 6997 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:37:51.742905 6997 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:37:51.743039 6997 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0218 11:37:51.743155 6997 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:37:51.744128 6997 factory.go:656] Stopping watch factory\\\\nI0218 11:37:51.758771 6997 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0218 11:37:51.758806 6997 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0218 11:37:51.758875 6997 ovnkube.go:599] Stopped ovnkube\\\\nI0218 11:37:51.758908 6997 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0218 11:37:51.759034 6997 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-wg4r5_openshift-ovn-kubernetes(653a41bb-bb1d-421c-a92b-7f2811d95edf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wg4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:38:09Z is after 2025-08-24T17:21:41Z" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.439533 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.439575 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.439584 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.439599 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.439609 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:09Z","lastTransitionTime":"2026-02-18T11:38:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.543705 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.543784 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.543808 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.543839 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.543864 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:09Z","lastTransitionTime":"2026-02-18T11:38:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.647622 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.647671 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.647686 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.647705 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.647720 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:09Z","lastTransitionTime":"2026-02-18T11:38:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.751035 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.751108 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.751125 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.751152 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.751172 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:09Z","lastTransitionTime":"2026-02-18T11:38:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.854471 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.854583 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.854609 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.854639 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.854660 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:09Z","lastTransitionTime":"2026-02-18T11:38:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.957993 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.958041 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.958054 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.958075 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.958089 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:09Z","lastTransitionTime":"2026-02-18T11:38:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.972384 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:38:09 crc kubenswrapper[4922]: E0218 11:38:09.972494 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.972693 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.972775 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:38:09 crc kubenswrapper[4922]: E0218 11:38:09.973297 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:38:09 crc kubenswrapper[4922]: E0218 11:38:09.973489 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.974017 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:38:09 crc kubenswrapper[4922]: E0218 11:38:09.974208 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pspfr" podUID="4702cf45-b47b-4291-a553-5bfc7bc22674" Feb 18 11:38:09 crc kubenswrapper[4922]: I0218 11:38:09.974423 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 10:41:54.036130102 +0000 UTC Feb 18 11:38:10 crc kubenswrapper[4922]: I0218 11:38:10.061410 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:10 crc kubenswrapper[4922]: I0218 11:38:10.061468 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:10 crc kubenswrapper[4922]: I0218 11:38:10.061481 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:10 crc kubenswrapper[4922]: I0218 11:38:10.061498 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:10 crc kubenswrapper[4922]: I0218 11:38:10.061511 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:10Z","lastTransitionTime":"2026-02-18T11:38:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:10 crc kubenswrapper[4922]: I0218 11:38:10.164166 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:10 crc kubenswrapper[4922]: I0218 11:38:10.164212 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:10 crc kubenswrapper[4922]: I0218 11:38:10.164223 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:10 crc kubenswrapper[4922]: I0218 11:38:10.164240 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:10 crc kubenswrapper[4922]: I0218 11:38:10.164253 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:10Z","lastTransitionTime":"2026-02-18T11:38:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:10 crc kubenswrapper[4922]: I0218 11:38:10.268121 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:10 crc kubenswrapper[4922]: I0218 11:38:10.268198 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:10 crc kubenswrapper[4922]: I0218 11:38:10.268220 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:10 crc kubenswrapper[4922]: I0218 11:38:10.268249 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:10 crc kubenswrapper[4922]: I0218 11:38:10.268277 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:10Z","lastTransitionTime":"2026-02-18T11:38:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:10 crc kubenswrapper[4922]: I0218 11:38:10.372438 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:10 crc kubenswrapper[4922]: I0218 11:38:10.372506 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:10 crc kubenswrapper[4922]: I0218 11:38:10.372530 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:10 crc kubenswrapper[4922]: I0218 11:38:10.372567 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:10 crc kubenswrapper[4922]: I0218 11:38:10.372585 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:10Z","lastTransitionTime":"2026-02-18T11:38:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:10 crc kubenswrapper[4922]: I0218 11:38:10.476789 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:10 crc kubenswrapper[4922]: I0218 11:38:10.476879 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:10 crc kubenswrapper[4922]: I0218 11:38:10.476907 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:10 crc kubenswrapper[4922]: I0218 11:38:10.476939 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:10 crc kubenswrapper[4922]: I0218 11:38:10.476957 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:10Z","lastTransitionTime":"2026-02-18T11:38:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:10 crc kubenswrapper[4922]: I0218 11:38:10.580521 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:10 crc kubenswrapper[4922]: I0218 11:38:10.580591 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:10 crc kubenswrapper[4922]: I0218 11:38:10.580611 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:10 crc kubenswrapper[4922]: I0218 11:38:10.580636 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:10 crc kubenswrapper[4922]: I0218 11:38:10.580657 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:10Z","lastTransitionTime":"2026-02-18T11:38:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:10 crc kubenswrapper[4922]: I0218 11:38:10.683700 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:10 crc kubenswrapper[4922]: I0218 11:38:10.683765 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:10 crc kubenswrapper[4922]: I0218 11:38:10.683789 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:10 crc kubenswrapper[4922]: I0218 11:38:10.683824 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:10 crc kubenswrapper[4922]: I0218 11:38:10.683848 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:10Z","lastTransitionTime":"2026-02-18T11:38:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:10 crc kubenswrapper[4922]: I0218 11:38:10.787545 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:10 crc kubenswrapper[4922]: I0218 11:38:10.787609 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:10 crc kubenswrapper[4922]: I0218 11:38:10.787631 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:10 crc kubenswrapper[4922]: I0218 11:38:10.787662 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:10 crc kubenswrapper[4922]: I0218 11:38:10.787699 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:10Z","lastTransitionTime":"2026-02-18T11:38:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:10 crc kubenswrapper[4922]: I0218 11:38:10.890955 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:10 crc kubenswrapper[4922]: I0218 11:38:10.891009 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:10 crc kubenswrapper[4922]: I0218 11:38:10.891022 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:10 crc kubenswrapper[4922]: I0218 11:38:10.891042 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:10 crc kubenswrapper[4922]: I0218 11:38:10.891055 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:10Z","lastTransitionTime":"2026-02-18T11:38:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:10 crc kubenswrapper[4922]: I0218 11:38:10.975114 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 16:39:40.974995256 +0000 UTC Feb 18 11:38:10 crc kubenswrapper[4922]: I0218 11:38:10.994128 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:10 crc kubenswrapper[4922]: I0218 11:38:10.994201 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:10 crc kubenswrapper[4922]: I0218 11:38:10.994223 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:10 crc kubenswrapper[4922]: I0218 11:38:10.994251 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:10 crc kubenswrapper[4922]: I0218 11:38:10.994269 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:10Z","lastTransitionTime":"2026-02-18T11:38:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:11 crc kubenswrapper[4922]: I0218 11:38:11.097720 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:11 crc kubenswrapper[4922]: I0218 11:38:11.097799 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:11 crc kubenswrapper[4922]: I0218 11:38:11.097825 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:11 crc kubenswrapper[4922]: I0218 11:38:11.097859 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:11 crc kubenswrapper[4922]: I0218 11:38:11.097880 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:11Z","lastTransitionTime":"2026-02-18T11:38:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:11 crc kubenswrapper[4922]: I0218 11:38:11.201310 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:11 crc kubenswrapper[4922]: I0218 11:38:11.201426 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:11 crc kubenswrapper[4922]: I0218 11:38:11.201451 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:11 crc kubenswrapper[4922]: I0218 11:38:11.201483 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:11 crc kubenswrapper[4922]: I0218 11:38:11.201534 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:11Z","lastTransitionTime":"2026-02-18T11:38:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:11 crc kubenswrapper[4922]: I0218 11:38:11.304417 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:11 crc kubenswrapper[4922]: I0218 11:38:11.304463 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:11 crc kubenswrapper[4922]: I0218 11:38:11.304474 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:11 crc kubenswrapper[4922]: I0218 11:38:11.304490 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:11 crc kubenswrapper[4922]: I0218 11:38:11.304502 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:11Z","lastTransitionTime":"2026-02-18T11:38:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:11 crc kubenswrapper[4922]: I0218 11:38:11.407761 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:11 crc kubenswrapper[4922]: I0218 11:38:11.407841 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:11 crc kubenswrapper[4922]: I0218 11:38:11.407866 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:11 crc kubenswrapper[4922]: I0218 11:38:11.407897 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:11 crc kubenswrapper[4922]: I0218 11:38:11.407918 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:11Z","lastTransitionTime":"2026-02-18T11:38:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:11 crc kubenswrapper[4922]: I0218 11:38:11.511519 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:11 crc kubenswrapper[4922]: I0218 11:38:11.511635 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:11 crc kubenswrapper[4922]: I0218 11:38:11.511662 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:11 crc kubenswrapper[4922]: I0218 11:38:11.511702 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:11 crc kubenswrapper[4922]: I0218 11:38:11.511727 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:11Z","lastTransitionTime":"2026-02-18T11:38:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:11 crc kubenswrapper[4922]: I0218 11:38:11.614749 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:11 crc kubenswrapper[4922]: I0218 11:38:11.614835 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:11 crc kubenswrapper[4922]: I0218 11:38:11.614853 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:11 crc kubenswrapper[4922]: I0218 11:38:11.614878 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:11 crc kubenswrapper[4922]: I0218 11:38:11.614896 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:11Z","lastTransitionTime":"2026-02-18T11:38:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:11 crc kubenswrapper[4922]: I0218 11:38:11.718198 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:11 crc kubenswrapper[4922]: I0218 11:38:11.718404 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:11 crc kubenswrapper[4922]: I0218 11:38:11.718435 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:11 crc kubenswrapper[4922]: I0218 11:38:11.718461 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:11 crc kubenswrapper[4922]: I0218 11:38:11.718479 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:11Z","lastTransitionTime":"2026-02-18T11:38:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:11 crc kubenswrapper[4922]: I0218 11:38:11.821622 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:11 crc kubenswrapper[4922]: I0218 11:38:11.821702 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:11 crc kubenswrapper[4922]: I0218 11:38:11.821724 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:11 crc kubenswrapper[4922]: I0218 11:38:11.821752 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:11 crc kubenswrapper[4922]: I0218 11:38:11.821774 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:11Z","lastTransitionTime":"2026-02-18T11:38:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:11 crc kubenswrapper[4922]: I0218 11:38:11.924025 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:11 crc kubenswrapper[4922]: I0218 11:38:11.924075 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:11 crc kubenswrapper[4922]: I0218 11:38:11.924088 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:11 crc kubenswrapper[4922]: I0218 11:38:11.924103 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:11 crc kubenswrapper[4922]: I0218 11:38:11.924119 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:11Z","lastTransitionTime":"2026-02-18T11:38:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:11 crc kubenswrapper[4922]: I0218 11:38:11.972259 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:38:11 crc kubenswrapper[4922]: E0218 11:38:11.972466 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:38:11 crc kubenswrapper[4922]: I0218 11:38:11.972631 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:38:11 crc kubenswrapper[4922]: I0218 11:38:11.972721 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:38:11 crc kubenswrapper[4922]: E0218 11:38:11.972962 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pspfr" podUID="4702cf45-b47b-4291-a553-5bfc7bc22674" Feb 18 11:38:11 crc kubenswrapper[4922]: I0218 11:38:11.972674 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:38:11 crc kubenswrapper[4922]: E0218 11:38:11.973310 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:38:11 crc kubenswrapper[4922]: E0218 11:38:11.973192 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:38:11 crc kubenswrapper[4922]: I0218 11:38:11.975626 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 02:38:46.750600678 +0000 UTC Feb 18 11:38:11 crc kubenswrapper[4922]: I0218 11:38:11.986304 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 18 11:38:12 crc kubenswrapper[4922]: I0218 11:38:12.027170 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:12 crc kubenswrapper[4922]: I0218 11:38:12.027229 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:12 crc kubenswrapper[4922]: I0218 11:38:12.027246 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:12 crc kubenswrapper[4922]: I0218 11:38:12.027271 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:12 crc kubenswrapper[4922]: I0218 11:38:12.027289 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:12Z","lastTransitionTime":"2026-02-18T11:38:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:12 crc kubenswrapper[4922]: I0218 11:38:12.130540 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:12 crc kubenswrapper[4922]: I0218 11:38:12.130606 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:12 crc kubenswrapper[4922]: I0218 11:38:12.130627 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:12 crc kubenswrapper[4922]: I0218 11:38:12.130987 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:12 crc kubenswrapper[4922]: I0218 11:38:12.131075 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:12Z","lastTransitionTime":"2026-02-18T11:38:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:12 crc kubenswrapper[4922]: I0218 11:38:12.233684 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:12 crc kubenswrapper[4922]: I0218 11:38:12.233726 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:12 crc kubenswrapper[4922]: I0218 11:38:12.233758 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:12 crc kubenswrapper[4922]: I0218 11:38:12.233774 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:12 crc kubenswrapper[4922]: I0218 11:38:12.233786 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:12Z","lastTransitionTime":"2026-02-18T11:38:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:12 crc kubenswrapper[4922]: I0218 11:38:12.336430 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:12 crc kubenswrapper[4922]: I0218 11:38:12.336508 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:12 crc kubenswrapper[4922]: I0218 11:38:12.336533 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:12 crc kubenswrapper[4922]: I0218 11:38:12.336565 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:12 crc kubenswrapper[4922]: I0218 11:38:12.336593 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:12Z","lastTransitionTime":"2026-02-18T11:38:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:12 crc kubenswrapper[4922]: I0218 11:38:12.440871 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:12 crc kubenswrapper[4922]: I0218 11:38:12.440936 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:12 crc kubenswrapper[4922]: I0218 11:38:12.440955 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:12 crc kubenswrapper[4922]: I0218 11:38:12.440979 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:12 crc kubenswrapper[4922]: I0218 11:38:12.441000 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:12Z","lastTransitionTime":"2026-02-18T11:38:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:12 crc kubenswrapper[4922]: I0218 11:38:12.544414 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:12 crc kubenswrapper[4922]: I0218 11:38:12.544512 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:12 crc kubenswrapper[4922]: I0218 11:38:12.544533 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:12 crc kubenswrapper[4922]: I0218 11:38:12.544588 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:12 crc kubenswrapper[4922]: I0218 11:38:12.544606 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:12Z","lastTransitionTime":"2026-02-18T11:38:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:12 crc kubenswrapper[4922]: I0218 11:38:12.647523 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:12 crc kubenswrapper[4922]: I0218 11:38:12.647583 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:12 crc kubenswrapper[4922]: I0218 11:38:12.647597 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:12 crc kubenswrapper[4922]: I0218 11:38:12.647620 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:12 crc kubenswrapper[4922]: I0218 11:38:12.647634 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:12Z","lastTransitionTime":"2026-02-18T11:38:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:12 crc kubenswrapper[4922]: I0218 11:38:12.750590 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:12 crc kubenswrapper[4922]: I0218 11:38:12.750660 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:12 crc kubenswrapper[4922]: I0218 11:38:12.750679 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:12 crc kubenswrapper[4922]: I0218 11:38:12.750706 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:12 crc kubenswrapper[4922]: I0218 11:38:12.750726 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:12Z","lastTransitionTime":"2026-02-18T11:38:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:12 crc kubenswrapper[4922]: I0218 11:38:12.853304 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:12 crc kubenswrapper[4922]: I0218 11:38:12.853354 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:12 crc kubenswrapper[4922]: I0218 11:38:12.853382 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:12 crc kubenswrapper[4922]: I0218 11:38:12.853397 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:12 crc kubenswrapper[4922]: I0218 11:38:12.853406 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:12Z","lastTransitionTime":"2026-02-18T11:38:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:12 crc kubenswrapper[4922]: I0218 11:38:12.955546 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:12 crc kubenswrapper[4922]: I0218 11:38:12.955672 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:12 crc kubenswrapper[4922]: I0218 11:38:12.955692 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:12 crc kubenswrapper[4922]: I0218 11:38:12.955717 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:12 crc kubenswrapper[4922]: I0218 11:38:12.955734 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:12Z","lastTransitionTime":"2026-02-18T11:38:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:12 crc kubenswrapper[4922]: I0218 11:38:12.975808 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 23:48:05.480725649 +0000 UTC Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.059006 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.059039 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.059049 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.059065 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.059076 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:13Z","lastTransitionTime":"2026-02-18T11:38:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.162533 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.162606 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.162629 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.162656 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.162677 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:13Z","lastTransitionTime":"2026-02-18T11:38:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.265595 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.265670 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.265683 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.265699 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.265711 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:13Z","lastTransitionTime":"2026-02-18T11:38:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.368057 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.368106 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.368114 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.368128 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.368139 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:13Z","lastTransitionTime":"2026-02-18T11:38:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.471014 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.471085 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.471111 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.471141 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.471162 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:13Z","lastTransitionTime":"2026-02-18T11:38:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.574085 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.574120 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.574129 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.574142 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.574152 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:13Z","lastTransitionTime":"2026-02-18T11:38:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.677231 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.677308 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.677341 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.677418 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.677444 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:13Z","lastTransitionTime":"2026-02-18T11:38:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.780484 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.780552 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.780571 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.780600 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.780627 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:13Z","lastTransitionTime":"2026-02-18T11:38:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.883667 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.883729 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.883745 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.883769 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.883786 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:13Z","lastTransitionTime":"2026-02-18T11:38:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.972260 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.972347 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:38:13 crc kubenswrapper[4922]: E0218 11:38:13.972419 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.972397 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:38:13 crc kubenswrapper[4922]: E0218 11:38:13.972670 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.972723 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:38:13 crc kubenswrapper[4922]: E0218 11:38:13.973224 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pspfr" podUID="4702cf45-b47b-4291-a553-5bfc7bc22674" Feb 18 11:38:13 crc kubenswrapper[4922]: E0218 11:38:13.973330 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.976590 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 10:59:51.836766503 +0000 UTC Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.986216 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.986248 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.986256 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.986268 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:13 crc kubenswrapper[4922]: I0218 11:38:13.986281 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:13Z","lastTransitionTime":"2026-02-18T11:38:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:14 crc kubenswrapper[4922]: I0218 11:38:14.089599 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:14 crc kubenswrapper[4922]: I0218 11:38:14.089633 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:14 crc kubenswrapper[4922]: I0218 11:38:14.089644 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:14 crc kubenswrapper[4922]: I0218 11:38:14.089658 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:14 crc kubenswrapper[4922]: I0218 11:38:14.089669 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:14Z","lastTransitionTime":"2026-02-18T11:38:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:14 crc kubenswrapper[4922]: I0218 11:38:14.192230 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:14 crc kubenswrapper[4922]: I0218 11:38:14.192274 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:14 crc kubenswrapper[4922]: I0218 11:38:14.192287 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:14 crc kubenswrapper[4922]: I0218 11:38:14.192303 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:14 crc kubenswrapper[4922]: I0218 11:38:14.192315 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:14Z","lastTransitionTime":"2026-02-18T11:38:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:14 crc kubenswrapper[4922]: I0218 11:38:14.294231 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:14 crc kubenswrapper[4922]: I0218 11:38:14.294270 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:14 crc kubenswrapper[4922]: I0218 11:38:14.294281 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:14 crc kubenswrapper[4922]: I0218 11:38:14.294294 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:14 crc kubenswrapper[4922]: I0218 11:38:14.294303 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:14Z","lastTransitionTime":"2026-02-18T11:38:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:14 crc kubenswrapper[4922]: I0218 11:38:14.396506 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:14 crc kubenswrapper[4922]: I0218 11:38:14.396570 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:14 crc kubenswrapper[4922]: I0218 11:38:14.396592 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:14 crc kubenswrapper[4922]: I0218 11:38:14.396618 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:14 crc kubenswrapper[4922]: I0218 11:38:14.396639 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:14Z","lastTransitionTime":"2026-02-18T11:38:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:14 crc kubenswrapper[4922]: I0218 11:38:14.499951 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:14 crc kubenswrapper[4922]: I0218 11:38:14.500005 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:14 crc kubenswrapper[4922]: I0218 11:38:14.500015 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:14 crc kubenswrapper[4922]: I0218 11:38:14.500031 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:14 crc kubenswrapper[4922]: I0218 11:38:14.500041 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:14Z","lastTransitionTime":"2026-02-18T11:38:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:14 crc kubenswrapper[4922]: I0218 11:38:14.602545 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:14 crc kubenswrapper[4922]: I0218 11:38:14.602604 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:14 crc kubenswrapper[4922]: I0218 11:38:14.602621 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:14 crc kubenswrapper[4922]: I0218 11:38:14.602646 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:14 crc kubenswrapper[4922]: I0218 11:38:14.602664 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:14Z","lastTransitionTime":"2026-02-18T11:38:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:14 crc kubenswrapper[4922]: I0218 11:38:14.705455 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:14 crc kubenswrapper[4922]: I0218 11:38:14.705495 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:14 crc kubenswrapper[4922]: I0218 11:38:14.705504 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:14 crc kubenswrapper[4922]: I0218 11:38:14.705517 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:14 crc kubenswrapper[4922]: I0218 11:38:14.705528 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:14Z","lastTransitionTime":"2026-02-18T11:38:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:14 crc kubenswrapper[4922]: I0218 11:38:14.807734 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:14 crc kubenswrapper[4922]: I0218 11:38:14.807796 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:14 crc kubenswrapper[4922]: I0218 11:38:14.807814 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:14 crc kubenswrapper[4922]: I0218 11:38:14.807837 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:14 crc kubenswrapper[4922]: I0218 11:38:14.807853 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:14Z","lastTransitionTime":"2026-02-18T11:38:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:14 crc kubenswrapper[4922]: I0218 11:38:14.911243 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:14 crc kubenswrapper[4922]: I0218 11:38:14.911286 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:14 crc kubenswrapper[4922]: I0218 11:38:14.911297 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:14 crc kubenswrapper[4922]: I0218 11:38:14.911316 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:14 crc kubenswrapper[4922]: I0218 11:38:14.911327 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:14Z","lastTransitionTime":"2026-02-18T11:38:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:14 crc kubenswrapper[4922]: I0218 11:38:14.977075 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 11:07:56.534093459 +0000 UTC Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.014323 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.014410 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.014424 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.014442 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.014454 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:15Z","lastTransitionTime":"2026-02-18T11:38:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.117505 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.117640 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.117667 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.117697 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.117720 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:15Z","lastTransitionTime":"2026-02-18T11:38:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.220238 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.220314 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.220338 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.220404 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.220429 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:15Z","lastTransitionTime":"2026-02-18T11:38:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.323914 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.323982 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.323998 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.324041 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.324054 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:15Z","lastTransitionTime":"2026-02-18T11:38:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.428105 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.428167 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.428182 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.428203 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.428219 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:15Z","lastTransitionTime":"2026-02-18T11:38:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.532391 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.532433 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.532444 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.532459 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.532471 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:15Z","lastTransitionTime":"2026-02-18T11:38:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.635643 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.636033 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.636048 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.636067 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.636081 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:15Z","lastTransitionTime":"2026-02-18T11:38:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.739932 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.740004 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.740018 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.740043 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.740058 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:15Z","lastTransitionTime":"2026-02-18T11:38:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.843466 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.843544 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.843563 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.843589 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.843606 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:15Z","lastTransitionTime":"2026-02-18T11:38:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.946092 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.946161 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.946174 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.946191 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.946222 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:15Z","lastTransitionTime":"2026-02-18T11:38:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.972071 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.972106 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:38:15 crc kubenswrapper[4922]: E0218 11:38:15.972239 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.972115 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.972106 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:38:15 crc kubenswrapper[4922]: E0218 11:38:15.972548 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:38:15 crc kubenswrapper[4922]: E0218 11:38:15.972743 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pspfr" podUID="4702cf45-b47b-4291-a553-5bfc7bc22674" Feb 18 11:38:15 crc kubenswrapper[4922]: E0218 11:38:15.972455 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:38:15 crc kubenswrapper[4922]: I0218 11:38:15.978204 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 19:47:02.301635378 +0000 UTC Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.049104 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.049169 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.049188 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.049213 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.049231 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:16Z","lastTransitionTime":"2026-02-18T11:38:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.152308 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.152412 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.152439 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.152467 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.152489 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:16Z","lastTransitionTime":"2026-02-18T11:38:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.255697 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.255752 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.255772 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.255795 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.255814 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:16Z","lastTransitionTime":"2026-02-18T11:38:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.360209 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.360282 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.360307 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.360409 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.360446 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:16Z","lastTransitionTime":"2026-02-18T11:38:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.419668 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.419720 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.419734 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.419753 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.419767 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:16Z","lastTransitionTime":"2026-02-18T11:38:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:16 crc kubenswrapper[4922]: E0218 11:38:16.438803 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:38:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:38:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:38:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:38:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:38:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:38:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:38:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:38:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"142c3684-8991-4ed8-97d2-827c32777413\\\",\\\"systemUUID\\\":\\\"e6651d54-876d-4ba5-b7d2-813fd96498e1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:38:16Z is after 2025-08-24T17:21:41Z" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.442613 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.442645 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.442657 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.442673 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.442685 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:16Z","lastTransitionTime":"2026-02-18T11:38:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:16 crc kubenswrapper[4922]: E0218 11:38:16.455839 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:38:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:38:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:38:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:38:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:38:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:38:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:38:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:38:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"142c3684-8991-4ed8-97d2-827c32777413\\\",\\\"systemUUID\\\":\\\"e6651d54-876d-4ba5-b7d2-813fd96498e1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:38:16Z is after 2025-08-24T17:21:41Z" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.459221 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.459252 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.459264 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.459281 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.459293 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:16Z","lastTransitionTime":"2026-02-18T11:38:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:16 crc kubenswrapper[4922]: E0218 11:38:16.471719 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:38:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:38:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:38:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:38:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:38:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:38:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:38:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:38:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"142c3684-8991-4ed8-97d2-827c32777413\\\",\\\"systemUUID\\\":\\\"e6651d54-876d-4ba5-b7d2-813fd96498e1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:38:16Z is after 2025-08-24T17:21:41Z" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.475135 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.475184 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.475198 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.475217 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.475232 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:16Z","lastTransitionTime":"2026-02-18T11:38:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:16 crc kubenswrapper[4922]: E0218 11:38:16.485691 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:38:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:38:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:38:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:38:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:38:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:38:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:38:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:38:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"142c3684-8991-4ed8-97d2-827c32777413\\\",\\\"systemUUID\\\":\\\"e6651d54-876d-4ba5-b7d2-813fd96498e1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:38:16Z is after 2025-08-24T17:21:41Z" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.488996 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.489033 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.489044 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.489058 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.489067 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:16Z","lastTransitionTime":"2026-02-18T11:38:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:16 crc kubenswrapper[4922]: E0218 11:38:16.503333 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:38:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:38:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:38:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:38:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:38:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:38:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:38:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:38:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"142c3684-8991-4ed8-97d2-827c32777413\\\",\\\"systemUUID\\\":\\\"e6651d54-876d-4ba5-b7d2-813fd96498e1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:38:16Z is after 2025-08-24T17:21:41Z" Feb 18 11:38:16 crc kubenswrapper[4922]: E0218 11:38:16.503504 4922 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.505078 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.505130 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.505143 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.505162 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.505174 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:16Z","lastTransitionTime":"2026-02-18T11:38:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.608024 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.608113 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.608133 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.608167 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.608186 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:16Z","lastTransitionTime":"2026-02-18T11:38:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.711542 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.711627 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.711645 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.711673 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.711686 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:16Z","lastTransitionTime":"2026-02-18T11:38:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.815110 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.815159 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.815170 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.815188 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.815201 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:16Z","lastTransitionTime":"2026-02-18T11:38:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.918243 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.918283 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.918291 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.918305 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.918314 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:16Z","lastTransitionTime":"2026-02-18T11:38:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:16 crc kubenswrapper[4922]: I0218 11:38:16.978704 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 04:25:23.818566832 +0000 UTC Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.021067 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.021254 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.021280 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.021301 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.021317 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:17Z","lastTransitionTime":"2026-02-18T11:38:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.124051 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.124108 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.124126 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.124149 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.124167 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:17Z","lastTransitionTime":"2026-02-18T11:38:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.227471 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.227556 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.227584 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.227616 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.227643 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:17Z","lastTransitionTime":"2026-02-18T11:38:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.330345 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.330457 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.330479 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.330508 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.330527 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:17Z","lastTransitionTime":"2026-02-18T11:38:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.433788 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.433860 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.433878 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.433899 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.433915 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:17Z","lastTransitionTime":"2026-02-18T11:38:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.537145 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.537226 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.537251 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.537281 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.537305 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:17Z","lastTransitionTime":"2026-02-18T11:38:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.641972 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.642054 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.642072 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.642103 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.642124 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:17Z","lastTransitionTime":"2026-02-18T11:38:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.745221 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.745295 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.745315 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.745340 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.745392 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:17Z","lastTransitionTime":"2026-02-18T11:38:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.750120 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4702cf45-b47b-4291-a553-5bfc7bc22674-metrics-certs\") pod \"network-metrics-daemon-pspfr\" (UID: \"4702cf45-b47b-4291-a553-5bfc7bc22674\") " pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:38:17 crc kubenswrapper[4922]: E0218 11:38:17.750301 4922 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 11:38:17 crc kubenswrapper[4922]: E0218 11:38:17.750428 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4702cf45-b47b-4291-a553-5bfc7bc22674-metrics-certs podName:4702cf45-b47b-4291-a553-5bfc7bc22674 nodeName:}" failed. No retries permitted until 2026-02-18 11:39:21.750401655 +0000 UTC m=+163.478105775 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4702cf45-b47b-4291-a553-5bfc7bc22674-metrics-certs") pod "network-metrics-daemon-pspfr" (UID: "4702cf45-b47b-4291-a553-5bfc7bc22674") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.849103 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.849170 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.849192 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.849217 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.849235 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:17Z","lastTransitionTime":"2026-02-18T11:38:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.952754 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.952839 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.952863 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.952891 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.952914 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:17Z","lastTransitionTime":"2026-02-18T11:38:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.972598 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:38:17 crc kubenswrapper[4922]: E0218 11:38:17.972786 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pspfr" podUID="4702cf45-b47b-4291-a553-5bfc7bc22674" Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.973224 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:38:17 crc kubenswrapper[4922]: E0218 11:38:17.973435 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.973751 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.973763 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:38:17 crc kubenswrapper[4922]: E0218 11:38:17.974144 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:38:17 crc kubenswrapper[4922]: E0218 11:38:17.974279 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:38:17 crc kubenswrapper[4922]: I0218 11:38:17.978883 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 04:57:52.933419806 +0000 UTC Feb 18 11:38:18 crc kubenswrapper[4922]: I0218 11:38:18.055479 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:18 crc kubenswrapper[4922]: I0218 11:38:18.055540 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:18 crc kubenswrapper[4922]: I0218 11:38:18.055563 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:18 crc kubenswrapper[4922]: I0218 11:38:18.055593 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:18 crc kubenswrapper[4922]: I0218 11:38:18.055619 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:18Z","lastTransitionTime":"2026-02-18T11:38:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:18 crc kubenswrapper[4922]: I0218 11:38:18.158534 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:18 crc kubenswrapper[4922]: I0218 11:38:18.158585 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:18 crc kubenswrapper[4922]: I0218 11:38:18.158594 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:18 crc kubenswrapper[4922]: I0218 11:38:18.158616 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:18 crc kubenswrapper[4922]: I0218 11:38:18.158626 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:18Z","lastTransitionTime":"2026-02-18T11:38:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:18 crc kubenswrapper[4922]: I0218 11:38:18.261789 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:18 crc kubenswrapper[4922]: I0218 11:38:18.261846 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:18 crc kubenswrapper[4922]: I0218 11:38:18.261866 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:18 crc kubenswrapper[4922]: I0218 11:38:18.261891 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:18 crc kubenswrapper[4922]: I0218 11:38:18.261911 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:18Z","lastTransitionTime":"2026-02-18T11:38:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:18 crc kubenswrapper[4922]: I0218 11:38:18.364562 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:18 crc kubenswrapper[4922]: I0218 11:38:18.364618 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:18 crc kubenswrapper[4922]: I0218 11:38:18.364635 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:18 crc kubenswrapper[4922]: I0218 11:38:18.364659 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:18 crc kubenswrapper[4922]: I0218 11:38:18.364677 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:18Z","lastTransitionTime":"2026-02-18T11:38:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:18 crc kubenswrapper[4922]: I0218 11:38:18.467400 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:18 crc kubenswrapper[4922]: I0218 11:38:18.467473 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:18 crc kubenswrapper[4922]: I0218 11:38:18.467493 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:18 crc kubenswrapper[4922]: I0218 11:38:18.467516 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:18 crc kubenswrapper[4922]: I0218 11:38:18.467534 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:18Z","lastTransitionTime":"2026-02-18T11:38:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:18 crc kubenswrapper[4922]: I0218 11:38:18.571405 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:18 crc kubenswrapper[4922]: I0218 11:38:18.571505 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:18 crc kubenswrapper[4922]: I0218 11:38:18.571527 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:18 crc kubenswrapper[4922]: I0218 11:38:18.571559 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:18 crc kubenswrapper[4922]: I0218 11:38:18.571585 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:18Z","lastTransitionTime":"2026-02-18T11:38:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:18 crc kubenswrapper[4922]: I0218 11:38:18.674884 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:18 crc kubenswrapper[4922]: I0218 11:38:18.674961 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:18 crc kubenswrapper[4922]: I0218 11:38:18.674985 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:18 crc kubenswrapper[4922]: I0218 11:38:18.675014 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:18 crc kubenswrapper[4922]: I0218 11:38:18.675036 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:18Z","lastTransitionTime":"2026-02-18T11:38:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:18 crc kubenswrapper[4922]: I0218 11:38:18.778231 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:18 crc kubenswrapper[4922]: I0218 11:38:18.778275 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:18 crc kubenswrapper[4922]: I0218 11:38:18.778285 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:18 crc kubenswrapper[4922]: I0218 11:38:18.778299 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:18 crc kubenswrapper[4922]: I0218 11:38:18.778308 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:18Z","lastTransitionTime":"2026-02-18T11:38:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:18 crc kubenswrapper[4922]: I0218 11:38:18.881219 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:18 crc kubenswrapper[4922]: I0218 11:38:18.881259 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:18 crc kubenswrapper[4922]: I0218 11:38:18.881270 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:18 crc kubenswrapper[4922]: I0218 11:38:18.881285 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:18 crc kubenswrapper[4922]: I0218 11:38:18.881297 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:18Z","lastTransitionTime":"2026-02-18T11:38:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:18 crc kubenswrapper[4922]: I0218 11:38:18.979070 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 20:52:59.815705932 +0000 UTC Feb 18 11:38:18 crc kubenswrapper[4922]: I0218 11:38:18.983817 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:18 crc kubenswrapper[4922]: I0218 11:38:18.983892 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:18 crc kubenswrapper[4922]: I0218 11:38:18.983919 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:18 crc kubenswrapper[4922]: I0218 11:38:18.983951 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:18 crc kubenswrapper[4922]: I0218 11:38:18.983985 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:18Z","lastTransitionTime":"2026-02-18T11:38:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:18 crc kubenswrapper[4922]: I0218 11:38:18.996259 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"384d508e-8f6a-427d-a85d-85bb61d8405e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dacf3659cbb51e3d8df303470837041f3110f38b42187d40cbb4e53a8750cafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab832676116da0ae3d9dc70d182e283d7ae2418db02c1ea0062812cab0d71b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://434010c9772cf1c9db5578ecd4726ae2de0c0a3923d45b522868677cf78fcaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af62821d90e57595091aa9e7f1083acd21cb955ada0837c5ab4f71885ff7f1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"g file observer\\\\nW0218 11:36:58.792829 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0218 11:36:58.792990 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0218 11:36:58.794119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-56419459/tls.crt::/tmp/serving-cert-56419459/tls.key\\\\\\\"\\\\nI0218 11:36:58.994875 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0218 11:36:59.009974 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0218 11:36:59.010010 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0218 11:36:59.010039 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0218 11:36:59.010047 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0218 11:36:59.021309 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0218 11:36:59.021338 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021343 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0218 11:36:59.021347 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0218 11:36:59.021350 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0218 11:36:59.021353 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:36:59.021356 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:36:59.021636 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0218 11:36:59.024101 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b891a2fe938b437cbfb28c7715d0ad758541d844a4768434d3c1f1a3f0578c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:38:18Z is after 2025-08-24T17:21:41Z" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.018551 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://454f72313609f075cd3f659b19279f8c3b07e8fea70f0ca7c88c5ca8e2e57397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0577fe2fd5192d4ec36ec8bfc6259a533b6422ef672d8153f601c2ac96bb5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:38:19Z is after 2025-08-24T17:21:41Z" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.044289 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26zbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592c6351-c252-4c19-b3b1-167096be2de9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fca36b120b6a8f3bf08a810358bdd23b8b3c3105ede8e850232a9fef2d322db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7585194c5a1150a4dc59095db7e04747db55227d8005c7b472fcd61f6b2480e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89daf582468d65a1bb8b328ea34eca65ffc45ab10f932d8e2770b0afedcb8c1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e7189806450269de2a8b03e9776336e5b727da219f6ba797b5249205994ce00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66447f702724cd3d39c49873ee30b35df3d9a2cc1f1a7dbdcf39fc64673febc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c04f0a35c3197d7383659633eac3058f316dd2b395515e372293162567baa959\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b85fcbd7611f3c38d84b9c892b62a9677e205d9f690e6cd2057dab3276e296\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-55zqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26zbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:38:19Z is after 2025-08-24T17:21:41Z" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.059436 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjnxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"648f85d5-dbc6-4db6-b590-3edc96740212\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e419e9c268c54b7829fa957e360c5b2c9943232871f9773ab32017c36a44b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8l2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61f493c42d7223ae48d738cf9973c2f6f5997722b017cc1665ec5f27ea0542dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8l2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:37:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjnxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:38:19Z is after 2025-08-24T17:21:41Z" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.072826 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w46bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"067f44ac-9e60-4581-87cc-f2e1c823fc4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4c52fc2f1b105e90957f8cd7424352990f9ec9e5350d1a501980bfda1613066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ks8d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w46bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:38:19Z is after 2025-08-24T17:21:41Z" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.087720 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.087778 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.087832 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.087855 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.087872 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:19Z","lastTransitionTime":"2026-02-18T11:38:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.088983 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pspfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702cf45-b47b-4291-a553-5bfc7bc22674\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7224x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7224x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:37:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pspfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:38:19Z is after 2025-08-24T17:21:41Z" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.101087 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adc6a195-a6bd-42d8-990f-60b614c51413\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc50b682470a4987df420a64c6ec491b7137229551303ea861e8c4c037cba371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32c42a81cb689c2599fc99a29686dcfa4beb5434da7149bbe8ca19d545a579bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32c42a81cb689c2599fc99a29686dcfa4beb5434da7149bbe8ca19d545a579bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:38:19Z is after 2025-08-24T17:21:41Z" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.122150 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47a2352-be16-435d-9207-078c64f99e77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d641439c1ae42b2579e47041fc257fa37643ec1ef4fcacf55b15b82d6ec02cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dda1daddcf256e52a7dc75c6b5c221fcd8b8fad50c50b6053774f9a8fab57e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac316efde113b3547d00f480ee2c658958ba4f99c719d0920f333ab2a62ebaf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6a8f8152ee7536b35054611238f4050c7f0e3a756f74ae8525af7f50150f88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:38:19Z is after 2025-08-24T17:21:41Z" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.140987 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"143e32d8-7a32-4ab1-9790-a5b4fe1f0180\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c70b47d4c3df2edbdf1c33bdcff5642932d5212a23d54b20e79ec909b2e3de12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75e27a0c7d36ad7d1c3f884b9bb69b963a922ad7851d263b3761ed5bc7e79a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a488435a081bdf17035a1ff4df73b0723045f0eef9049ec259198c697ecc33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39ce7973e25ba416a5f82cbafb8c1b3bea34718c49f8a8a56bd46970dc688727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39ce7973e25ba416a5f82cbafb8c1b3bea34718c49f8a8a56bd46970dc688727\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:38:19Z is after 2025-08-24T17:21:41Z" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.159581 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:38:19Z is after 2025-08-24T17:21:41Z" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.176058 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:38:19Z is after 2025-08-24T17:21:41Z" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.190732 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.190770 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.190784 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.190800 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.190838 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:19Z","lastTransitionTime":"2026-02-18T11:38:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.199808 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:38:19Z is after 2025-08-24T17:21:41Z" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.238201 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bbe0881c-3173-4207-adb8-72c0dd4fa9c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e73616098bbc0fcdcd0b7e6b5bf610a7c193bdb65922e4532ce9553e9fc6e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71f2a53a03eafd89329b955a7edbe71c610706392e0a07fa09b9a6247889913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cac665b2d5a707e29339e7e8afa75dffcfd1dc1f42262e2643b2487ae68558a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0517235f78003f9b1a9044287c23aebed811299dc48f24c6643dd3bfcf5cf1d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a0c8c34dd5237a5b2d52b436f5c4deabd96a96ccb8073961dc49aaffd62c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:36:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35c0c44a6ef5852d738ae2b528cfb201946f92b8c20d202c41b25be58aed7a4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5499f94aab1c37157b868db9ca72c519a84e0972ed310a60db2ea52a9148ddbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62432fda808af44d25af78beb104a775a1f211f1456d7112897af762cf72914f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:36:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:36:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:38:19Z is after 2025-08-24T17:21:41Z" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.255820 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65825dc8bc12fe9933ff1392fd17a19042bf62d65c0b1ba22303094881745e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:38:19Z is after 2025-08-24T17:21:41Z" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.272989 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9xzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b4595ac-c521-4ada-950d-e1b01cdff99b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d0689c83ee6c0c1704fac1f69846ec7d6ef1b16479ee4eae33acefd6b84765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83f784f6e4f98fea5928c944062388eaa69d5a5782490bdb120cd735f54f52df\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:37:47Z\\\",\\\"message\\\":\\\"2026-02-18T11:37:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_66f40f5e-a7a5-411c-a907-9573d8b5508f\\\\n2026-02-18T11:37:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_66f40f5e-a7a5-411c-a907-9573d8b5508f to /host/opt/cni/bin/\\\\n2026-02-18T11:37:02Z [verbose] multus-daemon started\\\\n2026-02-18T11:37:02Z [verbose] Readiness Indicator file check\\\\n2026-02-18T11:37:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zv6rh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9xzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:38:19Z is after 2025-08-24T17:21:41Z" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.289791 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3079844ef727cc58eb6253c7eaa0289ca84977a00334a10cdbf493b16133a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:38:19Z is after 2025-08-24T17:21:41Z" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.294080 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.294140 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.294157 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.294178 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.294193 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:19Z","lastTransitionTime":"2026-02-18T11:38:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.303068 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdb7cedc-b2e3-48f0-80e0-e17073b43228\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c29cb76dded770e8691cd270c1c1197ffd3764c3fa73faf7f072a63c9e3e4b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653a3be8615ffc467d4be70b77ab62c837c24df920515a9965ec20ce2941d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqgzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-znglx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:38:19Z is after 2025-08-24T17:21:41Z" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.328380 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"653a41bb-bb1d-421c-a92b-7f2811d95edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d1aca901c6bb735f04a76a87f7a05d6c549b8f936a07fb1917ece2b9fdd0081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d1aca901c6bb735f04a76a87f7a05d6c549b8f936a07fb1917ece2b9fdd0081\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:37:51Z\\\",\\\"message\\\":\\\"ernalversions/factory.go:141\\\\nI0218 11:37:51.742382 6997 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0218 11:37:51.742628 6997 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:37:51.742905 6997 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:37:51.743039 6997 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0218 11:37:51.743155 6997 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:37:51.744128 6997 factory.go:656] Stopping watch factory\\\\nI0218 11:37:51.758771 6997 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0218 11:37:51.758806 6997 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0218 11:37:51.758875 6997 ovnkube.go:599] Stopped ovnkube\\\\nI0218 11:37:51.758908 6997 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0218 11:37:51.759034 6997 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-wg4r5_openshift-ovn-kubernetes(653a41bb-bb1d-421c-a92b-7f2811d95edf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:37:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26p2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wg4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:38:19Z is after 2025-08-24T17:21:41Z" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.342182 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5qkb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cd3723d-a12f-4c7c-a1ea-63bfef3c931a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:36:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:37:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ff8efc7bd7cb5045841f85767cea766898289dab6f4081829d530fbaaf9dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:37:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c45cw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:36:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5qkb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:38:19Z is after 2025-08-24T17:21:41Z" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.397549 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.397602 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.397620 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.397644 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.397662 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:19Z","lastTransitionTime":"2026-02-18T11:38:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.500526 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.500597 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.500637 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.500673 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.500697 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:19Z","lastTransitionTime":"2026-02-18T11:38:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.603495 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.603593 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.603621 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.603655 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.603681 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:19Z","lastTransitionTime":"2026-02-18T11:38:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.706030 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.706081 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.706093 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.706110 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.706122 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:19Z","lastTransitionTime":"2026-02-18T11:38:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.809284 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.809432 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.809467 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.809494 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.809513 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:19Z","lastTransitionTime":"2026-02-18T11:38:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.914877 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.914936 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.914954 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.914978 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.914995 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:19Z","lastTransitionTime":"2026-02-18T11:38:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.973152 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.973269 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.973194 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.973152 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:38:19 crc kubenswrapper[4922]: E0218 11:38:19.973444 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:38:19 crc kubenswrapper[4922]: E0218 11:38:19.973694 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pspfr" podUID="4702cf45-b47b-4291-a553-5bfc7bc22674" Feb 18 11:38:19 crc kubenswrapper[4922]: E0218 11:38:19.973891 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:38:19 crc kubenswrapper[4922]: E0218 11:38:19.974000 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:38:19 crc kubenswrapper[4922]: I0218 11:38:19.979574 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 16:29:27.591157044 +0000 UTC Feb 18 11:38:20 crc kubenswrapper[4922]: I0218 11:38:20.018269 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:20 crc kubenswrapper[4922]: I0218 11:38:20.018316 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:20 crc kubenswrapper[4922]: I0218 11:38:20.018333 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:20 crc kubenswrapper[4922]: I0218 11:38:20.018355 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:20 crc kubenswrapper[4922]: I0218 11:38:20.018396 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:20Z","lastTransitionTime":"2026-02-18T11:38:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:20 crc kubenswrapper[4922]: I0218 11:38:20.121496 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:20 crc kubenswrapper[4922]: I0218 11:38:20.121565 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:20 crc kubenswrapper[4922]: I0218 11:38:20.121584 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:20 crc kubenswrapper[4922]: I0218 11:38:20.121611 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:20 crc kubenswrapper[4922]: I0218 11:38:20.121630 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:20Z","lastTransitionTime":"2026-02-18T11:38:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:20 crc kubenswrapper[4922]: I0218 11:38:20.224962 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:20 crc kubenswrapper[4922]: I0218 11:38:20.225060 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:20 crc kubenswrapper[4922]: I0218 11:38:20.225080 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:20 crc kubenswrapper[4922]: I0218 11:38:20.225141 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:20 crc kubenswrapper[4922]: I0218 11:38:20.225158 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:20Z","lastTransitionTime":"2026-02-18T11:38:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:20 crc kubenswrapper[4922]: I0218 11:38:20.327320 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:20 crc kubenswrapper[4922]: I0218 11:38:20.327442 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:20 crc kubenswrapper[4922]: I0218 11:38:20.327472 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:20 crc kubenswrapper[4922]: I0218 11:38:20.327504 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:20 crc kubenswrapper[4922]: I0218 11:38:20.327526 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:20Z","lastTransitionTime":"2026-02-18T11:38:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:20 crc kubenswrapper[4922]: I0218 11:38:20.430562 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:20 crc kubenswrapper[4922]: I0218 11:38:20.430612 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:20 crc kubenswrapper[4922]: I0218 11:38:20.430622 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:20 crc kubenswrapper[4922]: I0218 11:38:20.430639 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:20 crc kubenswrapper[4922]: I0218 11:38:20.430648 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:20Z","lastTransitionTime":"2026-02-18T11:38:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:20 crc kubenswrapper[4922]: I0218 11:38:20.534595 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:20 crc kubenswrapper[4922]: I0218 11:38:20.534680 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:20 crc kubenswrapper[4922]: I0218 11:38:20.534707 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:20 crc kubenswrapper[4922]: I0218 11:38:20.534739 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:20 crc kubenswrapper[4922]: I0218 11:38:20.534758 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:20Z","lastTransitionTime":"2026-02-18T11:38:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:20 crc kubenswrapper[4922]: I0218 11:38:20.637656 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:20 crc kubenswrapper[4922]: I0218 11:38:20.637736 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:20 crc kubenswrapper[4922]: I0218 11:38:20.637756 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:20 crc kubenswrapper[4922]: I0218 11:38:20.637788 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:20 crc kubenswrapper[4922]: I0218 11:38:20.637807 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:20Z","lastTransitionTime":"2026-02-18T11:38:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:20 crc kubenswrapper[4922]: I0218 11:38:20.740481 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:20 crc kubenswrapper[4922]: I0218 11:38:20.740554 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:20 crc kubenswrapper[4922]: I0218 11:38:20.740571 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:20 crc kubenswrapper[4922]: I0218 11:38:20.740591 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:20 crc kubenswrapper[4922]: I0218 11:38:20.740605 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:20Z","lastTransitionTime":"2026-02-18T11:38:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:20 crc kubenswrapper[4922]: I0218 11:38:20.843517 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:20 crc kubenswrapper[4922]: I0218 11:38:20.843645 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:20 crc kubenswrapper[4922]: I0218 11:38:20.843657 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:20 crc kubenswrapper[4922]: I0218 11:38:20.843674 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:20 crc kubenswrapper[4922]: I0218 11:38:20.843684 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:20Z","lastTransitionTime":"2026-02-18T11:38:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:20 crc kubenswrapper[4922]: I0218 11:38:20.947074 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:20 crc kubenswrapper[4922]: I0218 11:38:20.947144 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:20 crc kubenswrapper[4922]: I0218 11:38:20.947159 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:20 crc kubenswrapper[4922]: I0218 11:38:20.947179 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:20 crc kubenswrapper[4922]: I0218 11:38:20.947195 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:20Z","lastTransitionTime":"2026-02-18T11:38:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:20 crc kubenswrapper[4922]: I0218 11:38:20.979889 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 23:54:06.706686542 +0000 UTC Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.049440 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.049478 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.049489 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.049504 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.049515 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:21Z","lastTransitionTime":"2026-02-18T11:38:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.152727 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.152853 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.152877 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.152904 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.152921 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:21Z","lastTransitionTime":"2026-02-18T11:38:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.256898 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.256979 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.256998 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.257022 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.257041 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:21Z","lastTransitionTime":"2026-02-18T11:38:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.359793 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.359826 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.359835 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.359847 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.359855 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:21Z","lastTransitionTime":"2026-02-18T11:38:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.463568 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.463628 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.463644 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.463668 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.463686 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:21Z","lastTransitionTime":"2026-02-18T11:38:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.567063 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.567119 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.567133 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.567155 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.567177 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:21Z","lastTransitionTime":"2026-02-18T11:38:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.670143 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.670203 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.670216 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.670232 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.670245 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:21Z","lastTransitionTime":"2026-02-18T11:38:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.773559 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.773591 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.773607 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.773625 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.773637 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:21Z","lastTransitionTime":"2026-02-18T11:38:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.876308 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.876339 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.876348 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.876470 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.876515 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:21Z","lastTransitionTime":"2026-02-18T11:38:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.973153 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.973268 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.973277 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:38:21 crc kubenswrapper[4922]: E0218 11:38:21.973414 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:38:21 crc kubenswrapper[4922]: E0218 11:38:21.973471 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.973542 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:38:21 crc kubenswrapper[4922]: E0218 11:38:21.973633 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:38:21 crc kubenswrapper[4922]: E0218 11:38:21.973839 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pspfr" podUID="4702cf45-b47b-4291-a553-5bfc7bc22674" Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.979798 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.979872 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.979895 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.979918 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.979935 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:21Z","lastTransitionTime":"2026-02-18T11:38:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:21 crc kubenswrapper[4922]: I0218 11:38:21.980044 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 17:32:27.736818719 +0000 UTC Feb 18 11:38:22 crc kubenswrapper[4922]: I0218 11:38:22.082842 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:22 crc kubenswrapper[4922]: I0218 11:38:22.082893 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:22 crc kubenswrapper[4922]: I0218 11:38:22.082905 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:22 crc kubenswrapper[4922]: I0218 11:38:22.082922 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:22 crc kubenswrapper[4922]: I0218 11:38:22.082936 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:22Z","lastTransitionTime":"2026-02-18T11:38:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:22 crc kubenswrapper[4922]: I0218 11:38:22.186594 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:22 crc kubenswrapper[4922]: I0218 11:38:22.186650 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:22 crc kubenswrapper[4922]: I0218 11:38:22.186661 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:22 crc kubenswrapper[4922]: I0218 11:38:22.186677 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:22 crc kubenswrapper[4922]: I0218 11:38:22.186689 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:22Z","lastTransitionTime":"2026-02-18T11:38:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:22 crc kubenswrapper[4922]: I0218 11:38:22.289486 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:22 crc kubenswrapper[4922]: I0218 11:38:22.289541 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:22 crc kubenswrapper[4922]: I0218 11:38:22.289557 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:22 crc kubenswrapper[4922]: I0218 11:38:22.289575 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:22 crc kubenswrapper[4922]: I0218 11:38:22.289586 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:22Z","lastTransitionTime":"2026-02-18T11:38:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:22 crc kubenswrapper[4922]: I0218 11:38:22.392575 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:22 crc kubenswrapper[4922]: I0218 11:38:22.392651 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:22 crc kubenswrapper[4922]: I0218 11:38:22.392668 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:22 crc kubenswrapper[4922]: I0218 11:38:22.392694 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:22 crc kubenswrapper[4922]: I0218 11:38:22.392765 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:22Z","lastTransitionTime":"2026-02-18T11:38:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:22 crc kubenswrapper[4922]: I0218 11:38:22.495823 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:22 crc kubenswrapper[4922]: I0218 11:38:22.495887 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:22 crc kubenswrapper[4922]: I0218 11:38:22.495908 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:22 crc kubenswrapper[4922]: I0218 11:38:22.495931 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:22 crc kubenswrapper[4922]: I0218 11:38:22.495948 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:22Z","lastTransitionTime":"2026-02-18T11:38:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:22 crc kubenswrapper[4922]: I0218 11:38:22.599787 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:22 crc kubenswrapper[4922]: I0218 11:38:22.599830 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:22 crc kubenswrapper[4922]: I0218 11:38:22.599839 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:22 crc kubenswrapper[4922]: I0218 11:38:22.599858 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:22 crc kubenswrapper[4922]: I0218 11:38:22.599868 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:22Z","lastTransitionTime":"2026-02-18T11:38:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:22 crc kubenswrapper[4922]: I0218 11:38:22.703000 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:22 crc kubenswrapper[4922]: I0218 11:38:22.703047 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:22 crc kubenswrapper[4922]: I0218 11:38:22.703056 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:22 crc kubenswrapper[4922]: I0218 11:38:22.703071 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:22 crc kubenswrapper[4922]: I0218 11:38:22.703080 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:22Z","lastTransitionTime":"2026-02-18T11:38:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:22 crc kubenswrapper[4922]: I0218 11:38:22.805720 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:22 crc kubenswrapper[4922]: I0218 11:38:22.805777 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:22 crc kubenswrapper[4922]: I0218 11:38:22.805792 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:22 crc kubenswrapper[4922]: I0218 11:38:22.805813 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:22 crc kubenswrapper[4922]: I0218 11:38:22.805826 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:22Z","lastTransitionTime":"2026-02-18T11:38:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:22 crc kubenswrapper[4922]: I0218 11:38:22.908721 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:22 crc kubenswrapper[4922]: I0218 11:38:22.908786 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:22 crc kubenswrapper[4922]: I0218 11:38:22.908811 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:22 crc kubenswrapper[4922]: I0218 11:38:22.908837 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:22 crc kubenswrapper[4922]: I0218 11:38:22.908858 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:22Z","lastTransitionTime":"2026-02-18T11:38:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:22 crc kubenswrapper[4922]: I0218 11:38:22.973251 4922 scope.go:117] "RemoveContainer" containerID="1d1aca901c6bb735f04a76a87f7a05d6c549b8f936a07fb1917ece2b9fdd0081" Feb 18 11:38:22 crc kubenswrapper[4922]: E0218 11:38:22.973508 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-wg4r5_openshift-ovn-kubernetes(653a41bb-bb1d-421c-a92b-7f2811d95edf)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" Feb 18 11:38:22 crc kubenswrapper[4922]: I0218 11:38:22.981204 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 10:38:57.273600198 +0000 UTC Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.011999 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.012048 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.012062 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.012083 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.012096 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:23Z","lastTransitionTime":"2026-02-18T11:38:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.114346 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.114401 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.114411 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.114428 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.114441 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:23Z","lastTransitionTime":"2026-02-18T11:38:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.217380 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.217714 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.217828 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.217937 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.218026 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:23Z","lastTransitionTime":"2026-02-18T11:38:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.320762 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.320843 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.320866 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.320895 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.320920 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:23Z","lastTransitionTime":"2026-02-18T11:38:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.423514 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.423807 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.423889 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.423958 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.424023 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:23Z","lastTransitionTime":"2026-02-18T11:38:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.527689 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.527758 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.527776 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.527799 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.527816 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:23Z","lastTransitionTime":"2026-02-18T11:38:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.630472 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.630556 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.630579 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.631018 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.631280 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:23Z","lastTransitionTime":"2026-02-18T11:38:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.734638 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.734687 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.734706 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.734727 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.734747 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:23Z","lastTransitionTime":"2026-02-18T11:38:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.837786 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.837834 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.837854 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.837876 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.837895 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:23Z","lastTransitionTime":"2026-02-18T11:38:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.941182 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.941273 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.941292 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.941316 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.941333 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:23Z","lastTransitionTime":"2026-02-18T11:38:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.972982 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.973110 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:38:23 crc kubenswrapper[4922]: E0218 11:38:23.973168 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.973312 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.973346 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:38:23 crc kubenswrapper[4922]: E0218 11:38:23.973551 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pspfr" podUID="4702cf45-b47b-4291-a553-5bfc7bc22674" Feb 18 11:38:23 crc kubenswrapper[4922]: E0218 11:38:23.973720 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:38:23 crc kubenswrapper[4922]: E0218 11:38:23.973869 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:38:23 crc kubenswrapper[4922]: I0218 11:38:23.981595 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 15:59:43.393021193 +0000 UTC Feb 18 11:38:24 crc kubenswrapper[4922]: I0218 11:38:24.043539 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:24 crc kubenswrapper[4922]: I0218 11:38:24.043585 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:24 crc kubenswrapper[4922]: I0218 11:38:24.043597 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:24 crc kubenswrapper[4922]: I0218 11:38:24.043610 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:24 crc kubenswrapper[4922]: I0218 11:38:24.043620 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:24Z","lastTransitionTime":"2026-02-18T11:38:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:24 crc kubenswrapper[4922]: I0218 11:38:24.146886 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:24 crc kubenswrapper[4922]: I0218 11:38:24.146927 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:24 crc kubenswrapper[4922]: I0218 11:38:24.146936 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:24 crc kubenswrapper[4922]: I0218 11:38:24.146952 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:24 crc kubenswrapper[4922]: I0218 11:38:24.146963 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:24Z","lastTransitionTime":"2026-02-18T11:38:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:24 crc kubenswrapper[4922]: I0218 11:38:24.250532 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:24 crc kubenswrapper[4922]: I0218 11:38:24.250636 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:24 crc kubenswrapper[4922]: I0218 11:38:24.250655 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:24 crc kubenswrapper[4922]: I0218 11:38:24.250685 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:24 crc kubenswrapper[4922]: I0218 11:38:24.250706 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:24Z","lastTransitionTime":"2026-02-18T11:38:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:24 crc kubenswrapper[4922]: I0218 11:38:24.353541 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:24 crc kubenswrapper[4922]: I0218 11:38:24.353586 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:24 crc kubenswrapper[4922]: I0218 11:38:24.353597 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:24 crc kubenswrapper[4922]: I0218 11:38:24.353613 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:24 crc kubenswrapper[4922]: I0218 11:38:24.353626 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:24Z","lastTransitionTime":"2026-02-18T11:38:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:24 crc kubenswrapper[4922]: I0218 11:38:24.457202 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:24 crc kubenswrapper[4922]: I0218 11:38:24.457259 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:24 crc kubenswrapper[4922]: I0218 11:38:24.457271 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:24 crc kubenswrapper[4922]: I0218 11:38:24.457292 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:24 crc kubenswrapper[4922]: I0218 11:38:24.457328 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:24Z","lastTransitionTime":"2026-02-18T11:38:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:24 crc kubenswrapper[4922]: I0218 11:38:24.561663 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:24 crc kubenswrapper[4922]: I0218 11:38:24.562137 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:24 crc kubenswrapper[4922]: I0218 11:38:24.562214 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:24 crc kubenswrapper[4922]: I0218 11:38:24.562287 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:24 crc kubenswrapper[4922]: I0218 11:38:24.562353 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:24Z","lastTransitionTime":"2026-02-18T11:38:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:24 crc kubenswrapper[4922]: I0218 11:38:24.665936 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:24 crc kubenswrapper[4922]: I0218 11:38:24.666034 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:24 crc kubenswrapper[4922]: I0218 11:38:24.666064 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:24 crc kubenswrapper[4922]: I0218 11:38:24.666099 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:24 crc kubenswrapper[4922]: I0218 11:38:24.666124 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:24Z","lastTransitionTime":"2026-02-18T11:38:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:24 crc kubenswrapper[4922]: I0218 11:38:24.769458 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:24 crc kubenswrapper[4922]: I0218 11:38:24.769514 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:24 crc kubenswrapper[4922]: I0218 11:38:24.769527 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:24 crc kubenswrapper[4922]: I0218 11:38:24.769546 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:24 crc kubenswrapper[4922]: I0218 11:38:24.769558 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:24Z","lastTransitionTime":"2026-02-18T11:38:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:24 crc kubenswrapper[4922]: I0218 11:38:24.873419 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:24 crc kubenswrapper[4922]: I0218 11:38:24.873469 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:24 crc kubenswrapper[4922]: I0218 11:38:24.873480 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:24 crc kubenswrapper[4922]: I0218 11:38:24.873497 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:24 crc kubenswrapper[4922]: I0218 11:38:24.873513 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:24Z","lastTransitionTime":"2026-02-18T11:38:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:24 crc kubenswrapper[4922]: I0218 11:38:24.976908 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:24 crc kubenswrapper[4922]: I0218 11:38:24.976951 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:24 crc kubenswrapper[4922]: I0218 11:38:24.976963 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:24 crc kubenswrapper[4922]: I0218 11:38:24.976980 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:24 crc kubenswrapper[4922]: I0218 11:38:24.976990 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:24Z","lastTransitionTime":"2026-02-18T11:38:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:24 crc kubenswrapper[4922]: I0218 11:38:24.982478 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 20:16:49.602918227 +0000 UTC Feb 18 11:38:25 crc kubenswrapper[4922]: I0218 11:38:25.079911 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:25 crc kubenswrapper[4922]: I0218 11:38:25.079987 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:25 crc kubenswrapper[4922]: I0218 11:38:25.080008 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:25 crc kubenswrapper[4922]: I0218 11:38:25.080035 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:25 crc kubenswrapper[4922]: I0218 11:38:25.080053 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:25Z","lastTransitionTime":"2026-02-18T11:38:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:25 crc kubenswrapper[4922]: I0218 11:38:25.182582 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:25 crc kubenswrapper[4922]: I0218 11:38:25.182642 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:25 crc kubenswrapper[4922]: I0218 11:38:25.182656 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:25 crc kubenswrapper[4922]: I0218 11:38:25.182673 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:25 crc kubenswrapper[4922]: I0218 11:38:25.182685 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:25Z","lastTransitionTime":"2026-02-18T11:38:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:25 crc kubenswrapper[4922]: I0218 11:38:25.286484 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:25 crc kubenswrapper[4922]: I0218 11:38:25.286880 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:25 crc kubenswrapper[4922]: I0218 11:38:25.287028 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:25 crc kubenswrapper[4922]: I0218 11:38:25.287162 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:25 crc kubenswrapper[4922]: I0218 11:38:25.287332 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:25Z","lastTransitionTime":"2026-02-18T11:38:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:25 crc kubenswrapper[4922]: I0218 11:38:25.390779 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:25 crc kubenswrapper[4922]: I0218 11:38:25.390865 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:25 crc kubenswrapper[4922]: I0218 11:38:25.390902 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:25 crc kubenswrapper[4922]: I0218 11:38:25.390933 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:25 crc kubenswrapper[4922]: I0218 11:38:25.390955 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:25Z","lastTransitionTime":"2026-02-18T11:38:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:25 crc kubenswrapper[4922]: I0218 11:38:25.493411 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:25 crc kubenswrapper[4922]: I0218 11:38:25.493522 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:25 crc kubenswrapper[4922]: I0218 11:38:25.493548 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:25 crc kubenswrapper[4922]: I0218 11:38:25.493579 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:25 crc kubenswrapper[4922]: I0218 11:38:25.493600 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:25Z","lastTransitionTime":"2026-02-18T11:38:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:25 crc kubenswrapper[4922]: I0218 11:38:25.596601 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:25 crc kubenswrapper[4922]: I0218 11:38:25.596649 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:25 crc kubenswrapper[4922]: I0218 11:38:25.596662 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:25 crc kubenswrapper[4922]: I0218 11:38:25.596680 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:25 crc kubenswrapper[4922]: I0218 11:38:25.596690 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:25Z","lastTransitionTime":"2026-02-18T11:38:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:25 crc kubenswrapper[4922]: I0218 11:38:25.699952 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:25 crc kubenswrapper[4922]: I0218 11:38:25.700009 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:25 crc kubenswrapper[4922]: I0218 11:38:25.700027 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:25 crc kubenswrapper[4922]: I0218 11:38:25.700049 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:25 crc kubenswrapper[4922]: I0218 11:38:25.700067 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:25Z","lastTransitionTime":"2026-02-18T11:38:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:25 crc kubenswrapper[4922]: I0218 11:38:25.803332 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:25 crc kubenswrapper[4922]: I0218 11:38:25.803449 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:25 crc kubenswrapper[4922]: I0218 11:38:25.803479 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:25 crc kubenswrapper[4922]: I0218 11:38:25.803511 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:25 crc kubenswrapper[4922]: I0218 11:38:25.803532 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:25Z","lastTransitionTime":"2026-02-18T11:38:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:25 crc kubenswrapper[4922]: I0218 11:38:25.907160 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:25 crc kubenswrapper[4922]: I0218 11:38:25.907225 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:25 crc kubenswrapper[4922]: I0218 11:38:25.907247 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:25 crc kubenswrapper[4922]: I0218 11:38:25.907271 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:25 crc kubenswrapper[4922]: I0218 11:38:25.907291 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:25Z","lastTransitionTime":"2026-02-18T11:38:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:25 crc kubenswrapper[4922]: I0218 11:38:25.973138 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:38:25 crc kubenswrapper[4922]: I0218 11:38:25.973138 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:38:25 crc kubenswrapper[4922]: I0218 11:38:25.973846 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:38:25 crc kubenswrapper[4922]: E0218 11:38:25.974093 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:38:25 crc kubenswrapper[4922]: E0218 11:38:25.974247 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:38:25 crc kubenswrapper[4922]: E0218 11:38:25.974615 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pspfr" podUID="4702cf45-b47b-4291-a553-5bfc7bc22674" Feb 18 11:38:25 crc kubenswrapper[4922]: I0218 11:38:25.974813 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:38:25 crc kubenswrapper[4922]: E0218 11:38:25.975111 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:38:25 crc kubenswrapper[4922]: I0218 11:38:25.983474 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 19:18:39.844377495 +0000 UTC Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.011140 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.011717 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.011884 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.012052 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.012198 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:26Z","lastTransitionTime":"2026-02-18T11:38:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.115661 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.116507 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.116674 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.116828 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.117157 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:26Z","lastTransitionTime":"2026-02-18T11:38:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.220931 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.220989 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.221007 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.221037 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.221059 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:26Z","lastTransitionTime":"2026-02-18T11:38:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.324215 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.324605 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.324872 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.325110 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.325307 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:26Z","lastTransitionTime":"2026-02-18T11:38:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.429055 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.429408 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.429520 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.429626 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.429790 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:26Z","lastTransitionTime":"2026-02-18T11:38:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.532863 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.532911 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.532923 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.532944 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.532957 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:26Z","lastTransitionTime":"2026-02-18T11:38:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.635171 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.635248 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.635271 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.635298 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.635317 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:26Z","lastTransitionTime":"2026-02-18T11:38:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.738093 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.738180 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.738211 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.738241 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.738263 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:26Z","lastTransitionTime":"2026-02-18T11:38:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.799904 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.799969 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.799984 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.800001 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.800014 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:38:26Z","lastTransitionTime":"2026-02-18T11:38:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.859924 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-n7b8j"] Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.860651 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-n7b8j" Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.863456 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.863692 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.863780 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.863857 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.911154 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-w46bt" podStartSLOduration=88.911122828 podStartE2EDuration="1m28.911122828s" podCreationTimestamp="2026-02-18 11:36:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:38:26.897324754 +0000 UTC m=+108.625028874" watchObservedRunningTime="2026-02-18 11:38:26.911122828 +0000 UTC m=+108.638826948" Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.943131 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=15.943100091 podStartE2EDuration="15.943100091s" podCreationTimestamp="2026-02-18 11:38:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:38:26.926511881 +0000 UTC m=+108.654215991" watchObservedRunningTime="2026-02-18 11:38:26.943100091 +0000 UTC m=+108.670804201" Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.957081 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a2f5e320-c4e9-42c9-8af5-f528cc87ffba-service-ca\") pod \"cluster-version-operator-5c965bbfc6-n7b8j\" (UID: \"a2f5e320-c4e9-42c9-8af5-f528cc87ffba\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-n7b8j" Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.957341 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a2f5e320-c4e9-42c9-8af5-f528cc87ffba-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-n7b8j\" (UID: \"a2f5e320-c4e9-42c9-8af5-f528cc87ffba\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-n7b8j" Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.957472 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2f5e320-c4e9-42c9-8af5-f528cc87ffba-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-n7b8j\" (UID: \"a2f5e320-c4e9-42c9-8af5-f528cc87ffba\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-n7b8j" Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.957573 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/a2f5e320-c4e9-42c9-8af5-f528cc87ffba-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-n7b8j\" (UID: \"a2f5e320-c4e9-42c9-8af5-f528cc87ffba\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-n7b8j" Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.957684 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/a2f5e320-c4e9-42c9-8af5-f528cc87ffba-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-n7b8j\" (UID: \"a2f5e320-c4e9-42c9-8af5-f528cc87ffba\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-n7b8j" Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.958857 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=81.958832512 podStartE2EDuration="1m21.958832512s" podCreationTimestamp="2026-02-18 11:37:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:38:26.944072164 +0000 UTC m=+108.671776284" watchObservedRunningTime="2026-02-18 11:38:26.958832512 +0000 UTC m=+108.686536622" Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.959288 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=60.959275012 podStartE2EDuration="1m0.959275012s" podCreationTimestamp="2026-02-18 11:37:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:38:26.958093195 +0000 UTC m=+108.685797305" watchObservedRunningTime="2026-02-18 11:38:26.959275012 +0000 UTC m=+108.686979152" Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.984214 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 09:19:54.691378788 +0000 UTC Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.984305 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 18 11:38:26 crc kubenswrapper[4922]: I0218 11:38:26.995243 4922 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 18 11:38:27 crc kubenswrapper[4922]: I0218 11:38:27.039275 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=88.039256517 podStartE2EDuration="1m28.039256517s" podCreationTimestamp="2026-02-18 11:36:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:38:27.038443787 +0000 UTC m=+108.766147897" watchObservedRunningTime="2026-02-18 11:38:27.039256517 +0000 UTC m=+108.766960617" Feb 18 11:38:27 crc kubenswrapper[4922]: I0218 11:38:27.058544 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/a2f5e320-c4e9-42c9-8af5-f528cc87ffba-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-n7b8j\" (UID: \"a2f5e320-c4e9-42c9-8af5-f528cc87ffba\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-n7b8j" Feb 18 11:38:27 crc kubenswrapper[4922]: I0218 11:38:27.058616 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a2f5e320-c4e9-42c9-8af5-f528cc87ffba-service-ca\") pod \"cluster-version-operator-5c965bbfc6-n7b8j\" (UID: \"a2f5e320-c4e9-42c9-8af5-f528cc87ffba\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-n7b8j" Feb 18 11:38:27 crc kubenswrapper[4922]: I0218 11:38:27.058642 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a2f5e320-c4e9-42c9-8af5-f528cc87ffba-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-n7b8j\" (UID: \"a2f5e320-c4e9-42c9-8af5-f528cc87ffba\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-n7b8j" Feb 18 11:38:27 crc kubenswrapper[4922]: I0218 11:38:27.058671 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2f5e320-c4e9-42c9-8af5-f528cc87ffba-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-n7b8j\" (UID: \"a2f5e320-c4e9-42c9-8af5-f528cc87ffba\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-n7b8j" Feb 18 11:38:27 crc kubenswrapper[4922]: I0218 11:38:27.058695 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/a2f5e320-c4e9-42c9-8af5-f528cc87ffba-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-n7b8j\" (UID: \"a2f5e320-c4e9-42c9-8af5-f528cc87ffba\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-n7b8j" Feb 18 11:38:27 crc kubenswrapper[4922]: I0218 11:38:27.058690 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/a2f5e320-c4e9-42c9-8af5-f528cc87ffba-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-n7b8j\" (UID: \"a2f5e320-c4e9-42c9-8af5-f528cc87ffba\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-n7b8j" Feb 18 11:38:27 crc kubenswrapper[4922]: I0218 11:38:27.058787 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/a2f5e320-c4e9-42c9-8af5-f528cc87ffba-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-n7b8j\" (UID: \"a2f5e320-c4e9-42c9-8af5-f528cc87ffba\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-n7b8j" Feb 18 11:38:27 crc kubenswrapper[4922]: I0218 11:38:27.059866 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a2f5e320-c4e9-42c9-8af5-f528cc87ffba-service-ca\") pod \"cluster-version-operator-5c965bbfc6-n7b8j\" (UID: \"a2f5e320-c4e9-42c9-8af5-f528cc87ffba\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-n7b8j" Feb 18 11:38:27 crc kubenswrapper[4922]: I0218 11:38:27.071155 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2f5e320-c4e9-42c9-8af5-f528cc87ffba-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-n7b8j\" (UID: \"a2f5e320-c4e9-42c9-8af5-f528cc87ffba\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-n7b8j" Feb 18 11:38:27 crc kubenswrapper[4922]: I0218 11:38:27.080841 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a2f5e320-c4e9-42c9-8af5-f528cc87ffba-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-n7b8j\" (UID: \"a2f5e320-c4e9-42c9-8af5-f528cc87ffba\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-n7b8j" Feb 18 11:38:27 crc kubenswrapper[4922]: I0218 11:38:27.086156 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-c9xzd" podStartSLOduration=89.086143491 podStartE2EDuration="1m29.086143491s" podCreationTimestamp="2026-02-18 11:36:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:38:27.085392473 +0000 UTC m=+108.813096563" watchObservedRunningTime="2026-02-18 11:38:27.086143491 +0000 UTC m=+108.813847571" Feb 18 11:38:27 crc kubenswrapper[4922]: I0218 11:38:27.117895 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podStartSLOduration=89.117872709 podStartE2EDuration="1m29.117872709s" podCreationTimestamp="2026-02-18 11:36:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:38:27.117637393 +0000 UTC m=+108.845341473" watchObservedRunningTime="2026-02-18 11:38:27.117872709 +0000 UTC m=+108.845576809" Feb 18 11:38:27 crc kubenswrapper[4922]: I0218 11:38:27.131137 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-q5qkb" podStartSLOduration=89.131117761 podStartE2EDuration="1m29.131117761s" podCreationTimestamp="2026-02-18 11:36:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:38:27.130940496 +0000 UTC m=+108.858644596" watchObservedRunningTime="2026-02-18 11:38:27.131117761 +0000 UTC m=+108.858821841" Feb 18 11:38:27 crc kubenswrapper[4922]: I0218 11:38:27.149438 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=88.149418082 podStartE2EDuration="1m28.149418082s" podCreationTimestamp="2026-02-18 11:36:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:38:27.147788533 +0000 UTC m=+108.875492623" watchObservedRunningTime="2026-02-18 11:38:27.149418082 +0000 UTC m=+108.877122182" Feb 18 11:38:27 crc kubenswrapper[4922]: I0218 11:38:27.179639 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-n7b8j" Feb 18 11:38:27 crc kubenswrapper[4922]: I0218 11:38:27.197172 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-26zbd" podStartSLOduration=89.197150136 podStartE2EDuration="1m29.197150136s" podCreationTimestamp="2026-02-18 11:36:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:38:27.183917584 +0000 UTC m=+108.911621674" watchObservedRunningTime="2026-02-18 11:38:27.197150136 +0000 UTC m=+108.924854226" Feb 18 11:38:27 crc kubenswrapper[4922]: I0218 11:38:27.221146 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjnxs" podStartSLOduration=88.221128621 podStartE2EDuration="1m28.221128621s" podCreationTimestamp="2026-02-18 11:36:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:38:27.200213988 +0000 UTC m=+108.927918078" watchObservedRunningTime="2026-02-18 11:38:27.221128621 +0000 UTC m=+108.948832701" Feb 18 11:38:27 crc kubenswrapper[4922]: I0218 11:38:27.627899 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-n7b8j" event={"ID":"a2f5e320-c4e9-42c9-8af5-f528cc87ffba","Type":"ContainerStarted","Data":"3a203f7facc64ff381081a49c17212041c7856393c8abc131e8c003f76c085fd"} Feb 18 11:38:27 crc kubenswrapper[4922]: I0218 11:38:27.627948 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-n7b8j" event={"ID":"a2f5e320-c4e9-42c9-8af5-f528cc87ffba","Type":"ContainerStarted","Data":"4d0319557160db52cd2b9a747597dc410c4d850ddac4b4c0e255b8afb6e4e603"} Feb 18 11:38:27 crc kubenswrapper[4922]: I0218 11:38:27.973014 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:38:27 crc kubenswrapper[4922]: I0218 11:38:27.973128 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:38:27 crc kubenswrapper[4922]: I0218 11:38:27.973141 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:38:27 crc kubenswrapper[4922]: I0218 11:38:27.973014 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:38:27 crc kubenswrapper[4922]: E0218 11:38:27.973279 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:38:27 crc kubenswrapper[4922]: E0218 11:38:27.973436 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:38:27 crc kubenswrapper[4922]: E0218 11:38:27.973594 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pspfr" podUID="4702cf45-b47b-4291-a553-5bfc7bc22674" Feb 18 11:38:27 crc kubenswrapper[4922]: E0218 11:38:27.973835 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:38:29 crc kubenswrapper[4922]: I0218 11:38:29.972795 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:38:29 crc kubenswrapper[4922]: I0218 11:38:29.972835 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:38:29 crc kubenswrapper[4922]: I0218 11:38:29.972861 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:38:29 crc kubenswrapper[4922]: E0218 11:38:29.972999 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:38:29 crc kubenswrapper[4922]: E0218 11:38:29.973224 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pspfr" podUID="4702cf45-b47b-4291-a553-5bfc7bc22674" Feb 18 11:38:29 crc kubenswrapper[4922]: E0218 11:38:29.973323 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:38:29 crc kubenswrapper[4922]: I0218 11:38:29.973839 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:38:29 crc kubenswrapper[4922]: E0218 11:38:29.974041 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:38:31 crc kubenswrapper[4922]: I0218 11:38:31.972100 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:38:31 crc kubenswrapper[4922]: I0218 11:38:31.972146 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:38:31 crc kubenswrapper[4922]: E0218 11:38:31.972274 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pspfr" podUID="4702cf45-b47b-4291-a553-5bfc7bc22674" Feb 18 11:38:31 crc kubenswrapper[4922]: I0218 11:38:31.972544 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:38:31 crc kubenswrapper[4922]: I0218 11:38:31.972601 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:38:31 crc kubenswrapper[4922]: E0218 11:38:31.972717 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:38:31 crc kubenswrapper[4922]: E0218 11:38:31.973066 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:38:31 crc kubenswrapper[4922]: E0218 11:38:31.973277 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:38:33 crc kubenswrapper[4922]: I0218 11:38:33.972537 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:38:33 crc kubenswrapper[4922]: I0218 11:38:33.972676 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:38:33 crc kubenswrapper[4922]: I0218 11:38:33.972729 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:38:33 crc kubenswrapper[4922]: I0218 11:38:33.972753 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:38:33 crc kubenswrapper[4922]: E0218 11:38:33.972689 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:38:33 crc kubenswrapper[4922]: E0218 11:38:33.972914 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pspfr" podUID="4702cf45-b47b-4291-a553-5bfc7bc22674" Feb 18 11:38:33 crc kubenswrapper[4922]: E0218 11:38:33.973023 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:38:33 crc kubenswrapper[4922]: E0218 11:38:33.973184 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:38:34 crc kubenswrapper[4922]: I0218 11:38:34.653661 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-c9xzd_9b4595ac-c521-4ada-950d-e1b01cdff99b/kube-multus/1.log" Feb 18 11:38:34 crc kubenswrapper[4922]: I0218 11:38:34.654535 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-c9xzd_9b4595ac-c521-4ada-950d-e1b01cdff99b/kube-multus/0.log" Feb 18 11:38:34 crc kubenswrapper[4922]: I0218 11:38:34.654607 4922 generic.go:334] "Generic (PLEG): container finished" podID="9b4595ac-c521-4ada-950d-e1b01cdff99b" containerID="71d0689c83ee6c0c1704fac1f69846ec7d6ef1b16479ee4eae33acefd6b84765" exitCode=1 Feb 18 11:38:34 crc kubenswrapper[4922]: I0218 11:38:34.654654 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-c9xzd" event={"ID":"9b4595ac-c521-4ada-950d-e1b01cdff99b","Type":"ContainerDied","Data":"71d0689c83ee6c0c1704fac1f69846ec7d6ef1b16479ee4eae33acefd6b84765"} Feb 18 11:38:34 crc kubenswrapper[4922]: I0218 11:38:34.654693 4922 scope.go:117] "RemoveContainer" containerID="83f784f6e4f98fea5928c944062388eaa69d5a5782490bdb120cd735f54f52df" Feb 18 11:38:34 crc kubenswrapper[4922]: I0218 11:38:34.655210 4922 scope.go:117] "RemoveContainer" containerID="71d0689c83ee6c0c1704fac1f69846ec7d6ef1b16479ee4eae33acefd6b84765" Feb 18 11:38:34 crc kubenswrapper[4922]: E0218 11:38:34.655412 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-c9xzd_openshift-multus(9b4595ac-c521-4ada-950d-e1b01cdff99b)\"" pod="openshift-multus/multus-c9xzd" podUID="9b4595ac-c521-4ada-950d-e1b01cdff99b" Feb 18 11:38:34 crc kubenswrapper[4922]: I0218 11:38:34.688865 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-n7b8j" podStartSLOduration=96.688850375 podStartE2EDuration="1m36.688850375s" podCreationTimestamp="2026-02-18 11:36:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:38:27.647016753 +0000 UTC m=+109.374720833" watchObservedRunningTime="2026-02-18 11:38:34.688850375 +0000 UTC m=+116.416554455" Feb 18 11:38:35 crc kubenswrapper[4922]: I0218 11:38:35.660544 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-c9xzd_9b4595ac-c521-4ada-950d-e1b01cdff99b/kube-multus/1.log" Feb 18 11:38:35 crc kubenswrapper[4922]: I0218 11:38:35.973287 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:38:35 crc kubenswrapper[4922]: I0218 11:38:35.973316 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:38:35 crc kubenswrapper[4922]: E0218 11:38:35.974181 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:38:35 crc kubenswrapper[4922]: I0218 11:38:35.973509 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:38:35 crc kubenswrapper[4922]: I0218 11:38:35.972031 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:38:35 crc kubenswrapper[4922]: E0218 11:38:35.974870 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:38:35 crc kubenswrapper[4922]: E0218 11:38:35.974892 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pspfr" podUID="4702cf45-b47b-4291-a553-5bfc7bc22674" Feb 18 11:38:35 crc kubenswrapper[4922]: E0218 11:38:35.974975 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:38:37 crc kubenswrapper[4922]: I0218 11:38:37.972908 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:38:37 crc kubenswrapper[4922]: I0218 11:38:37.972922 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:38:37 crc kubenswrapper[4922]: I0218 11:38:37.972922 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:38:37 crc kubenswrapper[4922]: I0218 11:38:37.972970 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:38:37 crc kubenswrapper[4922]: E0218 11:38:37.973341 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pspfr" podUID="4702cf45-b47b-4291-a553-5bfc7bc22674" Feb 18 11:38:37 crc kubenswrapper[4922]: E0218 11:38:37.973578 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:38:37 crc kubenswrapper[4922]: E0218 11:38:37.973777 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:38:37 crc kubenswrapper[4922]: E0218 11:38:37.974295 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:38:37 crc kubenswrapper[4922]: I0218 11:38:37.975289 4922 scope.go:117] "RemoveContainer" containerID="1d1aca901c6bb735f04a76a87f7a05d6c549b8f936a07fb1917ece2b9fdd0081" Feb 18 11:38:38 crc kubenswrapper[4922]: I0218 11:38:38.673578 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wg4r5_653a41bb-bb1d-421c-a92b-7f2811d95edf/ovnkube-controller/3.log" Feb 18 11:38:38 crc kubenswrapper[4922]: I0218 11:38:38.676275 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" event={"ID":"653a41bb-bb1d-421c-a92b-7f2811d95edf","Type":"ContainerStarted","Data":"d40ac6f2baab7bfed2de70b92b02e9fd3903b69815200ff68c4655a0de9763f3"} Feb 18 11:38:38 crc kubenswrapper[4922]: I0218 11:38:38.676728 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:38:38 crc kubenswrapper[4922]: I0218 11:38:38.706506 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" podStartSLOduration=100.706487426 podStartE2EDuration="1m40.706487426s" podCreationTimestamp="2026-02-18 11:36:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:38:38.705650706 +0000 UTC m=+120.433354826" watchObservedRunningTime="2026-02-18 11:38:38.706487426 +0000 UTC m=+120.434191526" Feb 18 11:38:38 crc kubenswrapper[4922]: I0218 11:38:38.905091 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-pspfr"] Feb 18 11:38:38 crc kubenswrapper[4922]: I0218 11:38:38.905245 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:38:38 crc kubenswrapper[4922]: E0218 11:38:38.905403 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pspfr" podUID="4702cf45-b47b-4291-a553-5bfc7bc22674" Feb 18 11:38:38 crc kubenswrapper[4922]: E0218 11:38:38.957325 4922 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 18 11:38:39 crc kubenswrapper[4922]: E0218 11:38:39.066551 4922 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 18 11:38:39 crc kubenswrapper[4922]: I0218 11:38:39.973208 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:38:39 crc kubenswrapper[4922]: I0218 11:38:39.973830 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:38:39 crc kubenswrapper[4922]: E0218 11:38:39.973906 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:38:39 crc kubenswrapper[4922]: I0218 11:38:39.973932 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:38:39 crc kubenswrapper[4922]: E0218 11:38:39.974053 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:38:39 crc kubenswrapper[4922]: E0218 11:38:39.974166 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:38:40 crc kubenswrapper[4922]: I0218 11:38:40.972023 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:38:40 crc kubenswrapper[4922]: E0218 11:38:40.972293 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pspfr" podUID="4702cf45-b47b-4291-a553-5bfc7bc22674" Feb 18 11:38:41 crc kubenswrapper[4922]: I0218 11:38:41.972215 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:38:41 crc kubenswrapper[4922]: I0218 11:38:41.972318 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:38:41 crc kubenswrapper[4922]: E0218 11:38:41.972483 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:38:41 crc kubenswrapper[4922]: E0218 11:38:41.972647 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:38:41 crc kubenswrapper[4922]: I0218 11:38:41.973809 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:38:41 crc kubenswrapper[4922]: E0218 11:38:41.974139 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:38:42 crc kubenswrapper[4922]: I0218 11:38:42.972599 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:38:42 crc kubenswrapper[4922]: E0218 11:38:42.972795 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pspfr" podUID="4702cf45-b47b-4291-a553-5bfc7bc22674" Feb 18 11:38:43 crc kubenswrapper[4922]: I0218 11:38:43.972019 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:38:43 crc kubenswrapper[4922]: I0218 11:38:43.972156 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:38:43 crc kubenswrapper[4922]: E0218 11:38:43.972242 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:38:43 crc kubenswrapper[4922]: I0218 11:38:43.972278 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:38:43 crc kubenswrapper[4922]: E0218 11:38:43.972501 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:38:43 crc kubenswrapper[4922]: E0218 11:38:43.972643 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:38:44 crc kubenswrapper[4922]: E0218 11:38:44.068208 4922 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 18 11:38:44 crc kubenswrapper[4922]: I0218 11:38:44.972782 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:38:44 crc kubenswrapper[4922]: E0218 11:38:44.973065 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pspfr" podUID="4702cf45-b47b-4291-a553-5bfc7bc22674" Feb 18 11:38:45 crc kubenswrapper[4922]: I0218 11:38:45.972733 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:38:45 crc kubenswrapper[4922]: I0218 11:38:45.972836 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:38:45 crc kubenswrapper[4922]: E0218 11:38:45.972957 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:38:45 crc kubenswrapper[4922]: I0218 11:38:45.972983 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:38:45 crc kubenswrapper[4922]: E0218 11:38:45.973113 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:38:45 crc kubenswrapper[4922]: E0218 11:38:45.973193 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:38:46 crc kubenswrapper[4922]: I0218 11:38:46.972769 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:38:46 crc kubenswrapper[4922]: E0218 11:38:46.972937 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pspfr" podUID="4702cf45-b47b-4291-a553-5bfc7bc22674" Feb 18 11:38:47 crc kubenswrapper[4922]: I0218 11:38:47.972441 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:38:47 crc kubenswrapper[4922]: I0218 11:38:47.972491 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:38:47 crc kubenswrapper[4922]: I0218 11:38:47.972531 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:38:47 crc kubenswrapper[4922]: E0218 11:38:47.972679 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:38:47 crc kubenswrapper[4922]: E0218 11:38:47.972843 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:38:47 crc kubenswrapper[4922]: E0218 11:38:47.973324 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:38:48 crc kubenswrapper[4922]: I0218 11:38:48.973741 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:38:48 crc kubenswrapper[4922]: E0218 11:38:48.975789 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pspfr" podUID="4702cf45-b47b-4291-a553-5bfc7bc22674" Feb 18 11:38:49 crc kubenswrapper[4922]: E0218 11:38:49.069430 4922 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 18 11:38:49 crc kubenswrapper[4922]: I0218 11:38:49.972933 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:38:49 crc kubenswrapper[4922]: I0218 11:38:49.972978 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:38:49 crc kubenswrapper[4922]: I0218 11:38:49.973024 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:38:49 crc kubenswrapper[4922]: E0218 11:38:49.974060 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:38:49 crc kubenswrapper[4922]: E0218 11:38:49.974148 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:38:49 crc kubenswrapper[4922]: I0218 11:38:49.974193 4922 scope.go:117] "RemoveContainer" containerID="71d0689c83ee6c0c1704fac1f69846ec7d6ef1b16479ee4eae33acefd6b84765" Feb 18 11:38:49 crc kubenswrapper[4922]: E0218 11:38:49.974227 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:38:50 crc kubenswrapper[4922]: I0218 11:38:50.722560 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-c9xzd_9b4595ac-c521-4ada-950d-e1b01cdff99b/kube-multus/1.log" Feb 18 11:38:50 crc kubenswrapper[4922]: I0218 11:38:50.722659 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-c9xzd" event={"ID":"9b4595ac-c521-4ada-950d-e1b01cdff99b","Type":"ContainerStarted","Data":"7270cd2a97c0ebd334ca9515a9ea50d1714e52da4915c1ade982166b94d940c9"} Feb 18 11:38:50 crc kubenswrapper[4922]: I0218 11:38:50.972718 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:38:50 crc kubenswrapper[4922]: E0218 11:38:50.972888 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pspfr" podUID="4702cf45-b47b-4291-a553-5bfc7bc22674" Feb 18 11:38:51 crc kubenswrapper[4922]: I0218 11:38:51.972799 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:38:51 crc kubenswrapper[4922]: I0218 11:38:51.972848 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:38:51 crc kubenswrapper[4922]: I0218 11:38:51.972799 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:38:51 crc kubenswrapper[4922]: E0218 11:38:51.973027 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:38:51 crc kubenswrapper[4922]: E0218 11:38:51.973157 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:38:51 crc kubenswrapper[4922]: E0218 11:38:51.973334 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:38:52 crc kubenswrapper[4922]: I0218 11:38:52.972754 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:38:52 crc kubenswrapper[4922]: E0218 11:38:52.973046 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pspfr" podUID="4702cf45-b47b-4291-a553-5bfc7bc22674" Feb 18 11:38:53 crc kubenswrapper[4922]: I0218 11:38:53.972864 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:38:53 crc kubenswrapper[4922]: I0218 11:38:53.972912 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:38:53 crc kubenswrapper[4922]: I0218 11:38:53.972873 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:38:53 crc kubenswrapper[4922]: E0218 11:38:53.973088 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:38:53 crc kubenswrapper[4922]: E0218 11:38:53.973208 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:38:53 crc kubenswrapper[4922]: E0218 11:38:53.973507 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:38:54 crc kubenswrapper[4922]: I0218 11:38:54.972147 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:38:54 crc kubenswrapper[4922]: I0218 11:38:54.975353 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 18 11:38:54 crc kubenswrapper[4922]: I0218 11:38:54.975664 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 18 11:38:55 crc kubenswrapper[4922]: I0218 11:38:55.320792 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:38:55 crc kubenswrapper[4922]: I0218 11:38:55.972278 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:38:55 crc kubenswrapper[4922]: I0218 11:38:55.972422 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:38:55 crc kubenswrapper[4922]: I0218 11:38:55.972452 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:38:55 crc kubenswrapper[4922]: I0218 11:38:55.974753 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 18 11:38:55 crc kubenswrapper[4922]: I0218 11:38:55.975814 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 18 11:38:55 crc kubenswrapper[4922]: I0218 11:38:55.976414 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 18 11:38:55 crc kubenswrapper[4922]: I0218 11:38:55.976728 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.322493 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.378292 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-ks48g"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.379659 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-ks48g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.380636 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jkmhc"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.382654 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jkmhc" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.382830 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-5sz92"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.383902 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-5sz92" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.385033 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wzc5h"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.385737 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wzc5h" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.389447 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-78x9f"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.390156 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xvfbm"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.390463 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-78x9f" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.390850 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xvfbm" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.392230 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pj7qx"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.393096 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pj7qx" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.393238 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.393447 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.396460 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.399419 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-v6m8j"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.400419 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-v6m8j" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.410142 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8719fb44-5fea-4fd5-a516-5d2ab11c221c-etcd-serving-ca\") pod \"apiserver-76f77b778f-ks48g\" (UID: \"8719fb44-5fea-4fd5-a516-5d2ab11c221c\") " pod="openshift-apiserver/apiserver-76f77b778f-ks48g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.410198 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8719fb44-5fea-4fd5-a516-5d2ab11c221c-encryption-config\") pod \"apiserver-76f77b778f-ks48g\" (UID: \"8719fb44-5fea-4fd5-a516-5d2ab11c221c\") " pod="openshift-apiserver/apiserver-76f77b778f-ks48g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.410240 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8719fb44-5fea-4fd5-a516-5d2ab11c221c-serving-cert\") pod \"apiserver-76f77b778f-ks48g\" (UID: \"8719fb44-5fea-4fd5-a516-5d2ab11c221c\") " pod="openshift-apiserver/apiserver-76f77b778f-ks48g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.410272 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8719fb44-5fea-4fd5-a516-5d2ab11c221c-etcd-client\") pod \"apiserver-76f77b778f-ks48g\" (UID: \"8719fb44-5fea-4fd5-a516-5d2ab11c221c\") " pod="openshift-apiserver/apiserver-76f77b778f-ks48g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.410315 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/4a5c3121-2765-47df-aa3f-22595e4b4ea9-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-5sz92\" (UID: \"4a5c3121-2765-47df-aa3f-22595e4b4ea9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5sz92" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.410340 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a5c3121-2765-47df-aa3f-22595e4b4ea9-config\") pod \"machine-api-operator-5694c8668f-5sz92\" (UID: \"4a5c3121-2765-47df-aa3f-22595e4b4ea9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5sz92" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.410382 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8719fb44-5fea-4fd5-a516-5d2ab11c221c-audit\") pod \"apiserver-76f77b778f-ks48g\" (UID: \"8719fb44-5fea-4fd5-a516-5d2ab11c221c\") " pod="openshift-apiserver/apiserver-76f77b778f-ks48g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.410427 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8719fb44-5fea-4fd5-a516-5d2ab11c221c-node-pullsecrets\") pod \"apiserver-76f77b778f-ks48g\" (UID: \"8719fb44-5fea-4fd5-a516-5d2ab11c221c\") " pod="openshift-apiserver/apiserver-76f77b778f-ks48g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.410453 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8719fb44-5fea-4fd5-a516-5d2ab11c221c-config\") pod \"apiserver-76f77b778f-ks48g\" (UID: \"8719fb44-5fea-4fd5-a516-5d2ab11c221c\") " pod="openshift-apiserver/apiserver-76f77b778f-ks48g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.410481 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4a5c3121-2765-47df-aa3f-22595e4b4ea9-images\") pod \"machine-api-operator-5694c8668f-5sz92\" (UID: \"4a5c3121-2765-47df-aa3f-22595e4b4ea9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5sz92" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.410507 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wl6wl\" (UniqueName: \"kubernetes.io/projected/bb42c973-5e2c-4650-b259-e882429363c7-kube-api-access-wl6wl\") pod \"openshift-apiserver-operator-796bbdcf4f-wzc5h\" (UID: \"bb42c973-5e2c-4650-b259-e882429363c7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wzc5h" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.410549 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb42c973-5e2c-4650-b259-e882429363c7-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-wzc5h\" (UID: \"bb42c973-5e2c-4650-b259-e882429363c7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wzc5h" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.410596 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8719fb44-5fea-4fd5-a516-5d2ab11c221c-image-import-ca\") pod \"apiserver-76f77b778f-ks48g\" (UID: \"8719fb44-5fea-4fd5-a516-5d2ab11c221c\") " pod="openshift-apiserver/apiserver-76f77b778f-ks48g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.410652 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/93f5445e-7408-4d36-aa4c-a7461f94d75a-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-jkmhc\" (UID: \"93f5445e-7408-4d36-aa4c-a7461f94d75a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jkmhc" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.410737 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8719fb44-5fea-4fd5-a516-5d2ab11c221c-trusted-ca-bundle\") pod \"apiserver-76f77b778f-ks48g\" (UID: \"8719fb44-5fea-4fd5-a516-5d2ab11c221c\") " pod="openshift-apiserver/apiserver-76f77b778f-ks48g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.410778 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8719fb44-5fea-4fd5-a516-5d2ab11c221c-audit-dir\") pod \"apiserver-76f77b778f-ks48g\" (UID: \"8719fb44-5fea-4fd5-a516-5d2ab11c221c\") " pod="openshift-apiserver/apiserver-76f77b778f-ks48g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.410830 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb42c973-5e2c-4650-b259-e882429363c7-config\") pod \"openshift-apiserver-operator-796bbdcf4f-wzc5h\" (UID: \"bb42c973-5e2c-4650-b259-e882429363c7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wzc5h" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.410864 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xntp\" (UniqueName: \"kubernetes.io/projected/8719fb44-5fea-4fd5-a516-5d2ab11c221c-kube-api-access-4xntp\") pod \"apiserver-76f77b778f-ks48g\" (UID: \"8719fb44-5fea-4fd5-a516-5d2ab11c221c\") " pod="openshift-apiserver/apiserver-76f77b778f-ks48g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.410897 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7w4c\" (UniqueName: \"kubernetes.io/projected/93f5445e-7408-4d36-aa4c-a7461f94d75a-kube-api-access-c7w4c\") pod \"cluster-samples-operator-665b6dd947-jkmhc\" (UID: \"93f5445e-7408-4d36-aa4c-a7461f94d75a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jkmhc" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.410935 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csh8z\" (UniqueName: \"kubernetes.io/projected/4a5c3121-2765-47df-aa3f-22595e4b4ea9-kube-api-access-csh8z\") pod \"machine-api-operator-5694c8668f-5sz92\" (UID: \"4a5c3121-2765-47df-aa3f-22595e4b4ea9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5sz92" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.415825 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.416874 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.417299 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.417765 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.418284 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.418525 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.422880 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.423026 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.423202 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.423253 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.423686 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.426464 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-sddqb"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.427757 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-sddqb" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.433392 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.433717 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.433902 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.434137 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.434351 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.434625 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.434816 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.436067 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.441286 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-q7mwg"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.449781 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.451743 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.452121 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.452278 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.454269 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.454492 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.454775 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.455065 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.455086 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.455218 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.455334 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.455644 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.455839 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.456074 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.457801 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-b6dxx"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.458565 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-b6dxx" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.460919 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.461274 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.461633 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.461880 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.461929 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.462133 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.462976 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-nfn89"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.463605 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nfn89" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.467708 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.469504 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.469954 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.471207 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-fs2lx"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.472081 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fs2lx" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.474530 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-7twg2"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.475615 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7twg2" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.476672 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.477939 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.490453 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.491201 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.491318 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.491417 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.491503 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.491586 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.491673 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.491800 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.491853 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.491938 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.492093 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.492182 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.492328 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.492868 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.493920 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.494343 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.494476 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.494561 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.494637 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.492101 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.494936 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.495078 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.495152 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.494945 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.495203 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.495158 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.495334 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.495514 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.495728 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.495968 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.496099 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.496607 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.497045 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-prk5g"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.498820 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.501308 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.501575 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.525426 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.531979 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-vhdd8"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.532076 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-prk5g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.536261 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8719fb44-5fea-4fd5-a516-5d2ab11c221c-trusted-ca-bundle\") pod \"apiserver-76f77b778f-ks48g\" (UID: \"8719fb44-5fea-4fd5-a516-5d2ab11c221c\") " pod="openshift-apiserver/apiserver-76f77b778f-ks48g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.536300 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8719fb44-5fea-4fd5-a516-5d2ab11c221c-audit-dir\") pod \"apiserver-76f77b778f-ks48g\" (UID: \"8719fb44-5fea-4fd5-a516-5d2ab11c221c\") " pod="openshift-apiserver/apiserver-76f77b778f-ks48g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.536346 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8890a31d-16c4-4c0a-a1f8-4ce9d3314a80-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-xvfbm\" (UID: \"8890a31d-16c4-4c0a-a1f8-4ce9d3314a80\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xvfbm" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.536390 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/afb43c7e-87bc-4450-ad81-6a22161fb794-machine-approver-tls\") pod \"machine-approver-56656f9798-fs2lx\" (UID: \"afb43c7e-87bc-4450-ad81-6a22161fb794\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fs2lx" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.536431 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb42c973-5e2c-4650-b259-e882429363c7-config\") pod \"openshift-apiserver-operator-796bbdcf4f-wzc5h\" (UID: \"bb42c973-5e2c-4650-b259-e882429363c7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wzc5h" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.536457 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xntp\" (UniqueName: \"kubernetes.io/projected/8719fb44-5fea-4fd5-a516-5d2ab11c221c-kube-api-access-4xntp\") pod \"apiserver-76f77b778f-ks48g\" (UID: \"8719fb44-5fea-4fd5-a516-5d2ab11c221c\") " pod="openshift-apiserver/apiserver-76f77b778f-ks48g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.536484 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhx5p\" (UniqueName: \"kubernetes.io/projected/48dabf7e-d1d7-48b6-bc70-5cc88cdcf994-kube-api-access-jhx5p\") pod \"downloads-7954f5f757-b6dxx\" (UID: \"48dabf7e-d1d7-48b6-bc70-5cc88cdcf994\") " pod="openshift-console/downloads-7954f5f757-b6dxx" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.536513 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3bb2fc9-822c-4f53-98bf-70933744cf7f-client-ca\") pod \"route-controller-manager-6576b87f9c-v6m8j\" (UID: \"c3bb2fc9-822c-4f53-98bf-70933744cf7f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-v6m8j" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.536537 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lkpw\" (UniqueName: \"kubernetes.io/projected/c3bb2fc9-822c-4f53-98bf-70933744cf7f-kube-api-access-6lkpw\") pod \"route-controller-manager-6576b87f9c-v6m8j\" (UID: \"c3bb2fc9-822c-4f53-98bf-70933744cf7f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-v6m8j" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.536558 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8890a31d-16c4-4c0a-a1f8-4ce9d3314a80-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-xvfbm\" (UID: \"8890a31d-16c4-4c0a-a1f8-4ce9d3314a80\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xvfbm" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.536579 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/14e81dbf-6c73-481c-b758-4c15cc0f3258-console-serving-cert\") pod \"console-f9d7485db-nfn89\" (UID: \"14e81dbf-6c73-481c-b758-4c15cc0f3258\") " pod="openshift-console/console-f9d7485db-nfn89" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.536596 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-q7mwg\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.536618 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7w4c\" (UniqueName: \"kubernetes.io/projected/93f5445e-7408-4d36-aa4c-a7461f94d75a-kube-api-access-c7w4c\") pod \"cluster-samples-operator-665b6dd947-jkmhc\" (UID: \"93f5445e-7408-4d36-aa4c-a7461f94d75a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jkmhc" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.536640 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-q7mwg\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.536661 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpknq\" (UniqueName: \"kubernetes.io/projected/9e36551d-13cd-4a75-a29b-658850b46cb8-kube-api-access-mpknq\") pod \"oauth-openshift-558db77b4-q7mwg\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.536679 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/14e81dbf-6c73-481c-b758-4c15cc0f3258-oauth-serving-cert\") pod \"console-f9d7485db-nfn89\" (UID: \"14e81dbf-6c73-481c-b758-4c15cc0f3258\") " pod="openshift-console/console-f9d7485db-nfn89" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.536697 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csh8z\" (UniqueName: \"kubernetes.io/projected/4a5c3121-2765-47df-aa3f-22595e4b4ea9-kube-api-access-csh8z\") pod \"machine-api-operator-5694c8668f-5sz92\" (UID: \"4a5c3121-2765-47df-aa3f-22595e4b4ea9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5sz92" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.536725 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/8890a31d-16c4-4c0a-a1f8-4ce9d3314a80-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-xvfbm\" (UID: \"8890a31d-16c4-4c0a-a1f8-4ce9d3314a80\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xvfbm" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.536754 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/14e81dbf-6c73-481c-b758-4c15cc0f3258-service-ca\") pod \"console-f9d7485db-nfn89\" (UID: \"14e81dbf-6c73-481c-b758-4c15cc0f3258\") " pod="openshift-console/console-f9d7485db-nfn89" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.536780 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-q7mwg\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.536801 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8719fb44-5fea-4fd5-a516-5d2ab11c221c-etcd-serving-ca\") pod \"apiserver-76f77b778f-ks48g\" (UID: \"8719fb44-5fea-4fd5-a516-5d2ab11c221c\") " pod="openshift-apiserver/apiserver-76f77b778f-ks48g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.536823 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8719fb44-5fea-4fd5-a516-5d2ab11c221c-encryption-config\") pod \"apiserver-76f77b778f-ks48g\" (UID: \"8719fb44-5fea-4fd5-a516-5d2ab11c221c\") " pod="openshift-apiserver/apiserver-76f77b778f-ks48g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.536845 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jfkj\" (UniqueName: \"kubernetes.io/projected/9951c815-3e1f-40ad-8597-b558366ffc58-kube-api-access-6jfkj\") pod \"authentication-operator-69f744f599-78x9f\" (UID: \"9951c815-3e1f-40ad-8597-b558366ffc58\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-78x9f" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.536867 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8719fb44-5fea-4fd5-a516-5d2ab11c221c-etcd-client\") pod \"apiserver-76f77b778f-ks48g\" (UID: \"8719fb44-5fea-4fd5-a516-5d2ab11c221c\") " pod="openshift-apiserver/apiserver-76f77b778f-ks48g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.536884 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8719fb44-5fea-4fd5-a516-5d2ab11c221c-serving-cert\") pod \"apiserver-76f77b778f-ks48g\" (UID: \"8719fb44-5fea-4fd5-a516-5d2ab11c221c\") " pod="openshift-apiserver/apiserver-76f77b778f-ks48g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.536905 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9951c815-3e1f-40ad-8597-b558366ffc58-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-78x9f\" (UID: \"9951c815-3e1f-40ad-8597-b558366ffc58\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-78x9f" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.536940 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/4a5c3121-2765-47df-aa3f-22595e4b4ea9-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-5sz92\" (UID: \"4a5c3121-2765-47df-aa3f-22595e4b4ea9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5sz92" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.536959 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f4ea70ef-743e-44ef-804c-2f1321999baa-available-featuregates\") pod \"openshift-config-operator-7777fb866f-7twg2\" (UID: \"f4ea70ef-743e-44ef-804c-2f1321999baa\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7twg2" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.536979 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6lpr\" (UniqueName: \"kubernetes.io/projected/f4ea70ef-743e-44ef-804c-2f1321999baa-kube-api-access-t6lpr\") pod \"openshift-config-operator-7777fb866f-7twg2\" (UID: \"f4ea70ef-743e-44ef-804c-2f1321999baa\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7twg2" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.537002 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9951c815-3e1f-40ad-8597-b558366ffc58-serving-cert\") pod \"authentication-operator-69f744f599-78x9f\" (UID: \"9951c815-3e1f-40ad-8597-b558366ffc58\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-78x9f" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.537219 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/158f0672-c017-4e45-a564-96de81f21772-config\") pod \"controller-manager-879f6c89f-pj7qx\" (UID: \"158f0672-c017-4e45-a564-96de81f21772\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pj7qx" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.537239 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14e81dbf-6c73-481c-b758-4c15cc0f3258-trusted-ca-bundle\") pod \"console-f9d7485db-nfn89\" (UID: \"14e81dbf-6c73-481c-b758-4c15cc0f3258\") " pod="openshift-console/console-f9d7485db-nfn89" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.537257 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/158f0672-c017-4e45-a564-96de81f21772-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pj7qx\" (UID: \"158f0672-c017-4e45-a564-96de81f21772\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pj7qx" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.537280 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a5c3121-2765-47df-aa3f-22595e4b4ea9-config\") pod \"machine-api-operator-5694c8668f-5sz92\" (UID: \"4a5c3121-2765-47df-aa3f-22595e4b4ea9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5sz92" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.537303 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/158f0672-c017-4e45-a564-96de81f21772-client-ca\") pod \"controller-manager-879f6c89f-pj7qx\" (UID: \"158f0672-c017-4e45-a564-96de81f21772\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pj7qx" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.537327 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-q7mwg\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.537349 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfbvn\" (UniqueName: \"kubernetes.io/projected/ca3008b0-2ba6-4dfd-9fea-d1890e2af197-kube-api-access-sfbvn\") pod \"console-operator-58897d9998-sddqb\" (UID: \"ca3008b0-2ba6-4dfd-9fea-d1890e2af197\") " pod="openshift-console-operator/console-operator-58897d9998-sddqb" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.537391 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhj97\" (UniqueName: \"kubernetes.io/projected/8890a31d-16c4-4c0a-a1f8-4ce9d3314a80-kube-api-access-dhj97\") pod \"cluster-image-registry-operator-dc59b4c8b-xvfbm\" (UID: \"8890a31d-16c4-4c0a-a1f8-4ce9d3314a80\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xvfbm" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.537415 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3bb2fc9-822c-4f53-98bf-70933744cf7f-config\") pod \"route-controller-manager-6576b87f9c-v6m8j\" (UID: \"c3bb2fc9-822c-4f53-98bf-70933744cf7f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-v6m8j" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.537434 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-q7mwg\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.537454 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k46td\" (UniqueName: \"kubernetes.io/projected/158f0672-c017-4e45-a564-96de81f21772-kube-api-access-k46td\") pod \"controller-manager-879f6c89f-pj7qx\" (UID: \"158f0672-c017-4e45-a564-96de81f21772\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pj7qx" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.537475 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8719fb44-5fea-4fd5-a516-5d2ab11c221c-audit\") pod \"apiserver-76f77b778f-ks48g\" (UID: \"8719fb44-5fea-4fd5-a516-5d2ab11c221c\") " pod="openshift-apiserver/apiserver-76f77b778f-ks48g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.537497 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/14e81dbf-6c73-481c-b758-4c15cc0f3258-console-config\") pod \"console-f9d7485db-nfn89\" (UID: \"14e81dbf-6c73-481c-b758-4c15cc0f3258\") " pod="openshift-console/console-f9d7485db-nfn89" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.537516 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-q7mwg\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.537539 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.537547 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afb43c7e-87bc-4450-ad81-6a22161fb794-config\") pod \"machine-approver-56656f9798-fs2lx\" (UID: \"afb43c7e-87bc-4450-ad81-6a22161fb794\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fs2lx" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.537709 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8719fb44-5fea-4fd5-a516-5d2ab11c221c-config\") pod \"apiserver-76f77b778f-ks48g\" (UID: \"8719fb44-5fea-4fd5-a516-5d2ab11c221c\") " pod="openshift-apiserver/apiserver-76f77b778f-ks48g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.537738 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3bb2fc9-822c-4f53-98bf-70933744cf7f-serving-cert\") pod \"route-controller-manager-6576b87f9c-v6m8j\" (UID: \"c3bb2fc9-822c-4f53-98bf-70933744cf7f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-v6m8j" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.537765 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9e36551d-13cd-4a75-a29b-658850b46cb8-audit-dir\") pod \"oauth-openshift-558db77b4-q7mwg\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.537786 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8719fb44-5fea-4fd5-a516-5d2ab11c221c-node-pullsecrets\") pod \"apiserver-76f77b778f-ks48g\" (UID: \"8719fb44-5fea-4fd5-a516-5d2ab11c221c\") " pod="openshift-apiserver/apiserver-76f77b778f-ks48g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.537830 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4a5c3121-2765-47df-aa3f-22595e4b4ea9-images\") pod \"machine-api-operator-5694c8668f-5sz92\" (UID: \"4a5c3121-2765-47df-aa3f-22595e4b4ea9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5sz92" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.537850 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca3008b0-2ba6-4dfd-9fea-d1890e2af197-config\") pod \"console-operator-58897d9998-sddqb\" (UID: \"ca3008b0-2ba6-4dfd-9fea-d1890e2af197\") " pod="openshift-console-operator/console-operator-58897d9998-sddqb" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.537869 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz6m8\" (UniqueName: \"kubernetes.io/projected/afb43c7e-87bc-4450-ad81-6a22161fb794-kube-api-access-cz6m8\") pod \"machine-approver-56656f9798-fs2lx\" (UID: \"afb43c7e-87bc-4450-ad81-6a22161fb794\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fs2lx" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.537894 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wl6wl\" (UniqueName: \"kubernetes.io/projected/bb42c973-5e2c-4650-b259-e882429363c7-kube-api-access-wl6wl\") pod \"openshift-apiserver-operator-796bbdcf4f-wzc5h\" (UID: \"bb42c973-5e2c-4650-b259-e882429363c7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wzc5h" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.537895 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8719fb44-5fea-4fd5-a516-5d2ab11c221c-audit-dir\") pod \"apiserver-76f77b778f-ks48g\" (UID: \"8719fb44-5fea-4fd5-a516-5d2ab11c221c\") " pod="openshift-apiserver/apiserver-76f77b778f-ks48g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.537915 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slf52\" (UniqueName: \"kubernetes.io/projected/14e81dbf-6c73-481c-b758-4c15cc0f3258-kube-api-access-slf52\") pod \"console-f9d7485db-nfn89\" (UID: \"14e81dbf-6c73-481c-b758-4c15cc0f3258\") " pod="openshift-console/console-f9d7485db-nfn89" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.538015 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8719fb44-5fea-4fd5-a516-5d2ab11c221c-node-pullsecrets\") pod \"apiserver-76f77b778f-ks48g\" (UID: \"8719fb44-5fea-4fd5-a516-5d2ab11c221c\") " pod="openshift-apiserver/apiserver-76f77b778f-ks48g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.538089 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wt2rf"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.538728 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb42c973-5e2c-4650-b259-e882429363c7-config\") pod \"openshift-apiserver-operator-796bbdcf4f-wzc5h\" (UID: \"bb42c973-5e2c-4650-b259-e882429363c7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wzc5h" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.538783 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8719fb44-5fea-4fd5-a516-5d2ab11c221c-trusted-ca-bundle\") pod \"apiserver-76f77b778f-ks48g\" (UID: \"8719fb44-5fea-4fd5-a516-5d2ab11c221c\") " pod="openshift-apiserver/apiserver-76f77b778f-ks48g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.538845 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-q7mwg\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.539003 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9951c815-3e1f-40ad-8597-b558366ffc58-config\") pod \"authentication-operator-69f744f599-78x9f\" (UID: \"9951c815-3e1f-40ad-8597-b558366ffc58\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-78x9f" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.539044 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/14e81dbf-6c73-481c-b758-4c15cc0f3258-console-oauth-config\") pod \"console-f9d7485db-nfn89\" (UID: \"14e81dbf-6c73-481c-b758-4c15cc0f3258\") " pod="openshift-console/console-f9d7485db-nfn89" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.542840 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8719fb44-5fea-4fd5-a516-5d2ab11c221c-config\") pod \"apiserver-76f77b778f-ks48g\" (UID: \"8719fb44-5fea-4fd5-a516-5d2ab11c221c\") " pod="openshift-apiserver/apiserver-76f77b778f-ks48g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.561744 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.562113 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.562995 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.563510 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4ea70ef-743e-44ef-804c-2f1321999baa-serving-cert\") pod \"openshift-config-operator-7777fb866f-7twg2\" (UID: \"f4ea70ef-743e-44ef-804c-2f1321999baa\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7twg2" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.564061 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vhdd8" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.564143 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4a5c3121-2765-47df-aa3f-22595e4b4ea9-images\") pod \"machine-api-operator-5694c8668f-5sz92\" (UID: \"4a5c3121-2765-47df-aa3f-22595e4b4ea9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5sz92" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.568177 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-wdprh"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.569122 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a5c3121-2765-47df-aa3f-22595e4b4ea9-config\") pod \"machine-api-operator-5694c8668f-5sz92\" (UID: \"4a5c3121-2765-47df-aa3f-22595e4b4ea9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5sz92" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.569681 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-rwxzh"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.569977 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb42c973-5e2c-4650-b259-e882429363c7-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-wzc5h\" (UID: \"bb42c973-5e2c-4650-b259-e882429363c7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wzc5h" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.570521 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-wdprh" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.570953 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.571171 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.571569 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.572245 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.572480 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.572665 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.572966 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.573095 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.573259 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.573280 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8719fb44-5fea-4fd5-a516-5d2ab11c221c-audit\") pod \"apiserver-76f77b778f-ks48g\" (UID: \"8719fb44-5fea-4fd5-a516-5d2ab11c221c\") " pod="openshift-apiserver/apiserver-76f77b778f-ks48g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.573676 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.573819 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.573951 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.574091 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.574235 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.574348 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.574483 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.574649 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8719fb44-5fea-4fd5-a516-5d2ab11c221c-etcd-serving-ca\") pod \"apiserver-76f77b778f-ks48g\" (UID: \"8719fb44-5fea-4fd5-a516-5d2ab11c221c\") " pod="openshift-apiserver/apiserver-76f77b778f-ks48g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.574894 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-rwxzh" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.574941 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.575447 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.575587 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nbzck"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.576824 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-ks48g"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.576912 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nbzck" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.577659 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.577768 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-5sz92"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.578580 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/4a5c3121-2765-47df-aa3f-22595e4b4ea9-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-5sz92\" (UID: \"4a5c3121-2765-47df-aa3f-22595e4b4ea9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5sz92" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.578806 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-js5gq"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.580128 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-ktkz9"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.580501 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-ktkz9" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.580560 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-p4qtq"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.580868 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-js5gq" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.581222 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-p4qtq" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.581395 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8719fb44-5fea-4fd5-a516-5d2ab11c221c-etcd-client\") pod \"apiserver-76f77b778f-ks48g\" (UID: \"8719fb44-5fea-4fd5-a516-5d2ab11c221c\") " pod="openshift-apiserver/apiserver-76f77b778f-ks48g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.581824 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8719fb44-5fea-4fd5-a516-5d2ab11c221c-encryption-config\") pod \"apiserver-76f77b778f-ks48g\" (UID: \"8719fb44-5fea-4fd5-a516-5d2ab11c221c\") " pod="openshift-apiserver/apiserver-76f77b778f-ks48g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.583290 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.584187 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-jkv6t"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.584754 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.585461 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-h8l7q"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.585630 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8719fb44-5fea-4fd5-a516-5d2ab11c221c-serving-cert\") pod \"apiserver-76f77b778f-ks48g\" (UID: \"8719fb44-5fea-4fd5-a516-5d2ab11c221c\") " pod="openshift-apiserver/apiserver-76f77b778f-ks48g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.585865 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jkv6t" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.585942 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-9b55m"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.586345 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gst7r"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.586436 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h8l7q" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.586454 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9b55m" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.587131 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gst7r" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.587161 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wm5bk"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.587941 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.590452 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r84r2"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.590551 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wm5bk" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.592392 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca3008b0-2ba6-4dfd-9fea-d1890e2af197-serving-cert\") pod \"console-operator-58897d9998-sddqb\" (UID: \"ca3008b0-2ba6-4dfd-9fea-d1890e2af197\") " pod="openshift-console-operator/console-operator-58897d9998-sddqb" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.592469 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/afb43c7e-87bc-4450-ad81-6a22161fb794-auth-proxy-config\") pod \"machine-approver-56656f9798-fs2lx\" (UID: \"afb43c7e-87bc-4450-ad81-6a22161fb794\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fs2lx" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.592557 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-q7mwg\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.592685 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-q7mwg\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.592841 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-9kz6f"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.592913 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9951c815-3e1f-40ad-8597-b558366ffc58-service-ca-bundle\") pod \"authentication-operator-69f744f599-78x9f\" (UID: \"9951c815-3e1f-40ad-8597-b558366ffc58\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-78x9f" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.593053 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r84r2" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.593629 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ca3008b0-2ba6-4dfd-9fea-d1890e2af197-trusted-ca\") pod \"console-operator-58897d9998-sddqb\" (UID: \"ca3008b0-2ba6-4dfd-9fea-d1890e2af197\") " pod="openshift-console-operator/console-operator-58897d9998-sddqb" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.593808 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9e36551d-13cd-4a75-a29b-658850b46cb8-audit-policies\") pod \"oauth-openshift-558db77b4-q7mwg\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.593837 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-q7mwg\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.593863 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-q7mwg\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.593897 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8719fb44-5fea-4fd5-a516-5d2ab11c221c-image-import-ca\") pod \"apiserver-76f77b778f-ks48g\" (UID: \"8719fb44-5fea-4fd5-a516-5d2ab11c221c\") " pod="openshift-apiserver/apiserver-76f77b778f-ks48g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.593928 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/158f0672-c017-4e45-a564-96de81f21772-serving-cert\") pod \"controller-manager-879f6c89f-pj7qx\" (UID: \"158f0672-c017-4e45-a564-96de81f21772\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pj7qx" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.593953 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/93f5445e-7408-4d36-aa4c-a7461f94d75a-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-jkmhc\" (UID: \"93f5445e-7408-4d36-aa4c-a7461f94d75a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jkmhc" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.594174 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zm5zz"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.594572 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zm5zz" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.594647 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9kz6f" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.596689 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8719fb44-5fea-4fd5-a516-5d2ab11c221c-image-import-ca\") pod \"apiserver-76f77b778f-ks48g\" (UID: \"8719fb44-5fea-4fd5-a516-5d2ab11c221c\") " pod="openshift-apiserver/apiserver-76f77b778f-ks48g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.598898 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jkmhc"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.600591 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rj4b9"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.600794 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb42c973-5e2c-4650-b259-e882429363c7-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-wzc5h\" (UID: \"bb42c973-5e2c-4650-b259-e882429363c7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wzc5h" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.606265 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.608448 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/93f5445e-7408-4d36-aa4c-a7461f94d75a-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-jkmhc\" (UID: \"93f5445e-7408-4d36-aa4c-a7461f94d75a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jkmhc" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.609236 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vgs8b"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.609515 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rj4b9" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.611487 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nc7b9"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.612433 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vgs8b" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.613504 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-v4cwm"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.614913 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-nc7b9" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.616769 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-v4cwm" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.617648 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b6msr"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.619255 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b6msr" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.621436 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-9hdml"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.625553 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.629201 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-ppzj4"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.629581 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9hdml" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.629875 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-ppzj4" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.629669 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-78x9f"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.630513 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-nfn89"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.634510 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pj7qx"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.635454 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523570-k54zv"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.636191 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523570-k54zv" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.636532 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-v6m8j"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.638140 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-q7mwg"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.639774 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-jkv6t"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.641101 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nbzck"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.641585 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.642288 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wzc5h"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.643416 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wt2rf"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.645234 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-js5gq"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.646484 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gst7r"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.647812 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-prk5g"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.649214 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rj4b9"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.650617 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-l8pr7"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.651283 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-b6dxx"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.651409 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-l8pr7" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.654048 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r84r2"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.655740 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-9b55m"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.656819 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-x69t8"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.658816 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-x69t8" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.660203 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-rwxzh"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.661637 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-sddqb"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.662878 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-9kz6f"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.664063 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-wdprh"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.665572 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-7twg2"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.667298 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-p4qtq"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.668889 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xvfbm"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.669922 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-h8l7q"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.671239 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wm5bk"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.672640 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-vhdd8"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.674000 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zm5zz"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.674597 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csh8z\" (UniqueName: \"kubernetes.io/projected/4a5c3121-2765-47df-aa3f-22595e4b4ea9-kube-api-access-csh8z\") pod \"machine-api-operator-5694c8668f-5sz92\" (UID: \"4a5c3121-2765-47df-aa3f-22595e4b4ea9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5sz92" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.675727 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-ppzj4"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.676842 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-v4cwm"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.678050 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-x69t8"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.679328 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523570-k54zv"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.682067 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vgs8b"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.684804 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9hdml"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.686256 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b6msr"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.687389 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nc7b9"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.688610 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-gbpm4"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.689446 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gbpm4" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.689928 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-gbpm4"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.694473 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8890a31d-16c4-4c0a-a1f8-4ce9d3314a80-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-xvfbm\" (UID: \"8890a31d-16c4-4c0a-a1f8-4ce9d3314a80\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xvfbm" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.694503 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/afb43c7e-87bc-4450-ad81-6a22161fb794-machine-approver-tls\") pod \"machine-approver-56656f9798-fs2lx\" (UID: \"afb43c7e-87bc-4450-ad81-6a22161fb794\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fs2lx" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.694529 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2f7958cf-7c2d-4c29-bea8-5871267d5e16-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-vhdd8\" (UID: \"2f7958cf-7c2d-4c29-bea8-5871267d5e16\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vhdd8" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.694991 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/14e81dbf-6c73-481c-b758-4c15cc0f3258-console-serving-cert\") pod \"console-f9d7485db-nfn89\" (UID: \"14e81dbf-6c73-481c-b758-4c15cc0f3258\") " pod="openshift-console/console-f9d7485db-nfn89" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.695034 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69fd90c9-8767-4d22-b88e-33fafd8026d8-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-js5gq\" (UID: \"69fd90c9-8767-4d22-b88e-33fafd8026d8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-js5gq" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.695053 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/14e81dbf-6c73-481c-b758-4c15cc0f3258-oauth-serving-cert\") pod \"console-f9d7485db-nfn89\" (UID: \"14e81dbf-6c73-481c-b758-4c15cc0f3258\") " pod="openshift-console/console-f9d7485db-nfn89" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.695071 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-q7mwg\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.695102 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b3dc29a-edba-48bc-823b-33b792856873-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-nbzck\" (UID: \"8b3dc29a-edba-48bc-823b-33b792856873\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nbzck" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.695129 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/14e81dbf-6c73-481c-b758-4c15cc0f3258-service-ca\") pod \"console-f9d7485db-nfn89\" (UID: \"14e81dbf-6c73-481c-b758-4c15cc0f3258\") " pod="openshift-console/console-f9d7485db-nfn89" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.695151 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-q7mwg\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.695221 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7ntm\" (UniqueName: \"kubernetes.io/projected/2f7958cf-7c2d-4c29-bea8-5871267d5e16-kube-api-access-n7ntm\") pod \"apiserver-7bbb656c7d-vhdd8\" (UID: \"2f7958cf-7c2d-4c29-bea8-5871267d5e16\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vhdd8" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.695254 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jfkj\" (UniqueName: \"kubernetes.io/projected/9951c815-3e1f-40ad-8597-b558366ffc58-kube-api-access-6jfkj\") pod \"authentication-operator-69f744f599-78x9f\" (UID: \"9951c815-3e1f-40ad-8597-b558366ffc58\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-78x9f" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.695289 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/faaf8fb4-0dba-494d-8a14-2dba7901f50a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-wm5bk\" (UID: \"faaf8fb4-0dba-494d-8a14-2dba7901f50a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wm5bk" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.695313 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2787200d-e2f9-477b-bb3c-c1c40201f13a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gst7r\" (UID: \"2787200d-e2f9-477b-bb3c-c1c40201f13a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gst7r" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.695339 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f4ea70ef-743e-44ef-804c-2f1321999baa-available-featuregates\") pod \"openshift-config-operator-7777fb866f-7twg2\" (UID: \"f4ea70ef-743e-44ef-804c-2f1321999baa\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7twg2" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.695382 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6lpr\" (UniqueName: \"kubernetes.io/projected/f4ea70ef-743e-44ef-804c-2f1321999baa-kube-api-access-t6lpr\") pod \"openshift-config-operator-7777fb866f-7twg2\" (UID: \"f4ea70ef-743e-44ef-804c-2f1321999baa\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7twg2" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.695411 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2f7958cf-7c2d-4c29-bea8-5871267d5e16-audit-policies\") pod \"apiserver-7bbb656c7d-vhdd8\" (UID: \"2f7958cf-7c2d-4c29-bea8-5871267d5e16\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vhdd8" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.695436 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/158f0672-c017-4e45-a564-96de81f21772-config\") pod \"controller-manager-879f6c89f-pj7qx\" (UID: \"158f0672-c017-4e45-a564-96de81f21772\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pj7qx" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.695461 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14e81dbf-6c73-481c-b758-4c15cc0f3258-trusted-ca-bundle\") pod \"console-f9d7485db-nfn89\" (UID: \"14e81dbf-6c73-481c-b758-4c15cc0f3258\") " pod="openshift-console/console-f9d7485db-nfn89" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.695486 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/158f0672-c017-4e45-a564-96de81f21772-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pj7qx\" (UID: \"158f0672-c017-4e45-a564-96de81f21772\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pj7qx" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.695515 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9b494454-5efb-466f-81bd-754f7d6fa0a8-bound-sa-token\") pod \"ingress-operator-5b745b69d9-9b55m\" (UID: \"9b494454-5efb-466f-81bd-754f7d6fa0a8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9b55m" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.695544 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-q7mwg\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.695573 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0ed74e60-8c19-47d2-b760-a6f8678f38da-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-r84r2\" (UID: \"0ed74e60-8c19-47d2-b760-a6f8678f38da\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r84r2" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.695600 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-q7mwg\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.695673 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k46td\" (UniqueName: \"kubernetes.io/projected/158f0672-c017-4e45-a564-96de81f21772-kube-api-access-k46td\") pod \"controller-manager-879f6c89f-pj7qx\" (UID: \"158f0672-c017-4e45-a564-96de81f21772\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pj7qx" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.695700 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/faaf8fb4-0dba-494d-8a14-2dba7901f50a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-wm5bk\" (UID: \"faaf8fb4-0dba-494d-8a14-2dba7901f50a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wm5bk" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.695725 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfbvn\" (UniqueName: \"kubernetes.io/projected/ca3008b0-2ba6-4dfd-9fea-d1890e2af197-kube-api-access-sfbvn\") pod \"console-operator-58897d9998-sddqb\" (UID: \"ca3008b0-2ba6-4dfd-9fea-d1890e2af197\") " pod="openshift-console-operator/console-operator-58897d9998-sddqb" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.695748 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhj97\" (UniqueName: \"kubernetes.io/projected/8890a31d-16c4-4c0a-a1f8-4ce9d3314a80-kube-api-access-dhj97\") pod \"cluster-image-registry-operator-dc59b4c8b-xvfbm\" (UID: \"8890a31d-16c4-4c0a-a1f8-4ce9d3314a80\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xvfbm" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.695773 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/14e81dbf-6c73-481c-b758-4c15cc0f3258-console-config\") pod \"console-f9d7485db-nfn89\" (UID: \"14e81dbf-6c73-481c-b758-4c15cc0f3258\") " pod="openshift-console/console-f9d7485db-nfn89" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.695797 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afb43c7e-87bc-4450-ad81-6a22161fb794-config\") pod \"machine-approver-56656f9798-fs2lx\" (UID: \"afb43c7e-87bc-4450-ad81-6a22161fb794\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fs2lx" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.695822 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b3dc29a-edba-48bc-823b-33b792856873-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-nbzck\" (UID: \"8b3dc29a-edba-48bc-823b-33b792856873\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nbzck" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.695869 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9e36551d-13cd-4a75-a29b-658850b46cb8-audit-dir\") pod \"oauth-openshift-558db77b4-q7mwg\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.695883 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/14e81dbf-6c73-481c-b758-4c15cc0f3258-oauth-serving-cert\") pod \"console-f9d7485db-nfn89\" (UID: \"14e81dbf-6c73-481c-b758-4c15cc0f3258\") " pod="openshift-console/console-f9d7485db-nfn89" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.695883 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-q7mwg\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.695905 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw4z2\" (UniqueName: \"kubernetes.io/projected/69fd90c9-8767-4d22-b88e-33fafd8026d8-kube-api-access-sw4z2\") pod \"kube-storage-version-migrator-operator-b67b599dd-js5gq\" (UID: \"69fd90c9-8767-4d22-b88e-33fafd8026d8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-js5gq" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.695943 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz6m8\" (UniqueName: \"kubernetes.io/projected/afb43c7e-87bc-4450-ad81-6a22161fb794-kube-api-access-cz6m8\") pod \"machine-approver-56656f9798-fs2lx\" (UID: \"afb43c7e-87bc-4450-ad81-6a22161fb794\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fs2lx" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.695974 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4891f319-eff4-4b7f-912e-45da55cb4fc2-srv-cert\") pod \"olm-operator-6b444d44fb-zm5zz\" (UID: \"4891f319-eff4-4b7f-912e-45da55cb4fc2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zm5zz" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.696085 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f4ea70ef-743e-44ef-804c-2f1321999baa-available-featuregates\") pod \"openshift-config-operator-7777fb866f-7twg2\" (UID: \"f4ea70ef-743e-44ef-804c-2f1321999baa\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7twg2" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.696106 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4z5l\" (UniqueName: \"kubernetes.io/projected/353bd1c5-bab8-42cc-925a-d9776ac60b6b-kube-api-access-g4z5l\") pod \"migrator-59844c95c7-9kz6f\" (UID: \"353bd1c5-bab8-42cc-925a-d9776ac60b6b\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9kz6f" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.696151 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slf52\" (UniqueName: \"kubernetes.io/projected/14e81dbf-6c73-481c-b758-4c15cc0f3258-kube-api-access-slf52\") pod \"console-f9d7485db-nfn89\" (UID: \"14e81dbf-6c73-481c-b758-4c15cc0f3258\") " pod="openshift-console/console-f9d7485db-nfn89" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.696183 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ed74e60-8c19-47d2-b760-a6f8678f38da-config\") pod \"kube-controller-manager-operator-78b949d7b-r84r2\" (UID: \"0ed74e60-8c19-47d2-b760-a6f8678f38da\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r84r2" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.696212 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9951c815-3e1f-40ad-8597-b558366ffc58-config\") pod \"authentication-operator-69f744f599-78x9f\" (UID: \"9951c815-3e1f-40ad-8597-b558366ffc58\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-78x9f" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.696242 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/14e81dbf-6c73-481c-b758-4c15cc0f3258-console-oauth-config\") pod \"console-f9d7485db-nfn89\" (UID: \"14e81dbf-6c73-481c-b758-4c15cc0f3258\") " pod="openshift-console/console-f9d7485db-nfn89" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.696290 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb9m2\" (UniqueName: \"kubernetes.io/projected/6ae30939-0d1c-4856-86e0-2b0b4797fa6a-kube-api-access-rb9m2\") pod \"etcd-operator-b45778765-prk5g\" (UID: \"6ae30939-0d1c-4856-86e0-2b0b4797fa6a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-prk5g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.696317 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/afb43c7e-87bc-4450-ad81-6a22161fb794-auth-proxy-config\") pod \"machine-approver-56656f9798-fs2lx\" (UID: \"afb43c7e-87bc-4450-ad81-6a22161fb794\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fs2lx" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.696344 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2787200d-e2f9-477b-bb3c-c1c40201f13a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gst7r\" (UID: \"2787200d-e2f9-477b-bb3c-c1c40201f13a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gst7r" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.696350 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9e36551d-13cd-4a75-a29b-658850b46cb8-audit-dir\") pod \"oauth-openshift-558db77b4-q7mwg\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.696391 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-q7mwg\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.696423 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9951c815-3e1f-40ad-8597-b558366ffc58-service-ca-bundle\") pod \"authentication-operator-69f744f599-78x9f\" (UID: \"9951c815-3e1f-40ad-8597-b558366ffc58\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-78x9f" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.696451 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ae30939-0d1c-4856-86e0-2b0b4797fa6a-config\") pod \"etcd-operator-b45778765-prk5g\" (UID: \"6ae30939-0d1c-4856-86e0-2b0b4797fa6a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-prk5g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.696477 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/14e81dbf-6c73-481c-b758-4c15cc0f3258-service-ca\") pod \"console-f9d7485db-nfn89\" (UID: \"14e81dbf-6c73-481c-b758-4c15cc0f3258\") " pod="openshift-console/console-f9d7485db-nfn89" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.696490 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/158f0672-c017-4e45-a564-96de81f21772-serving-cert\") pod \"controller-manager-879f6c89f-pj7qx\" (UID: \"158f0672-c017-4e45-a564-96de81f21772\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pj7qx" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.696527 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f7b66c5-b258-4314-b3a5-e08b958245b6-metrics-certs\") pod \"router-default-5444994796-ktkz9\" (UID: \"9f7b66c5-b258-4314-b3a5-e08b958245b6\") " pod="openshift-ingress/router-default-5444994796-ktkz9" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.696562 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzhgh\" (UniqueName: \"kubernetes.io/projected/9b494454-5efb-466f-81bd-754f7d6fa0a8-kube-api-access-tzhgh\") pod \"ingress-operator-5b745b69d9-9b55m\" (UID: \"9b494454-5efb-466f-81bd-754f7d6fa0a8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9b55m" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.696590 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/083b5af3-1602-4add-a778-86b19df106c2-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-rwxzh\" (UID: \"083b5af3-1602-4add-a778-86b19df106c2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rwxzh" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.696619 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6ae30939-0d1c-4856-86e0-2b0b4797fa6a-etcd-ca\") pod \"etcd-operator-b45778765-prk5g\" (UID: \"6ae30939-0d1c-4856-86e0-2b0b4797fa6a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-prk5g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.696654 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6ae30939-0d1c-4856-86e0-2b0b4797fa6a-etcd-service-ca\") pod \"etcd-operator-b45778765-prk5g\" (UID: \"6ae30939-0d1c-4856-86e0-2b0b4797fa6a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-prk5g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.696710 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhx5p\" (UniqueName: \"kubernetes.io/projected/48dabf7e-d1d7-48b6-bc70-5cc88cdcf994-kube-api-access-jhx5p\") pod \"downloads-7954f5f757-b6dxx\" (UID: \"48dabf7e-d1d7-48b6-bc70-5cc88cdcf994\") " pod="openshift-console/downloads-7954f5f757-b6dxx" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.697421 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/14e81dbf-6c73-481c-b758-4c15cc0f3258-console-config\") pod \"console-f9d7485db-nfn89\" (UID: \"14e81dbf-6c73-481c-b758-4c15cc0f3258\") " pod="openshift-console/console-f9d7485db-nfn89" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.697453 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3bb2fc9-822c-4f53-98bf-70933744cf7f-client-ca\") pod \"route-controller-manager-6576b87f9c-v6m8j\" (UID: \"c3bb2fc9-822c-4f53-98bf-70933744cf7f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-v6m8j" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.697486 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lkpw\" (UniqueName: \"kubernetes.io/projected/c3bb2fc9-822c-4f53-98bf-70933744cf7f-kube-api-access-6lkpw\") pod \"route-controller-manager-6576b87f9c-v6m8j\" (UID: \"c3bb2fc9-822c-4f53-98bf-70933744cf7f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-v6m8j" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.697509 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/afb43c7e-87bc-4450-ad81-6a22161fb794-auth-proxy-config\") pod \"machine-approver-56656f9798-fs2lx\" (UID: \"afb43c7e-87bc-4450-ad81-6a22161fb794\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fs2lx" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.697890 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xntp\" (UniqueName: \"kubernetes.io/projected/8719fb44-5fea-4fd5-a516-5d2ab11c221c-kube-api-access-4xntp\") pod \"apiserver-76f77b778f-ks48g\" (UID: \"8719fb44-5fea-4fd5-a516-5d2ab11c221c\") " pod="openshift-apiserver/apiserver-76f77b778f-ks48g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.697890 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/afb43c7e-87bc-4450-ad81-6a22161fb794-machine-approver-tls\") pod \"machine-approver-56656f9798-fs2lx\" (UID: \"afb43c7e-87bc-4450-ad81-6a22161fb794\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fs2lx" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.697998 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8890a31d-16c4-4c0a-a1f8-4ce9d3314a80-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-xvfbm\" (UID: \"8890a31d-16c4-4c0a-a1f8-4ce9d3314a80\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xvfbm" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.698186 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9951c815-3e1f-40ad-8597-b558366ffc58-config\") pod \"authentication-operator-69f744f599-78x9f\" (UID: \"9951c815-3e1f-40ad-8597-b558366ffc58\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-78x9f" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.698198 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8890a31d-16c4-4c0a-a1f8-4ce9d3314a80-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-xvfbm\" (UID: \"8890a31d-16c4-4c0a-a1f8-4ce9d3314a80\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xvfbm" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.698346 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f7b66c5-b258-4314-b3a5-e08b958245b6-service-ca-bundle\") pod \"router-default-5444994796-ktkz9\" (UID: \"9f7b66c5-b258-4314-b3a5-e08b958245b6\") " pod="openshift-ingress/router-default-5444994796-ktkz9" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.698465 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-q7mwg\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.698552 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9f7b66c5-b258-4314-b3a5-e08b958245b6-default-certificate\") pod \"router-default-5444994796-ktkz9\" (UID: \"9f7b66c5-b258-4314-b3a5-e08b958245b6\") " pod="openshift-ingress/router-default-5444994796-ktkz9" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.698661 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f7958cf-7c2d-4c29-bea8-5871267d5e16-serving-cert\") pod \"apiserver-7bbb656c7d-vhdd8\" (UID: \"2f7958cf-7c2d-4c29-bea8-5871267d5e16\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vhdd8" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.698744 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlbk4\" (UniqueName: \"kubernetes.io/projected/027da92d-9293-48ea-bd00-47b0fcb186fd-kube-api-access-hlbk4\") pod \"machine-config-operator-74547568cd-jkv6t\" (UID: \"027da92d-9293-48ea-bd00-47b0fcb186fd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jkv6t" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.699278 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpknq\" (UniqueName: \"kubernetes.io/projected/9e36551d-13cd-4a75-a29b-658850b46cb8-kube-api-access-mpknq\") pod \"oauth-openshift-558db77b4-q7mwg\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.699342 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afb43c7e-87bc-4450-ad81-6a22161fb794-config\") pod \"machine-approver-56656f9798-fs2lx\" (UID: \"afb43c7e-87bc-4450-ad81-6a22161fb794\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fs2lx" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.699434 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2787200d-e2f9-477b-bb3c-c1c40201f13a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gst7r\" (UID: \"2787200d-e2f9-477b-bb3c-c1c40201f13a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gst7r" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.699512 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/8890a31d-16c4-4c0a-a1f8-4ce9d3314a80-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-xvfbm\" (UID: \"8890a31d-16c4-4c0a-a1f8-4ce9d3314a80\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xvfbm" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.699576 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-q7mwg\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.699607 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-q7mwg\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.699695 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9951c815-3e1f-40ad-8597-b558366ffc58-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-78x9f\" (UID: \"9951c815-3e1f-40ad-8597-b558366ffc58\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-78x9f" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.699548 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9951c815-3e1f-40ad-8597-b558366ffc58-service-ca-bundle\") pod \"authentication-operator-69f744f599-78x9f\" (UID: \"9951c815-3e1f-40ad-8597-b558366ffc58\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-78x9f" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.699756 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/158f0672-c017-4e45-a564-96de81f21772-config\") pod \"controller-manager-879f6c89f-pj7qx\" (UID: \"158f0672-c017-4e45-a564-96de81f21772\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pj7qx" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.699847 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3bb2fc9-822c-4f53-98bf-70933744cf7f-client-ca\") pod \"route-controller-manager-6576b87f9c-v6m8j\" (UID: \"c3bb2fc9-822c-4f53-98bf-70933744cf7f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-v6m8j" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.699876 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-q7mwg\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.699770 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjqpz\" (UniqueName: \"kubernetes.io/projected/374ac04a-b37d-42c8-b0ca-e2647c86bc74-kube-api-access-jjqpz\") pod \"dns-operator-744455d44c-wdprh\" (UID: \"374ac04a-b37d-42c8-b0ca-e2647c86bc74\") " pod="openshift-dns-operator/dns-operator-744455d44c-wdprh" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.700012 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14e81dbf-6c73-481c-b758-4c15cc0f3258-trusted-ca-bundle\") pod \"console-f9d7485db-nfn89\" (UID: \"14e81dbf-6c73-481c-b758-4c15cc0f3258\") " pod="openshift-console/console-f9d7485db-nfn89" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.700029 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69fd90c9-8767-4d22-b88e-33fafd8026d8-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-js5gq\" (UID: \"69fd90c9-8767-4d22-b88e-33fafd8026d8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-js5gq" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.700161 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/faaf8fb4-0dba-494d-8a14-2dba7901f50a-config\") pod \"kube-apiserver-operator-766d6c64bb-wm5bk\" (UID: \"faaf8fb4-0dba-494d-8a14-2dba7901f50a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wm5bk" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.700254 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f7958cf-7c2d-4c29-bea8-5871267d5e16-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-vhdd8\" (UID: \"2f7958cf-7c2d-4c29-bea8-5871267d5e16\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vhdd8" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.700328 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9951c815-3e1f-40ad-8597-b558366ffc58-serving-cert\") pod \"authentication-operator-69f744f599-78x9f\" (UID: \"9951c815-3e1f-40ad-8597-b558366ffc58\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-78x9f" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.700488 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/374ac04a-b37d-42c8-b0ca-e2647c86bc74-metrics-tls\") pod \"dns-operator-744455d44c-wdprh\" (UID: \"374ac04a-b37d-42c8-b0ca-e2647c86bc74\") " pod="openshift-dns-operator/dns-operator-744455d44c-wdprh" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.700485 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9951c815-3e1f-40ad-8597-b558366ffc58-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-78x9f\" (UID: \"9951c815-3e1f-40ad-8597-b558366ffc58\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-78x9f" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.704725 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-q7mwg\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.705398 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-q7mwg\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.705980 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms9ww\" (UniqueName: \"kubernetes.io/projected/4891f319-eff4-4b7f-912e-45da55cb4fc2-kube-api-access-ms9ww\") pod \"olm-operator-6b444d44fb-zm5zz\" (UID: \"4891f319-eff4-4b7f-912e-45da55cb4fc2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zm5zz" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.706054 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4891f319-eff4-4b7f-912e-45da55cb4fc2-profile-collector-cert\") pod \"olm-operator-6b444d44fb-zm5zz\" (UID: \"4891f319-eff4-4b7f-912e-45da55cb4fc2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zm5zz" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.706103 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/158f0672-c017-4e45-a564-96de81f21772-client-ca\") pod \"controller-manager-879f6c89f-pj7qx\" (UID: \"158f0672-c017-4e45-a564-96de81f21772\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pj7qx" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.705983 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/8890a31d-16c4-4c0a-a1f8-4ce9d3314a80-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-xvfbm\" (UID: \"8890a31d-16c4-4c0a-a1f8-4ce9d3314a80\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xvfbm" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.706612 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dh67\" (UniqueName: \"kubernetes.io/projected/9f7b66c5-b258-4314-b3a5-e08b958245b6-kube-api-access-4dh67\") pod \"router-default-5444994796-ktkz9\" (UID: \"9f7b66c5-b258-4314-b3a5-e08b958245b6\") " pod="openshift-ingress/router-default-5444994796-ktkz9" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.706671 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/027da92d-9293-48ea-bd00-47b0fcb186fd-auth-proxy-config\") pod \"machine-config-operator-74547568cd-jkv6t\" (UID: \"027da92d-9293-48ea-bd00-47b0fcb186fd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jkv6t" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.707586 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/158f0672-c017-4e45-a564-96de81f21772-client-ca\") pod \"controller-manager-879f6c89f-pj7qx\" (UID: \"158f0672-c017-4e45-a564-96de81f21772\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pj7qx" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.707698 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3bb2fc9-822c-4f53-98bf-70933744cf7f-config\") pod \"route-controller-manager-6576b87f9c-v6m8j\" (UID: \"c3bb2fc9-822c-4f53-98bf-70933744cf7f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-v6m8j" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.707801 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-q7mwg\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.707851 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/027da92d-9293-48ea-bd00-47b0fcb186fd-proxy-tls\") pod \"machine-config-operator-74547568cd-jkv6t\" (UID: \"027da92d-9293-48ea-bd00-47b0fcb186fd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jkv6t" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.707964 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ed74e60-8c19-47d2-b760-a6f8678f38da-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-r84r2\" (UID: \"0ed74e60-8c19-47d2-b760-a6f8678f38da\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r84r2" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.708024 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6q9q\" (UniqueName: \"kubernetes.io/projected/083b5af3-1602-4add-a778-86b19df106c2-kube-api-access-l6q9q\") pod \"multus-admission-controller-857f4d67dd-rwxzh\" (UID: \"083b5af3-1602-4add-a778-86b19df106c2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rwxzh" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.708059 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/14e81dbf-6c73-481c-b758-4c15cc0f3258-console-oauth-config\") pod \"console-f9d7485db-nfn89\" (UID: \"14e81dbf-6c73-481c-b758-4c15cc0f3258\") " pod="openshift-console/console-f9d7485db-nfn89" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.708078 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3bb2fc9-822c-4f53-98bf-70933744cf7f-serving-cert\") pod \"route-controller-manager-6576b87f9c-v6m8j\" (UID: \"c3bb2fc9-822c-4f53-98bf-70933744cf7f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-v6m8j" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.708128 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca3008b0-2ba6-4dfd-9fea-d1890e2af197-config\") pod \"console-operator-58897d9998-sddqb\" (UID: \"ca3008b0-2ba6-4dfd-9fea-d1890e2af197\") " pod="openshift-console-operator/console-operator-58897d9998-sddqb" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.708169 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9f7b66c5-b258-4314-b3a5-e08b958245b6-stats-auth\") pod \"router-default-5444994796-ktkz9\" (UID: \"9f7b66c5-b258-4314-b3a5-e08b958245b6\") " pod="openshift-ingress/router-default-5444994796-ktkz9" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.700549 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/158f0672-c017-4e45-a564-96de81f21772-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pj7qx\" (UID: \"158f0672-c017-4e45-a564-96de81f21772\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pj7qx" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.708401 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9b494454-5efb-466f-81bd-754f7d6fa0a8-metrics-tls\") pod \"ingress-operator-5b745b69d9-9b55m\" (UID: \"9b494454-5efb-466f-81bd-754f7d6fa0a8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9b55m" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.708469 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z9bk\" (UniqueName: \"kubernetes.io/projected/8b3dc29a-edba-48bc-823b-33b792856873-kube-api-access-4z9bk\") pod \"openshift-controller-manager-operator-756b6f6bc6-nbzck\" (UID: \"8b3dc29a-edba-48bc-823b-33b792856873\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nbzck" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.709138 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-q7mwg\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.709261 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2f7958cf-7c2d-4c29-bea8-5871267d5e16-etcd-client\") pod \"apiserver-7bbb656c7d-vhdd8\" (UID: \"2f7958cf-7c2d-4c29-bea8-5871267d5e16\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vhdd8" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.709423 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4ea70ef-743e-44ef-804c-2f1321999baa-serving-cert\") pod \"openshift-config-operator-7777fb866f-7twg2\" (UID: \"f4ea70ef-743e-44ef-804c-2f1321999baa\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7twg2" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.709975 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca3008b0-2ba6-4dfd-9fea-d1890e2af197-serving-cert\") pod \"console-operator-58897d9998-sddqb\" (UID: \"ca3008b0-2ba6-4dfd-9fea-d1890e2af197\") " pod="openshift-console-operator/console-operator-58897d9998-sddqb" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.710062 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ae30939-0d1c-4856-86e0-2b0b4797fa6a-serving-cert\") pod \"etcd-operator-b45778765-prk5g\" (UID: \"6ae30939-0d1c-4856-86e0-2b0b4797fa6a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-prk5g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.709873 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca3008b0-2ba6-4dfd-9fea-d1890e2af197-config\") pod \"console-operator-58897d9998-sddqb\" (UID: \"ca3008b0-2ba6-4dfd-9fea-d1890e2af197\") " pod="openshift-console-operator/console-operator-58897d9998-sddqb" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.709913 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3bb2fc9-822c-4f53-98bf-70933744cf7f-config\") pod \"route-controller-manager-6576b87f9c-v6m8j\" (UID: \"c3bb2fc9-822c-4f53-98bf-70933744cf7f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-v6m8j" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.710143 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-q7mwg\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.710291 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ca3008b0-2ba6-4dfd-9fea-d1890e2af197-trusted-ca\") pod \"console-operator-58897d9998-sddqb\" (UID: \"ca3008b0-2ba6-4dfd-9fea-d1890e2af197\") " pod="openshift-console-operator/console-operator-58897d9998-sddqb" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.710343 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9e36551d-13cd-4a75-a29b-658850b46cb8-audit-policies\") pod \"oauth-openshift-558db77b4-q7mwg\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.710426 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-q7mwg\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.710464 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-q7mwg\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.710517 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9b494454-5efb-466f-81bd-754f7d6fa0a8-trusted-ca\") pod \"ingress-operator-5b745b69d9-9b55m\" (UID: \"9b494454-5efb-466f-81bd-754f7d6fa0a8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9b55m" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.710569 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2f7958cf-7c2d-4c29-bea8-5871267d5e16-audit-dir\") pod \"apiserver-7bbb656c7d-vhdd8\" (UID: \"2f7958cf-7c2d-4c29-bea8-5871267d5e16\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vhdd8" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.710610 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/027da92d-9293-48ea-bd00-47b0fcb186fd-images\") pod \"machine-config-operator-74547568cd-jkv6t\" (UID: \"027da92d-9293-48ea-bd00-47b0fcb186fd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jkv6t" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.710654 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6ae30939-0d1c-4856-86e0-2b0b4797fa6a-etcd-client\") pod \"etcd-operator-b45778765-prk5g\" (UID: \"6ae30939-0d1c-4856-86e0-2b0b4797fa6a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-prk5g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.710694 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2f7958cf-7c2d-4c29-bea8-5871267d5e16-encryption-config\") pod \"apiserver-7bbb656c7d-vhdd8\" (UID: \"2f7958cf-7c2d-4c29-bea8-5871267d5e16\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vhdd8" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.712166 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ca3008b0-2ba6-4dfd-9fea-d1890e2af197-trusted-ca\") pod \"console-operator-58897d9998-sddqb\" (UID: \"ca3008b0-2ba6-4dfd-9fea-d1890e2af197\") " pod="openshift-console-operator/console-operator-58897d9998-sddqb" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.712678 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9e36551d-13cd-4a75-a29b-658850b46cb8-audit-policies\") pod \"oauth-openshift-558db77b4-q7mwg\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.713175 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-q7mwg\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.713877 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-q7mwg\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.714155 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/14e81dbf-6c73-481c-b758-4c15cc0f3258-console-serving-cert\") pod \"console-f9d7485db-nfn89\" (UID: \"14e81dbf-6c73-481c-b758-4c15cc0f3258\") " pod="openshift-console/console-f9d7485db-nfn89" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.714438 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/158f0672-c017-4e45-a564-96de81f21772-serving-cert\") pod \"controller-manager-879f6c89f-pj7qx\" (UID: \"158f0672-c017-4e45-a564-96de81f21772\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pj7qx" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.716205 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4ea70ef-743e-44ef-804c-2f1321999baa-serving-cert\") pod \"openshift-config-operator-7777fb866f-7twg2\" (UID: \"f4ea70ef-743e-44ef-804c-2f1321999baa\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7twg2" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.716218 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9951c815-3e1f-40ad-8597-b558366ffc58-serving-cert\") pod \"authentication-operator-69f744f599-78x9f\" (UID: \"9951c815-3e1f-40ad-8597-b558366ffc58\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-78x9f" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.716349 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca3008b0-2ba6-4dfd-9fea-d1890e2af197-serving-cert\") pod \"console-operator-58897d9998-sddqb\" (UID: \"ca3008b0-2ba6-4dfd-9fea-d1890e2af197\") " pod="openshift-console-operator/console-operator-58897d9998-sddqb" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.716412 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-ks48g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.717097 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-q7mwg\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.717969 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-q7mwg\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.718001 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7w4c\" (UniqueName: \"kubernetes.io/projected/93f5445e-7408-4d36-aa4c-a7461f94d75a-kube-api-access-c7w4c\") pod \"cluster-samples-operator-665b6dd947-jkmhc\" (UID: \"93f5445e-7408-4d36-aa4c-a7461f94d75a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jkmhc" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.721170 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-q7mwg\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.723795 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3bb2fc9-822c-4f53-98bf-70933744cf7f-serving-cert\") pod \"route-controller-manager-6576b87f9c-v6m8j\" (UID: \"c3bb2fc9-822c-4f53-98bf-70933744cf7f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-v6m8j" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.736878 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wl6wl\" (UniqueName: \"kubernetes.io/projected/bb42c973-5e2c-4650-b259-e882429363c7-kube-api-access-wl6wl\") pod \"openshift-apiserver-operator-796bbdcf4f-wzc5h\" (UID: \"bb42c973-5e2c-4650-b259-e882429363c7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wzc5h" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.742242 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.752313 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jkmhc" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.760513 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-5sz92" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.762605 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.778110 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wzc5h" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.782309 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.820760 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.822033 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6q9q\" (UniqueName: \"kubernetes.io/projected/083b5af3-1602-4add-a778-86b19df106c2-kube-api-access-l6q9q\") pod \"multus-admission-controller-857f4d67dd-rwxzh\" (UID: \"083b5af3-1602-4add-a778-86b19df106c2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rwxzh" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.822066 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9f7b66c5-b258-4314-b3a5-e08b958245b6-stats-auth\") pod \"router-default-5444994796-ktkz9\" (UID: \"9f7b66c5-b258-4314-b3a5-e08b958245b6\") " pod="openshift-ingress/router-default-5444994796-ktkz9" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.822085 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9b494454-5efb-466f-81bd-754f7d6fa0a8-metrics-tls\") pod \"ingress-operator-5b745b69d9-9b55m\" (UID: \"9b494454-5efb-466f-81bd-754f7d6fa0a8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9b55m" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.822103 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4z9bk\" (UniqueName: \"kubernetes.io/projected/8b3dc29a-edba-48bc-823b-33b792856873-kube-api-access-4z9bk\") pod \"openshift-controller-manager-operator-756b6f6bc6-nbzck\" (UID: \"8b3dc29a-edba-48bc-823b-33b792856873\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nbzck" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.822122 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2f7958cf-7c2d-4c29-bea8-5871267d5e16-etcd-client\") pod \"apiserver-7bbb656c7d-vhdd8\" (UID: \"2f7958cf-7c2d-4c29-bea8-5871267d5e16\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vhdd8" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.822143 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ae30939-0d1c-4856-86e0-2b0b4797fa6a-serving-cert\") pod \"etcd-operator-b45778765-prk5g\" (UID: \"6ae30939-0d1c-4856-86e0-2b0b4797fa6a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-prk5g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.822160 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9b494454-5efb-466f-81bd-754f7d6fa0a8-trusted-ca\") pod \"ingress-operator-5b745b69d9-9b55m\" (UID: \"9b494454-5efb-466f-81bd-754f7d6fa0a8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9b55m" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.822174 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2f7958cf-7c2d-4c29-bea8-5871267d5e16-audit-dir\") pod \"apiserver-7bbb656c7d-vhdd8\" (UID: \"2f7958cf-7c2d-4c29-bea8-5871267d5e16\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vhdd8" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.822189 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/027da92d-9293-48ea-bd00-47b0fcb186fd-images\") pod \"machine-config-operator-74547568cd-jkv6t\" (UID: \"027da92d-9293-48ea-bd00-47b0fcb186fd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jkv6t" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.822204 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6ae30939-0d1c-4856-86e0-2b0b4797fa6a-etcd-client\") pod \"etcd-operator-b45778765-prk5g\" (UID: \"6ae30939-0d1c-4856-86e0-2b0b4797fa6a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-prk5g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.822233 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2f7958cf-7c2d-4c29-bea8-5871267d5e16-encryption-config\") pod \"apiserver-7bbb656c7d-vhdd8\" (UID: \"2f7958cf-7c2d-4c29-bea8-5871267d5e16\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vhdd8" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.822265 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2f7958cf-7c2d-4c29-bea8-5871267d5e16-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-vhdd8\" (UID: \"2f7958cf-7c2d-4c29-bea8-5871267d5e16\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vhdd8" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.822281 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69fd90c9-8767-4d22-b88e-33fafd8026d8-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-js5gq\" (UID: \"69fd90c9-8767-4d22-b88e-33fafd8026d8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-js5gq" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.822303 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b3dc29a-edba-48bc-823b-33b792856873-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-nbzck\" (UID: \"8b3dc29a-edba-48bc-823b-33b792856873\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nbzck" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.822299 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2f7958cf-7c2d-4c29-bea8-5871267d5e16-audit-dir\") pod \"apiserver-7bbb656c7d-vhdd8\" (UID: \"2f7958cf-7c2d-4c29-bea8-5871267d5e16\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vhdd8" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.822326 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7ntm\" (UniqueName: \"kubernetes.io/projected/2f7958cf-7c2d-4c29-bea8-5871267d5e16-kube-api-access-n7ntm\") pod \"apiserver-7bbb656c7d-vhdd8\" (UID: \"2f7958cf-7c2d-4c29-bea8-5871267d5e16\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vhdd8" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.822352 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2787200d-e2f9-477b-bb3c-c1c40201f13a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gst7r\" (UID: \"2787200d-e2f9-477b-bb3c-c1c40201f13a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gst7r" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.823211 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/faaf8fb4-0dba-494d-8a14-2dba7901f50a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-wm5bk\" (UID: \"faaf8fb4-0dba-494d-8a14-2dba7901f50a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wm5bk" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.823264 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2f7958cf-7c2d-4c29-bea8-5871267d5e16-audit-policies\") pod \"apiserver-7bbb656c7d-vhdd8\" (UID: \"2f7958cf-7c2d-4c29-bea8-5871267d5e16\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vhdd8" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.823293 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9b494454-5efb-466f-81bd-754f7d6fa0a8-bound-sa-token\") pod \"ingress-operator-5b745b69d9-9b55m\" (UID: \"9b494454-5efb-466f-81bd-754f7d6fa0a8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9b55m" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.823320 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0ed74e60-8c19-47d2-b760-a6f8678f38da-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-r84r2\" (UID: \"0ed74e60-8c19-47d2-b760-a6f8678f38da\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r84r2" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.823385 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/faaf8fb4-0dba-494d-8a14-2dba7901f50a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-wm5bk\" (UID: \"faaf8fb4-0dba-494d-8a14-2dba7901f50a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wm5bk" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.823417 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b3dc29a-edba-48bc-823b-33b792856873-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-nbzck\" (UID: \"8b3dc29a-edba-48bc-823b-33b792856873\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nbzck" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.823468 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sw4z2\" (UniqueName: \"kubernetes.io/projected/69fd90c9-8767-4d22-b88e-33fafd8026d8-kube-api-access-sw4z2\") pod \"kube-storage-version-migrator-operator-b67b599dd-js5gq\" (UID: \"69fd90c9-8767-4d22-b88e-33fafd8026d8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-js5gq" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.823505 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4891f319-eff4-4b7f-912e-45da55cb4fc2-srv-cert\") pod \"olm-operator-6b444d44fb-zm5zz\" (UID: \"4891f319-eff4-4b7f-912e-45da55cb4fc2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zm5zz" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.823528 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4z5l\" (UniqueName: \"kubernetes.io/projected/353bd1c5-bab8-42cc-925a-d9776ac60b6b-kube-api-access-g4z5l\") pod \"migrator-59844c95c7-9kz6f\" (UID: \"353bd1c5-bab8-42cc-925a-d9776ac60b6b\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9kz6f" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.823562 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ed74e60-8c19-47d2-b760-a6f8678f38da-config\") pod \"kube-controller-manager-operator-78b949d7b-r84r2\" (UID: \"0ed74e60-8c19-47d2-b760-a6f8678f38da\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r84r2" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.823604 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2787200d-e2f9-477b-bb3c-c1c40201f13a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gst7r\" (UID: \"2787200d-e2f9-477b-bb3c-c1c40201f13a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gst7r" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.823624 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb9m2\" (UniqueName: \"kubernetes.io/projected/6ae30939-0d1c-4856-86e0-2b0b4797fa6a-kube-api-access-rb9m2\") pod \"etcd-operator-b45778765-prk5g\" (UID: \"6ae30939-0d1c-4856-86e0-2b0b4797fa6a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-prk5g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.823647 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ae30939-0d1c-4856-86e0-2b0b4797fa6a-config\") pod \"etcd-operator-b45778765-prk5g\" (UID: \"6ae30939-0d1c-4856-86e0-2b0b4797fa6a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-prk5g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.823671 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f7b66c5-b258-4314-b3a5-e08b958245b6-metrics-certs\") pod \"router-default-5444994796-ktkz9\" (UID: \"9f7b66c5-b258-4314-b3a5-e08b958245b6\") " pod="openshift-ingress/router-default-5444994796-ktkz9" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.823690 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzhgh\" (UniqueName: \"kubernetes.io/projected/9b494454-5efb-466f-81bd-754f7d6fa0a8-kube-api-access-tzhgh\") pod \"ingress-operator-5b745b69d9-9b55m\" (UID: \"9b494454-5efb-466f-81bd-754f7d6fa0a8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9b55m" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.823712 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/083b5af3-1602-4add-a778-86b19df106c2-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-rwxzh\" (UID: \"083b5af3-1602-4add-a778-86b19df106c2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rwxzh" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.823730 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6ae30939-0d1c-4856-86e0-2b0b4797fa6a-etcd-ca\") pod \"etcd-operator-b45778765-prk5g\" (UID: \"6ae30939-0d1c-4856-86e0-2b0b4797fa6a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-prk5g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.823750 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6ae30939-0d1c-4856-86e0-2b0b4797fa6a-etcd-service-ca\") pod \"etcd-operator-b45778765-prk5g\" (UID: \"6ae30939-0d1c-4856-86e0-2b0b4797fa6a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-prk5g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.823806 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9f7b66c5-b258-4314-b3a5-e08b958245b6-default-certificate\") pod \"router-default-5444994796-ktkz9\" (UID: \"9f7b66c5-b258-4314-b3a5-e08b958245b6\") " pod="openshift-ingress/router-default-5444994796-ktkz9" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.823839 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f7b66c5-b258-4314-b3a5-e08b958245b6-service-ca-bundle\") pod \"router-default-5444994796-ktkz9\" (UID: \"9f7b66c5-b258-4314-b3a5-e08b958245b6\") " pod="openshift-ingress/router-default-5444994796-ktkz9" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.823872 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2787200d-e2f9-477b-bb3c-c1c40201f13a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gst7r\" (UID: \"2787200d-e2f9-477b-bb3c-c1c40201f13a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gst7r" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.823892 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f7958cf-7c2d-4c29-bea8-5871267d5e16-serving-cert\") pod \"apiserver-7bbb656c7d-vhdd8\" (UID: \"2f7958cf-7c2d-4c29-bea8-5871267d5e16\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vhdd8" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.823914 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlbk4\" (UniqueName: \"kubernetes.io/projected/027da92d-9293-48ea-bd00-47b0fcb186fd-kube-api-access-hlbk4\") pod \"machine-config-operator-74547568cd-jkv6t\" (UID: \"027da92d-9293-48ea-bd00-47b0fcb186fd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jkv6t" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.823946 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjqpz\" (UniqueName: \"kubernetes.io/projected/374ac04a-b37d-42c8-b0ca-e2647c86bc74-kube-api-access-jjqpz\") pod \"dns-operator-744455d44c-wdprh\" (UID: \"374ac04a-b37d-42c8-b0ca-e2647c86bc74\") " pod="openshift-dns-operator/dns-operator-744455d44c-wdprh" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.823972 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69fd90c9-8767-4d22-b88e-33fafd8026d8-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-js5gq\" (UID: \"69fd90c9-8767-4d22-b88e-33fafd8026d8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-js5gq" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.823993 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/faaf8fb4-0dba-494d-8a14-2dba7901f50a-config\") pod \"kube-apiserver-operator-766d6c64bb-wm5bk\" (UID: \"faaf8fb4-0dba-494d-8a14-2dba7901f50a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wm5bk" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.824016 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f7958cf-7c2d-4c29-bea8-5871267d5e16-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-vhdd8\" (UID: \"2f7958cf-7c2d-4c29-bea8-5871267d5e16\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vhdd8" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.824042 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/374ac04a-b37d-42c8-b0ca-e2647c86bc74-metrics-tls\") pod \"dns-operator-744455d44c-wdprh\" (UID: \"374ac04a-b37d-42c8-b0ca-e2647c86bc74\") " pod="openshift-dns-operator/dns-operator-744455d44c-wdprh" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.824066 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms9ww\" (UniqueName: \"kubernetes.io/projected/4891f319-eff4-4b7f-912e-45da55cb4fc2-kube-api-access-ms9ww\") pod \"olm-operator-6b444d44fb-zm5zz\" (UID: \"4891f319-eff4-4b7f-912e-45da55cb4fc2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zm5zz" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.824092 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4891f319-eff4-4b7f-912e-45da55cb4fc2-profile-collector-cert\") pod \"olm-operator-6b444d44fb-zm5zz\" (UID: \"4891f319-eff4-4b7f-912e-45da55cb4fc2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zm5zz" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.824116 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dh67\" (UniqueName: \"kubernetes.io/projected/9f7b66c5-b258-4314-b3a5-e08b958245b6-kube-api-access-4dh67\") pod \"router-default-5444994796-ktkz9\" (UID: \"9f7b66c5-b258-4314-b3a5-e08b958245b6\") " pod="openshift-ingress/router-default-5444994796-ktkz9" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.824138 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/027da92d-9293-48ea-bd00-47b0fcb186fd-auth-proxy-config\") pod \"machine-config-operator-74547568cd-jkv6t\" (UID: \"027da92d-9293-48ea-bd00-47b0fcb186fd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jkv6t" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.824176 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/027da92d-9293-48ea-bd00-47b0fcb186fd-proxy-tls\") pod \"machine-config-operator-74547568cd-jkv6t\" (UID: \"027da92d-9293-48ea-bd00-47b0fcb186fd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jkv6t" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.824203 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ed74e60-8c19-47d2-b760-a6f8678f38da-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-r84r2\" (UID: \"0ed74e60-8c19-47d2-b760-a6f8678f38da\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r84r2" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.824638 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ae30939-0d1c-4856-86e0-2b0b4797fa6a-config\") pod \"etcd-operator-b45778765-prk5g\" (UID: \"6ae30939-0d1c-4856-86e0-2b0b4797fa6a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-prk5g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.825488 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6ae30939-0d1c-4856-86e0-2b0b4797fa6a-etcd-ca\") pod \"etcd-operator-b45778765-prk5g\" (UID: \"6ae30939-0d1c-4856-86e0-2b0b4797fa6a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-prk5g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.825766 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/027da92d-9293-48ea-bd00-47b0fcb186fd-auth-proxy-config\") pod \"machine-config-operator-74547568cd-jkv6t\" (UID: \"027da92d-9293-48ea-bd00-47b0fcb186fd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jkv6t" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.826981 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f7958cf-7c2d-4c29-bea8-5871267d5e16-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-vhdd8\" (UID: \"2f7958cf-7c2d-4c29-bea8-5871267d5e16\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vhdd8" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.827161 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2f7958cf-7c2d-4c29-bea8-5871267d5e16-audit-policies\") pod \"apiserver-7bbb656c7d-vhdd8\" (UID: \"2f7958cf-7c2d-4c29-bea8-5871267d5e16\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vhdd8" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.827316 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6ae30939-0d1c-4856-86e0-2b0b4797fa6a-etcd-service-ca\") pod \"etcd-operator-b45778765-prk5g\" (UID: \"6ae30939-0d1c-4856-86e0-2b0b4797fa6a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-prk5g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.827994 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2f7958cf-7c2d-4c29-bea8-5871267d5e16-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-vhdd8\" (UID: \"2f7958cf-7c2d-4c29-bea8-5871267d5e16\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vhdd8" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.830310 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2f7958cf-7c2d-4c29-bea8-5871267d5e16-etcd-client\") pod \"apiserver-7bbb656c7d-vhdd8\" (UID: \"2f7958cf-7c2d-4c29-bea8-5871267d5e16\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vhdd8" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.831634 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ae30939-0d1c-4856-86e0-2b0b4797fa6a-serving-cert\") pod \"etcd-operator-b45778765-prk5g\" (UID: \"6ae30939-0d1c-4856-86e0-2b0b4797fa6a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-prk5g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.831872 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6ae30939-0d1c-4856-86e0-2b0b4797fa6a-etcd-client\") pod \"etcd-operator-b45778765-prk5g\" (UID: \"6ae30939-0d1c-4856-86e0-2b0b4797fa6a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-prk5g" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.836650 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/374ac04a-b37d-42c8-b0ca-e2647c86bc74-metrics-tls\") pod \"dns-operator-744455d44c-wdprh\" (UID: \"374ac04a-b37d-42c8-b0ca-e2647c86bc74\") " pod="openshift-dns-operator/dns-operator-744455d44c-wdprh" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.836928 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f7958cf-7c2d-4c29-bea8-5871267d5e16-serving-cert\") pod \"apiserver-7bbb656c7d-vhdd8\" (UID: \"2f7958cf-7c2d-4c29-bea8-5871267d5e16\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vhdd8" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.837354 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2f7958cf-7c2d-4c29-bea8-5871267d5e16-encryption-config\") pod \"apiserver-7bbb656c7d-vhdd8\" (UID: \"2f7958cf-7c2d-4c29-bea8-5871267d5e16\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vhdd8" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.843743 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.850899 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/083b5af3-1602-4add-a778-86b19df106c2-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-rwxzh\" (UID: \"083b5af3-1602-4add-a778-86b19df106c2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rwxzh" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.863443 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.885542 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.895072 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b3dc29a-edba-48bc-823b-33b792856873-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-nbzck\" (UID: \"8b3dc29a-edba-48bc-823b-33b792856873\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nbzck" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.904638 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.923377 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.943997 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.956915 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b3dc29a-edba-48bc-823b-33b792856873-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-nbzck\" (UID: \"8b3dc29a-edba-48bc-823b-33b792856873\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nbzck" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.964551 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-ks48g"] Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.966330 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 18 11:38:57 crc kubenswrapper[4922]: I0218 11:38:57.982300 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:57.999862 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jkmhc"] Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.003816 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.010386 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f7b66c5-b258-4314-b3a5-e08b958245b6-metrics-certs\") pod \"router-default-5444994796-ktkz9\" (UID: \"9f7b66c5-b258-4314-b3a5-e08b958245b6\") " pod="openshift-ingress/router-default-5444994796-ktkz9" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.023581 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.038539 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9f7b66c5-b258-4314-b3a5-e08b958245b6-stats-auth\") pod \"router-default-5444994796-ktkz9\" (UID: \"9f7b66c5-b258-4314-b3a5-e08b958245b6\") " pod="openshift-ingress/router-default-5444994796-ktkz9" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.043324 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.046189 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wzc5h"] Feb 18 11:38:58 crc kubenswrapper[4922]: W0218 11:38:58.052081 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb42c973_5e2c_4650_b259_e882429363c7.slice/crio-ebbff29cb34fa65f1084ac364e70873185e148520ea29fdb2e74a7cc8a4e4e6c WatchSource:0}: Error finding container ebbff29cb34fa65f1084ac364e70873185e148520ea29fdb2e74a7cc8a4e4e6c: Status 404 returned error can't find the container with id ebbff29cb34fa65f1084ac364e70873185e148520ea29fdb2e74a7cc8a4e4e6c Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.061574 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.065313 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f7b66c5-b258-4314-b3a5-e08b958245b6-service-ca-bundle\") pod \"router-default-5444994796-ktkz9\" (UID: \"9f7b66c5-b258-4314-b3a5-e08b958245b6\") " pod="openshift-ingress/router-default-5444994796-ktkz9" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.081859 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.103016 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.109296 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9f7b66c5-b258-4314-b3a5-e08b958245b6-default-certificate\") pod \"router-default-5444994796-ktkz9\" (UID: \"9f7b66c5-b258-4314-b3a5-e08b958245b6\") " pod="openshift-ingress/router-default-5444994796-ktkz9" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.122510 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.142977 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.162982 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.173380 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69fd90c9-8767-4d22-b88e-33fafd8026d8-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-js5gq\" (UID: \"69fd90c9-8767-4d22-b88e-33fafd8026d8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-js5gq" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.182298 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.186616 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69fd90c9-8767-4d22-b88e-33fafd8026d8-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-js5gq\" (UID: \"69fd90c9-8767-4d22-b88e-33fafd8026d8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-js5gq" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.201848 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.208793 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-5sz92"] Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.222846 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.229228 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4891f319-eff4-4b7f-912e-45da55cb4fc2-profile-collector-cert\") pod \"olm-operator-6b444d44fb-zm5zz\" (UID: \"4891f319-eff4-4b7f-912e-45da55cb4fc2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zm5zz" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.243297 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.262271 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.283006 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.301992 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.321979 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.323637 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/027da92d-9293-48ea-bd00-47b0fcb186fd-images\") pod \"machine-config-operator-74547568cd-jkv6t\" (UID: \"027da92d-9293-48ea-bd00-47b0fcb186fd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jkv6t" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.343257 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.361857 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.367792 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/027da92d-9293-48ea-bd00-47b0fcb186fd-proxy-tls\") pod \"machine-config-operator-74547568cd-jkv6t\" (UID: \"027da92d-9293-48ea-bd00-47b0fcb186fd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jkv6t" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.382699 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.402385 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.405765 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9b494454-5efb-466f-81bd-754f7d6fa0a8-metrics-tls\") pod \"ingress-operator-5b745b69d9-9b55m\" (UID: \"9b494454-5efb-466f-81bd-754f7d6fa0a8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9b55m" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.422608 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.450668 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.457518 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9b494454-5efb-466f-81bd-754f7d6fa0a8-trusted-ca\") pod \"ingress-operator-5b745b69d9-9b55m\" (UID: \"9b494454-5efb-466f-81bd-754f7d6fa0a8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9b55m" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.462200 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.487576 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.502778 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.522894 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.542682 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.562386 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.568865 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2787200d-e2f9-477b-bb3c-c1c40201f13a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gst7r\" (UID: \"2787200d-e2f9-477b-bb3c-c1c40201f13a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gst7r" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.582116 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.590832 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2787200d-e2f9-477b-bb3c-c1c40201f13a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gst7r\" (UID: \"2787200d-e2f9-477b-bb3c-c1c40201f13a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gst7r" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.600634 4922 request.go:700] Waited for 1.009919302s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver-operator/secrets?fieldSelector=metadata.name%3Dkube-apiserver-operator-dockercfg-x57mr&limit=500&resourceVersion=0 Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.602159 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.622920 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.643535 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.653823 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/faaf8fb4-0dba-494d-8a14-2dba7901f50a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-wm5bk\" (UID: \"faaf8fb4-0dba-494d-8a14-2dba7901f50a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wm5bk" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.663134 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.702882 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.710972 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ed74e60-8c19-47d2-b760-a6f8678f38da-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-r84r2\" (UID: \"0ed74e60-8c19-47d2-b760-a6f8678f38da\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r84r2" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.722819 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.738890 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/faaf8fb4-0dba-494d-8a14-2dba7901f50a-config\") pod \"kube-apiserver-operator-766d6c64bb-wm5bk\" (UID: \"faaf8fb4-0dba-494d-8a14-2dba7901f50a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wm5bk" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.743380 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.754442 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wzc5h" event={"ID":"bb42c973-5e2c-4650-b259-e882429363c7","Type":"ContainerStarted","Data":"ca8df75d49277b64cc8877f85c707ec70825cdc4a38fe081dd647da1feeb98df"} Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.754598 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wzc5h" event={"ID":"bb42c973-5e2c-4650-b259-e882429363c7","Type":"ContainerStarted","Data":"ebbff29cb34fa65f1084ac364e70873185e148520ea29fdb2e74a7cc8a4e4e6c"} Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.756354 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-5sz92" event={"ID":"4a5c3121-2765-47df-aa3f-22595e4b4ea9","Type":"ContainerStarted","Data":"ade595eb8b0948ce173e8e45817a8322764f6d260984a62fb381433fde6294fb"} Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.756448 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-5sz92" event={"ID":"4a5c3121-2765-47df-aa3f-22595e4b4ea9","Type":"ContainerStarted","Data":"3bc876637c4a8bcadaa5fbf2bc10accf11c2546eae002cfe27908bbf1fbbbfc5"} Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.756459 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-5sz92" event={"ID":"4a5c3121-2765-47df-aa3f-22595e4b4ea9","Type":"ContainerStarted","Data":"5dd5c00b4cbcf728ec53c4032a30beeb3e8a997561275183dea55345059ab4f5"} Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.758826 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jkmhc" event={"ID":"93f5445e-7408-4d36-aa4c-a7461f94d75a","Type":"ContainerStarted","Data":"70146e719eec72378b931a7ae745ea7c36852f057a7bc8fdc74705f731e4ee3f"} Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.758864 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jkmhc" event={"ID":"93f5445e-7408-4d36-aa4c-a7461f94d75a","Type":"ContainerStarted","Data":"33446b54746bccaa123f769e3809f84478a1c66a3bf1f31ae3381c80a093c8c5"} Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.758876 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jkmhc" event={"ID":"93f5445e-7408-4d36-aa4c-a7461f94d75a","Type":"ContainerStarted","Data":"e550ec09022128d220868962aaff5d661987c7f6f617d145ed7f69f0bee436b8"} Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.762443 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.762753 4922 generic.go:334] "Generic (PLEG): container finished" podID="8719fb44-5fea-4fd5-a516-5d2ab11c221c" containerID="7e253d56157209fb39f9490236d9d75719c1e2515a17451fa024a2c0b0ee3f80" exitCode=0 Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.762794 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ks48g" event={"ID":"8719fb44-5fea-4fd5-a516-5d2ab11c221c","Type":"ContainerDied","Data":"7e253d56157209fb39f9490236d9d75719c1e2515a17451fa024a2c0b0ee3f80"} Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.762831 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ks48g" event={"ID":"8719fb44-5fea-4fd5-a516-5d2ab11c221c","Type":"ContainerStarted","Data":"20e7dce9a6acaf67ad64fbf40b2f8911ea37201b34483d36238d92a2148409e0"} Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.768467 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4891f319-eff4-4b7f-912e-45da55cb4fc2-srv-cert\") pod \"olm-operator-6b444d44fb-zm5zz\" (UID: \"4891f319-eff4-4b7f-912e-45da55cb4fc2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zm5zz" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.783079 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.797116 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ed74e60-8c19-47d2-b760-a6f8678f38da-config\") pod \"kube-controller-manager-operator-78b949d7b-r84r2\" (UID: \"0ed74e60-8c19-47d2-b760-a6f8678f38da\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r84r2" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.802441 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.822425 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.842444 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.861598 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.882583 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.910645 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.922268 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.942851 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.962788 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 18 11:38:58 crc kubenswrapper[4922]: I0218 11:38:58.983144 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.002104 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.022063 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.042628 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.062497 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.082499 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.102006 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.121856 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.141800 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.163116 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.183411 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.201882 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.222970 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.242927 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.262028 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.283390 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.302939 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.323165 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.341987 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.361966 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.383308 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.404914 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.422755 4922 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.442928 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.462452 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.485571 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.502419 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.523225 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.564158 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jfkj\" (UniqueName: \"kubernetes.io/projected/9951c815-3e1f-40ad-8597-b558366ffc58-kube-api-access-6jfkj\") pod \"authentication-operator-69f744f599-78x9f\" (UID: \"9951c815-3e1f-40ad-8597-b558366ffc58\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-78x9f" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.580115 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6lpr\" (UniqueName: \"kubernetes.io/projected/f4ea70ef-743e-44ef-804c-2f1321999baa-kube-api-access-t6lpr\") pod \"openshift-config-operator-7777fb866f-7twg2\" (UID: \"f4ea70ef-743e-44ef-804c-2f1321999baa\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7twg2" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.588590 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-78x9f" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.607033 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k46td\" (UniqueName: \"kubernetes.io/projected/158f0672-c017-4e45-a564-96de81f21772-kube-api-access-k46td\") pod \"controller-manager-879f6c89f-pj7qx\" (UID: \"158f0672-c017-4e45-a564-96de81f21772\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pj7qx" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.622469 4922 request.go:700] Waited for 1.926391199s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console-operator/serviceaccounts/console-operator/token Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.623665 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhj97\" (UniqueName: \"kubernetes.io/projected/8890a31d-16c4-4c0a-a1f8-4ce9d3314a80-kube-api-access-dhj97\") pod \"cluster-image-registry-operator-dc59b4c8b-xvfbm\" (UID: \"8890a31d-16c4-4c0a-a1f8-4ce9d3314a80\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xvfbm" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.645463 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfbvn\" (UniqueName: \"kubernetes.io/projected/ca3008b0-2ba6-4dfd-9fea-d1890e2af197-kube-api-access-sfbvn\") pod \"console-operator-58897d9998-sddqb\" (UID: \"ca3008b0-2ba6-4dfd-9fea-d1890e2af197\") " pod="openshift-console-operator/console-operator-58897d9998-sddqb" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.657376 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz6m8\" (UniqueName: \"kubernetes.io/projected/afb43c7e-87bc-4450-ad81-6a22161fb794-kube-api-access-cz6m8\") pod \"machine-approver-56656f9798-fs2lx\" (UID: \"afb43c7e-87bc-4450-ad81-6a22161fb794\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fs2lx" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.681331 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slf52\" (UniqueName: \"kubernetes.io/projected/14e81dbf-6c73-481c-b758-4c15cc0f3258-kube-api-access-slf52\") pod \"console-f9d7485db-nfn89\" (UID: \"14e81dbf-6c73-481c-b758-4c15cc0f3258\") " pod="openshift-console/console-f9d7485db-nfn89" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.699659 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lkpw\" (UniqueName: \"kubernetes.io/projected/c3bb2fc9-822c-4f53-98bf-70933744cf7f-kube-api-access-6lkpw\") pod \"route-controller-manager-6576b87f9c-v6m8j\" (UID: \"c3bb2fc9-822c-4f53-98bf-70933744cf7f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-v6m8j" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.716963 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhx5p\" (UniqueName: \"kubernetes.io/projected/48dabf7e-d1d7-48b6-bc70-5cc88cdcf994-kube-api-access-jhx5p\") pod \"downloads-7954f5f757-b6dxx\" (UID: \"48dabf7e-d1d7-48b6-bc70-5cc88cdcf994\") " pod="openshift-console/downloads-7954f5f757-b6dxx" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.721935 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nfn89" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.729884 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fs2lx" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.738756 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7twg2" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.740998 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8890a31d-16c4-4c0a-a1f8-4ce9d3314a80-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-xvfbm\" (UID: \"8890a31d-16c4-4c0a-a1f8-4ce9d3314a80\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xvfbm" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.761098 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpknq\" (UniqueName: \"kubernetes.io/projected/9e36551d-13cd-4a75-a29b-658850b46cb8-kube-api-access-mpknq\") pod \"oauth-openshift-558db77b4-q7mwg\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.774490 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ks48g" event={"ID":"8719fb44-5fea-4fd5-a516-5d2ab11c221c","Type":"ContainerStarted","Data":"f5d8595b5efb56b0c783073cffd5e94a9e803bd2a79aa2e9dd4e875d38705733"} Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.774542 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ks48g" event={"ID":"8719fb44-5fea-4fd5-a516-5d2ab11c221c","Type":"ContainerStarted","Data":"2d3ac63287f54d65348280c81b34792f9bb2289093e3618f46bac52340509cb5"} Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.777260 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6q9q\" (UniqueName: \"kubernetes.io/projected/083b5af3-1602-4add-a778-86b19df106c2-kube-api-access-l6q9q\") pod \"multus-admission-controller-857f4d67dd-rwxzh\" (UID: \"083b5af3-1602-4add-a778-86b19df106c2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rwxzh" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.778600 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fs2lx" event={"ID":"afb43c7e-87bc-4450-ad81-6a22161fb794","Type":"ContainerStarted","Data":"a6cdb6374783bfcbb321fe6dc0b32c039af0469cdf0fb76100ab334fb5f8f006"} Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.793925 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-rwxzh" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.802431 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7ntm\" (UniqueName: \"kubernetes.io/projected/2f7958cf-7c2d-4c29-bea8-5871267d5e16-kube-api-access-n7ntm\") pod \"apiserver-7bbb656c7d-vhdd8\" (UID: \"2f7958cf-7c2d-4c29-bea8-5871267d5e16\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vhdd8" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.824425 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb9m2\" (UniqueName: \"kubernetes.io/projected/6ae30939-0d1c-4856-86e0-2b0b4797fa6a-kube-api-access-rb9m2\") pod \"etcd-operator-b45778765-prk5g\" (UID: \"6ae30939-0d1c-4856-86e0-2b0b4797fa6a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-prk5g" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.830513 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pj7qx" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.839602 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzhgh\" (UniqueName: \"kubernetes.io/projected/9b494454-5efb-466f-81bd-754f7d6fa0a8-kube-api-access-tzhgh\") pod \"ingress-operator-5b745b69d9-9b55m\" (UID: \"9b494454-5efb-466f-81bd-754f7d6fa0a8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9b55m" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.861203 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4z5l\" (UniqueName: \"kubernetes.io/projected/353bd1c5-bab8-42cc-925a-d9776ac60b6b-kube-api-access-g4z5l\") pod \"migrator-59844c95c7-9kz6f\" (UID: \"353bd1c5-bab8-42cc-925a-d9776ac60b6b\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9kz6f" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.890925 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9b494454-5efb-466f-81bd-754f7d6fa0a8-bound-sa-token\") pod \"ingress-operator-5b745b69d9-9b55m\" (UID: \"9b494454-5efb-466f-81bd-754f7d6fa0a8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9b55m" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.894838 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xvfbm" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.897027 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9kz6f" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.903950 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dh67\" (UniqueName: \"kubernetes.io/projected/9f7b66c5-b258-4314-b3a5-e08b958245b6-kube-api-access-4dh67\") pod \"router-default-5444994796-ktkz9\" (UID: \"9f7b66c5-b258-4314-b3a5-e08b958245b6\") " pod="openshift-ingress/router-default-5444994796-ktkz9" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.919766 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms9ww\" (UniqueName: \"kubernetes.io/projected/4891f319-eff4-4b7f-912e-45da55cb4fc2-kube-api-access-ms9ww\") pod \"olm-operator-6b444d44fb-zm5zz\" (UID: \"4891f319-eff4-4b7f-912e-45da55cb4fc2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zm5zz" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.925768 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-v6m8j" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.937692 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-sddqb" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.940641 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/faaf8fb4-0dba-494d-8a14-2dba7901f50a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-wm5bk\" (UID: \"faaf8fb4-0dba-494d-8a14-2dba7901f50a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wm5bk" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.953725 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.961013 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0ed74e60-8c19-47d2-b760-a6f8678f38da-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-r84r2\" (UID: \"0ed74e60-8c19-47d2-b760-a6f8678f38da\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r84r2" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.977456 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-b6dxx" Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.988540 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-7twg2"] Feb 18 11:38:59 crc kubenswrapper[4922]: I0218 11:38:59.989610 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z9bk\" (UniqueName: \"kubernetes.io/projected/8b3dc29a-edba-48bc-823b-33b792856873-kube-api-access-4z9bk\") pod \"openshift-controller-manager-operator-756b6f6bc6-nbzck\" (UID: \"8b3dc29a-edba-48bc-823b-33b792856873\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nbzck" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.003095 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlbk4\" (UniqueName: \"kubernetes.io/projected/027da92d-9293-48ea-bd00-47b0fcb186fd-kube-api-access-hlbk4\") pod \"machine-config-operator-74547568cd-jkv6t\" (UID: \"027da92d-9293-48ea-bd00-47b0fcb186fd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jkv6t" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.022744 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw4z2\" (UniqueName: \"kubernetes.io/projected/69fd90c9-8767-4d22-b88e-33fafd8026d8-kube-api-access-sw4z2\") pod \"kube-storage-version-migrator-operator-b67b599dd-js5gq\" (UID: \"69fd90c9-8767-4d22-b88e-33fafd8026d8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-js5gq" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.049337 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-prk5g" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.051104 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-nfn89"] Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.051140 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-78x9f"] Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.051321 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjqpz\" (UniqueName: \"kubernetes.io/projected/374ac04a-b37d-42c8-b0ca-e2647c86bc74-kube-api-access-jjqpz\") pod \"dns-operator-744455d44c-wdprh\" (UID: \"374ac04a-b37d-42c8-b0ca-e2647c86bc74\") " pod="openshift-dns-operator/dns-operator-744455d44c-wdprh" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.064060 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2787200d-e2f9-477b-bb3c-c1c40201f13a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gst7r\" (UID: \"2787200d-e2f9-477b-bb3c-c1c40201f13a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gst7r" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.071287 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vhdd8" Feb 18 11:39:00 crc kubenswrapper[4922]: W0218 11:39:00.083470 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14e81dbf_6c73_481c_b758_4c15cc0f3258.slice/crio-e3e0521f1d2586e618292c2cbf0b96f0a5f28185a418b592c2676c55e3372f97 WatchSource:0}: Error finding container e3e0521f1d2586e618292c2cbf0b96f0a5f28185a418b592c2676c55e3372f97: Status 404 returned error can't find the container with id e3e0521f1d2586e618292c2cbf0b96f0a5f28185a418b592c2676c55e3372f97 Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.094532 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-wdprh" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.114446 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nbzck" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.114744 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-ktkz9" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.134014 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-js5gq" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.135005 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pj7qx"] Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.135880 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-rwxzh"] Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.146193 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jkv6t" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.163920 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9b55m" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.167805 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0-registry-tls\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.167893 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56jvt\" (UniqueName: \"kubernetes.io/projected/fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0-kube-api-access-56jvt\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.167930 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np5z7\" (UniqueName: \"kubernetes.io/projected/642557ec-2e08-451b-8a4c-b4e8cf88f048-kube-api-access-np5z7\") pod \"machine-config-controller-84d6567774-h8l7q\" (UID: \"642557ec-2e08-451b-8a4c-b4e8cf88f048\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h8l7q" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.167990 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0-installation-pull-secrets\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.168027 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t76fv\" (UniqueName: \"kubernetes.io/projected/5566256e-1d22-41b3-8c9b-5765acbf0425-kube-api-access-t76fv\") pod \"catalog-operator-68c6474976-p4qtq\" (UID: \"5566256e-1d22-41b3-8c9b-5765acbf0425\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-p4qtq" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.168057 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/642557ec-2e08-451b-8a4c-b4e8cf88f048-proxy-tls\") pod \"machine-config-controller-84d6567774-h8l7q\" (UID: \"642557ec-2e08-451b-8a4c-b4e8cf88f048\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h8l7q" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.168081 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5566256e-1d22-41b3-8c9b-5765acbf0425-profile-collector-cert\") pod \"catalog-operator-68c6474976-p4qtq\" (UID: \"5566256e-1d22-41b3-8c9b-5765acbf0425\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-p4qtq" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.168108 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0-registry-certificates\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.168170 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0-bound-sa-token\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.168223 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/642557ec-2e08-451b-8a4c-b4e8cf88f048-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-h8l7q\" (UID: \"642557ec-2e08-451b-8a4c-b4e8cf88f048\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h8l7q" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.168273 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5566256e-1d22-41b3-8c9b-5765acbf0425-srv-cert\") pod \"catalog-operator-68c6474976-p4qtq\" (UID: \"5566256e-1d22-41b3-8c9b-5765acbf0425\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-p4qtq" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.168304 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0-ca-trust-extracted\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.168333 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0-trusted-ca\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.168412 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:00 crc kubenswrapper[4922]: E0218 11:39:00.168947 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:39:00.668926787 +0000 UTC m=+142.396631087 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt2rf" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.173960 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gst7r" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.183134 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wm5bk" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.188079 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zm5zz" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.234319 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r84r2" Feb 18 11:39:00 crc kubenswrapper[4922]: W0218 11:39:00.238126 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod158f0672_c017_4e45_a564_96de81f21772.slice/crio-2d992c5fc4f578358729ef2e1374662b53a6ba4618dfe838d93d24e0c5e5a820 WatchSource:0}: Error finding container 2d992c5fc4f578358729ef2e1374662b53a6ba4618dfe838d93d24e0c5e5a820: Status 404 returned error can't find the container with id 2d992c5fc4f578358729ef2e1374662b53a6ba4618dfe838d93d24e0c5e5a820 Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.269706 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.270347 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a768634b-1586-4ba2-9a05-6a88f5befea1-csi-data-dir\") pod \"csi-hostpathplugin-x69t8\" (UID: \"a768634b-1586-4ba2-9a05-6a88f5befea1\") " pod="hostpath-provisioner/csi-hostpathplugin-x69t8" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.270436 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5566256e-1d22-41b3-8c9b-5765acbf0425-srv-cert\") pod \"catalog-operator-68c6474976-p4qtq\" (UID: \"5566256e-1d22-41b3-8c9b-5765acbf0425\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-p4qtq" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.270607 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdbmc\" (UniqueName: \"kubernetes.io/projected/75c707c4-5c62-438f-8312-2307d3ef0ba8-kube-api-access-qdbmc\") pod \"collect-profiles-29523570-k54zv\" (UID: \"75c707c4-5c62-438f-8312-2307d3ef0ba8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523570-k54zv" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.270641 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66dfc07c-fa4c-48ad-9904-2a767310c6ac-cert\") pod \"ingress-canary-9hdml\" (UID: \"66dfc07c-fa4c-48ad-9904-2a767310c6ac\") " pod="openshift-ingress-canary/ingress-canary-9hdml" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.270693 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0-ca-trust-extracted\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.271163 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0-trusted-ca\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.271318 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4snbf\" (UniqueName: \"kubernetes.io/projected/a62974f3-68b4-451d-9887-bf8af554ace0-kube-api-access-4snbf\") pod \"service-ca-9c57cc56f-ppzj4\" (UID: \"a62974f3-68b4-451d-9887-bf8af554ace0\") " pod="openshift-service-ca/service-ca-9c57cc56f-ppzj4" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.271383 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0-registry-tls\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.271401 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/aa233e7a-8a71-495c-b696-2f3dac9f0ada-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-nc7b9\" (UID: \"aa233e7a-8a71-495c-b696-2f3dac9f0ada\") " pod="openshift-marketplace/marketplace-operator-79b997595-nc7b9" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.271417 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h5ql\" (UniqueName: \"kubernetes.io/projected/aa233e7a-8a71-495c-b696-2f3dac9f0ada-kube-api-access-2h5ql\") pod \"marketplace-operator-79b997595-nc7b9\" (UID: \"aa233e7a-8a71-495c-b696-2f3dac9f0ada\") " pod="openshift-marketplace/marketplace-operator-79b997595-nc7b9" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.271472 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56jvt\" (UniqueName: \"kubernetes.io/projected/fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0-kube-api-access-56jvt\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.271542 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zrhx\" (UniqueName: \"kubernetes.io/projected/913f9471-59b8-4494-964d-0db4086d77ab-kube-api-access-9zrhx\") pod \"packageserver-d55dfcdfc-vgs8b\" (UID: \"913f9471-59b8-4494-964d-0db4086d77ab\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vgs8b" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.271584 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a768634b-1586-4ba2-9a05-6a88f5befea1-registration-dir\") pod \"csi-hostpathplugin-x69t8\" (UID: \"a768634b-1586-4ba2-9a05-6a88f5befea1\") " pod="hostpath-provisioner/csi-hostpathplugin-x69t8" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.271648 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np5z7\" (UniqueName: \"kubernetes.io/projected/642557ec-2e08-451b-8a4c-b4e8cf88f048-kube-api-access-np5z7\") pod \"machine-config-controller-84d6567774-h8l7q\" (UID: \"642557ec-2e08-451b-8a4c-b4e8cf88f048\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h8l7q" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.271739 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p2v8\" (UniqueName: \"kubernetes.io/projected/d17bc269-b566-4738-ac8f-354d91dd9245-kube-api-access-7p2v8\") pod \"package-server-manager-789f6589d5-rj4b9\" (UID: \"d17bc269-b566-4738-ac8f-354d91dd9245\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rj4b9" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.271794 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75c707c4-5c62-438f-8312-2307d3ef0ba8-config-volume\") pod \"collect-profiles-29523570-k54zv\" (UID: \"75c707c4-5c62-438f-8312-2307d3ef0ba8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523570-k54zv" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.271831 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/984302b1-545d-474c-a808-8c8f716e580e-node-bootstrap-token\") pod \"machine-config-server-l8pr7\" (UID: \"984302b1-545d-474c-a808-8c8f716e580e\") " pod="openshift-machine-config-operator/machine-config-server-l8pr7" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.271919 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a768634b-1586-4ba2-9a05-6a88f5befea1-socket-dir\") pod \"csi-hostpathplugin-x69t8\" (UID: \"a768634b-1586-4ba2-9a05-6a88f5befea1\") " pod="hostpath-provisioner/csi-hostpathplugin-x69t8" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.271959 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfd5c\" (UniqueName: \"kubernetes.io/projected/a768634b-1586-4ba2-9a05-6a88f5befea1-kube-api-access-pfd5c\") pod \"csi-hostpathplugin-x69t8\" (UID: \"a768634b-1586-4ba2-9a05-6a88f5befea1\") " pod="hostpath-provisioner/csi-hostpathplugin-x69t8" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.271975 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d523bc23-dd6a-4d1f-b72b-2070ecce0cde-metrics-tls\") pod \"dns-default-gbpm4\" (UID: \"d523bc23-dd6a-4d1f-b72b-2070ecce0cde\") " pod="openshift-dns/dns-default-gbpm4" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.272036 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/913f9471-59b8-4494-964d-0db4086d77ab-webhook-cert\") pod \"packageserver-d55dfcdfc-vgs8b\" (UID: \"913f9471-59b8-4494-964d-0db4086d77ab\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vgs8b" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.272052 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aa233e7a-8a71-495c-b696-2f3dac9f0ada-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-nc7b9\" (UID: \"aa233e7a-8a71-495c-b696-2f3dac9f0ada\") " pod="openshift-marketplace/marketplace-operator-79b997595-nc7b9" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.272076 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75c707c4-5c62-438f-8312-2307d3ef0ba8-secret-volume\") pod \"collect-profiles-29523570-k54zv\" (UID: \"75c707c4-5c62-438f-8312-2307d3ef0ba8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523570-k54zv" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.272132 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0-installation-pull-secrets\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.272281 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t76fv\" (UniqueName: \"kubernetes.io/projected/5566256e-1d22-41b3-8c9b-5765acbf0425-kube-api-access-t76fv\") pod \"catalog-operator-68c6474976-p4qtq\" (UID: \"5566256e-1d22-41b3-8c9b-5765acbf0425\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-p4qtq" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.272334 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/642557ec-2e08-451b-8a4c-b4e8cf88f048-proxy-tls\") pod \"machine-config-controller-84d6567774-h8l7q\" (UID: \"642557ec-2e08-451b-8a4c-b4e8cf88f048\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h8l7q" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.272400 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/913f9471-59b8-4494-964d-0db4086d77ab-tmpfs\") pod \"packageserver-d55dfcdfc-vgs8b\" (UID: \"913f9471-59b8-4494-964d-0db4086d77ab\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vgs8b" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.272735 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/984302b1-545d-474c-a808-8c8f716e580e-certs\") pod \"machine-config-server-l8pr7\" (UID: \"984302b1-545d-474c-a808-8c8f716e580e\") " pod="openshift-machine-config-operator/machine-config-server-l8pr7" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.272798 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d523bc23-dd6a-4d1f-b72b-2070ecce0cde-config-volume\") pod \"dns-default-gbpm4\" (UID: \"d523bc23-dd6a-4d1f-b72b-2070ecce0cde\") " pod="openshift-dns/dns-default-gbpm4" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.273440 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5566256e-1d22-41b3-8c9b-5765acbf0425-profile-collector-cert\") pod \"catalog-operator-68c6474976-p4qtq\" (UID: \"5566256e-1d22-41b3-8c9b-5765acbf0425\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-p4qtq" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.273765 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0-registry-certificates\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.273956 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/adf4d88c-a19b-49bf-bb62-eef23b55efae-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-b6msr\" (UID: \"adf4d88c-a19b-49bf-bb62-eef23b55efae\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b6msr" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.274303 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0-bound-sa-token\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.274858 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vhsm\" (UniqueName: \"kubernetes.io/projected/66dfc07c-fa4c-48ad-9904-2a767310c6ac-kube-api-access-9vhsm\") pod \"ingress-canary-9hdml\" (UID: \"66dfc07c-fa4c-48ad-9904-2a767310c6ac\") " pod="openshift-ingress-canary/ingress-canary-9hdml" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.275016 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a62974f3-68b4-451d-9887-bf8af554ace0-signing-cabundle\") pod \"service-ca-9c57cc56f-ppzj4\" (UID: \"a62974f3-68b4-451d-9887-bf8af554ace0\") " pod="openshift-service-ca/service-ca-9c57cc56f-ppzj4" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.275154 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d17bc269-b566-4738-ac8f-354d91dd9245-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rj4b9\" (UID: \"d17bc269-b566-4738-ac8f-354d91dd9245\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rj4b9" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.275319 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/817e4164-fa3e-41e5-8638-6a512b9d28bf-config\") pod \"service-ca-operator-777779d784-v4cwm\" (UID: \"817e4164-fa3e-41e5-8638-6a512b9d28bf\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-v4cwm" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.275351 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwt9m\" (UniqueName: \"kubernetes.io/projected/d523bc23-dd6a-4d1f-b72b-2070ecce0cde-kube-api-access-wwt9m\") pod \"dns-default-gbpm4\" (UID: \"d523bc23-dd6a-4d1f-b72b-2070ecce0cde\") " pod="openshift-dns/dns-default-gbpm4" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.275553 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfssv\" (UniqueName: \"kubernetes.io/projected/adf4d88c-a19b-49bf-bb62-eef23b55efae-kube-api-access-qfssv\") pod \"control-plane-machine-set-operator-78cbb6b69f-b6msr\" (UID: \"adf4d88c-a19b-49bf-bb62-eef23b55efae\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b6msr" Feb 18 11:39:00 crc kubenswrapper[4922]: E0218 11:39:00.276421 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:39:00.776393517 +0000 UTC m=+142.504097597 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.277258 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0-registry-certificates\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.275601 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/817e4164-fa3e-41e5-8638-6a512b9d28bf-serving-cert\") pod \"service-ca-operator-777779d784-v4cwm\" (UID: \"817e4164-fa3e-41e5-8638-6a512b9d28bf\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-v4cwm" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.287560 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lh5mm\" (UniqueName: \"kubernetes.io/projected/817e4164-fa3e-41e5-8638-6a512b9d28bf-kube-api-access-lh5mm\") pod \"service-ca-operator-777779d784-v4cwm\" (UID: \"817e4164-fa3e-41e5-8638-6a512b9d28bf\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-v4cwm" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.291524 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0-ca-trust-extracted\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.292025 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/913f9471-59b8-4494-964d-0db4086d77ab-apiservice-cert\") pod \"packageserver-d55dfcdfc-vgs8b\" (UID: \"913f9471-59b8-4494-964d-0db4086d77ab\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vgs8b" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.292188 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/642557ec-2e08-451b-8a4c-b4e8cf88f048-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-h8l7q\" (UID: \"642557ec-2e08-451b-8a4c-b4e8cf88f048\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h8l7q" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.292233 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkcgp\" (UniqueName: \"kubernetes.io/projected/984302b1-545d-474c-a808-8c8f716e580e-kube-api-access-qkcgp\") pod \"machine-config-server-l8pr7\" (UID: \"984302b1-545d-474c-a808-8c8f716e580e\") " pod="openshift-machine-config-operator/machine-config-server-l8pr7" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.292817 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a62974f3-68b4-451d-9887-bf8af554ace0-signing-key\") pod \"service-ca-9c57cc56f-ppzj4\" (UID: \"a62974f3-68b4-451d-9887-bf8af554ace0\") " pod="openshift-service-ca/service-ca-9c57cc56f-ppzj4" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.293256 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/642557ec-2e08-451b-8a4c-b4e8cf88f048-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-h8l7q\" (UID: \"642557ec-2e08-451b-8a4c-b4e8cf88f048\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h8l7q" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.293282 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a768634b-1586-4ba2-9a05-6a88f5befea1-mountpoint-dir\") pod \"csi-hostpathplugin-x69t8\" (UID: \"a768634b-1586-4ba2-9a05-6a88f5befea1\") " pod="hostpath-provisioner/csi-hostpathplugin-x69t8" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.293327 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a768634b-1586-4ba2-9a05-6a88f5befea1-plugins-dir\") pod \"csi-hostpathplugin-x69t8\" (UID: \"a768634b-1586-4ba2-9a05-6a88f5befea1\") " pod="hostpath-provisioner/csi-hostpathplugin-x69t8" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.293436 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0-trusted-ca\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.301515 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0-registry-tls\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.304185 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0-installation-pull-secrets\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.304445 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5566256e-1d22-41b3-8c9b-5765acbf0425-srv-cert\") pod \"catalog-operator-68c6474976-p4qtq\" (UID: \"5566256e-1d22-41b3-8c9b-5765acbf0425\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-p4qtq" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.305347 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5566256e-1d22-41b3-8c9b-5765acbf0425-profile-collector-cert\") pod \"catalog-operator-68c6474976-p4qtq\" (UID: \"5566256e-1d22-41b3-8c9b-5765acbf0425\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-p4qtq" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.314506 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/642557ec-2e08-451b-8a4c-b4e8cf88f048-proxy-tls\") pod \"machine-config-controller-84d6567774-h8l7q\" (UID: \"642557ec-2e08-451b-8a4c-b4e8cf88f048\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h8l7q" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.328213 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0-bound-sa-token\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.346723 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t76fv\" (UniqueName: \"kubernetes.io/projected/5566256e-1d22-41b3-8c9b-5765acbf0425-kube-api-access-t76fv\") pod \"catalog-operator-68c6474976-p4qtq\" (UID: \"5566256e-1d22-41b3-8c9b-5765acbf0425\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-p4qtq" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.394862 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.394918 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4snbf\" (UniqueName: \"kubernetes.io/projected/a62974f3-68b4-451d-9887-bf8af554ace0-kube-api-access-4snbf\") pod \"service-ca-9c57cc56f-ppzj4\" (UID: \"a62974f3-68b4-451d-9887-bf8af554ace0\") " pod="openshift-service-ca/service-ca-9c57cc56f-ppzj4" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.394940 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/aa233e7a-8a71-495c-b696-2f3dac9f0ada-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-nc7b9\" (UID: \"aa233e7a-8a71-495c-b696-2f3dac9f0ada\") " pod="openshift-marketplace/marketplace-operator-79b997595-nc7b9" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.394965 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h5ql\" (UniqueName: \"kubernetes.io/projected/aa233e7a-8a71-495c-b696-2f3dac9f0ada-kube-api-access-2h5ql\") pod \"marketplace-operator-79b997595-nc7b9\" (UID: \"aa233e7a-8a71-495c-b696-2f3dac9f0ada\") " pod="openshift-marketplace/marketplace-operator-79b997595-nc7b9" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.394989 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zrhx\" (UniqueName: \"kubernetes.io/projected/913f9471-59b8-4494-964d-0db4086d77ab-kube-api-access-9zrhx\") pod \"packageserver-d55dfcdfc-vgs8b\" (UID: \"913f9471-59b8-4494-964d-0db4086d77ab\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vgs8b" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.395007 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a768634b-1586-4ba2-9a05-6a88f5befea1-registration-dir\") pod \"csi-hostpathplugin-x69t8\" (UID: \"a768634b-1586-4ba2-9a05-6a88f5befea1\") " pod="hostpath-provisioner/csi-hostpathplugin-x69t8" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.395030 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7p2v8\" (UniqueName: \"kubernetes.io/projected/d17bc269-b566-4738-ac8f-354d91dd9245-kube-api-access-7p2v8\") pod \"package-server-manager-789f6589d5-rj4b9\" (UID: \"d17bc269-b566-4738-ac8f-354d91dd9245\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rj4b9" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.395046 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75c707c4-5c62-438f-8312-2307d3ef0ba8-config-volume\") pod \"collect-profiles-29523570-k54zv\" (UID: \"75c707c4-5c62-438f-8312-2307d3ef0ba8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523570-k54zv" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.395069 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/984302b1-545d-474c-a808-8c8f716e580e-node-bootstrap-token\") pod \"machine-config-server-l8pr7\" (UID: \"984302b1-545d-474c-a808-8c8f716e580e\") " pod="openshift-machine-config-operator/machine-config-server-l8pr7" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.395087 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfd5c\" (UniqueName: \"kubernetes.io/projected/a768634b-1586-4ba2-9a05-6a88f5befea1-kube-api-access-pfd5c\") pod \"csi-hostpathplugin-x69t8\" (UID: \"a768634b-1586-4ba2-9a05-6a88f5befea1\") " pod="hostpath-provisioner/csi-hostpathplugin-x69t8" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.395103 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a768634b-1586-4ba2-9a05-6a88f5befea1-socket-dir\") pod \"csi-hostpathplugin-x69t8\" (UID: \"a768634b-1586-4ba2-9a05-6a88f5befea1\") " pod="hostpath-provisioner/csi-hostpathplugin-x69t8" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.395118 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/913f9471-59b8-4494-964d-0db4086d77ab-webhook-cert\") pod \"packageserver-d55dfcdfc-vgs8b\" (UID: \"913f9471-59b8-4494-964d-0db4086d77ab\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vgs8b" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.395133 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d523bc23-dd6a-4d1f-b72b-2070ecce0cde-metrics-tls\") pod \"dns-default-gbpm4\" (UID: \"d523bc23-dd6a-4d1f-b72b-2070ecce0cde\") " pod="openshift-dns/dns-default-gbpm4" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.395151 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75c707c4-5c62-438f-8312-2307d3ef0ba8-secret-volume\") pod \"collect-profiles-29523570-k54zv\" (UID: \"75c707c4-5c62-438f-8312-2307d3ef0ba8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523570-k54zv" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.395167 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aa233e7a-8a71-495c-b696-2f3dac9f0ada-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-nc7b9\" (UID: \"aa233e7a-8a71-495c-b696-2f3dac9f0ada\") " pod="openshift-marketplace/marketplace-operator-79b997595-nc7b9" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.395191 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/913f9471-59b8-4494-964d-0db4086d77ab-tmpfs\") pod \"packageserver-d55dfcdfc-vgs8b\" (UID: \"913f9471-59b8-4494-964d-0db4086d77ab\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vgs8b" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.395206 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/984302b1-545d-474c-a808-8c8f716e580e-certs\") pod \"machine-config-server-l8pr7\" (UID: \"984302b1-545d-474c-a808-8c8f716e580e\") " pod="openshift-machine-config-operator/machine-config-server-l8pr7" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.395221 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d523bc23-dd6a-4d1f-b72b-2070ecce0cde-config-volume\") pod \"dns-default-gbpm4\" (UID: \"d523bc23-dd6a-4d1f-b72b-2070ecce0cde\") " pod="openshift-dns/dns-default-gbpm4" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.395243 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/adf4d88c-a19b-49bf-bb62-eef23b55efae-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-b6msr\" (UID: \"adf4d88c-a19b-49bf-bb62-eef23b55efae\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b6msr" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.395265 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a62974f3-68b4-451d-9887-bf8af554ace0-signing-cabundle\") pod \"service-ca-9c57cc56f-ppzj4\" (UID: \"a62974f3-68b4-451d-9887-bf8af554ace0\") " pod="openshift-service-ca/service-ca-9c57cc56f-ppzj4" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.395280 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d17bc269-b566-4738-ac8f-354d91dd9245-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rj4b9\" (UID: \"d17bc269-b566-4738-ac8f-354d91dd9245\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rj4b9" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.395295 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vhsm\" (UniqueName: \"kubernetes.io/projected/66dfc07c-fa4c-48ad-9904-2a767310c6ac-kube-api-access-9vhsm\") pod \"ingress-canary-9hdml\" (UID: \"66dfc07c-fa4c-48ad-9904-2a767310c6ac\") " pod="openshift-ingress-canary/ingress-canary-9hdml" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.395313 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwt9m\" (UniqueName: \"kubernetes.io/projected/d523bc23-dd6a-4d1f-b72b-2070ecce0cde-kube-api-access-wwt9m\") pod \"dns-default-gbpm4\" (UID: \"d523bc23-dd6a-4d1f-b72b-2070ecce0cde\") " pod="openshift-dns/dns-default-gbpm4" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.395328 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/817e4164-fa3e-41e5-8638-6a512b9d28bf-config\") pod \"service-ca-operator-777779d784-v4cwm\" (UID: \"817e4164-fa3e-41e5-8638-6a512b9d28bf\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-v4cwm" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.395351 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfssv\" (UniqueName: \"kubernetes.io/projected/adf4d88c-a19b-49bf-bb62-eef23b55efae-kube-api-access-qfssv\") pod \"control-plane-machine-set-operator-78cbb6b69f-b6msr\" (UID: \"adf4d88c-a19b-49bf-bb62-eef23b55efae\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b6msr" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.395371 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/817e4164-fa3e-41e5-8638-6a512b9d28bf-serving-cert\") pod \"service-ca-operator-777779d784-v4cwm\" (UID: \"817e4164-fa3e-41e5-8638-6a512b9d28bf\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-v4cwm" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.395411 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/913f9471-59b8-4494-964d-0db4086d77ab-apiservice-cert\") pod \"packageserver-d55dfcdfc-vgs8b\" (UID: \"913f9471-59b8-4494-964d-0db4086d77ab\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vgs8b" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.395434 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lh5mm\" (UniqueName: \"kubernetes.io/projected/817e4164-fa3e-41e5-8638-6a512b9d28bf-kube-api-access-lh5mm\") pod \"service-ca-operator-777779d784-v4cwm\" (UID: \"817e4164-fa3e-41e5-8638-6a512b9d28bf\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-v4cwm" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.395455 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkcgp\" (UniqueName: \"kubernetes.io/projected/984302b1-545d-474c-a808-8c8f716e580e-kube-api-access-qkcgp\") pod \"machine-config-server-l8pr7\" (UID: \"984302b1-545d-474c-a808-8c8f716e580e\") " pod="openshift-machine-config-operator/machine-config-server-l8pr7" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.395478 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a62974f3-68b4-451d-9887-bf8af554ace0-signing-key\") pod \"service-ca-9c57cc56f-ppzj4\" (UID: \"a62974f3-68b4-451d-9887-bf8af554ace0\") " pod="openshift-service-ca/service-ca-9c57cc56f-ppzj4" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.395503 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a768634b-1586-4ba2-9a05-6a88f5befea1-mountpoint-dir\") pod \"csi-hostpathplugin-x69t8\" (UID: \"a768634b-1586-4ba2-9a05-6a88f5befea1\") " pod="hostpath-provisioner/csi-hostpathplugin-x69t8" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.395517 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a768634b-1586-4ba2-9a05-6a88f5befea1-plugins-dir\") pod \"csi-hostpathplugin-x69t8\" (UID: \"a768634b-1586-4ba2-9a05-6a88f5befea1\") " pod="hostpath-provisioner/csi-hostpathplugin-x69t8" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.395535 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a768634b-1586-4ba2-9a05-6a88f5befea1-csi-data-dir\") pod \"csi-hostpathplugin-x69t8\" (UID: \"a768634b-1586-4ba2-9a05-6a88f5befea1\") " pod="hostpath-provisioner/csi-hostpathplugin-x69t8" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.395555 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdbmc\" (UniqueName: \"kubernetes.io/projected/75c707c4-5c62-438f-8312-2307d3ef0ba8-kube-api-access-qdbmc\") pod \"collect-profiles-29523570-k54zv\" (UID: \"75c707c4-5c62-438f-8312-2307d3ef0ba8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523570-k54zv" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.395570 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66dfc07c-fa4c-48ad-9904-2a767310c6ac-cert\") pod \"ingress-canary-9hdml\" (UID: \"66dfc07c-fa4c-48ad-9904-2a767310c6ac\") " pod="openshift-ingress-canary/ingress-canary-9hdml" Feb 18 11:39:00 crc kubenswrapper[4922]: E0218 11:39:00.396155 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:39:00.896120407 +0000 UTC m=+142.623824487 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt2rf" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.399405 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a768634b-1586-4ba2-9a05-6a88f5befea1-socket-dir\") pod \"csi-hostpathplugin-x69t8\" (UID: \"a768634b-1586-4ba2-9a05-6a88f5befea1\") " pod="hostpath-provisioner/csi-hostpathplugin-x69t8" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.399649 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a768634b-1586-4ba2-9a05-6a88f5befea1-registration-dir\") pod \"csi-hostpathplugin-x69t8\" (UID: \"a768634b-1586-4ba2-9a05-6a88f5befea1\") " pod="hostpath-provisioner/csi-hostpathplugin-x69t8" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.401413 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75c707c4-5c62-438f-8312-2307d3ef0ba8-config-volume\") pod \"collect-profiles-29523570-k54zv\" (UID: \"75c707c4-5c62-438f-8312-2307d3ef0ba8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523570-k54zv" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.402955 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a768634b-1586-4ba2-9a05-6a88f5befea1-csi-data-dir\") pod \"csi-hostpathplugin-x69t8\" (UID: \"a768634b-1586-4ba2-9a05-6a88f5befea1\") " pod="hostpath-provisioner/csi-hostpathplugin-x69t8" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.403054 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a768634b-1586-4ba2-9a05-6a88f5befea1-mountpoint-dir\") pod \"csi-hostpathplugin-x69t8\" (UID: \"a768634b-1586-4ba2-9a05-6a88f5befea1\") " pod="hostpath-provisioner/csi-hostpathplugin-x69t8" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.403066 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a768634b-1586-4ba2-9a05-6a88f5befea1-plugins-dir\") pod \"csi-hostpathplugin-x69t8\" (UID: \"a768634b-1586-4ba2-9a05-6a88f5befea1\") " pod="hostpath-provisioner/csi-hostpathplugin-x69t8" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.404045 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a62974f3-68b4-451d-9887-bf8af554ace0-signing-cabundle\") pod \"service-ca-9c57cc56f-ppzj4\" (UID: \"a62974f3-68b4-451d-9887-bf8af554ace0\") " pod="openshift-service-ca/service-ca-9c57cc56f-ppzj4" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.404413 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/913f9471-59b8-4494-964d-0db4086d77ab-tmpfs\") pod \"packageserver-d55dfcdfc-vgs8b\" (UID: \"913f9471-59b8-4494-964d-0db4086d77ab\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vgs8b" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.404650 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aa233e7a-8a71-495c-b696-2f3dac9f0ada-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-nc7b9\" (UID: \"aa233e7a-8a71-495c-b696-2f3dac9f0ada\") " pod="openshift-marketplace/marketplace-operator-79b997595-nc7b9" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.405078 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/817e4164-fa3e-41e5-8638-6a512b9d28bf-config\") pod \"service-ca-operator-777779d784-v4cwm\" (UID: \"817e4164-fa3e-41e5-8638-6a512b9d28bf\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-v4cwm" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.405936 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d523bc23-dd6a-4d1f-b72b-2070ecce0cde-config-volume\") pod \"dns-default-gbpm4\" (UID: \"d523bc23-dd6a-4d1f-b72b-2070ecce0cde\") " pod="openshift-dns/dns-default-gbpm4" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.412715 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/913f9471-59b8-4494-964d-0db4086d77ab-apiservice-cert\") pod \"packageserver-d55dfcdfc-vgs8b\" (UID: \"913f9471-59b8-4494-964d-0db4086d77ab\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vgs8b" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.412828 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np5z7\" (UniqueName: \"kubernetes.io/projected/642557ec-2e08-451b-8a4c-b4e8cf88f048-kube-api-access-np5z7\") pod \"machine-config-controller-84d6567774-h8l7q\" (UID: \"642557ec-2e08-451b-8a4c-b4e8cf88f048\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h8l7q" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.432422 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d17bc269-b566-4738-ac8f-354d91dd9245-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rj4b9\" (UID: \"d17bc269-b566-4738-ac8f-354d91dd9245\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rj4b9" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.433259 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d523bc23-dd6a-4d1f-b72b-2070ecce0cde-metrics-tls\") pod \"dns-default-gbpm4\" (UID: \"d523bc23-dd6a-4d1f-b72b-2070ecce0cde\") " pod="openshift-dns/dns-default-gbpm4" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.433354 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a62974f3-68b4-451d-9887-bf8af554ace0-signing-key\") pod \"service-ca-9c57cc56f-ppzj4\" (UID: \"a62974f3-68b4-451d-9887-bf8af554ace0\") " pod="openshift-service-ca/service-ca-9c57cc56f-ppzj4" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.433940 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/984302b1-545d-474c-a808-8c8f716e580e-certs\") pod \"machine-config-server-l8pr7\" (UID: \"984302b1-545d-474c-a808-8c8f716e580e\") " pod="openshift-machine-config-operator/machine-config-server-l8pr7" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.435026 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/817e4164-fa3e-41e5-8638-6a512b9d28bf-serving-cert\") pod \"service-ca-operator-777779d784-v4cwm\" (UID: \"817e4164-fa3e-41e5-8638-6a512b9d28bf\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-v4cwm" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.435260 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66dfc07c-fa4c-48ad-9904-2a767310c6ac-cert\") pod \"ingress-canary-9hdml\" (UID: \"66dfc07c-fa4c-48ad-9904-2a767310c6ac\") " pod="openshift-ingress-canary/ingress-canary-9hdml" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.435287 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/adf4d88c-a19b-49bf-bb62-eef23b55efae-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-b6msr\" (UID: \"adf4d88c-a19b-49bf-bb62-eef23b55efae\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b6msr" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.435448 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56jvt\" (UniqueName: \"kubernetes.io/projected/fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0-kube-api-access-56jvt\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.436620 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/aa233e7a-8a71-495c-b696-2f3dac9f0ada-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-nc7b9\" (UID: \"aa233e7a-8a71-495c-b696-2f3dac9f0ada\") " pod="openshift-marketplace/marketplace-operator-79b997595-nc7b9" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.436850 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-p4qtq" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.437000 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/984302b1-545d-474c-a808-8c8f716e580e-node-bootstrap-token\") pod \"machine-config-server-l8pr7\" (UID: \"984302b1-545d-474c-a808-8c8f716e580e\") " pod="openshift-machine-config-operator/machine-config-server-l8pr7" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.437477 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75c707c4-5c62-438f-8312-2307d3ef0ba8-secret-volume\") pod \"collect-profiles-29523570-k54zv\" (UID: \"75c707c4-5c62-438f-8312-2307d3ef0ba8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523570-k54zv" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.440838 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4snbf\" (UniqueName: \"kubernetes.io/projected/a62974f3-68b4-451d-9887-bf8af554ace0-kube-api-access-4snbf\") pod \"service-ca-9c57cc56f-ppzj4\" (UID: \"a62974f3-68b4-451d-9887-bf8af554ace0\") " pod="openshift-service-ca/service-ca-9c57cc56f-ppzj4" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.450404 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/913f9471-59b8-4494-964d-0db4086d77ab-webhook-cert\") pod \"packageserver-d55dfcdfc-vgs8b\" (UID: \"913f9471-59b8-4494-964d-0db4086d77ab\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vgs8b" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.456813 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h8l7q" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.464371 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h5ql\" (UniqueName: \"kubernetes.io/projected/aa233e7a-8a71-495c-b696-2f3dac9f0ada-kube-api-access-2h5ql\") pod \"marketplace-operator-79b997595-nc7b9\" (UID: \"aa233e7a-8a71-495c-b696-2f3dac9f0ada\") " pod="openshift-marketplace/marketplace-operator-79b997595-nc7b9" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.480213 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zrhx\" (UniqueName: \"kubernetes.io/projected/913f9471-59b8-4494-964d-0db4086d77ab-kube-api-access-9zrhx\") pod \"packageserver-d55dfcdfc-vgs8b\" (UID: \"913f9471-59b8-4494-964d-0db4086d77ab\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vgs8b" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.497073 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:39:00 crc kubenswrapper[4922]: E0218 11:39:00.497590 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:39:00.997555935 +0000 UTC m=+142.725260015 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.512668 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p2v8\" (UniqueName: \"kubernetes.io/projected/d17bc269-b566-4738-ac8f-354d91dd9245-kube-api-access-7p2v8\") pod \"package-server-manager-789f6589d5-rj4b9\" (UID: \"d17bc269-b566-4738-ac8f-354d91dd9245\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rj4b9" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.524061 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-q7mwg"] Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.531007 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-9kz6f"] Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.546483 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rj4b9" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.549958 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh5mm\" (UniqueName: \"kubernetes.io/projected/817e4164-fa3e-41e5-8638-6a512b9d28bf-kube-api-access-lh5mm\") pod \"service-ca-operator-777779d784-v4cwm\" (UID: \"817e4164-fa3e-41e5-8638-6a512b9d28bf\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-v4cwm" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.555610 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkcgp\" (UniqueName: \"kubernetes.io/projected/984302b1-545d-474c-a808-8c8f716e580e-kube-api-access-qkcgp\") pod \"machine-config-server-l8pr7\" (UID: \"984302b1-545d-474c-a808-8c8f716e580e\") " pod="openshift-machine-config-operator/machine-config-server-l8pr7" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.560567 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vgs8b" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.574731 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-nc7b9" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.598488 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfd5c\" (UniqueName: \"kubernetes.io/projected/a768634b-1586-4ba2-9a05-6a88f5befea1-kube-api-access-pfd5c\") pod \"csi-hostpathplugin-x69t8\" (UID: \"a768634b-1586-4ba2-9a05-6a88f5befea1\") " pod="hostpath-provisioner/csi-hostpathplugin-x69t8" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.599854 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:00 crc kubenswrapper[4922]: E0218 11:39:00.600209 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:39:01.100193153 +0000 UTC m=+142.827897233 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt2rf" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.601738 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwt9m\" (UniqueName: \"kubernetes.io/projected/d523bc23-dd6a-4d1f-b72b-2070ecce0cde-kube-api-access-wwt9m\") pod \"dns-default-gbpm4\" (UID: \"d523bc23-dd6a-4d1f-b72b-2070ecce0cde\") " pod="openshift-dns/dns-default-gbpm4" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.605770 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-v4cwm" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.619113 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vhsm\" (UniqueName: \"kubernetes.io/projected/66dfc07c-fa4c-48ad-9904-2a767310c6ac-kube-api-access-9vhsm\") pod \"ingress-canary-9hdml\" (UID: \"66dfc07c-fa4c-48ad-9904-2a767310c6ac\") " pod="openshift-ingress-canary/ingress-canary-9hdml" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.638940 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9hdml" Feb 18 11:39:00 crc kubenswrapper[4922]: W0218 11:39:00.639353 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e36551d_13cd_4a75_a29b_658850b46cb8.slice/crio-f32dc9e435ded17fa5224c75138342bd559fb562ee11ecfd7567b73083fd4cff WatchSource:0}: Error finding container f32dc9e435ded17fa5224c75138342bd559fb562ee11ecfd7567b73083fd4cff: Status 404 returned error can't find the container with id f32dc9e435ded17fa5224c75138342bd559fb562ee11ecfd7567b73083fd4cff Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.639513 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-ppzj4" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.654228 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-l8pr7" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.659817 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdbmc\" (UniqueName: \"kubernetes.io/projected/75c707c4-5c62-438f-8312-2307d3ef0ba8-kube-api-access-qdbmc\") pod \"collect-profiles-29523570-k54zv\" (UID: \"75c707c4-5c62-438f-8312-2307d3ef0ba8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523570-k54zv" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.663471 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfssv\" (UniqueName: \"kubernetes.io/projected/adf4d88c-a19b-49bf-bb62-eef23b55efae-kube-api-access-qfssv\") pod \"control-plane-machine-set-operator-78cbb6b69f-b6msr\" (UID: \"adf4d88c-a19b-49bf-bb62-eef23b55efae\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b6msr" Feb 18 11:39:00 crc kubenswrapper[4922]: W0218 11:39:00.669975 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod353bd1c5_bab8_42cc_925a_d9776ac60b6b.slice/crio-270a203474ba84f435ec22b9c71503106e1a4c970e2a080f19dca4ce0186bede WatchSource:0}: Error finding container 270a203474ba84f435ec22b9c71503106e1a4c970e2a080f19dca4ce0186bede: Status 404 returned error can't find the container with id 270a203474ba84f435ec22b9c71503106e1a4c970e2a080f19dca4ce0186bede Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.682606 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-x69t8" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.690323 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gbpm4" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.700518 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:39:00 crc kubenswrapper[4922]: E0218 11:39:00.700612 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:39:01.200593124 +0000 UTC m=+142.928297204 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.700743 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:00 crc kubenswrapper[4922]: E0218 11:39:00.701020 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:39:01.201011425 +0000 UTC m=+142.928715505 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt2rf" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.706475 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-sddqb"] Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.741874 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-v6m8j"] Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.743616 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-b6dxx"] Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.746596 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xvfbm"] Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.801785 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:39:00 crc kubenswrapper[4922]: E0218 11:39:00.802158 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:39:01.302126854 +0000 UTC m=+143.029830934 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.844886 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-78x9f" event={"ID":"9951c815-3e1f-40ad-8597-b558366ffc58","Type":"ContainerStarted","Data":"ff075026a738e076b481a73f3a75acb8d6738e1e24148acad8afe689bd021563"} Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.845345 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-78x9f" event={"ID":"9951c815-3e1f-40ad-8597-b558366ffc58","Type":"ContainerStarted","Data":"4db9590941a93d410d1f3e94ff9fc3e772d8bf8bdac79a6ee60fb31a1b40a31b"} Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.851567 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nfn89" event={"ID":"14e81dbf-6c73-481c-b758-4c15cc0f3258","Type":"ContainerStarted","Data":"e3e0521f1d2586e618292c2cbf0b96f0a5f28185a418b592c2676c55e3372f97"} Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.853488 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9kz6f" event={"ID":"353bd1c5-bab8-42cc-925a-d9776ac60b6b","Type":"ContainerStarted","Data":"270a203474ba84f435ec22b9c71503106e1a4c970e2a080f19dca4ce0186bede"} Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.859080 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-ktkz9" event={"ID":"9f7b66c5-b258-4314-b3a5-e08b958245b6","Type":"ContainerStarted","Data":"4dc87b2fbe3355753c06c4226a76516acee6e6b7368c2539573d63e8626cd376"} Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.876547 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" event={"ID":"9e36551d-13cd-4a75-a29b-658850b46cb8","Type":"ContainerStarted","Data":"f32dc9e435ded17fa5224c75138342bd559fb562ee11ecfd7567b73083fd4cff"} Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.885097 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-rwxzh" event={"ID":"083b5af3-1602-4add-a778-86b19df106c2","Type":"ContainerStarted","Data":"ced3405a371a1faa7601431d9e2a5855cd3c8213c521cab1927ce6b56082b1db"} Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.903848 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:00 crc kubenswrapper[4922]: E0218 11:39:00.904179 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:39:01.404168597 +0000 UTC m=+143.131872677 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt2rf" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.907476 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fs2lx" event={"ID":"afb43c7e-87bc-4450-ad81-6a22161fb794","Type":"ContainerStarted","Data":"7bd8c637f253795bbee1f4903cb5578e614d80fa7ebd4b253117e3d49cebab10"} Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.913807 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b6msr" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.916534 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pj7qx" event={"ID":"158f0672-c017-4e45-a564-96de81f21772","Type":"ContainerStarted","Data":"2d992c5fc4f578358729ef2e1374662b53a6ba4618dfe838d93d24e0c5e5a820"} Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.917855 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7twg2" event={"ID":"f4ea70ef-743e-44ef-804c-2f1321999baa","Type":"ContainerStarted","Data":"7395b686de9c51569b28e03f4f310f89d8701a436f73b30737cce70fa5185b5b"} Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.944985 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523570-k54zv" Feb 18 11:39:00 crc kubenswrapper[4922]: I0218 11:39:00.977237 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-5sz92" podStartSLOduration=121.977220906 podStartE2EDuration="2m1.977220906s" podCreationTimestamp="2026-02-18 11:36:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:39:00.975746409 +0000 UTC m=+142.703450489" watchObservedRunningTime="2026-02-18 11:39:00.977220906 +0000 UTC m=+142.704924986" Feb 18 11:39:01 crc kubenswrapper[4922]: I0218 11:39:01.004238 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:39:01 crc kubenswrapper[4922]: E0218 11:39:01.004634 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:39:01.504620999 +0000 UTC m=+143.232325079 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:01 crc kubenswrapper[4922]: W0218 11:39:01.028067 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3bb2fc9_822c_4f53_98bf_70933744cf7f.slice/crio-52db1346d3e5c28577ccc9c9f0dc56c1b32dc4dc3a9883af1710dab9823da00c WatchSource:0}: Error finding container 52db1346d3e5c28577ccc9c9f0dc56c1b32dc4dc3a9883af1710dab9823da00c: Status 404 returned error can't find the container with id 52db1346d3e5c28577ccc9c9f0dc56c1b32dc4dc3a9883af1710dab9823da00c Feb 18 11:39:01 crc kubenswrapper[4922]: I0218 11:39:01.114176 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:01 crc kubenswrapper[4922]: E0218 11:39:01.114512 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:39:01.614501311 +0000 UTC m=+143.342205391 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt2rf" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:01 crc kubenswrapper[4922]: I0218 11:39:01.216609 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:39:01 crc kubenswrapper[4922]: E0218 11:39:01.216910 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:39:01.716893202 +0000 UTC m=+143.444597282 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:01 crc kubenswrapper[4922]: I0218 11:39:01.262854 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-jkv6t"] Feb 18 11:39:01 crc kubenswrapper[4922]: I0218 11:39:01.277410 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-wdprh"] Feb 18 11:39:01 crc kubenswrapper[4922]: I0218 11:39:01.297116 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nbzck"] Feb 18 11:39:01 crc kubenswrapper[4922]: I0218 11:39:01.318134 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:01 crc kubenswrapper[4922]: E0218 11:39:01.318533 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:39:01.818521125 +0000 UTC m=+143.546225205 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt2rf" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:01 crc kubenswrapper[4922]: I0218 11:39:01.355669 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-prk5g"] Feb 18 11:39:01 crc kubenswrapper[4922]: I0218 11:39:01.376349 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r84r2"] Feb 18 11:39:01 crc kubenswrapper[4922]: I0218 11:39:01.384194 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-ks48g" podStartSLOduration=123.384179947 podStartE2EDuration="2m3.384179947s" podCreationTimestamp="2026-02-18 11:36:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:39:01.38234412 +0000 UTC m=+143.110048190" watchObservedRunningTime="2026-02-18 11:39:01.384179947 +0000 UTC m=+143.111884027" Feb 18 11:39:01 crc kubenswrapper[4922]: I0218 11:39:01.419001 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:39:01 crc kubenswrapper[4922]: E0218 11:39:01.419329 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:39:01.919306476 +0000 UTC m=+143.647010626 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:01 crc kubenswrapper[4922]: I0218 11:39:01.450213 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-js5gq"] Feb 18 11:39:01 crc kubenswrapper[4922]: I0218 11:39:01.455174 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-9b55m"] Feb 18 11:39:01 crc kubenswrapper[4922]: I0218 11:39:01.457502 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-vhdd8"] Feb 18 11:39:01 crc kubenswrapper[4922]: I0218 11:39:01.521536 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:01 crc kubenswrapper[4922]: E0218 11:39:01.522209 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:39:02.02219669 +0000 UTC m=+143.749900770 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt2rf" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:01 crc kubenswrapper[4922]: I0218 11:39:01.602162 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wm5bk"] Feb 18 11:39:01 crc kubenswrapper[4922]: I0218 11:39:01.628869 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gst7r"] Feb 18 11:39:01 crc kubenswrapper[4922]: I0218 11:39:01.628915 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zm5zz"] Feb 18 11:39:01 crc kubenswrapper[4922]: I0218 11:39:01.629262 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:39:01 crc kubenswrapper[4922]: E0218 11:39:01.629641 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:39:02.129626669 +0000 UTC m=+143.857330749 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:01 crc kubenswrapper[4922]: I0218 11:39:01.653716 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jkmhc" podStartSLOduration=123.653700259 podStartE2EDuration="2m3.653700259s" podCreationTimestamp="2026-02-18 11:36:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:39:01.652709194 +0000 UTC m=+143.380413274" watchObservedRunningTime="2026-02-18 11:39:01.653700259 +0000 UTC m=+143.381404339" Feb 18 11:39:01 crc kubenswrapper[4922]: I0218 11:39:01.731344 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:01 crc kubenswrapper[4922]: E0218 11:39:01.731806 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:39:02.231789235 +0000 UTC m=+143.959493315 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt2rf" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:01 crc kubenswrapper[4922]: I0218 11:39:01.832519 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:39:01 crc kubenswrapper[4922]: E0218 11:39:01.832888 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:39:02.332873834 +0000 UTC m=+144.060577914 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.228446 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:02 crc kubenswrapper[4922]: E0218 11:39:02.231016 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:39:02.730585541 +0000 UTC m=+144.458289661 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt2rf" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.244546 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-b6dxx" event={"ID":"48dabf7e-d1d7-48b6-bc70-5cc88cdcf994","Type":"ContainerStarted","Data":"021babde74a7b3b31a99a40a1dd014b4826e7ac93aab6ba43a2790a2d56fbc9a"} Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.248774 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vhdd8" event={"ID":"2f7958cf-7c2d-4c29-bea8-5871267d5e16","Type":"ContainerStarted","Data":"c18498deb1197edbc6261c83110dcc9bce5f86b0dcdab3ffc97e6e31308e59cd"} Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.250683 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-rwxzh" event={"ID":"083b5af3-1602-4add-a778-86b19df106c2","Type":"ContainerStarted","Data":"d59ca7d52e30ccaf9a6eb7b58026ee4e77061cf64e6752c593e2f43c2e1a8036"} Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.258036 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-js5gq" event={"ID":"69fd90c9-8767-4d22-b88e-33fafd8026d8","Type":"ContainerStarted","Data":"611ba1ac340163e184a22ea72cc9a93a243aff084edde9d42aa62f8f0db84524"} Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.271352 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pj7qx" event={"ID":"158f0672-c017-4e45-a564-96de81f21772","Type":"ContainerStarted","Data":"5e927c6e4399c8def57efddaefd76cf053c5e57aa111182c3f59a17801fceec0"} Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.271623 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-pj7qx" Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.274650 4922 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-pj7qx container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.274736 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-pj7qx" podUID="158f0672-c017-4e45-a564-96de81f21772" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.280246 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wm5bk" event={"ID":"faaf8fb4-0dba-494d-8a14-2dba7901f50a","Type":"ContainerStarted","Data":"0d69a5b463245362a2382e19ad22752b28cbd883a56d832e09e61167a9957ad5"} Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.282029 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-sddqb" event={"ID":"ca3008b0-2ba6-4dfd-9fea-d1890e2af197","Type":"ContainerStarted","Data":"00ae2fd6f157a70e9a2ab233d003233d31639ccbb06494ef24d137c6b612295d"} Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.283238 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gst7r" event={"ID":"2787200d-e2f9-477b-bb3c-c1c40201f13a","Type":"ContainerStarted","Data":"79767b99e85384af50c8693dba2a682ded6dd1b9ae9a283014045ca0afedee09"} Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.284355 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jkv6t" event={"ID":"027da92d-9293-48ea-bd00-47b0fcb186fd","Type":"ContainerStarted","Data":"5e0d376649c250a25133e6a296c4532a0a2731e6566d3d67e8ebb01ac4310734"} Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.301156 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9kz6f" event={"ID":"353bd1c5-bab8-42cc-925a-d9776ac60b6b","Type":"ContainerStarted","Data":"b9a00451ff85851fa144629a5b4122f5e54d94ccbd87a0412a51fc58d7870fa3"} Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.306197 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nfn89" event={"ID":"14e81dbf-6c73-481c-b758-4c15cc0f3258","Type":"ContainerStarted","Data":"af27bceaff8159c64269dd32349e1085ec2a3938f8c97b9f88035f96de999194"} Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.311666 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-ktkz9" event={"ID":"9f7b66c5-b258-4314-b3a5-e08b958245b6","Type":"ContainerStarted","Data":"b9d1cb8abd2e3a3d807243a687c0d3ea717d5cea3dc89c6854b5d6ddaa23869a"} Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.313074 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xvfbm" event={"ID":"8890a31d-16c4-4c0a-a1f8-4ce9d3314a80","Type":"ContainerStarted","Data":"8e5bf2b992167ee3b0a7a2fd76cdb65f2030e45b2bc361fe1228f36de8c8e976"} Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.317488 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-p4qtq"] Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.318457 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-wdprh" event={"ID":"374ac04a-b37d-42c8-b0ca-e2647c86bc74","Type":"ContainerStarted","Data":"a44e76d8d0f5bfda5fce5af3b3a8e798ee0a3cfbf59f263c7d6b0f45d4b4a0ed"} Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.320116 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-l8pr7" event={"ID":"984302b1-545d-474c-a808-8c8f716e580e","Type":"ContainerStarted","Data":"ebae5076964f25acb41eaeb433e4308b9c0b30fbb922d93128a427c221eb874d"} Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.320191 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nc7b9"] Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.325221 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-h8l7q"] Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.332572 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9b55m" event={"ID":"9b494454-5efb-466f-81bd-754f7d6fa0a8","Type":"ContainerStarted","Data":"1524098063fb352e09c217245a846501f4dac58d17b40881f8a7facd19b78853"} Feb 18 11:39:02 crc kubenswrapper[4922]: E0218 11:39:02.335996 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:39:02.835974308 +0000 UTC m=+144.563678398 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.335995 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.336400 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nbzck" event={"ID":"8b3dc29a-edba-48bc-823b-33b792856873","Type":"ContainerStarted","Data":"11070fbc8db6773482100ef0c5f27a27d7f418d89908b236e65ee19d59c54ed6"} Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.336880 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:02 crc kubenswrapper[4922]: E0218 11:39:02.337283 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:39:02.837268801 +0000 UTC m=+144.564972881 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt2rf" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.340034 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vgs8b"] Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.342695 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rj4b9"] Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.343049 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7twg2" event={"ID":"f4ea70ef-743e-44ef-804c-2f1321999baa","Type":"ContainerStarted","Data":"800b915b8f6df0c4b29fd6fbcc1d95d19b095de634f9f8ea8b178e5c698c4d21"} Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.345001 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r84r2" event={"ID":"0ed74e60-8c19-47d2-b760-a6f8678f38da","Type":"ContainerStarted","Data":"751d6cef6fd428c1e86b566db08e6c7860cd5d2215e658c684b2946ad52f3bbc"} Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.345189 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b6msr"] Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.346311 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-prk5g" event={"ID":"6ae30939-0d1c-4856-86e0-2b0b4797fa6a","Type":"ContainerStarted","Data":"d040bb05b48528b220a1e1d086a8c99ad9feb7e262ee2d1740e5c6b5b5d2e203"} Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.347347 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-v4cwm"] Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.351060 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-v6m8j" event={"ID":"c3bb2fc9-822c-4f53-98bf-70933744cf7f","Type":"ContainerStarted","Data":"52db1346d3e5c28577ccc9c9f0dc56c1b32dc4dc3a9883af1710dab9823da00c"} Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.351463 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-gbpm4"] Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.353606 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-x69t8"] Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.355164 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523570-k54zv"] Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.358694 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9hdml"] Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.360723 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wzc5h" podStartSLOduration=124.360705314 podStartE2EDuration="2m4.360705314s" podCreationTimestamp="2026-02-18 11:36:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:39:02.345189601 +0000 UTC m=+144.072893691" watchObservedRunningTime="2026-02-18 11:39:02.360705314 +0000 UTC m=+144.088409394" Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.360828 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-ppzj4"] Feb 18 11:39:02 crc kubenswrapper[4922]: W0218 11:39:02.403044 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa233e7a_8a71_495c_b696_2f3dac9f0ada.slice/crio-0d0074ed8642e690505cb0bcb8a3858df0eaa0a88849d13791f2daa4f4d6c521 WatchSource:0}: Error finding container 0d0074ed8642e690505cb0bcb8a3858df0eaa0a88849d13791f2daa4f4d6c521: Status 404 returned error can't find the container with id 0d0074ed8642e690505cb0bcb8a3858df0eaa0a88849d13791f2daa4f4d6c521 Feb 18 11:39:02 crc kubenswrapper[4922]: W0218 11:39:02.404243 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod913f9471_59b8_4494_964d_0db4086d77ab.slice/crio-2bb6698dc82485875bc82a38c85082c1a114762f3bc0714cd24ce971fd284aa0 WatchSource:0}: Error finding container 2bb6698dc82485875bc82a38c85082c1a114762f3bc0714cd24ce971fd284aa0: Status 404 returned error can't find the container with id 2bb6698dc82485875bc82a38c85082c1a114762f3bc0714cd24ce971fd284aa0 Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.406235 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-nfn89" podStartSLOduration=124.406212496 podStartE2EDuration="2m4.406212496s" podCreationTimestamp="2026-02-18 11:36:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:39:02.402312387 +0000 UTC m=+144.130016477" watchObservedRunningTime="2026-02-18 11:39:02.406212496 +0000 UTC m=+144.133916576" Feb 18 11:39:02 crc kubenswrapper[4922]: W0218 11:39:02.406503 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd17bc269_b566_4738_ac8f_354d91dd9245.slice/crio-4fcfedfe43678c41deb2499360ddc0b5661080187a21aa2887eaac9309b75ab1 WatchSource:0}: Error finding container 4fcfedfe43678c41deb2499360ddc0b5661080187a21aa2887eaac9309b75ab1: Status 404 returned error can't find the container with id 4fcfedfe43678c41deb2499360ddc0b5661080187a21aa2887eaac9309b75ab1 Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.438962 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:39:02 crc kubenswrapper[4922]: E0218 11:39:02.440150 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:39:02.940122894 +0000 UTC m=+144.667826974 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:02 crc kubenswrapper[4922]: W0218 11:39:02.450242 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod817e4164_fa3e_41e5_8638_6a512b9d28bf.slice/crio-c2133f63c30ca179015d02d1491013526e0b12ae60dccf733560ec7c5697dfc0 WatchSource:0}: Error finding container c2133f63c30ca179015d02d1491013526e0b12ae60dccf733560ec7c5697dfc0: Status 404 returned error can't find the container with id c2133f63c30ca179015d02d1491013526e0b12ae60dccf733560ec7c5697dfc0 Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.456969 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-78x9f" podStartSLOduration=124.45693924 podStartE2EDuration="2m4.45693924s" podCreationTimestamp="2026-02-18 11:36:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:39:02.455291808 +0000 UTC m=+144.182995888" watchObservedRunningTime="2026-02-18 11:39:02.45693924 +0000 UTC m=+144.184643310" Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.483256 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-ktkz9" podStartSLOduration=123.483234586 podStartE2EDuration="2m3.483234586s" podCreationTimestamp="2026-02-18 11:36:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:39:02.481979864 +0000 UTC m=+144.209683944" watchObservedRunningTime="2026-02-18 11:39:02.483234586 +0000 UTC m=+144.210938666" Feb 18 11:39:02 crc kubenswrapper[4922]: W0218 11:39:02.488545 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5566256e_1d22_41b3_8c9b_5765acbf0425.slice/crio-a2d8c374f7618e73ccd087e9db919056d5bd8db50ea9c2df381591c83f13763a WatchSource:0}: Error finding container a2d8c374f7618e73ccd087e9db919056d5bd8db50ea9c2df381591c83f13763a: Status 404 returned error can't find the container with id a2d8c374f7618e73ccd087e9db919056d5bd8db50ea9c2df381591c83f13763a Feb 18 11:39:02 crc kubenswrapper[4922]: W0218 11:39:02.496371 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd523bc23_dd6a_4d1f_b72b_2070ecce0cde.slice/crio-4e29e2617ea27de1fa522ac06b00cc6cbffc2ee061ea2183412c95fbb4f0c8ef WatchSource:0}: Error finding container 4e29e2617ea27de1fa522ac06b00cc6cbffc2ee061ea2183412c95fbb4f0c8ef: Status 404 returned error can't find the container with id 4e29e2617ea27de1fa522ac06b00cc6cbffc2ee061ea2183412c95fbb4f0c8ef Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.528841 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-pj7qx" podStartSLOduration=124.52882666 podStartE2EDuration="2m4.52882666s" podCreationTimestamp="2026-02-18 11:36:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:39:02.498756019 +0000 UTC m=+144.226460119" watchObservedRunningTime="2026-02-18 11:39:02.52882666 +0000 UTC m=+144.256530740" Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.542334 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:02 crc kubenswrapper[4922]: E0218 11:39:02.542755 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:39:03.042740412 +0000 UTC m=+144.770444492 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt2rf" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.643419 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:39:02 crc kubenswrapper[4922]: E0218 11:39:02.643720 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:39:03.143694897 +0000 UTC m=+144.871398977 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.643854 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:02 crc kubenswrapper[4922]: E0218 11:39:02.644244 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:39:03.144237471 +0000 UTC m=+144.871941551 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt2rf" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.718507 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-ks48g" Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.719071 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-ks48g" Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.723561 4922 patch_prober.go:28] interesting pod/apiserver-76f77b778f-ks48g container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.5:8443/livez\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.723620 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-ks48g" podUID="8719fb44-5fea-4fd5-a516-5d2ab11c221c" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.5:8443/livez\": dial tcp 10.217.0.5:8443: connect: connection refused" Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.744812 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:39:02 crc kubenswrapper[4922]: E0218 11:39:02.744932 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:39:03.244913788 +0000 UTC m=+144.972617868 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.745167 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:02 crc kubenswrapper[4922]: E0218 11:39:02.745552 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:39:03.245537364 +0000 UTC m=+144.973241444 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt2rf" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.846682 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:39:02 crc kubenswrapper[4922]: E0218 11:39:02.846876 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:39:03.346846598 +0000 UTC m=+145.074550698 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.847171 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:02 crc kubenswrapper[4922]: E0218 11:39:02.847856 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:39:03.347832843 +0000 UTC m=+145.075536923 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt2rf" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.948070 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:39:02 crc kubenswrapper[4922]: E0218 11:39:02.948387 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:39:03.448327477 +0000 UTC m=+145.176031577 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:02 crc kubenswrapper[4922]: I0218 11:39:02.948464 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:02 crc kubenswrapper[4922]: E0218 11:39:02.948896 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:39:03.448872641 +0000 UTC m=+145.176576721 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt2rf" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.049806 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:39:03 crc kubenswrapper[4922]: E0218 11:39:03.049957 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:39:03.549927399 +0000 UTC m=+145.277631489 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.050175 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:03 crc kubenswrapper[4922]: E0218 11:39:03.050507 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:39:03.550493703 +0000 UTC m=+145.278197783 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt2rf" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.116127 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-ktkz9" Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.118290 4922 patch_prober.go:28] interesting pod/router-default-5444994796-ktkz9 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.118361 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ktkz9" podUID="9f7b66c5-b258-4314-b3a5-e08b958245b6" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.151780 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:39:03 crc kubenswrapper[4922]: E0218 11:39:03.151997 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:39:03.651965331 +0000 UTC m=+145.379669431 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.152088 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:03 crc kubenswrapper[4922]: E0218 11:39:03.152468 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:39:03.652450174 +0000 UTC m=+145.380154254 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt2rf" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.253623 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:39:03 crc kubenswrapper[4922]: E0218 11:39:03.254076 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:39:03.754058135 +0000 UTC m=+145.481762215 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.355048 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:03 crc kubenswrapper[4922]: E0218 11:39:03.355459 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:39:03.855441652 +0000 UTC m=+145.583145802 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt2rf" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.381552 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-nc7b9" event={"ID":"aa233e7a-8a71-495c-b696-2f3dac9f0ada","Type":"ContainerStarted","Data":"0d0074ed8642e690505cb0bcb8a3858df0eaa0a88849d13791f2daa4f4d6c521"} Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.393627 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zm5zz" event={"ID":"4891f319-eff4-4b7f-912e-45da55cb4fc2","Type":"ContainerStarted","Data":"d1e042b27b59a1da20d4078cfed6a004e18aa6f2f57bdd3d14ddd84d243969c1"} Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.397264 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b6msr" event={"ID":"adf4d88c-a19b-49bf-bb62-eef23b55efae","Type":"ContainerStarted","Data":"c1d51ad68ed921dd436eee0cb365468fc743537c75614055fa9ee60ceca22696"} Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.401711 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fs2lx" event={"ID":"afb43c7e-87bc-4450-ad81-6a22161fb794","Type":"ContainerStarted","Data":"a3c90b7d6bacd996ab468020ba8edd7eeb8331165cdebe9add803d87729721bb"} Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.412934 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-sddqb" event={"ID":"ca3008b0-2ba6-4dfd-9fea-d1890e2af197","Type":"ContainerStarted","Data":"9ce9c276a8e5c26decc1b7c4d95b0fd62537bd8c363cb3b02b87cc2632007a2d"} Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.413203 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-sddqb" Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.414530 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jkv6t" event={"ID":"027da92d-9293-48ea-bd00-47b0fcb186fd","Type":"ContainerStarted","Data":"65d330d319f9449b58d15169f88ad7a6acb8ec0fc3bf9876afbb115c7b1085b6"} Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.415595 4922 patch_prober.go:28] interesting pod/console-operator-58897d9998-sddqb container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/readyz\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.415649 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-sddqb" podUID="ca3008b0-2ba6-4dfd-9fea-d1890e2af197" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.29:8443/readyz\": dial tcp 10.217.0.29:8443: connect: connection refused" Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.418287 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-b6dxx" event={"ID":"48dabf7e-d1d7-48b6-bc70-5cc88cdcf994","Type":"ContainerStarted","Data":"2c993aa4181e3c591333184139c4fdd1e2fe5f505f42b28f05ca8cb6ce671444"} Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.422485 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-b6dxx" Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.426693 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fs2lx" podStartSLOduration=125.426668115 podStartE2EDuration="2m5.426668115s" podCreationTimestamp="2026-02-18 11:36:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:39:03.422661353 +0000 UTC m=+145.150365433" watchObservedRunningTime="2026-02-18 11:39:03.426668115 +0000 UTC m=+145.154372215" Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.435085 4922 patch_prober.go:28] interesting pod/downloads-7954f5f757-b6dxx container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.435129 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-b6dxx" podUID="48dabf7e-d1d7-48b6-bc70-5cc88cdcf994" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.435349 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-v6m8j" event={"ID":"c3bb2fc9-822c-4f53-98bf-70933744cf7f","Type":"ContainerStarted","Data":"6194b5e6d35bc369d7c503f0dab1c1b4699e56c00e185999f0dd2d244bb4e4b6"} Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.436191 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-v6m8j" Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.442501 4922 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-v6m8j container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.442556 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-v6m8j" podUID="c3bb2fc9-822c-4f53-98bf-70933744cf7f" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.459736 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:39:03 crc kubenswrapper[4922]: E0218 11:39:03.460165 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:39:03.960149492 +0000 UTC m=+145.687853572 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.478589 4922 generic.go:334] "Generic (PLEG): container finished" podID="2f7958cf-7c2d-4c29-bea8-5871267d5e16" containerID="b07cf4437cbc267ee1a52bb7027f8b8ea0689cc07e6154dc429ff4f87cfa0e5a" exitCode=0 Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.478654 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vhdd8" event={"ID":"2f7958cf-7c2d-4c29-bea8-5871267d5e16","Type":"ContainerDied","Data":"b07cf4437cbc267ee1a52bb7027f8b8ea0689cc07e6154dc429ff4f87cfa0e5a"} Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.479720 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-sddqb" podStartSLOduration=125.479702637 podStartE2EDuration="2m5.479702637s" podCreationTimestamp="2026-02-18 11:36:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:39:03.478010384 +0000 UTC m=+145.205714464" watchObservedRunningTime="2026-02-18 11:39:03.479702637 +0000 UTC m=+145.207406717" Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.480897 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-b6dxx" podStartSLOduration=125.480890437 podStartE2EDuration="2m5.480890437s" podCreationTimestamp="2026-02-18 11:36:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:39:03.459899946 +0000 UTC m=+145.187604026" watchObservedRunningTime="2026-02-18 11:39:03.480890437 +0000 UTC m=+145.208594517" Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.483908 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-l8pr7" event={"ID":"984302b1-545d-474c-a808-8c8f716e580e","Type":"ContainerStarted","Data":"5b4fc0ef007d316fad1dbc341ba002a0863284420b1ba189011397c38969afc2"} Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.492062 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h8l7q" event={"ID":"642557ec-2e08-451b-8a4c-b4e8cf88f048","Type":"ContainerStarted","Data":"20227f686332cad34e3df8e5a3e855e1a786f126dd7a8186dccb49aa54c66573"} Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.496828 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-v6m8j" podStartSLOduration=124.49681159 podStartE2EDuration="2m4.49681159s" podCreationTimestamp="2026-02-18 11:36:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:39:03.493529477 +0000 UTC m=+145.221233557" watchObservedRunningTime="2026-02-18 11:39:03.49681159 +0000 UTC m=+145.224515670" Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.498887 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xvfbm" event={"ID":"8890a31d-16c4-4c0a-a1f8-4ce9d3314a80","Type":"ContainerStarted","Data":"aedf250be9431181205f229da7e261f45b7d32d0337001be99513451534925aa"} Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.500634 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-p4qtq" event={"ID":"5566256e-1d22-41b3-8c9b-5765acbf0425","Type":"ContainerStarted","Data":"a2d8c374f7618e73ccd087e9db919056d5bd8db50ea9c2df381591c83f13763a"} Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.502037 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nbzck" event={"ID":"8b3dc29a-edba-48bc-823b-33b792856873","Type":"ContainerStarted","Data":"8af51f2408c0810de69fc4521d83e2110bcc108ad7f3e68b264c01bcc3ce5b25"} Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.504064 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523570-k54zv" event={"ID":"75c707c4-5c62-438f-8312-2307d3ef0ba8","Type":"ContainerStarted","Data":"28c2cf88217349a49db593635ea8e128208ef1ae24d7cc6d1020cc30632765bf"} Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.506290 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-x69t8" event={"ID":"a768634b-1586-4ba2-9a05-6a88f5befea1","Type":"ContainerStarted","Data":"24875c03230fe7b07f2bf2b1aeacb99091fe584af6a63a16379985eff3253dce"} Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.512866 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-l8pr7" podStartSLOduration=6.512843586 podStartE2EDuration="6.512843586s" podCreationTimestamp="2026-02-18 11:38:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:39:03.509671985 +0000 UTC m=+145.237376065" watchObservedRunningTime="2026-02-18 11:39:03.512843586 +0000 UTC m=+145.240547666" Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.514091 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-v4cwm" event={"ID":"817e4164-fa3e-41e5-8638-6a512b9d28bf","Type":"ContainerStarted","Data":"c2133f63c30ca179015d02d1491013526e0b12ae60dccf733560ec7c5697dfc0"} Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.520001 4922 generic.go:334] "Generic (PLEG): container finished" podID="f4ea70ef-743e-44ef-804c-2f1321999baa" containerID="800b915b8f6df0c4b29fd6fbcc1d95d19b095de634f9f8ea8b178e5c698c4d21" exitCode=0 Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.520088 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7twg2" event={"ID":"f4ea70ef-743e-44ef-804c-2f1321999baa","Type":"ContainerDied","Data":"800b915b8f6df0c4b29fd6fbcc1d95d19b095de634f9f8ea8b178e5c698c4d21"} Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.535632 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gbpm4" event={"ID":"d523bc23-dd6a-4d1f-b72b-2070ecce0cde","Type":"ContainerStarted","Data":"4e29e2617ea27de1fa522ac06b00cc6cbffc2ee061ea2183412c95fbb4f0c8ef"} Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.549063 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vgs8b" event={"ID":"913f9471-59b8-4494-964d-0db4086d77ab","Type":"ContainerStarted","Data":"2bb6698dc82485875bc82a38c85082c1a114762f3bc0714cd24ce971fd284aa0"} Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.572025 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:03 crc kubenswrapper[4922]: E0218 11:39:03.574487 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:39:04.074467056 +0000 UTC m=+145.802171136 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt2rf" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.579294 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-ppzj4" event={"ID":"a62974f3-68b4-451d-9887-bf8af554ace0","Type":"ContainerStarted","Data":"9f1463891f17f48341e02c80d813b5f3c7115df8d26a1a8be676f77303b3e781"} Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.601145 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nbzck" podStartSLOduration=125.60112884 podStartE2EDuration="2m5.60112884s" podCreationTimestamp="2026-02-18 11:36:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:39:03.5651588 +0000 UTC m=+145.292862880" watchObservedRunningTime="2026-02-18 11:39:03.60112884 +0000 UTC m=+145.328832920" Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.628152 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-js5gq" podStartSLOduration=124.628136144 podStartE2EDuration="2m4.628136144s" podCreationTimestamp="2026-02-18 11:36:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:39:03.602079634 +0000 UTC m=+145.329783734" watchObservedRunningTime="2026-02-18 11:39:03.628136144 +0000 UTC m=+145.355840224" Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.634403 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" event={"ID":"9e36551d-13cd-4a75-a29b-658850b46cb8","Type":"ContainerStarted","Data":"6267ab86fea6d91bef9e1d2fb055261c2673103810304b496229122d5cfe0a9c"} Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.635566 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.648899 4922 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-q7mwg container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" start-of-body= Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.648939 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" podUID="9e36551d-13cd-4a75-a29b-658850b46cb8" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.653161 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xvfbm" podStartSLOduration=125.653146237 podStartE2EDuration="2m5.653146237s" podCreationTimestamp="2026-02-18 11:36:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:39:03.630598896 +0000 UTC m=+145.358302976" watchObservedRunningTime="2026-02-18 11:39:03.653146237 +0000 UTC m=+145.380850317" Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.672069 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-prk5g" podStartSLOduration=125.672054256 podStartE2EDuration="2m5.672054256s" podCreationTimestamp="2026-02-18 11:36:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:39:03.671653686 +0000 UTC m=+145.399357766" watchObservedRunningTime="2026-02-18 11:39:03.672054256 +0000 UTC m=+145.399758336" Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.678605 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rj4b9" event={"ID":"d17bc269-b566-4738-ac8f-354d91dd9245","Type":"ContainerStarted","Data":"4fcfedfe43678c41deb2499360ddc0b5661080187a21aa2887eaac9309b75ab1"} Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.684317 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9hdml" event={"ID":"66dfc07c-fa4c-48ad-9904-2a767310c6ac","Type":"ContainerStarted","Data":"8b504b001cdb830339dab531348794aef1e7dff43d9a2dc9460db743d8768619"} Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.685912 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:39:03 crc kubenswrapper[4922]: E0218 11:39:03.686439 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:39:04.186415479 +0000 UTC m=+145.914119559 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.686709 4922 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-pj7qx container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.686744 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-pj7qx" podUID="158f0672-c017-4e45-a564-96de81f21772" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.687007 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:03 crc kubenswrapper[4922]: E0218 11:39:03.687345 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:39:04.187334772 +0000 UTC m=+145.915038862 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt2rf" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.714268 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" podStartSLOduration=125.714248924 podStartE2EDuration="2m5.714248924s" podCreationTimestamp="2026-02-18 11:36:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:39:03.70621746 +0000 UTC m=+145.433921540" watchObservedRunningTime="2026-02-18 11:39:03.714248924 +0000 UTC m=+145.441953014" Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.742635 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-9hdml" podStartSLOduration=6.742605531 podStartE2EDuration="6.742605531s" podCreationTimestamp="2026-02-18 11:38:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:39:03.736785104 +0000 UTC m=+145.464489184" watchObservedRunningTime="2026-02-18 11:39:03.742605531 +0000 UTC m=+145.470309621" Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.788712 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:39:03 crc kubenswrapper[4922]: E0218 11:39:03.788917 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:39:04.288891673 +0000 UTC m=+146.016595753 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.794691 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:03 crc kubenswrapper[4922]: E0218 11:39:03.794940 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:39:04.294928156 +0000 UTC m=+146.022632336 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt2rf" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.896296 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:39:03 crc kubenswrapper[4922]: E0218 11:39:03.896893 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:39:04.396869076 +0000 UTC m=+146.124573156 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.897039 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:03 crc kubenswrapper[4922]: E0218 11:39:03.897396 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:39:04.397358148 +0000 UTC m=+146.125062228 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt2rf" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.998677 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:39:03 crc kubenswrapper[4922]: E0218 11:39:03.998869 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:39:04.498828077 +0000 UTC m=+146.226532167 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:03 crc kubenswrapper[4922]: I0218 11:39:03.999245 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:03 crc kubenswrapper[4922]: E0218 11:39:03.999601 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:39:04.499587736 +0000 UTC m=+146.227291806 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt2rf" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.100706 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:39:04 crc kubenswrapper[4922]: E0218 11:39:04.101156 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:39:04.601133476 +0000 UTC m=+146.328837556 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.101232 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:04 crc kubenswrapper[4922]: E0218 11:39:04.101627 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:39:04.601598718 +0000 UTC m=+146.329302798 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt2rf" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.116854 4922 patch_prober.go:28] interesting pod/router-default-5444994796-ktkz9 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.116926 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ktkz9" podUID="9f7b66c5-b258-4314-b3a5-e08b958245b6" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.201914 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:39:04 crc kubenswrapper[4922]: E0218 11:39:04.202163 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:39:04.702139653 +0000 UTC m=+146.429843733 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.202807 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:04 crc kubenswrapper[4922]: E0218 11:39:04.203188 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:39:04.703177209 +0000 UTC m=+146.430881279 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt2rf" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.303766 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:39:04 crc kubenswrapper[4922]: E0218 11:39:04.304048 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:39:04.804034252 +0000 UTC m=+146.531738332 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.405752 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:04 crc kubenswrapper[4922]: E0218 11:39:04.406237 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:39:04.906221699 +0000 UTC m=+146.633925789 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt2rf" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.506751 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:39:04 crc kubenswrapper[4922]: E0218 11:39:04.507068 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:39:05.007052361 +0000 UTC m=+146.734756431 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.608354 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:04 crc kubenswrapper[4922]: E0218 11:39:04.608738 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:39:05.108727525 +0000 UTC m=+146.836431605 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt2rf" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.694016 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-ppzj4" event={"ID":"a62974f3-68b4-451d-9887-bf8af554ace0","Type":"ContainerStarted","Data":"cb12ef1479e4563d313fa40b0a093d64397251fa35ae5fb84e06aafcade603d0"} Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.698645 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-rwxzh" event={"ID":"083b5af3-1602-4add-a778-86b19df106c2","Type":"ContainerStarted","Data":"65c4aea109955d848178629401d9f210c9f469ab62fc5a28d6987fbd54481f48"} Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.700712 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h8l7q" event={"ID":"642557ec-2e08-451b-8a4c-b4e8cf88f048","Type":"ContainerStarted","Data":"786d71aa506311a7b9db900698bd3f7301429e811f1275733fe62ca5b6103707"} Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.700761 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h8l7q" event={"ID":"642557ec-2e08-451b-8a4c-b4e8cf88f048","Type":"ContainerStarted","Data":"107db514ae864b4e2b106c6e0c6c97719e7ce4f85f0e80f5dad24f2f718e6206"} Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.702700 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-p4qtq" event={"ID":"5566256e-1d22-41b3-8c9b-5765acbf0425","Type":"ContainerStarted","Data":"8d388362a3e88a1d99161f2e4f8dbf961fa358c8bd007e5b64d3db3c3a1da0d6"} Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.702860 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-p4qtq" Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.705571 4922 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-p4qtq container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.705639 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-p4qtq" podUID="5566256e-1d22-41b3-8c9b-5765acbf0425" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.707556 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r84r2" event={"ID":"0ed74e60-8c19-47d2-b760-a6f8678f38da","Type":"ContainerStarted","Data":"3cb9a47c5c2c48cf77b858b29b12e1469109afb2aca6b57d27222f7ff280ea0f"} Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.708813 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.708928 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523570-k54zv" event={"ID":"75c707c4-5c62-438f-8312-2307d3ef0ba8","Type":"ContainerStarted","Data":"28b5d335d68c0542326e75d8138276312eb3264af281f35e13ca715cee393fd1"} Feb 18 11:39:04 crc kubenswrapper[4922]: E0218 11:39:04.708985 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:39:05.208964142 +0000 UTC m=+146.936668222 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.709206 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:04 crc kubenswrapper[4922]: E0218 11:39:04.709507 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:39:05.209488375 +0000 UTC m=+146.937192455 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt2rf" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.710334 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gst7r" event={"ID":"2787200d-e2f9-477b-bb3c-c1c40201f13a","Type":"ContainerStarted","Data":"c51ccf08396fb2bb9a58735128459ae1a04af5bddea74108676e1ea98d280471"} Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.712356 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7twg2" event={"ID":"f4ea70ef-743e-44ef-804c-2f1321999baa","Type":"ContainerStarted","Data":"856a19ea3f6bb5c476999f4118a140582f9a1373a71996bd3be3685b50fae44b"} Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.712435 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7twg2" Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.715307 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9hdml" event={"ID":"66dfc07c-fa4c-48ad-9904-2a767310c6ac","Type":"ContainerStarted","Data":"d1ca058a4ce2873007f1c7261ff2f0f31984ceb9aba5a5bba09875c116ebba39"} Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.717517 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vgs8b" event={"ID":"913f9471-59b8-4494-964d-0db4086d77ab","Type":"ContainerStarted","Data":"d10367914cd300d01c9b56f2bbd95d902130de10c973ab9d29618714cacdeb85"} Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.717702 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vgs8b" Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.720716 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-ppzj4" podStartSLOduration=125.720695329 podStartE2EDuration="2m5.720695329s" podCreationTimestamp="2026-02-18 11:36:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:39:04.718602546 +0000 UTC m=+146.446306626" watchObservedRunningTime="2026-02-18 11:39:04.720695329 +0000 UTC m=+146.448399409" Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.721256 4922 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-vgs8b container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:5443/healthz\": dial tcp 10.217.0.38:5443: connect: connection refused" start-of-body= Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.721301 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vgs8b" podUID="913f9471-59b8-4494-964d-0db4086d77ab" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.38:5443/healthz\": dial tcp 10.217.0.38:5443: connect: connection refused" Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.722132 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-prk5g" event={"ID":"6ae30939-0d1c-4856-86e0-2b0b4797fa6a","Type":"ContainerStarted","Data":"3517a33c4999f050839ab57c18af1c001b2c34efea6e1a5b34392e4587e073de"} Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.724820 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gbpm4" event={"ID":"d523bc23-dd6a-4d1f-b72b-2070ecce0cde","Type":"ContainerStarted","Data":"7431926b174e82de716da14c7c22b82e49895f227151ee5e2f7edef9d6ab1ffb"} Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.724857 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gbpm4" event={"ID":"d523bc23-dd6a-4d1f-b72b-2070ecce0cde","Type":"ContainerStarted","Data":"0cb3d1345b998746e1242f7bfb1c40974d53b3289c499e13bddda8a47a7ae977"} Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.724952 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-gbpm4" Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.726246 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-nc7b9" event={"ID":"aa233e7a-8a71-495c-b696-2f3dac9f0ada","Type":"ContainerStarted","Data":"47f81b843aae8ab322577ee88976e3fa1b06eba4eb120e58e24ce0de633eadc0"} Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.726482 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-nc7b9" Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.728017 4922 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-nc7b9 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.728078 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-nc7b9" podUID="aa233e7a-8a71-495c-b696-2f3dac9f0ada" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.728315 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-v4cwm" event={"ID":"817e4164-fa3e-41e5-8638-6a512b9d28bf","Type":"ContainerStarted","Data":"129c2315b1a47a4586de5e9f9f7716909cf67b9add10228f2cffcf3c179dd78c"} Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.731021 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-js5gq" event={"ID":"69fd90c9-8767-4d22-b88e-33fafd8026d8","Type":"ContainerStarted","Data":"bf73eb69e2451fbf2587410449a76eab593426f6282bcf400200cd18ed5415d9"} Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.733202 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wm5bk" event={"ID":"faaf8fb4-0dba-494d-8a14-2dba7901f50a","Type":"ContainerStarted","Data":"720b2484c75640f7273395f1c0aceb8c1aaf51e7b3b5916211b28cfe7c6147e9"} Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.735421 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-wdprh" event={"ID":"374ac04a-b37d-42c8-b0ca-e2647c86bc74","Type":"ContainerStarted","Data":"463713d8f98bc911518d2ca27eeab62e0684e10e9b167f050739a8e084663469"} Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.735465 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-wdprh" event={"ID":"374ac04a-b37d-42c8-b0ca-e2647c86bc74","Type":"ContainerStarted","Data":"a4a54a9edd61103198db5dfd95b5ce19026c37e7687af7267aabb69845ac249a"} Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.736768 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zm5zz" event={"ID":"4891f319-eff4-4b7f-912e-45da55cb4fc2","Type":"ContainerStarted","Data":"9404f5e2ae4aabd3d9991c6817b693cd457683a6951ce280ea0524cf0d419898"} Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.736979 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zm5zz" Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.739193 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rj4b9" event={"ID":"d17bc269-b566-4738-ac8f-354d91dd9245","Type":"ContainerStarted","Data":"fe3960cce94d066ad32220e4455ff52bcf307ee9a5bcb70c93c7628930d14353"} Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.739263 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rj4b9" event={"ID":"d17bc269-b566-4738-ac8f-354d91dd9245","Type":"ContainerStarted","Data":"fd07afdb87bdbec61651ca48b9f569e1f81df8b8a6b58534d3247942100d567d"} Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.739576 4922 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-zm5zz container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" start-of-body= Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.739597 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rj4b9" Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.739615 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zm5zz" podUID="4891f319-eff4-4b7f-912e-45da55cb4fc2" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.740835 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b6msr" event={"ID":"adf4d88c-a19b-49bf-bb62-eef23b55efae","Type":"ContainerStarted","Data":"2d581da97420156437790faed18ade975933011c52eec00dca326338151b817d"} Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.742869 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9kz6f" event={"ID":"353bd1c5-bab8-42cc-925a-d9776ac60b6b","Type":"ContainerStarted","Data":"6f1e7be260dd556cd0986477cadcd0f657832041dd39aa3f2efa9d0308fdf348"} Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.745813 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9b55m" event={"ID":"9b494454-5efb-466f-81bd-754f7d6fa0a8","Type":"ContainerStarted","Data":"f1168c8c54acf15fffb4b9d435d37323d0c717516e4f3a062f239f98765b2216"} Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.745856 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9b55m" event={"ID":"9b494454-5efb-466f-81bd-754f7d6fa0a8","Type":"ContainerStarted","Data":"a716f7cc11ffb59f882c129b969ab905c12ed0dafdf8d34ce18a47de24d232c5"} Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.749443 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vhdd8" event={"ID":"2f7958cf-7c2d-4c29-bea8-5871267d5e16","Type":"ContainerStarted","Data":"c84ab4747087531677041b9910125bba0d9c74d28809f960eb9f4f401b7180fe"} Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.751461 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jkv6t" event={"ID":"027da92d-9293-48ea-bd00-47b0fcb186fd","Type":"ContainerStarted","Data":"cf5ac84c4abc58bebbc7b8c441d8ff470b1e0e95b354d2cfa1151169bc01b6e2"} Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.752218 4922 patch_prober.go:28] interesting pod/console-operator-58897d9998-sddqb container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/readyz\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.752330 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-sddqb" podUID="ca3008b0-2ba6-4dfd-9fea-d1890e2af197" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.29:8443/readyz\": dial tcp 10.217.0.29:8443: connect: connection refused" Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.752489 4922 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-q7mwg container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" start-of-body= Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.752545 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" podUID="9e36551d-13cd-4a75-a29b-658850b46cb8" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.752709 4922 patch_prober.go:28] interesting pod/downloads-7954f5f757-b6dxx container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.752754 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-b6dxx" podUID="48dabf7e-d1d7-48b6-bc70-5cc88cdcf994" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.756303 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gst7r" podStartSLOduration=125.756273479 podStartE2EDuration="2m5.756273479s" podCreationTimestamp="2026-02-18 11:36:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:39:04.75591291 +0000 UTC m=+146.483617000" watchObservedRunningTime="2026-02-18 11:39:04.756273479 +0000 UTC m=+146.483977569" Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.810583 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:39:04 crc kubenswrapper[4922]: E0218 11:39:04.810763 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:39:05.310722187 +0000 UTC m=+147.038426267 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.811498 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:04 crc kubenswrapper[4922]: E0218 11:39:04.813097 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:39:05.313079807 +0000 UTC m=+147.040783997 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt2rf" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.849078 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-p4qtq" podStartSLOduration=125.849046887 podStartE2EDuration="2m5.849046887s" podCreationTimestamp="2026-02-18 11:36:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:39:04.794596499 +0000 UTC m=+146.522300579" watchObservedRunningTime="2026-02-18 11:39:04.849046887 +0000 UTC m=+146.576750957" Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.909201 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-rwxzh" podStartSLOduration=125.90917968 podStartE2EDuration="2m5.90917968s" podCreationTimestamp="2026-02-18 11:36:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:39:04.864907289 +0000 UTC m=+146.592611379" watchObservedRunningTime="2026-02-18 11:39:04.90917968 +0000 UTC m=+146.636883760" Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.910617 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29523570-k54zv" podStartSLOduration=126.910608816 podStartE2EDuration="2m6.910608816s" podCreationTimestamp="2026-02-18 11:36:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:39:04.909653422 +0000 UTC m=+146.637357502" watchObservedRunningTime="2026-02-18 11:39:04.910608816 +0000 UTC m=+146.638312896" Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.913849 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:39:04 crc kubenswrapper[4922]: E0218 11:39:04.915097 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:39:05.415073379 +0000 UTC m=+147.142777529 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.986818 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r84r2" podStartSLOduration=125.986804544 podStartE2EDuration="2m5.986804544s" podCreationTimestamp="2026-02-18 11:36:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:39:04.986380794 +0000 UTC m=+146.714084874" watchObservedRunningTime="2026-02-18 11:39:04.986804544 +0000 UTC m=+146.714508624" Feb 18 11:39:04 crc kubenswrapper[4922]: I0218 11:39:04.987614 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h8l7q" podStartSLOduration=125.987609325 podStartE2EDuration="2m5.987609325s" podCreationTimestamp="2026-02-18 11:36:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:39:04.948016923 +0000 UTC m=+146.675721003" watchObservedRunningTime="2026-02-18 11:39:04.987609325 +0000 UTC m=+146.715313405" Feb 18 11:39:05 crc kubenswrapper[4922]: I0218 11:39:05.016112 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:05 crc kubenswrapper[4922]: E0218 11:39:05.016698 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:39:05.516684521 +0000 UTC m=+147.244388601 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt2rf" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:05 crc kubenswrapper[4922]: I0218 11:39:05.022285 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vgs8b" podStartSLOduration=126.022264562 podStartE2EDuration="2m6.022264562s" podCreationTimestamp="2026-02-18 11:36:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:39:05.020652991 +0000 UTC m=+146.748357071" watchObservedRunningTime="2026-02-18 11:39:05.022264562 +0000 UTC m=+146.749968642" Feb 18 11:39:05 crc kubenswrapper[4922]: I0218 11:39:05.058522 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7twg2" podStartSLOduration=127.058508229 podStartE2EDuration="2m7.058508229s" podCreationTimestamp="2026-02-18 11:36:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:39:05.057824592 +0000 UTC m=+146.785528672" watchObservedRunningTime="2026-02-18 11:39:05.058508229 +0000 UTC m=+146.786212299" Feb 18 11:39:05 crc kubenswrapper[4922]: I0218 11:39:05.072830 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vhdd8" Feb 18 11:39:05 crc kubenswrapper[4922]: I0218 11:39:05.073453 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vhdd8" Feb 18 11:39:05 crc kubenswrapper[4922]: I0218 11:39:05.075534 4922 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-vhdd8 container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.32:8443/livez\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Feb 18 11:39:05 crc kubenswrapper[4922]: I0218 11:39:05.075594 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vhdd8" podUID="2f7958cf-7c2d-4c29-bea8-5871267d5e16" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.32:8443/livez\": dial tcp 10.217.0.32:8443: connect: connection refused" Feb 18 11:39:05 crc kubenswrapper[4922]: I0218 11:39:05.100786 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rj4b9" podStartSLOduration=126.100767289 podStartE2EDuration="2m6.100767289s" podCreationTimestamp="2026-02-18 11:36:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:39:05.098759258 +0000 UTC m=+146.826463338" watchObservedRunningTime="2026-02-18 11:39:05.100767289 +0000 UTC m=+146.828471369" Feb 18 11:39:05 crc kubenswrapper[4922]: I0218 11:39:05.117433 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:39:05 crc kubenswrapper[4922]: E0218 11:39:05.117838 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:39:05.617822981 +0000 UTC m=+147.345527061 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:05 crc kubenswrapper[4922]: I0218 11:39:05.126046 4922 patch_prober.go:28] interesting pod/router-default-5444994796-ktkz9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 11:39:05 crc kubenswrapper[4922]: [-]has-synced failed: reason withheld Feb 18 11:39:05 crc kubenswrapper[4922]: [+]process-running ok Feb 18 11:39:05 crc kubenswrapper[4922]: healthz check failed Feb 18 11:39:05 crc kubenswrapper[4922]: I0218 11:39:05.126106 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ktkz9" podUID="9f7b66c5-b258-4314-b3a5-e08b958245b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 11:39:05 crc kubenswrapper[4922]: I0218 11:39:05.212838 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-nc7b9" podStartSLOduration=126.212822675 podStartE2EDuration="2m6.212822675s" podCreationTimestamp="2026-02-18 11:36:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:39:05.208857425 +0000 UTC m=+146.936561505" watchObservedRunningTime="2026-02-18 11:39:05.212822675 +0000 UTC m=+146.940526755" Feb 18 11:39:05 crc kubenswrapper[4922]: I0218 11:39:05.213601 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wm5bk" podStartSLOduration=126.213593775 podStartE2EDuration="2m6.213593775s" podCreationTimestamp="2026-02-18 11:36:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:39:05.145930912 +0000 UTC m=+146.873634992" watchObservedRunningTime="2026-02-18 11:39:05.213593775 +0000 UTC m=+146.941297855" Feb 18 11:39:05 crc kubenswrapper[4922]: I0218 11:39:05.221144 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:05 crc kubenswrapper[4922]: E0218 11:39:05.221437 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:39:05.721426193 +0000 UTC m=+147.449130273 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt2rf" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:05 crc kubenswrapper[4922]: I0218 11:39:05.322903 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:39:05 crc kubenswrapper[4922]: E0218 11:39:05.323247 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:39:05.82323399 +0000 UTC m=+147.550938070 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:05 crc kubenswrapper[4922]: I0218 11:39:05.359542 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zm5zz" podStartSLOduration=126.359522078 podStartE2EDuration="2m6.359522078s" podCreationTimestamp="2026-02-18 11:36:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:39:05.316593322 +0000 UTC m=+147.044297392" watchObservedRunningTime="2026-02-18 11:39:05.359522078 +0000 UTC m=+147.087226158" Feb 18 11:39:05 crc kubenswrapper[4922]: I0218 11:39:05.397222 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vhdd8" podStartSLOduration=126.397200182 podStartE2EDuration="2m6.397200182s" podCreationTimestamp="2026-02-18 11:36:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:39:05.361618521 +0000 UTC m=+147.089322601" watchObservedRunningTime="2026-02-18 11:39:05.397200182 +0000 UTC m=+147.124904262" Feb 18 11:39:05 crc kubenswrapper[4922]: I0218 11:39:05.398262 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-wdprh" podStartSLOduration=127.398254509 podStartE2EDuration="2m7.398254509s" podCreationTimestamp="2026-02-18 11:36:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:39:05.397689775 +0000 UTC m=+147.125393855" watchObservedRunningTime="2026-02-18 11:39:05.398254509 +0000 UTC m=+147.125958589" Feb 18 11:39:05 crc kubenswrapper[4922]: I0218 11:39:05.430200 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:05 crc kubenswrapper[4922]: E0218 11:39:05.431179 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:39:05.931167462 +0000 UTC m=+147.658871542 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt2rf" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:05 crc kubenswrapper[4922]: I0218 11:39:05.465615 4922 csr.go:261] certificate signing request csr-6t8gk is approved, waiting to be issued Feb 18 11:39:05 crc kubenswrapper[4922]: I0218 11:39:05.487326 4922 csr.go:257] certificate signing request csr-6t8gk is issued Feb 18 11:39:05 crc kubenswrapper[4922]: I0218 11:39:05.504235 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9kz6f" podStartSLOduration=126.504216941 podStartE2EDuration="2m6.504216941s" podCreationTimestamp="2026-02-18 11:36:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:39:05.441704039 +0000 UTC m=+147.169408129" watchObservedRunningTime="2026-02-18 11:39:05.504216941 +0000 UTC m=+147.231921021" Feb 18 11:39:05 crc kubenswrapper[4922]: I0218 11:39:05.532159 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:39:05 crc kubenswrapper[4922]: E0218 11:39:05.532655 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:39:06.03263579 +0000 UTC m=+147.760339870 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:05 crc kubenswrapper[4922]: I0218 11:39:05.550937 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-gbpm4" podStartSLOduration=8.550922563 podStartE2EDuration="8.550922563s" podCreationTimestamp="2026-02-18 11:38:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:39:05.546345547 +0000 UTC m=+147.274049627" watchObservedRunningTime="2026-02-18 11:39:05.550922563 +0000 UTC m=+147.278626643" Feb 18 11:39:05 crc kubenswrapper[4922]: I0218 11:39:05.551230 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jkv6t" podStartSLOduration=126.551225971 podStartE2EDuration="2m6.551225971s" podCreationTimestamp="2026-02-18 11:36:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:39:05.506197471 +0000 UTC m=+147.233901551" watchObservedRunningTime="2026-02-18 11:39:05.551225971 +0000 UTC m=+147.278930051" Feb 18 11:39:05 crc kubenswrapper[4922]: I0218 11:39:05.575669 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9b55m" podStartSLOduration=127.575647349 podStartE2EDuration="2m7.575647349s" podCreationTimestamp="2026-02-18 11:36:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:39:05.568722214 +0000 UTC m=+147.296426294" watchObservedRunningTime="2026-02-18 11:39:05.575647349 +0000 UTC m=+147.303351449" Feb 18 11:39:05 crc kubenswrapper[4922]: I0218 11:39:05.598343 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-v4cwm" podStartSLOduration=126.598329783 podStartE2EDuration="2m6.598329783s" podCreationTimestamp="2026-02-18 11:36:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:39:05.596765553 +0000 UTC m=+147.324469633" watchObservedRunningTime="2026-02-18 11:39:05.598329783 +0000 UTC m=+147.326033853" Feb 18 11:39:05 crc kubenswrapper[4922]: I0218 11:39:05.621202 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b6msr" podStartSLOduration=126.621185372 podStartE2EDuration="2m6.621185372s" podCreationTimestamp="2026-02-18 11:36:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:39:05.620020322 +0000 UTC m=+147.347724402" watchObservedRunningTime="2026-02-18 11:39:05.621185372 +0000 UTC m=+147.348889452" Feb 18 11:39:05 crc kubenswrapper[4922]: I0218 11:39:05.635248 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:05 crc kubenswrapper[4922]: E0218 11:39:05.635703 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:39:06.135688769 +0000 UTC m=+147.863392849 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt2rf" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:05 crc kubenswrapper[4922]: I0218 11:39:05.651710 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-v6m8j" Feb 18 11:39:05 crc kubenswrapper[4922]: I0218 11:39:05.739922 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:39:05 crc kubenswrapper[4922]: E0218 11:39:05.740470 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:39:06.24045161 +0000 UTC m=+147.968155690 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:05 crc kubenswrapper[4922]: I0218 11:39:05.795355 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-x69t8" event={"ID":"a768634b-1586-4ba2-9a05-6a88f5befea1","Type":"ContainerStarted","Data":"72bb5869f6a6496f8b63fd6ed562f4bb406d18a6d8ad2d2971ddbfd870acaf55"} Feb 18 11:39:05 crc kubenswrapper[4922]: I0218 11:39:05.796717 4922 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-nc7b9 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Feb 18 11:39:05 crc kubenswrapper[4922]: I0218 11:39:05.796763 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-nc7b9" podUID="aa233e7a-8a71-495c-b696-2f3dac9f0ada" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" Feb 18 11:39:05 crc kubenswrapper[4922]: I0218 11:39:05.819701 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-p4qtq" Feb 18 11:39:05 crc kubenswrapper[4922]: I0218 11:39:05.848240 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:05 crc kubenswrapper[4922]: I0218 11:39:05.848363 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zm5zz" Feb 18 11:39:05 crc kubenswrapper[4922]: E0218 11:39:05.848659 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:39:06.348643399 +0000 UTC m=+148.076347479 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt2rf" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:05 crc kubenswrapper[4922]: I0218 11:39:05.949091 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:39:05 crc kubenswrapper[4922]: I0218 11:39:05.969796 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" Feb 18 11:39:05 crc kubenswrapper[4922]: E0218 11:39:05.970438 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:39:06.470400411 +0000 UTC m=+148.198104491 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:06 crc kubenswrapper[4922]: I0218 11:39:06.077440 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:06 crc kubenswrapper[4922]: E0218 11:39:06.077913 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:39:06.577898392 +0000 UTC m=+148.305602472 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt2rf" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:06 crc kubenswrapper[4922]: I0218 11:39:06.122793 4922 patch_prober.go:28] interesting pod/router-default-5444994796-ktkz9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 11:39:06 crc kubenswrapper[4922]: [-]has-synced failed: reason withheld Feb 18 11:39:06 crc kubenswrapper[4922]: [+]process-running ok Feb 18 11:39:06 crc kubenswrapper[4922]: healthz check failed Feb 18 11:39:06 crc kubenswrapper[4922]: I0218 11:39:06.122857 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ktkz9" podUID="9f7b66c5-b258-4314-b3a5-e08b958245b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 11:39:06 crc kubenswrapper[4922]: I0218 11:39:06.178744 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:39:06 crc kubenswrapper[4922]: E0218 11:39:06.178966 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:39:06.678925939 +0000 UTC m=+148.406630019 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:06 crc kubenswrapper[4922]: I0218 11:39:06.179215 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:06 crc kubenswrapper[4922]: E0218 11:39:06.179696 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:39:06.679674048 +0000 UTC m=+148.407378128 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt2rf" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:06 crc kubenswrapper[4922]: I0218 11:39:06.280541 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:39:06 crc kubenswrapper[4922]: E0218 11:39:06.280964 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:39:06.780897489 +0000 UTC m=+148.508601569 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:06 crc kubenswrapper[4922]: I0218 11:39:06.281099 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:06 crc kubenswrapper[4922]: E0218 11:39:06.281534 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:39:06.781518365 +0000 UTC m=+148.509222445 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt2rf" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:06 crc kubenswrapper[4922]: I0218 11:39:06.382059 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:39:06 crc kubenswrapper[4922]: E0218 11:39:06.382286 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:39:06.882245524 +0000 UTC m=+148.609949604 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:06 crc kubenswrapper[4922]: I0218 11:39:06.382754 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:06 crc kubenswrapper[4922]: E0218 11:39:06.383074 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:39:06.883059705 +0000 UTC m=+148.610763785 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt2rf" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:06 crc kubenswrapper[4922]: I0218 11:39:06.484270 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:39:06 crc kubenswrapper[4922]: E0218 11:39:06.484703 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:39:06.984668767 +0000 UTC m=+148.712372857 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:06 crc kubenswrapper[4922]: I0218 11:39:06.485584 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:06 crc kubenswrapper[4922]: E0218 11:39:06.486091 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:39:06.986078892 +0000 UTC m=+148.713782972 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt2rf" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:06 crc kubenswrapper[4922]: I0218 11:39:06.488210 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-18 11:34:05 +0000 UTC, rotation deadline is 2026-11-01 11:09:18.354946646 +0000 UTC Feb 18 11:39:06 crc kubenswrapper[4922]: I0218 11:39:06.488253 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6143h30m11.866696429s for next certificate rotation Feb 18 11:39:06 crc kubenswrapper[4922]: I0218 11:39:06.586923 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:39:06 crc kubenswrapper[4922]: E0218 11:39:06.587231 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:39:07.087216062 +0000 UTC m=+148.814920142 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:06 crc kubenswrapper[4922]: I0218 11:39:06.587588 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:06 crc kubenswrapper[4922]: E0218 11:39:06.587889 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:39:07.087881399 +0000 UTC m=+148.815585479 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt2rf" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:06 crc kubenswrapper[4922]: I0218 11:39:06.666942 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vgs8b" Feb 18 11:39:06 crc kubenswrapper[4922]: I0218 11:39:06.688852 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:39:06 crc kubenswrapper[4922]: E0218 11:39:06.689109 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:39:07.189092491 +0000 UTC m=+148.916796571 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:06 crc kubenswrapper[4922]: E0218 11:39:06.689854 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:39:07.18984601 +0000 UTC m=+148.917550090 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt2rf" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:06 crc kubenswrapper[4922]: I0218 11:39:06.689965 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:06 crc kubenswrapper[4922]: I0218 11:39:06.791833 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:39:06 crc kubenswrapper[4922]: E0218 11:39:06.792029 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:39:07.292000316 +0000 UTC m=+149.019704396 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:06 crc kubenswrapper[4922]: I0218 11:39:06.792873 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:06 crc kubenswrapper[4922]: E0218 11:39:06.793348 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:39:07.29333363 +0000 UTC m=+149.021037710 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt2rf" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:06 crc kubenswrapper[4922]: I0218 11:39:06.802186 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-x69t8" event={"ID":"a768634b-1586-4ba2-9a05-6a88f5befea1","Type":"ContainerStarted","Data":"cdd3dc5fb1f6da55feebfd1371f0fa68eee2082cb297235c6c7633a91ebfc381"} Feb 18 11:39:06 crc kubenswrapper[4922]: I0218 11:39:06.893930 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:39:06 crc kubenswrapper[4922]: E0218 11:39:06.894295 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:39:07.394279385 +0000 UTC m=+149.121983465 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:06 crc kubenswrapper[4922]: I0218 11:39:06.894635 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:06 crc kubenswrapper[4922]: E0218 11:39:06.894931 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:39:07.394923991 +0000 UTC m=+149.122628071 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt2rf" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:06 crc kubenswrapper[4922]: I0218 11:39:06.920347 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5lflw"] Feb 18 11:39:06 crc kubenswrapper[4922]: I0218 11:39:06.921238 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5lflw" Feb 18 11:39:06 crc kubenswrapper[4922]: I0218 11:39:06.923350 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 18 11:39:06 crc kubenswrapper[4922]: I0218 11:39:06.941619 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5lflw"] Feb 18 11:39:06 crc kubenswrapper[4922]: I0218 11:39:06.996185 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:39:06 crc kubenswrapper[4922]: E0218 11:39:06.997455 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:39:07.497432476 +0000 UTC m=+149.225136626 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.068967 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7dzbt"] Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.069977 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7dzbt" Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.072067 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.089284 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7dzbt"] Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.098004 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cddee0a-8b13-429b-89b6-e820f8f3ec59-catalog-content\") pod \"community-operators-5lflw\" (UID: \"9cddee0a-8b13-429b-89b6-e820f8f3ec59\") " pod="openshift-marketplace/community-operators-5lflw" Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.098076 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x5ts\" (UniqueName: \"kubernetes.io/projected/9cddee0a-8b13-429b-89b6-e820f8f3ec59-kube-api-access-6x5ts\") pod \"community-operators-5lflw\" (UID: \"9cddee0a-8b13-429b-89b6-e820f8f3ec59\") " pod="openshift-marketplace/community-operators-5lflw" Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.098278 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.098325 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cddee0a-8b13-429b-89b6-e820f8f3ec59-utilities\") pod \"community-operators-5lflw\" (UID: \"9cddee0a-8b13-429b-89b6-e820f8f3ec59\") " pod="openshift-marketplace/community-operators-5lflw" Feb 18 11:39:07 crc kubenswrapper[4922]: E0218 11:39:07.098645 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:39:07.598633807 +0000 UTC m=+149.326337877 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt2rf" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.119367 4922 patch_prober.go:28] interesting pod/router-default-5444994796-ktkz9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 11:39:07 crc kubenswrapper[4922]: [-]has-synced failed: reason withheld Feb 18 11:39:07 crc kubenswrapper[4922]: [+]process-running ok Feb 18 11:39:07 crc kubenswrapper[4922]: healthz check failed Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.119424 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ktkz9" podUID="9f7b66c5-b258-4314-b3a5-e08b958245b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.202941 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.203193 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe4edbcb-8a38-4f30-975f-aa4825192b4e-catalog-content\") pod \"certified-operators-7dzbt\" (UID: \"fe4edbcb-8a38-4f30-975f-aa4825192b4e\") " pod="openshift-marketplace/certified-operators-7dzbt" Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.203258 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cddee0a-8b13-429b-89b6-e820f8f3ec59-utilities\") pod \"community-operators-5lflw\" (UID: \"9cddee0a-8b13-429b-89b6-e820f8f3ec59\") " pod="openshift-marketplace/community-operators-5lflw" Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.203326 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cddee0a-8b13-429b-89b6-e820f8f3ec59-catalog-content\") pod \"community-operators-5lflw\" (UID: \"9cddee0a-8b13-429b-89b6-e820f8f3ec59\") " pod="openshift-marketplace/community-operators-5lflw" Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.203355 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe4edbcb-8a38-4f30-975f-aa4825192b4e-utilities\") pod \"certified-operators-7dzbt\" (UID: \"fe4edbcb-8a38-4f30-975f-aa4825192b4e\") " pod="openshift-marketplace/certified-operators-7dzbt" Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.203428 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6nbf\" (UniqueName: \"kubernetes.io/projected/fe4edbcb-8a38-4f30-975f-aa4825192b4e-kube-api-access-k6nbf\") pod \"certified-operators-7dzbt\" (UID: \"fe4edbcb-8a38-4f30-975f-aa4825192b4e\") " pod="openshift-marketplace/certified-operators-7dzbt" Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.203462 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x5ts\" (UniqueName: \"kubernetes.io/projected/9cddee0a-8b13-429b-89b6-e820f8f3ec59-kube-api-access-6x5ts\") pod \"community-operators-5lflw\" (UID: \"9cddee0a-8b13-429b-89b6-e820f8f3ec59\") " pod="openshift-marketplace/community-operators-5lflw" Feb 18 11:39:07 crc kubenswrapper[4922]: E0218 11:39:07.203841 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:39:07.70382776 +0000 UTC m=+149.431531840 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.204551 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cddee0a-8b13-429b-89b6-e820f8f3ec59-utilities\") pod \"community-operators-5lflw\" (UID: \"9cddee0a-8b13-429b-89b6-e820f8f3ec59\") " pod="openshift-marketplace/community-operators-5lflw" Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.204769 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cddee0a-8b13-429b-89b6-e820f8f3ec59-catalog-content\") pod \"community-operators-5lflw\" (UID: \"9cddee0a-8b13-429b-89b6-e820f8f3ec59\") " pod="openshift-marketplace/community-operators-5lflw" Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.234244 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x5ts\" (UniqueName: \"kubernetes.io/projected/9cddee0a-8b13-429b-89b6-e820f8f3ec59-kube-api-access-6x5ts\") pod \"community-operators-5lflw\" (UID: \"9cddee0a-8b13-429b-89b6-e820f8f3ec59\") " pod="openshift-marketplace/community-operators-5lflw" Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.251676 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5lflw" Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.283380 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lcnjk"] Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.284727 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lcnjk" Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.302827 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lcnjk"] Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.308494 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe4edbcb-8a38-4f30-975f-aa4825192b4e-utilities\") pod \"certified-operators-7dzbt\" (UID: \"fe4edbcb-8a38-4f30-975f-aa4825192b4e\") " pod="openshift-marketplace/certified-operators-7dzbt" Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.308562 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6nbf\" (UniqueName: \"kubernetes.io/projected/fe4edbcb-8a38-4f30-975f-aa4825192b4e-kube-api-access-k6nbf\") pod \"certified-operators-7dzbt\" (UID: \"fe4edbcb-8a38-4f30-975f-aa4825192b4e\") " pod="openshift-marketplace/certified-operators-7dzbt" Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.308614 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe4edbcb-8a38-4f30-975f-aa4825192b4e-catalog-content\") pod \"certified-operators-7dzbt\" (UID: \"fe4edbcb-8a38-4f30-975f-aa4825192b4e\") " pod="openshift-marketplace/certified-operators-7dzbt" Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.308644 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:07 crc kubenswrapper[4922]: E0218 11:39:07.308957 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:39:07.808945631 +0000 UTC m=+149.536649711 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt2rf" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.309413 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe4edbcb-8a38-4f30-975f-aa4825192b4e-utilities\") pod \"certified-operators-7dzbt\" (UID: \"fe4edbcb-8a38-4f30-975f-aa4825192b4e\") " pod="openshift-marketplace/certified-operators-7dzbt" Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.309904 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe4edbcb-8a38-4f30-975f-aa4825192b4e-catalog-content\") pod \"certified-operators-7dzbt\" (UID: \"fe4edbcb-8a38-4f30-975f-aa4825192b4e\") " pod="openshift-marketplace/certified-operators-7dzbt" Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.363123 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6nbf\" (UniqueName: \"kubernetes.io/projected/fe4edbcb-8a38-4f30-975f-aa4825192b4e-kube-api-access-k6nbf\") pod \"certified-operators-7dzbt\" (UID: \"fe4edbcb-8a38-4f30-975f-aa4825192b4e\") " pod="openshift-marketplace/certified-operators-7dzbt" Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.383653 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7dzbt" Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.409832 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.410041 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvvjz\" (UniqueName: \"kubernetes.io/projected/fc9b41f8-ac9b-4166-a2a6-80326e19254a-kube-api-access-fvvjz\") pod \"community-operators-lcnjk\" (UID: \"fc9b41f8-ac9b-4166-a2a6-80326e19254a\") " pod="openshift-marketplace/community-operators-lcnjk" Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.410072 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc9b41f8-ac9b-4166-a2a6-80326e19254a-utilities\") pod \"community-operators-lcnjk\" (UID: \"fc9b41f8-ac9b-4166-a2a6-80326e19254a\") " pod="openshift-marketplace/community-operators-lcnjk" Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.410130 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc9b41f8-ac9b-4166-a2a6-80326e19254a-catalog-content\") pod \"community-operators-lcnjk\" (UID: \"fc9b41f8-ac9b-4166-a2a6-80326e19254a\") " pod="openshift-marketplace/community-operators-lcnjk" Feb 18 11:39:07 crc kubenswrapper[4922]: E0218 11:39:07.410256 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:39:07.910242535 +0000 UTC m=+149.637946615 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.514263 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvvjz\" (UniqueName: \"kubernetes.io/projected/fc9b41f8-ac9b-4166-a2a6-80326e19254a-kube-api-access-fvvjz\") pod \"community-operators-lcnjk\" (UID: \"fc9b41f8-ac9b-4166-a2a6-80326e19254a\") " pod="openshift-marketplace/community-operators-lcnjk" Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.514559 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc9b41f8-ac9b-4166-a2a6-80326e19254a-utilities\") pod \"community-operators-lcnjk\" (UID: \"fc9b41f8-ac9b-4166-a2a6-80326e19254a\") " pod="openshift-marketplace/community-operators-lcnjk" Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.514621 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc9b41f8-ac9b-4166-a2a6-80326e19254a-catalog-content\") pod \"community-operators-lcnjk\" (UID: \"fc9b41f8-ac9b-4166-a2a6-80326e19254a\") " pod="openshift-marketplace/community-operators-lcnjk" Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.514649 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:07 crc kubenswrapper[4922]: E0218 11:39:07.514951 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:39:08.014939855 +0000 UTC m=+149.742643935 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt2rf" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.515660 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc9b41f8-ac9b-4166-a2a6-80326e19254a-utilities\") pod \"community-operators-lcnjk\" (UID: \"fc9b41f8-ac9b-4166-a2a6-80326e19254a\") " pod="openshift-marketplace/community-operators-lcnjk" Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.515860 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc9b41f8-ac9b-4166-a2a6-80326e19254a-catalog-content\") pod \"community-operators-lcnjk\" (UID: \"fc9b41f8-ac9b-4166-a2a6-80326e19254a\") " pod="openshift-marketplace/community-operators-lcnjk" Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.520558 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nrxb6"] Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.524670 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nrxb6" Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.569108 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nrxb6"] Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.573417 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvvjz\" (UniqueName: \"kubernetes.io/projected/fc9b41f8-ac9b-4166-a2a6-80326e19254a-kube-api-access-fvvjz\") pod \"community-operators-lcnjk\" (UID: \"fc9b41f8-ac9b-4166-a2a6-80326e19254a\") " pod="openshift-marketplace/community-operators-lcnjk" Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.618478 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:39:07 crc kubenswrapper[4922]: E0218 11:39:07.618795 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:39:08.118781843 +0000 UTC m=+149.846485923 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.627650 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lcnjk" Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.720214 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tq9z\" (UniqueName: \"kubernetes.io/projected/48ccc67a-6393-4dea-9c00-24bbc55e34d3-kube-api-access-9tq9z\") pod \"certified-operators-nrxb6\" (UID: \"48ccc67a-6393-4dea-9c00-24bbc55e34d3\") " pod="openshift-marketplace/certified-operators-nrxb6" Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.720274 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48ccc67a-6393-4dea-9c00-24bbc55e34d3-catalog-content\") pod \"certified-operators-nrxb6\" (UID: \"48ccc67a-6393-4dea-9c00-24bbc55e34d3\") " pod="openshift-marketplace/certified-operators-nrxb6" Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.720332 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48ccc67a-6393-4dea-9c00-24bbc55e34d3-utilities\") pod \"certified-operators-nrxb6\" (UID: \"48ccc67a-6393-4dea-9c00-24bbc55e34d3\") " pod="openshift-marketplace/certified-operators-nrxb6" Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.720477 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:07 crc kubenswrapper[4922]: E0218 11:39:07.720906 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:39:08.220890858 +0000 UTC m=+149.948594948 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt2rf" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.740527 4922 patch_prober.go:28] interesting pod/apiserver-76f77b778f-ks48g container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 18 11:39:07 crc kubenswrapper[4922]: [+]log ok Feb 18 11:39:07 crc kubenswrapper[4922]: [+]etcd ok Feb 18 11:39:07 crc kubenswrapper[4922]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 18 11:39:07 crc kubenswrapper[4922]: [+]poststarthook/generic-apiserver-start-informers ok Feb 18 11:39:07 crc kubenswrapper[4922]: [+]poststarthook/max-in-flight-filter ok Feb 18 11:39:07 crc kubenswrapper[4922]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 18 11:39:07 crc kubenswrapper[4922]: [+]poststarthook/image.openshift.io-apiserver-caches ok Feb 18 11:39:07 crc kubenswrapper[4922]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Feb 18 11:39:07 crc kubenswrapper[4922]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Feb 18 11:39:07 crc kubenswrapper[4922]: [+]poststarthook/project.openshift.io-projectcache ok Feb 18 11:39:07 crc kubenswrapper[4922]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Feb 18 11:39:07 crc kubenswrapper[4922]: [+]poststarthook/openshift.io-startinformers ok Feb 18 11:39:07 crc kubenswrapper[4922]: [+]poststarthook/openshift.io-restmapperupdater ok Feb 18 11:39:07 crc kubenswrapper[4922]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 18 11:39:07 crc kubenswrapper[4922]: livez check failed Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.740585 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-ks48g" podUID="8719fb44-5fea-4fd5-a516-5d2ab11c221c" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.824027 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.824462 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tq9z\" (UniqueName: \"kubernetes.io/projected/48ccc67a-6393-4dea-9c00-24bbc55e34d3-kube-api-access-9tq9z\") pod \"certified-operators-nrxb6\" (UID: \"48ccc67a-6393-4dea-9c00-24bbc55e34d3\") " pod="openshift-marketplace/certified-operators-nrxb6" Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.824537 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48ccc67a-6393-4dea-9c00-24bbc55e34d3-catalog-content\") pod \"certified-operators-nrxb6\" (UID: \"48ccc67a-6393-4dea-9c00-24bbc55e34d3\") " pod="openshift-marketplace/certified-operators-nrxb6" Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.824617 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48ccc67a-6393-4dea-9c00-24bbc55e34d3-utilities\") pod \"certified-operators-nrxb6\" (UID: \"48ccc67a-6393-4dea-9c00-24bbc55e34d3\") " pod="openshift-marketplace/certified-operators-nrxb6" Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.825861 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48ccc67a-6393-4dea-9c00-24bbc55e34d3-utilities\") pod \"certified-operators-nrxb6\" (UID: \"48ccc67a-6393-4dea-9c00-24bbc55e34d3\") " pod="openshift-marketplace/certified-operators-nrxb6" Feb 18 11:39:07 crc kubenswrapper[4922]: E0218 11:39:07.825979 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:39:08.325955957 +0000 UTC m=+150.053660077 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.827940 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48ccc67a-6393-4dea-9c00-24bbc55e34d3-catalog-content\") pod \"certified-operators-nrxb6\" (UID: \"48ccc67a-6393-4dea-9c00-24bbc55e34d3\") " pod="openshift-marketplace/certified-operators-nrxb6" Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.868466 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tq9z\" (UniqueName: \"kubernetes.io/projected/48ccc67a-6393-4dea-9c00-24bbc55e34d3-kube-api-access-9tq9z\") pod \"certified-operators-nrxb6\" (UID: \"48ccc67a-6393-4dea-9c00-24bbc55e34d3\") " pod="openshift-marketplace/certified-operators-nrxb6" Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.869970 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5lflw"] Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.874458 4922 generic.go:334] "Generic (PLEG): container finished" podID="75c707c4-5c62-438f-8312-2307d3ef0ba8" containerID="28b5d335d68c0542326e75d8138276312eb3264af281f35e13ca715cee393fd1" exitCode=0 Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.874587 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523570-k54zv" event={"ID":"75c707c4-5c62-438f-8312-2307d3ef0ba8","Type":"ContainerDied","Data":"28b5d335d68c0542326e75d8138276312eb3264af281f35e13ca715cee393fd1"} Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.885873 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-x69t8" event={"ID":"a768634b-1586-4ba2-9a05-6a88f5befea1","Type":"ContainerStarted","Data":"03e1a416ade5db9ec99c4b0589181f7a5bef5c1cac2f5083f1dbdcfa6f069bdb"} Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.885917 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-x69t8" event={"ID":"a768634b-1586-4ba2-9a05-6a88f5befea1","Type":"ContainerStarted","Data":"7f104f486c331e04994cef74c68022853ba96007cd1988029409f1ba6aeaa07c"} Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.920189 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nrxb6" Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.930656 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:07 crc kubenswrapper[4922]: E0218 11:39:07.931669 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:39:08.431657023 +0000 UTC m=+150.159361103 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt2rf" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:07 crc kubenswrapper[4922]: I0218 11:39:07.945233 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-x69t8" podStartSLOduration=10.945216676 podStartE2EDuration="10.945216676s" podCreationTimestamp="2026-02-18 11:38:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:39:07.938966908 +0000 UTC m=+149.666670988" watchObservedRunningTime="2026-02-18 11:39:07.945216676 +0000 UTC m=+149.672920756" Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.033110 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.033415 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.033454 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.033501 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:39:08 crc kubenswrapper[4922]: E0218 11:39:08.033617 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:39:08.533574272 +0000 UTC m=+150.261278352 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.033825 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.033870 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:08 crc kubenswrapper[4922]: E0218 11:39:08.034379 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:39:08.534354442 +0000 UTC m=+150.262058522 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt2rf" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.039066 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.048022 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.053876 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.059255 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.065866 4922 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.085874 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lcnjk"] Feb 18 11:39:08 crc kubenswrapper[4922]: W0218 11:39:08.105064 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc9b41f8_ac9b_4166_a2a6_80326e19254a.slice/crio-a8cd6fc2cc1529740a5644abc0364ded3c6ac51bb6647c92d79339c69415a299 WatchSource:0}: Error finding container a8cd6fc2cc1529740a5644abc0364ded3c6ac51bb6647c92d79339c69415a299: Status 404 returned error can't find the container with id a8cd6fc2cc1529740a5644abc0364ded3c6ac51bb6647c92d79339c69415a299 Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.125791 4922 patch_prober.go:28] interesting pod/router-default-5444994796-ktkz9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 11:39:08 crc kubenswrapper[4922]: [-]has-synced failed: reason withheld Feb 18 11:39:08 crc kubenswrapper[4922]: [+]process-running ok Feb 18 11:39:08 crc kubenswrapper[4922]: healthz check failed Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.125882 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ktkz9" podUID="9f7b66c5-b258-4314-b3a5-e08b958245b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.140383 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:39:08 crc kubenswrapper[4922]: E0218 11:39:08.140564 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:39:08.6405375 +0000 UTC m=+150.368241580 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.140643 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:08 crc kubenswrapper[4922]: E0218 11:39:08.140924 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:39:08.640913339 +0000 UTC m=+150.368617419 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wt2rf" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.153265 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7dzbt"] Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.230487 4922 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-18T11:39:08.06589768Z","Handler":null,"Name":""} Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.236501 4922 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.236533 4922 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.241791 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.265755 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.298637 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.307153 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.312278 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.345145 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.355272 4922 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.355498 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.404665 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nrxb6"] Feb 18 11:39:08 crc kubenswrapper[4922]: W0218 11:39:08.492741 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48ccc67a_6393_4dea_9c00_24bbc55e34d3.slice/crio-cf75383b2030608656ce7bf5f480ccca8d0511a948c0bdf822601c66ee3563e2 WatchSource:0}: Error finding container cf75383b2030608656ce7bf5f480ccca8d0511a948c0bdf822601c66ee3563e2: Status 404 returned error can't find the container with id cf75383b2030608656ce7bf5f480ccca8d0511a948c0bdf822601c66ee3563e2 Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.521152 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wt2rf\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.700579 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.701439 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.707626 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.707883 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.721635 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.753149 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7twg2" Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.763180 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.852691 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/45f5b001-7d04-46f6-b77d-f79f28d8513e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"45f5b001-7d04-46f6-b77d-f79f28d8513e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.852787 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/45f5b001-7d04-46f6-b77d-f79f28d8513e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"45f5b001-7d04-46f6-b77d-f79f28d8513e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.869802 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5vjsn"] Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.873158 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5vjsn" Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.882948 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.905692 4922 generic.go:334] "Generic (PLEG): container finished" podID="fc9b41f8-ac9b-4166-a2a6-80326e19254a" containerID="120adbdad8789c27eefe3c782cdf2eec2b4857607b20057ec0fcf6bbe6831fd0" exitCode=0 Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.905758 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lcnjk" event={"ID":"fc9b41f8-ac9b-4166-a2a6-80326e19254a","Type":"ContainerDied","Data":"120adbdad8789c27eefe3c782cdf2eec2b4857607b20057ec0fcf6bbe6831fd0"} Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.905791 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lcnjk" event={"ID":"fc9b41f8-ac9b-4166-a2a6-80326e19254a","Type":"ContainerStarted","Data":"a8cd6fc2cc1529740a5644abc0364ded3c6ac51bb6647c92d79339c69415a299"} Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.906048 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5vjsn"] Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.913254 4922 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.923812 4922 generic.go:334] "Generic (PLEG): container finished" podID="48ccc67a-6393-4dea-9c00-24bbc55e34d3" containerID="26033474e45a010781ba836c11c8f393e1aeb5595fb8120441fc214ed20d9a16" exitCode=0 Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.923916 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nrxb6" event={"ID":"48ccc67a-6393-4dea-9c00-24bbc55e34d3","Type":"ContainerDied","Data":"26033474e45a010781ba836c11c8f393e1aeb5595fb8120441fc214ed20d9a16"} Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.923952 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nrxb6" event={"ID":"48ccc67a-6393-4dea-9c00-24bbc55e34d3","Type":"ContainerStarted","Data":"cf75383b2030608656ce7bf5f480ccca8d0511a948c0bdf822601c66ee3563e2"} Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.938348 4922 generic.go:334] "Generic (PLEG): container finished" podID="9cddee0a-8b13-429b-89b6-e820f8f3ec59" containerID="51b067b284f625e9e52dba70686a38665f13c1a97c6bdfd4e23fe3752e04d605" exitCode=0 Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.938486 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5lflw" event={"ID":"9cddee0a-8b13-429b-89b6-e820f8f3ec59","Type":"ContainerDied","Data":"51b067b284f625e9e52dba70686a38665f13c1a97c6bdfd4e23fe3752e04d605"} Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.938511 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5lflw" event={"ID":"9cddee0a-8b13-429b-89b6-e820f8f3ec59","Type":"ContainerStarted","Data":"c1ce59c10870c2ecd21ad32da1730316e1c9e1d338deac7b1c3b3f7688db298c"} Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.944641 4922 generic.go:334] "Generic (PLEG): container finished" podID="fe4edbcb-8a38-4f30-975f-aa4825192b4e" containerID="f76c202bd50c16deefebdd21c812fb020f0a14b30d06f1245a0034d7909dea9c" exitCode=0 Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.947618 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7dzbt" event={"ID":"fe4edbcb-8a38-4f30-975f-aa4825192b4e","Type":"ContainerDied","Data":"f76c202bd50c16deefebdd21c812fb020f0a14b30d06f1245a0034d7909dea9c"} Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.947646 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7dzbt" event={"ID":"fe4edbcb-8a38-4f30-975f-aa4825192b4e","Type":"ContainerStarted","Data":"aa0626d406720474e06eba27d9c88b12751f048f72073c63b3e1e91b6784d080"} Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.956003 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7zw6\" (UniqueName: \"kubernetes.io/projected/bf0d2342-e758-43cc-8c89-adc3ceb98453-kube-api-access-x7zw6\") pod \"redhat-marketplace-5vjsn\" (UID: \"bf0d2342-e758-43cc-8c89-adc3ceb98453\") " pod="openshift-marketplace/redhat-marketplace-5vjsn" Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.956132 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/45f5b001-7d04-46f6-b77d-f79f28d8513e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"45f5b001-7d04-46f6-b77d-f79f28d8513e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.956154 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf0d2342-e758-43cc-8c89-adc3ceb98453-catalog-content\") pod \"redhat-marketplace-5vjsn\" (UID: \"bf0d2342-e758-43cc-8c89-adc3ceb98453\") " pod="openshift-marketplace/redhat-marketplace-5vjsn" Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.956337 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf0d2342-e758-43cc-8c89-adc3ceb98453-utilities\") pod \"redhat-marketplace-5vjsn\" (UID: \"bf0d2342-e758-43cc-8c89-adc3ceb98453\") " pod="openshift-marketplace/redhat-marketplace-5vjsn" Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.956643 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/45f5b001-7d04-46f6-b77d-f79f28d8513e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"45f5b001-7d04-46f6-b77d-f79f28d8513e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.957096 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/45f5b001-7d04-46f6-b77d-f79f28d8513e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"45f5b001-7d04-46f6-b77d-f79f28d8513e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 18 11:39:08 crc kubenswrapper[4922]: W0218 11:39:08.957877 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-2a5135b69f4b3807d7677aeea252c6b9b3eb722654036f6b8e1f09af0f3488da WatchSource:0}: Error finding container 2a5135b69f4b3807d7677aeea252c6b9b3eb722654036f6b8e1f09af0f3488da: Status 404 returned error can't find the container with id 2a5135b69f4b3807d7677aeea252c6b9b3eb722654036f6b8e1f09af0f3488da Feb 18 11:39:08 crc kubenswrapper[4922]: I0218 11:39:08.981833 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/45f5b001-7d04-46f6-b77d-f79f28d8513e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"45f5b001-7d04-46f6-b77d-f79f28d8513e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:08.998989 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.041299 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.058480 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7zw6\" (UniqueName: \"kubernetes.io/projected/bf0d2342-e758-43cc-8c89-adc3ceb98453-kube-api-access-x7zw6\") pod \"redhat-marketplace-5vjsn\" (UID: \"bf0d2342-e758-43cc-8c89-adc3ceb98453\") " pod="openshift-marketplace/redhat-marketplace-5vjsn" Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.058647 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf0d2342-e758-43cc-8c89-adc3ceb98453-catalog-content\") pod \"redhat-marketplace-5vjsn\" (UID: \"bf0d2342-e758-43cc-8c89-adc3ceb98453\") " pod="openshift-marketplace/redhat-marketplace-5vjsn" Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.058697 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf0d2342-e758-43cc-8c89-adc3ceb98453-utilities\") pod \"redhat-marketplace-5vjsn\" (UID: \"bf0d2342-e758-43cc-8c89-adc3ceb98453\") " pod="openshift-marketplace/redhat-marketplace-5vjsn" Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.059806 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf0d2342-e758-43cc-8c89-adc3ceb98453-utilities\") pod \"redhat-marketplace-5vjsn\" (UID: \"bf0d2342-e758-43cc-8c89-adc3ceb98453\") " pod="openshift-marketplace/redhat-marketplace-5vjsn" Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.061951 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf0d2342-e758-43cc-8c89-adc3ceb98453-catalog-content\") pod \"redhat-marketplace-5vjsn\" (UID: \"bf0d2342-e758-43cc-8c89-adc3ceb98453\") " pod="openshift-marketplace/redhat-marketplace-5vjsn" Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.098479 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7zw6\" (UniqueName: \"kubernetes.io/projected/bf0d2342-e758-43cc-8c89-adc3ceb98453-kube-api-access-x7zw6\") pod \"redhat-marketplace-5vjsn\" (UID: \"bf0d2342-e758-43cc-8c89-adc3ceb98453\") " pod="openshift-marketplace/redhat-marketplace-5vjsn" Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.130733 4922 patch_prober.go:28] interesting pod/router-default-5444994796-ktkz9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 11:39:09 crc kubenswrapper[4922]: [-]has-synced failed: reason withheld Feb 18 11:39:09 crc kubenswrapper[4922]: [+]process-running ok Feb 18 11:39:09 crc kubenswrapper[4922]: healthz check failed Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.130794 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ktkz9" podUID="9f7b66c5-b258-4314-b3a5-e08b958245b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.160220 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wt2rf"] Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.198500 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5vjsn" Feb 18 11:39:09 crc kubenswrapper[4922]: W0218 11:39:09.200926 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa9f6f1e_d5ab_4de4_b8b4_ee14f742f2e0.slice/crio-ea1a8d4dd7f49c86b94455df1de54cf958a56e87072aa59f4711d51402743ec5 WatchSource:0}: Error finding container ea1a8d4dd7f49c86b94455df1de54cf958a56e87072aa59f4711d51402743ec5: Status 404 returned error can't find the container with id ea1a8d4dd7f49c86b94455df1de54cf958a56e87072aa59f4711d51402743ec5 Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.296909 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nf4nk"] Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.304794 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nf4nk" Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.319740 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nf4nk"] Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.369107 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523570-k54zv" Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.466905 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0a05956-6087-461d-a271-52db98c6032a-utilities\") pod \"redhat-marketplace-nf4nk\" (UID: \"d0a05956-6087-461d-a271-52db98c6032a\") " pod="openshift-marketplace/redhat-marketplace-nf4nk" Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.467001 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0a05956-6087-461d-a271-52db98c6032a-catalog-content\") pod \"redhat-marketplace-nf4nk\" (UID: \"d0a05956-6087-461d-a271-52db98c6032a\") " pod="openshift-marketplace/redhat-marketplace-nf4nk" Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.467087 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzdsv\" (UniqueName: \"kubernetes.io/projected/d0a05956-6087-461d-a271-52db98c6032a-kube-api-access-tzdsv\") pod \"redhat-marketplace-nf4nk\" (UID: \"d0a05956-6087-461d-a271-52db98c6032a\") " pod="openshift-marketplace/redhat-marketplace-nf4nk" Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.568067 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75c707c4-5c62-438f-8312-2307d3ef0ba8-secret-volume\") pod \"75c707c4-5c62-438f-8312-2307d3ef0ba8\" (UID: \"75c707c4-5c62-438f-8312-2307d3ef0ba8\") " Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.568496 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdbmc\" (UniqueName: \"kubernetes.io/projected/75c707c4-5c62-438f-8312-2307d3ef0ba8-kube-api-access-qdbmc\") pod \"75c707c4-5c62-438f-8312-2307d3ef0ba8\" (UID: \"75c707c4-5c62-438f-8312-2307d3ef0ba8\") " Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.568601 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75c707c4-5c62-438f-8312-2307d3ef0ba8-config-volume\") pod \"75c707c4-5c62-438f-8312-2307d3ef0ba8\" (UID: \"75c707c4-5c62-438f-8312-2307d3ef0ba8\") " Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.568747 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0a05956-6087-461d-a271-52db98c6032a-catalog-content\") pod \"redhat-marketplace-nf4nk\" (UID: \"d0a05956-6087-461d-a271-52db98c6032a\") " pod="openshift-marketplace/redhat-marketplace-nf4nk" Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.568770 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzdsv\" (UniqueName: \"kubernetes.io/projected/d0a05956-6087-461d-a271-52db98c6032a-kube-api-access-tzdsv\") pod \"redhat-marketplace-nf4nk\" (UID: \"d0a05956-6087-461d-a271-52db98c6032a\") " pod="openshift-marketplace/redhat-marketplace-nf4nk" Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.568828 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0a05956-6087-461d-a271-52db98c6032a-utilities\") pod \"redhat-marketplace-nf4nk\" (UID: \"d0a05956-6087-461d-a271-52db98c6032a\") " pod="openshift-marketplace/redhat-marketplace-nf4nk" Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.569460 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0a05956-6087-461d-a271-52db98c6032a-utilities\") pod \"redhat-marketplace-nf4nk\" (UID: \"d0a05956-6087-461d-a271-52db98c6032a\") " pod="openshift-marketplace/redhat-marketplace-nf4nk" Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.570031 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0a05956-6087-461d-a271-52db98c6032a-catalog-content\") pod \"redhat-marketplace-nf4nk\" (UID: \"d0a05956-6087-461d-a271-52db98c6032a\") " pod="openshift-marketplace/redhat-marketplace-nf4nk" Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.570118 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75c707c4-5c62-438f-8312-2307d3ef0ba8-config-volume" (OuterVolumeSpecName: "config-volume") pod "75c707c4-5c62-438f-8312-2307d3ef0ba8" (UID: "75c707c4-5c62-438f-8312-2307d3ef0ba8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.575004 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75c707c4-5c62-438f-8312-2307d3ef0ba8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "75c707c4-5c62-438f-8312-2307d3ef0ba8" (UID: "75c707c4-5c62-438f-8312-2307d3ef0ba8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.585222 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75c707c4-5c62-438f-8312-2307d3ef0ba8-kube-api-access-qdbmc" (OuterVolumeSpecName: "kube-api-access-qdbmc") pod "75c707c4-5c62-438f-8312-2307d3ef0ba8" (UID: "75c707c4-5c62-438f-8312-2307d3ef0ba8"). InnerVolumeSpecName "kube-api-access-qdbmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.588694 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzdsv\" (UniqueName: \"kubernetes.io/projected/d0a05956-6087-461d-a271-52db98c6032a-kube-api-access-tzdsv\") pod \"redhat-marketplace-nf4nk\" (UID: \"d0a05956-6087-461d-a271-52db98c6032a\") " pod="openshift-marketplace/redhat-marketplace-nf4nk" Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.620898 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 18 11:39:09 crc kubenswrapper[4922]: W0218 11:39:09.629795 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod45f5b001_7d04_46f6_b77d_f79f28d8513e.slice/crio-a9b4977daf388120fe76f0f273845c52dc404c3e21cd7f8660810678fb42ce9c WatchSource:0}: Error finding container a9b4977daf388120fe76f0f273845c52dc404c3e21cd7f8660810678fb42ce9c: Status 404 returned error can't find the container with id a9b4977daf388120fe76f0f273845c52dc404c3e21cd7f8660810678fb42ce9c Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.662683 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nf4nk" Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.669039 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5vjsn"] Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.669798 4922 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75c707c4-5c62-438f-8312-2307d3ef0ba8-config-volume\") on node \"crc\" DevicePath \"\"" Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.669824 4922 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75c707c4-5c62-438f-8312-2307d3ef0ba8-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.669835 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdbmc\" (UniqueName: \"kubernetes.io/projected/75c707c4-5c62-438f-8312-2307d3ef0ba8-kube-api-access-qdbmc\") on node \"crc\" DevicePath \"\"" Feb 18 11:39:09 crc kubenswrapper[4922]: W0218 11:39:09.678135 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf0d2342_e758_43cc_8c89_adc3ceb98453.slice/crio-253c1fe74cd3ee618dc4b79bd02124aafb18cc9d88de6205e0a30abc2ecadb34 WatchSource:0}: Error finding container 253c1fe74cd3ee618dc4b79bd02124aafb18cc9d88de6205e0a30abc2ecadb34: Status 404 returned error can't find the container with id 253c1fe74cd3ee618dc4b79bd02124aafb18cc9d88de6205e0a30abc2ecadb34 Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.722829 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-nfn89" Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.724022 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-nfn89" Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.726687 4922 patch_prober.go:28] interesting pod/console-f9d7485db-nfn89 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.726737 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-nfn89" podUID="14e81dbf-6c73-481c-b758-4c15cc0f3258" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.810873 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.810946 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.835423 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-pj7qx" Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.921520 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nf4nk"] Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.943270 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-sddqb" Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.952251 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523570-k54zv" Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.952236 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523570-k54zv" event={"ID":"75c707c4-5c62-438f-8312-2307d3ef0ba8","Type":"ContainerDied","Data":"28c2cf88217349a49db593635ea8e128208ef1ae24d7cc6d1020cc30632765bf"} Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.952854 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28c2cf88217349a49db593635ea8e128208ef1ae24d7cc6d1020cc30632765bf" Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.953975 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"4eb08f37e8241ad9b183d2b32c46286809fafe4a88a928b684e7f920b1a52932"} Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.954020 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"2a5135b69f4b3807d7677aeea252c6b9b3eb722654036f6b8e1f09af0f3488da"} Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.957647 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"d226335892aa0135b931a64fb29dcd629371ddaecbe8c104cc966a127e262f55"} Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.957693 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"9afdc23b817126f5d66c3dd8ca0fb5dd583325bf96b4bbfe24218da4f9354664"} Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.958598 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.960583 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"eaeb02c76dcba489d6dea1f6a0d48a46cdcf6a4ee0f90ed3b572ff1b0654f3a8"} Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.960609 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"8ed6222e5539873a8dc19ddf3c314d51efbdcdb13cb12f6de6e8a9cf3359f762"} Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.967758 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" event={"ID":"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0","Type":"ContainerStarted","Data":"9bc6cc301e873229c1127137e22f6a9b28ccb1bc2ceb7bd9d965728c3ed9f3a9"} Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.967798 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" event={"ID":"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0","Type":"ContainerStarted","Data":"ea1a8d4dd7f49c86b94455df1de54cf958a56e87072aa59f4711d51402743ec5"} Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.967815 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.980311 4922 patch_prober.go:28] interesting pod/downloads-7954f5f757-b6dxx container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.980329 4922 patch_prober.go:28] interesting pod/downloads-7954f5f757-b6dxx container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.980900 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-b6dxx" podUID="48dabf7e-d1d7-48b6-bc70-5cc88cdcf994" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.980380 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-b6dxx" podUID="48dabf7e-d1d7-48b6-bc70-5cc88cdcf994" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Feb 18 11:39:09 crc kubenswrapper[4922]: W0218 11:39:09.984164 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0a05956_6087_461d_a271_52db98c6032a.slice/crio-a1f5dac4997bf52c337b4ff07c819ce5fc50051e9cba18a9a9a797070bbd762b WatchSource:0}: Error finding container a1f5dac4997bf52c337b4ff07c819ce5fc50051e9cba18a9a9a797070bbd762b: Status 404 returned error can't find the container with id a1f5dac4997bf52c337b4ff07c819ce5fc50051e9cba18a9a9a797070bbd762b Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.991909 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5vjsn" event={"ID":"bf0d2342-e758-43cc-8c89-adc3ceb98453","Type":"ContainerDied","Data":"88ce7213e522a2bdfb89014af40e0edefc2f7d30268eda8f28f4c4825cf47c50"} Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.992091 4922 generic.go:334] "Generic (PLEG): container finished" podID="bf0d2342-e758-43cc-8c89-adc3ceb98453" containerID="88ce7213e522a2bdfb89014af40e0edefc2f7d30268eda8f28f4c4825cf47c50" exitCode=0 Feb 18 11:39:09 crc kubenswrapper[4922]: I0218 11:39:09.992165 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5vjsn" event={"ID":"bf0d2342-e758-43cc-8c89-adc3ceb98453","Type":"ContainerStarted","Data":"253c1fe74cd3ee618dc4b79bd02124aafb18cc9d88de6205e0a30abc2ecadb34"} Feb 18 11:39:10 crc kubenswrapper[4922]: I0218 11:39:10.011326 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"45f5b001-7d04-46f6-b77d-f79f28d8513e","Type":"ContainerStarted","Data":"a9b4977daf388120fe76f0f273845c52dc404c3e21cd7f8660810678fb42ce9c"} Feb 18 11:39:10 crc kubenswrapper[4922]: I0218 11:39:10.056199 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" podStartSLOduration=132.056179417 podStartE2EDuration="2m12.056179417s" podCreationTimestamp="2026-02-18 11:36:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:39:10.053710404 +0000 UTC m=+151.781414484" watchObservedRunningTime="2026-02-18 11:39:10.056179417 +0000 UTC m=+151.783883497" Feb 18 11:39:10 crc kubenswrapper[4922]: I0218 11:39:10.086110 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vhdd8" Feb 18 11:39:10 crc kubenswrapper[4922]: I0218 11:39:10.094129 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vhdd8" Feb 18 11:39:10 crc kubenswrapper[4922]: I0218 11:39:10.115710 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-ktkz9" Feb 18 11:39:10 crc kubenswrapper[4922]: I0218 11:39:10.118809 4922 patch_prober.go:28] interesting pod/router-default-5444994796-ktkz9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 11:39:10 crc kubenswrapper[4922]: [-]has-synced failed: reason withheld Feb 18 11:39:10 crc kubenswrapper[4922]: [+]process-running ok Feb 18 11:39:10 crc kubenswrapper[4922]: healthz check failed Feb 18 11:39:10 crc kubenswrapper[4922]: I0218 11:39:10.118856 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ktkz9" podUID="9f7b66c5-b258-4314-b3a5-e08b958245b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 11:39:10 crc kubenswrapper[4922]: I0218 11:39:10.273263 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wz74v"] Feb 18 11:39:10 crc kubenswrapper[4922]: E0218 11:39:10.273554 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75c707c4-5c62-438f-8312-2307d3ef0ba8" containerName="collect-profiles" Feb 18 11:39:10 crc kubenswrapper[4922]: I0218 11:39:10.273568 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="75c707c4-5c62-438f-8312-2307d3ef0ba8" containerName="collect-profiles" Feb 18 11:39:10 crc kubenswrapper[4922]: I0218 11:39:10.273689 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="75c707c4-5c62-438f-8312-2307d3ef0ba8" containerName="collect-profiles" Feb 18 11:39:10 crc kubenswrapper[4922]: I0218 11:39:10.280197 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wz74v" Feb 18 11:39:10 crc kubenswrapper[4922]: I0218 11:39:10.284023 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 18 11:39:10 crc kubenswrapper[4922]: I0218 11:39:10.291278 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wz74v"] Feb 18 11:39:10 crc kubenswrapper[4922]: I0218 11:39:10.381864 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z6qm\" (UniqueName: \"kubernetes.io/projected/47c627d0-6fb9-4b77-b266-74670361fcd6-kube-api-access-4z6qm\") pod \"redhat-operators-wz74v\" (UID: \"47c627d0-6fb9-4b77-b266-74670361fcd6\") " pod="openshift-marketplace/redhat-operators-wz74v" Feb 18 11:39:10 crc kubenswrapper[4922]: I0218 11:39:10.381906 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47c627d0-6fb9-4b77-b266-74670361fcd6-catalog-content\") pod \"redhat-operators-wz74v\" (UID: \"47c627d0-6fb9-4b77-b266-74670361fcd6\") " pod="openshift-marketplace/redhat-operators-wz74v" Feb 18 11:39:10 crc kubenswrapper[4922]: I0218 11:39:10.381999 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47c627d0-6fb9-4b77-b266-74670361fcd6-utilities\") pod \"redhat-operators-wz74v\" (UID: \"47c627d0-6fb9-4b77-b266-74670361fcd6\") " pod="openshift-marketplace/redhat-operators-wz74v" Feb 18 11:39:10 crc kubenswrapper[4922]: I0218 11:39:10.482919 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47c627d0-6fb9-4b77-b266-74670361fcd6-utilities\") pod \"redhat-operators-wz74v\" (UID: \"47c627d0-6fb9-4b77-b266-74670361fcd6\") " pod="openshift-marketplace/redhat-operators-wz74v" Feb 18 11:39:10 crc kubenswrapper[4922]: I0218 11:39:10.483218 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4z6qm\" (UniqueName: \"kubernetes.io/projected/47c627d0-6fb9-4b77-b266-74670361fcd6-kube-api-access-4z6qm\") pod \"redhat-operators-wz74v\" (UID: \"47c627d0-6fb9-4b77-b266-74670361fcd6\") " pod="openshift-marketplace/redhat-operators-wz74v" Feb 18 11:39:10 crc kubenswrapper[4922]: I0218 11:39:10.483237 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47c627d0-6fb9-4b77-b266-74670361fcd6-catalog-content\") pod \"redhat-operators-wz74v\" (UID: \"47c627d0-6fb9-4b77-b266-74670361fcd6\") " pod="openshift-marketplace/redhat-operators-wz74v" Feb 18 11:39:10 crc kubenswrapper[4922]: I0218 11:39:10.483669 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47c627d0-6fb9-4b77-b266-74670361fcd6-catalog-content\") pod \"redhat-operators-wz74v\" (UID: \"47c627d0-6fb9-4b77-b266-74670361fcd6\") " pod="openshift-marketplace/redhat-operators-wz74v" Feb 18 11:39:10 crc kubenswrapper[4922]: I0218 11:39:10.483872 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47c627d0-6fb9-4b77-b266-74670361fcd6-utilities\") pod \"redhat-operators-wz74v\" (UID: \"47c627d0-6fb9-4b77-b266-74670361fcd6\") " pod="openshift-marketplace/redhat-operators-wz74v" Feb 18 11:39:10 crc kubenswrapper[4922]: I0218 11:39:10.522455 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z6qm\" (UniqueName: \"kubernetes.io/projected/47c627d0-6fb9-4b77-b266-74670361fcd6-kube-api-access-4z6qm\") pod \"redhat-operators-wz74v\" (UID: \"47c627d0-6fb9-4b77-b266-74670361fcd6\") " pod="openshift-marketplace/redhat-operators-wz74v" Feb 18 11:39:10 crc kubenswrapper[4922]: I0218 11:39:10.582961 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-nc7b9" Feb 18 11:39:10 crc kubenswrapper[4922]: I0218 11:39:10.606254 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wz74v" Feb 18 11:39:10 crc kubenswrapper[4922]: I0218 11:39:10.672805 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wg8w6"] Feb 18 11:39:10 crc kubenswrapper[4922]: I0218 11:39:10.677701 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wg8w6" Feb 18 11:39:10 crc kubenswrapper[4922]: I0218 11:39:10.679984 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wg8w6"] Feb 18 11:39:10 crc kubenswrapper[4922]: I0218 11:39:10.787992 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54-utilities\") pod \"redhat-operators-wg8w6\" (UID: \"80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54\") " pod="openshift-marketplace/redhat-operators-wg8w6" Feb 18 11:39:10 crc kubenswrapper[4922]: I0218 11:39:10.788041 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c89j\" (UniqueName: \"kubernetes.io/projected/80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54-kube-api-access-8c89j\") pod \"redhat-operators-wg8w6\" (UID: \"80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54\") " pod="openshift-marketplace/redhat-operators-wg8w6" Feb 18 11:39:10 crc kubenswrapper[4922]: I0218 11:39:10.788089 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54-catalog-content\") pod \"redhat-operators-wg8w6\" (UID: \"80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54\") " pod="openshift-marketplace/redhat-operators-wg8w6" Feb 18 11:39:10 crc kubenswrapper[4922]: I0218 11:39:10.889998 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54-utilities\") pod \"redhat-operators-wg8w6\" (UID: \"80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54\") " pod="openshift-marketplace/redhat-operators-wg8w6" Feb 18 11:39:10 crc kubenswrapper[4922]: I0218 11:39:10.890249 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c89j\" (UniqueName: \"kubernetes.io/projected/80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54-kube-api-access-8c89j\") pod \"redhat-operators-wg8w6\" (UID: \"80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54\") " pod="openshift-marketplace/redhat-operators-wg8w6" Feb 18 11:39:10 crc kubenswrapper[4922]: I0218 11:39:10.890279 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54-catalog-content\") pod \"redhat-operators-wg8w6\" (UID: \"80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54\") " pod="openshift-marketplace/redhat-operators-wg8w6" Feb 18 11:39:10 crc kubenswrapper[4922]: I0218 11:39:10.890523 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54-utilities\") pod \"redhat-operators-wg8w6\" (UID: \"80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54\") " pod="openshift-marketplace/redhat-operators-wg8w6" Feb 18 11:39:10 crc kubenswrapper[4922]: I0218 11:39:10.890563 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54-catalog-content\") pod \"redhat-operators-wg8w6\" (UID: \"80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54\") " pod="openshift-marketplace/redhat-operators-wg8w6" Feb 18 11:39:10 crc kubenswrapper[4922]: I0218 11:39:10.914027 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c89j\" (UniqueName: \"kubernetes.io/projected/80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54-kube-api-access-8c89j\") pod \"redhat-operators-wg8w6\" (UID: \"80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54\") " pod="openshift-marketplace/redhat-operators-wg8w6" Feb 18 11:39:10 crc kubenswrapper[4922]: I0218 11:39:10.993810 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wg8w6" Feb 18 11:39:11 crc kubenswrapper[4922]: I0218 11:39:11.028219 4922 generic.go:334] "Generic (PLEG): container finished" podID="d0a05956-6087-461d-a271-52db98c6032a" containerID="e5b592a27d7235e3492be66b98f4503fc9cce34317fcbba82078ccf86ac362b0" exitCode=0 Feb 18 11:39:11 crc kubenswrapper[4922]: I0218 11:39:11.028296 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nf4nk" event={"ID":"d0a05956-6087-461d-a271-52db98c6032a","Type":"ContainerDied","Data":"e5b592a27d7235e3492be66b98f4503fc9cce34317fcbba82078ccf86ac362b0"} Feb 18 11:39:11 crc kubenswrapper[4922]: I0218 11:39:11.028322 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nf4nk" event={"ID":"d0a05956-6087-461d-a271-52db98c6032a","Type":"ContainerStarted","Data":"a1f5dac4997bf52c337b4ff07c819ce5fc50051e9cba18a9a9a797070bbd762b"} Feb 18 11:39:11 crc kubenswrapper[4922]: I0218 11:39:11.031827 4922 generic.go:334] "Generic (PLEG): container finished" podID="45f5b001-7d04-46f6-b77d-f79f28d8513e" containerID="1a5b9e642430835de9898ea0d1086d2036a4ac3e11f7db0b73326129db5097a8" exitCode=0 Feb 18 11:39:11 crc kubenswrapper[4922]: I0218 11:39:11.031968 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"45f5b001-7d04-46f6-b77d-f79f28d8513e","Type":"ContainerDied","Data":"1a5b9e642430835de9898ea0d1086d2036a4ac3e11f7db0b73326129db5097a8"} Feb 18 11:39:11 crc kubenswrapper[4922]: I0218 11:39:11.119597 4922 patch_prober.go:28] interesting pod/router-default-5444994796-ktkz9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 11:39:11 crc kubenswrapper[4922]: [-]has-synced failed: reason withheld Feb 18 11:39:11 crc kubenswrapper[4922]: [+]process-running ok Feb 18 11:39:11 crc kubenswrapper[4922]: healthz check failed Feb 18 11:39:11 crc kubenswrapper[4922]: I0218 11:39:11.119741 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ktkz9" podUID="9f7b66c5-b258-4314-b3a5-e08b958245b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 11:39:11 crc kubenswrapper[4922]: I0218 11:39:11.174356 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wz74v"] Feb 18 11:39:11 crc kubenswrapper[4922]: I0218 11:39:11.265139 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wg8w6"] Feb 18 11:39:11 crc kubenswrapper[4922]: W0218 11:39:11.307428 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80efc1b6_0ae2_4cbf_8dc9_0e2c4d526f54.slice/crio-25d2b5e0e4b5a6b68be1bcb304a04f4b6084a6c0bdb0404fc5532afe958fbbd6 WatchSource:0}: Error finding container 25d2b5e0e4b5a6b68be1bcb304a04f4b6084a6c0bdb0404fc5532afe958fbbd6: Status 404 returned error can't find the container with id 25d2b5e0e4b5a6b68be1bcb304a04f4b6084a6c0bdb0404fc5532afe958fbbd6 Feb 18 11:39:12 crc kubenswrapper[4922]: I0218 11:39:12.041411 4922 generic.go:334] "Generic (PLEG): container finished" podID="47c627d0-6fb9-4b77-b266-74670361fcd6" containerID="339bdbf56d4f0a3e6321916973a3099db06b916e6ec5ad9353826af30e315679" exitCode=0 Feb 18 11:39:12 crc kubenswrapper[4922]: I0218 11:39:12.041579 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wz74v" event={"ID":"47c627d0-6fb9-4b77-b266-74670361fcd6","Type":"ContainerDied","Data":"339bdbf56d4f0a3e6321916973a3099db06b916e6ec5ad9353826af30e315679"} Feb 18 11:39:12 crc kubenswrapper[4922]: I0218 11:39:12.041713 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wz74v" event={"ID":"47c627d0-6fb9-4b77-b266-74670361fcd6","Type":"ContainerStarted","Data":"86936b3faf99a98562fccc6ce9e3e9f7de7879c692a3b15d363c67f9bb07864e"} Feb 18 11:39:12 crc kubenswrapper[4922]: I0218 11:39:12.045643 4922 generic.go:334] "Generic (PLEG): container finished" podID="80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54" containerID="8dd3aff6ccf90bcc5b2f3c4b1fa26fee9db4eb20a75dd95c5b684cb6dd48e777" exitCode=0 Feb 18 11:39:12 crc kubenswrapper[4922]: I0218 11:39:12.045725 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wg8w6" event={"ID":"80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54","Type":"ContainerDied","Data":"8dd3aff6ccf90bcc5b2f3c4b1fa26fee9db4eb20a75dd95c5b684cb6dd48e777"} Feb 18 11:39:12 crc kubenswrapper[4922]: I0218 11:39:12.045765 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wg8w6" event={"ID":"80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54","Type":"ContainerStarted","Data":"25d2b5e0e4b5a6b68be1bcb304a04f4b6084a6c0bdb0404fc5532afe958fbbd6"} Feb 18 11:39:12 crc kubenswrapper[4922]: I0218 11:39:12.118099 4922 patch_prober.go:28] interesting pod/router-default-5444994796-ktkz9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 11:39:12 crc kubenswrapper[4922]: [-]has-synced failed: reason withheld Feb 18 11:39:12 crc kubenswrapper[4922]: [+]process-running ok Feb 18 11:39:12 crc kubenswrapper[4922]: healthz check failed Feb 18 11:39:12 crc kubenswrapper[4922]: I0218 11:39:12.118147 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ktkz9" podUID="9f7b66c5-b258-4314-b3a5-e08b958245b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 11:39:12 crc kubenswrapper[4922]: I0218 11:39:12.292897 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 18 11:39:12 crc kubenswrapper[4922]: I0218 11:39:12.424740 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/45f5b001-7d04-46f6-b77d-f79f28d8513e-kubelet-dir\") pod \"45f5b001-7d04-46f6-b77d-f79f28d8513e\" (UID: \"45f5b001-7d04-46f6-b77d-f79f28d8513e\") " Feb 18 11:39:12 crc kubenswrapper[4922]: I0218 11:39:12.424884 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/45f5b001-7d04-46f6-b77d-f79f28d8513e-kube-api-access\") pod \"45f5b001-7d04-46f6-b77d-f79f28d8513e\" (UID: \"45f5b001-7d04-46f6-b77d-f79f28d8513e\") " Feb 18 11:39:12 crc kubenswrapper[4922]: I0218 11:39:12.426097 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45f5b001-7d04-46f6-b77d-f79f28d8513e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "45f5b001-7d04-46f6-b77d-f79f28d8513e" (UID: "45f5b001-7d04-46f6-b77d-f79f28d8513e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 11:39:12 crc kubenswrapper[4922]: I0218 11:39:12.437563 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45f5b001-7d04-46f6-b77d-f79f28d8513e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "45f5b001-7d04-46f6-b77d-f79f28d8513e" (UID: "45f5b001-7d04-46f6-b77d-f79f28d8513e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:39:12 crc kubenswrapper[4922]: I0218 11:39:12.526736 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/45f5b001-7d04-46f6-b77d-f79f28d8513e-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 18 11:39:12 crc kubenswrapper[4922]: I0218 11:39:12.526799 4922 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/45f5b001-7d04-46f6-b77d-f79f28d8513e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 18 11:39:12 crc kubenswrapper[4922]: I0218 11:39:12.722375 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-ks48g" Feb 18 11:39:12 crc kubenswrapper[4922]: I0218 11:39:12.726749 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-ks48g" Feb 18 11:39:13 crc kubenswrapper[4922]: I0218 11:39:13.072827 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"45f5b001-7d04-46f6-b77d-f79f28d8513e","Type":"ContainerDied","Data":"a9b4977daf388120fe76f0f273845c52dc404c3e21cd7f8660810678fb42ce9c"} Feb 18 11:39:13 crc kubenswrapper[4922]: I0218 11:39:13.073189 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9b4977daf388120fe76f0f273845c52dc404c3e21cd7f8660810678fb42ce9c" Feb 18 11:39:13 crc kubenswrapper[4922]: I0218 11:39:13.072904 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 18 11:39:13 crc kubenswrapper[4922]: I0218 11:39:13.119329 4922 patch_prober.go:28] interesting pod/router-default-5444994796-ktkz9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 11:39:13 crc kubenswrapper[4922]: [-]has-synced failed: reason withheld Feb 18 11:39:13 crc kubenswrapper[4922]: [+]process-running ok Feb 18 11:39:13 crc kubenswrapper[4922]: healthz check failed Feb 18 11:39:13 crc kubenswrapper[4922]: I0218 11:39:13.119429 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ktkz9" podUID="9f7b66c5-b258-4314-b3a5-e08b958245b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 11:39:14 crc kubenswrapper[4922]: I0218 11:39:14.128259 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-ktkz9" Feb 18 11:39:14 crc kubenswrapper[4922]: I0218 11:39:14.131316 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-ktkz9" Feb 18 11:39:14 crc kubenswrapper[4922]: I0218 11:39:14.249439 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 18 11:39:14 crc kubenswrapper[4922]: E0218 11:39:14.249787 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45f5b001-7d04-46f6-b77d-f79f28d8513e" containerName="pruner" Feb 18 11:39:14 crc kubenswrapper[4922]: I0218 11:39:14.249809 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="45f5b001-7d04-46f6-b77d-f79f28d8513e" containerName="pruner" Feb 18 11:39:14 crc kubenswrapper[4922]: I0218 11:39:14.249936 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="45f5b001-7d04-46f6-b77d-f79f28d8513e" containerName="pruner" Feb 18 11:39:14 crc kubenswrapper[4922]: I0218 11:39:14.250452 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 18 11:39:14 crc kubenswrapper[4922]: I0218 11:39:14.252593 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 18 11:39:14 crc kubenswrapper[4922]: I0218 11:39:14.252983 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 18 11:39:14 crc kubenswrapper[4922]: I0218 11:39:14.259184 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 18 11:39:14 crc kubenswrapper[4922]: I0218 11:39:14.390301 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a4f0f44-f90d-4e2e-a41a-0d785c890c11-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6a4f0f44-f90d-4e2e-a41a-0d785c890c11\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 18 11:39:14 crc kubenswrapper[4922]: I0218 11:39:14.390405 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6a4f0f44-f90d-4e2e-a41a-0d785c890c11-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6a4f0f44-f90d-4e2e-a41a-0d785c890c11\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 18 11:39:14 crc kubenswrapper[4922]: I0218 11:39:14.492176 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a4f0f44-f90d-4e2e-a41a-0d785c890c11-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6a4f0f44-f90d-4e2e-a41a-0d785c890c11\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 18 11:39:14 crc kubenswrapper[4922]: I0218 11:39:14.492247 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6a4f0f44-f90d-4e2e-a41a-0d785c890c11-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6a4f0f44-f90d-4e2e-a41a-0d785c890c11\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 18 11:39:14 crc kubenswrapper[4922]: I0218 11:39:14.492359 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6a4f0f44-f90d-4e2e-a41a-0d785c890c11-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6a4f0f44-f90d-4e2e-a41a-0d785c890c11\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 18 11:39:14 crc kubenswrapper[4922]: I0218 11:39:14.521041 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a4f0f44-f90d-4e2e-a41a-0d785c890c11-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6a4f0f44-f90d-4e2e-a41a-0d785c890c11\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 18 11:39:14 crc kubenswrapper[4922]: I0218 11:39:14.590855 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 18 11:39:15 crc kubenswrapper[4922]: I0218 11:39:15.232753 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 18 11:39:15 crc kubenswrapper[4922]: I0218 11:39:15.693864 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-gbpm4" Feb 18 11:39:16 crc kubenswrapper[4922]: I0218 11:39:16.114324 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6a4f0f44-f90d-4e2e-a41a-0d785c890c11","Type":"ContainerStarted","Data":"59b18b090d0689c87a8a1db521371c6f148f035c17433d7c2edcc69c82db0fb3"} Feb 18 11:39:16 crc kubenswrapper[4922]: I0218 11:39:16.114393 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6a4f0f44-f90d-4e2e-a41a-0d785c890c11","Type":"ContainerStarted","Data":"ccbaee4d994d716c70b98bd2ca44e40786371cc7260f2de61281d7fb8c3aecf3"} Feb 18 11:39:17 crc kubenswrapper[4922]: I0218 11:39:17.123064 4922 generic.go:334] "Generic (PLEG): container finished" podID="6a4f0f44-f90d-4e2e-a41a-0d785c890c11" containerID="59b18b090d0689c87a8a1db521371c6f148f035c17433d7c2edcc69c82db0fb3" exitCode=0 Feb 18 11:39:17 crc kubenswrapper[4922]: I0218 11:39:17.123280 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6a4f0f44-f90d-4e2e-a41a-0d785c890c11","Type":"ContainerDied","Data":"59b18b090d0689c87a8a1db521371c6f148f035c17433d7c2edcc69c82db0fb3"} Feb 18 11:39:19 crc kubenswrapper[4922]: I0218 11:39:19.723784 4922 patch_prober.go:28] interesting pod/console-f9d7485db-nfn89 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Feb 18 11:39:19 crc kubenswrapper[4922]: I0218 11:39:19.724544 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-nfn89" podUID="14e81dbf-6c73-481c-b758-4c15cc0f3258" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Feb 18 11:39:19 crc kubenswrapper[4922]: I0218 11:39:19.979175 4922 patch_prober.go:28] interesting pod/downloads-7954f5f757-b6dxx container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Feb 18 11:39:19 crc kubenswrapper[4922]: I0218 11:39:19.979227 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-b6dxx" podUID="48dabf7e-d1d7-48b6-bc70-5cc88cdcf994" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Feb 18 11:39:19 crc kubenswrapper[4922]: I0218 11:39:19.980595 4922 patch_prober.go:28] interesting pod/downloads-7954f5f757-b6dxx container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Feb 18 11:39:19 crc kubenswrapper[4922]: I0218 11:39:19.980670 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-b6dxx" podUID="48dabf7e-d1d7-48b6-bc70-5cc88cdcf994" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Feb 18 11:39:21 crc kubenswrapper[4922]: I0218 11:39:21.816927 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4702cf45-b47b-4291-a553-5bfc7bc22674-metrics-certs\") pod \"network-metrics-daemon-pspfr\" (UID: \"4702cf45-b47b-4291-a553-5bfc7bc22674\") " pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:39:21 crc kubenswrapper[4922]: I0218 11:39:21.823422 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4702cf45-b47b-4291-a553-5bfc7bc22674-metrics-certs\") pod \"network-metrics-daemon-pspfr\" (UID: \"4702cf45-b47b-4291-a553-5bfc7bc22674\") " pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:39:21 crc kubenswrapper[4922]: I0218 11:39:21.994587 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pspfr" Feb 18 11:39:22 crc kubenswrapper[4922]: I0218 11:39:22.085171 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 18 11:39:22 crc kubenswrapper[4922]: I0218 11:39:22.120713 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a4f0f44-f90d-4e2e-a41a-0d785c890c11-kube-api-access\") pod \"6a4f0f44-f90d-4e2e-a41a-0d785c890c11\" (UID: \"6a4f0f44-f90d-4e2e-a41a-0d785c890c11\") " Feb 18 11:39:22 crc kubenswrapper[4922]: I0218 11:39:22.120769 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6a4f0f44-f90d-4e2e-a41a-0d785c890c11-kubelet-dir\") pod \"6a4f0f44-f90d-4e2e-a41a-0d785c890c11\" (UID: \"6a4f0f44-f90d-4e2e-a41a-0d785c890c11\") " Feb 18 11:39:22 crc kubenswrapper[4922]: I0218 11:39:22.120968 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6a4f0f44-f90d-4e2e-a41a-0d785c890c11-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6a4f0f44-f90d-4e2e-a41a-0d785c890c11" (UID: "6a4f0f44-f90d-4e2e-a41a-0d785c890c11"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 11:39:22 crc kubenswrapper[4922]: I0218 11:39:22.124274 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a4f0f44-f90d-4e2e-a41a-0d785c890c11-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6a4f0f44-f90d-4e2e-a41a-0d785c890c11" (UID: "6a4f0f44-f90d-4e2e-a41a-0d785c890c11"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:39:22 crc kubenswrapper[4922]: I0218 11:39:22.153437 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6a4f0f44-f90d-4e2e-a41a-0d785c890c11","Type":"ContainerDied","Data":"ccbaee4d994d716c70b98bd2ca44e40786371cc7260f2de61281d7fb8c3aecf3"} Feb 18 11:39:22 crc kubenswrapper[4922]: I0218 11:39:22.153575 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ccbaee4d994d716c70b98bd2ca44e40786371cc7260f2de61281d7fb8c3aecf3" Feb 18 11:39:22 crc kubenswrapper[4922]: I0218 11:39:22.153480 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 18 11:39:22 crc kubenswrapper[4922]: I0218 11:39:22.222024 4922 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6a4f0f44-f90d-4e2e-a41a-0d785c890c11-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 18 11:39:22 crc kubenswrapper[4922]: I0218 11:39:22.222065 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a4f0f44-f90d-4e2e-a41a-0d785c890c11-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 18 11:39:27 crc kubenswrapper[4922]: I0218 11:39:27.486049 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pj7qx"] Feb 18 11:39:27 crc kubenswrapper[4922]: I0218 11:39:27.486717 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-pj7qx" podUID="158f0672-c017-4e45-a564-96de81f21772" containerName="controller-manager" containerID="cri-o://5e927c6e4399c8def57efddaefd76cf053c5e57aa111182c3f59a17801fceec0" gracePeriod=30 Feb 18 11:39:27 crc kubenswrapper[4922]: I0218 11:39:27.512791 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-v6m8j"] Feb 18 11:39:27 crc kubenswrapper[4922]: I0218 11:39:27.513468 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-v6m8j" podUID="c3bb2fc9-822c-4f53-98bf-70933744cf7f" containerName="route-controller-manager" containerID="cri-o://6194b5e6d35bc369d7c503f0dab1c1b4699e56c00e185999f0dd2d244bb4e4b6" gracePeriod=30 Feb 18 11:39:28 crc kubenswrapper[4922]: I0218 11:39:28.772613 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:39:29 crc kubenswrapper[4922]: I0218 11:39:29.728073 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-nfn89" Feb 18 11:39:29 crc kubenswrapper[4922]: I0218 11:39:29.735182 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-nfn89" Feb 18 11:39:29 crc kubenswrapper[4922]: I0218 11:39:29.927109 4922 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-v6m8j container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Feb 18 11:39:29 crc kubenswrapper[4922]: I0218 11:39:29.927171 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-v6m8j" podUID="c3bb2fc9-822c-4f53-98bf-70933744cf7f" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Feb 18 11:39:29 crc kubenswrapper[4922]: I0218 11:39:29.984705 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-b6dxx" Feb 18 11:39:30 crc kubenswrapper[4922]: I0218 11:39:30.831915 4922 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-pj7qx container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 18 11:39:30 crc kubenswrapper[4922]: I0218 11:39:30.832910 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-pj7qx" podUID="158f0672-c017-4e45-a564-96de81f21772" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 18 11:39:35 crc kubenswrapper[4922]: I0218 11:39:35.236661 4922 generic.go:334] "Generic (PLEG): container finished" podID="c3bb2fc9-822c-4f53-98bf-70933744cf7f" containerID="6194b5e6d35bc369d7c503f0dab1c1b4699e56c00e185999f0dd2d244bb4e4b6" exitCode=0 Feb 18 11:39:35 crc kubenswrapper[4922]: I0218 11:39:35.236750 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-v6m8j" event={"ID":"c3bb2fc9-822c-4f53-98bf-70933744cf7f","Type":"ContainerDied","Data":"6194b5e6d35bc369d7c503f0dab1c1b4699e56c00e185999f0dd2d244bb4e4b6"} Feb 18 11:39:35 crc kubenswrapper[4922]: I0218 11:39:35.239221 4922 generic.go:334] "Generic (PLEG): container finished" podID="158f0672-c017-4e45-a564-96de81f21772" containerID="5e927c6e4399c8def57efddaefd76cf053c5e57aa111182c3f59a17801fceec0" exitCode=0 Feb 18 11:39:35 crc kubenswrapper[4922]: I0218 11:39:35.239262 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pj7qx" event={"ID":"158f0672-c017-4e45-a564-96de81f21772","Type":"ContainerDied","Data":"5e927c6e4399c8def57efddaefd76cf053c5e57aa111182c3f59a17801fceec0"} Feb 18 11:39:39 crc kubenswrapper[4922]: I0218 11:39:39.807674 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 11:39:39 crc kubenswrapper[4922]: I0218 11:39:39.808434 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 11:39:40 crc kubenswrapper[4922]: I0218 11:39:40.553722 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rj4b9" Feb 18 11:39:40 crc kubenswrapper[4922]: I0218 11:39:40.833016 4922 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-pj7qx container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 18 11:39:40 crc kubenswrapper[4922]: I0218 11:39:40.833081 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-pj7qx" podUID="158f0672-c017-4e45-a564-96de81f21772" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 18 11:39:40 crc kubenswrapper[4922]: I0218 11:39:40.926878 4922 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-v6m8j container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 18 11:39:40 crc kubenswrapper[4922]: I0218 11:39:40.926940 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-v6m8j" podUID="c3bb2fc9-822c-4f53-98bf-70933744cf7f" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 18 11:39:42 crc kubenswrapper[4922]: E0218 11:39:42.238139 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 18 11:39:42 crc kubenswrapper[4922]: E0218 11:39:42.238892 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8c89j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-wg8w6_openshift-marketplace(80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 18 11:39:42 crc kubenswrapper[4922]: E0218 11:39:42.240215 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-wg8w6" podUID="80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54" Feb 18 11:39:43 crc kubenswrapper[4922]: E0218 11:39:43.342083 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-wg8w6" podUID="80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54" Feb 18 11:39:43 crc kubenswrapper[4922]: E0218 11:39:43.428923 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 18 11:39:43 crc kubenswrapper[4922]: E0218 11:39:43.429214 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tzdsv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-nf4nk_openshift-marketplace(d0a05956-6087-461d-a271-52db98c6032a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 18 11:39:43 crc kubenswrapper[4922]: E0218 11:39:43.430901 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-nf4nk" podUID="d0a05956-6087-461d-a271-52db98c6032a" Feb 18 11:39:43 crc kubenswrapper[4922]: E0218 11:39:43.488745 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 18 11:39:43 crc kubenswrapper[4922]: E0218 11:39:43.489160 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x7zw6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-5vjsn_openshift-marketplace(bf0d2342-e758-43cc-8c89-adc3ceb98453): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 18 11:39:43 crc kubenswrapper[4922]: E0218 11:39:43.490407 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-5vjsn" podUID="bf0d2342-e758-43cc-8c89-adc3ceb98453" Feb 18 11:39:45 crc kubenswrapper[4922]: E0218 11:39:45.510707 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-5vjsn" podUID="bf0d2342-e758-43cc-8c89-adc3ceb98453" Feb 18 11:39:45 crc kubenswrapper[4922]: E0218 11:39:45.511347 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-nf4nk" podUID="d0a05956-6087-461d-a271-52db98c6032a" Feb 18 11:39:45 crc kubenswrapper[4922]: I0218 11:39:45.563496 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-v6m8j" Feb 18 11:39:45 crc kubenswrapper[4922]: I0218 11:39:45.570539 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pj7qx" Feb 18 11:39:45 crc kubenswrapper[4922]: I0218 11:39:45.593670 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-588997d685-rq7xx"] Feb 18 11:39:45 crc kubenswrapper[4922]: E0218 11:39:45.593935 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a4f0f44-f90d-4e2e-a41a-0d785c890c11" containerName="pruner" Feb 18 11:39:45 crc kubenswrapper[4922]: I0218 11:39:45.593952 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a4f0f44-f90d-4e2e-a41a-0d785c890c11" containerName="pruner" Feb 18 11:39:45 crc kubenswrapper[4922]: E0218 11:39:45.593975 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="158f0672-c017-4e45-a564-96de81f21772" containerName="controller-manager" Feb 18 11:39:45 crc kubenswrapper[4922]: I0218 11:39:45.593984 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="158f0672-c017-4e45-a564-96de81f21772" containerName="controller-manager" Feb 18 11:39:45 crc kubenswrapper[4922]: E0218 11:39:45.594000 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3bb2fc9-822c-4f53-98bf-70933744cf7f" containerName="route-controller-manager" Feb 18 11:39:45 crc kubenswrapper[4922]: I0218 11:39:45.594009 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3bb2fc9-822c-4f53-98bf-70933744cf7f" containerName="route-controller-manager" Feb 18 11:39:45 crc kubenswrapper[4922]: I0218 11:39:45.594134 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="158f0672-c017-4e45-a564-96de81f21772" containerName="controller-manager" Feb 18 11:39:45 crc kubenswrapper[4922]: I0218 11:39:45.594148 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3bb2fc9-822c-4f53-98bf-70933744cf7f" containerName="route-controller-manager" Feb 18 11:39:45 crc kubenswrapper[4922]: I0218 11:39:45.594164 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a4f0f44-f90d-4e2e-a41a-0d785c890c11" containerName="pruner" Feb 18 11:39:45 crc kubenswrapper[4922]: I0218 11:39:45.594649 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-588997d685-rq7xx" Feb 18 11:39:45 crc kubenswrapper[4922]: I0218 11:39:45.605675 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-588997d685-rq7xx"] Feb 18 11:39:45 crc kubenswrapper[4922]: I0218 11:39:45.667555 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k46td\" (UniqueName: \"kubernetes.io/projected/158f0672-c017-4e45-a564-96de81f21772-kube-api-access-k46td\") pod \"158f0672-c017-4e45-a564-96de81f21772\" (UID: \"158f0672-c017-4e45-a564-96de81f21772\") " Feb 18 11:39:45 crc kubenswrapper[4922]: I0218 11:39:45.667616 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/158f0672-c017-4e45-a564-96de81f21772-serving-cert\") pod \"158f0672-c017-4e45-a564-96de81f21772\" (UID: \"158f0672-c017-4e45-a564-96de81f21772\") " Feb 18 11:39:45 crc kubenswrapper[4922]: I0218 11:39:45.667680 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/158f0672-c017-4e45-a564-96de81f21772-config\") pod \"158f0672-c017-4e45-a564-96de81f21772\" (UID: \"158f0672-c017-4e45-a564-96de81f21772\") " Feb 18 11:39:45 crc kubenswrapper[4922]: I0218 11:39:45.667716 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/158f0672-c017-4e45-a564-96de81f21772-client-ca\") pod \"158f0672-c017-4e45-a564-96de81f21772\" (UID: \"158f0672-c017-4e45-a564-96de81f21772\") " Feb 18 11:39:45 crc kubenswrapper[4922]: I0218 11:39:45.667743 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3bb2fc9-822c-4f53-98bf-70933744cf7f-serving-cert\") pod \"c3bb2fc9-822c-4f53-98bf-70933744cf7f\" (UID: \"c3bb2fc9-822c-4f53-98bf-70933744cf7f\") " Feb 18 11:39:45 crc kubenswrapper[4922]: I0218 11:39:45.667782 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3bb2fc9-822c-4f53-98bf-70933744cf7f-config\") pod \"c3bb2fc9-822c-4f53-98bf-70933744cf7f\" (UID: \"c3bb2fc9-822c-4f53-98bf-70933744cf7f\") " Feb 18 11:39:45 crc kubenswrapper[4922]: I0218 11:39:45.667809 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3bb2fc9-822c-4f53-98bf-70933744cf7f-client-ca\") pod \"c3bb2fc9-822c-4f53-98bf-70933744cf7f\" (UID: \"c3bb2fc9-822c-4f53-98bf-70933744cf7f\") " Feb 18 11:39:45 crc kubenswrapper[4922]: I0218 11:39:45.667844 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lkpw\" (UniqueName: \"kubernetes.io/projected/c3bb2fc9-822c-4f53-98bf-70933744cf7f-kube-api-access-6lkpw\") pod \"c3bb2fc9-822c-4f53-98bf-70933744cf7f\" (UID: \"c3bb2fc9-822c-4f53-98bf-70933744cf7f\") " Feb 18 11:39:45 crc kubenswrapper[4922]: I0218 11:39:45.667890 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/158f0672-c017-4e45-a564-96de81f21772-proxy-ca-bundles\") pod \"158f0672-c017-4e45-a564-96de81f21772\" (UID: \"158f0672-c017-4e45-a564-96de81f21772\") " Feb 18 11:39:45 crc kubenswrapper[4922]: I0218 11:39:45.668115 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjhth\" (UniqueName: \"kubernetes.io/projected/15bde054-692c-48d7-9289-37ba209fa899-kube-api-access-tjhth\") pod \"route-controller-manager-588997d685-rq7xx\" (UID: \"15bde054-692c-48d7-9289-37ba209fa899\") " pod="openshift-route-controller-manager/route-controller-manager-588997d685-rq7xx" Feb 18 11:39:45 crc kubenswrapper[4922]: I0218 11:39:45.668160 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/15bde054-692c-48d7-9289-37ba209fa899-client-ca\") pod \"route-controller-manager-588997d685-rq7xx\" (UID: \"15bde054-692c-48d7-9289-37ba209fa899\") " pod="openshift-route-controller-manager/route-controller-manager-588997d685-rq7xx" Feb 18 11:39:45 crc kubenswrapper[4922]: I0218 11:39:45.668192 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15bde054-692c-48d7-9289-37ba209fa899-serving-cert\") pod \"route-controller-manager-588997d685-rq7xx\" (UID: \"15bde054-692c-48d7-9289-37ba209fa899\") " pod="openshift-route-controller-manager/route-controller-manager-588997d685-rq7xx" Feb 18 11:39:45 crc kubenswrapper[4922]: I0218 11:39:45.668216 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15bde054-692c-48d7-9289-37ba209fa899-config\") pod \"route-controller-manager-588997d685-rq7xx\" (UID: \"15bde054-692c-48d7-9289-37ba209fa899\") " pod="openshift-route-controller-manager/route-controller-manager-588997d685-rq7xx" Feb 18 11:39:45 crc kubenswrapper[4922]: I0218 11:39:45.668448 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/158f0672-c017-4e45-a564-96de81f21772-client-ca" (OuterVolumeSpecName: "client-ca") pod "158f0672-c017-4e45-a564-96de81f21772" (UID: "158f0672-c017-4e45-a564-96de81f21772"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:39:45 crc kubenswrapper[4922]: I0218 11:39:45.668587 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/158f0672-c017-4e45-a564-96de81f21772-config" (OuterVolumeSpecName: "config") pod "158f0672-c017-4e45-a564-96de81f21772" (UID: "158f0672-c017-4e45-a564-96de81f21772"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:39:45 crc kubenswrapper[4922]: I0218 11:39:45.668607 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3bb2fc9-822c-4f53-98bf-70933744cf7f-config" (OuterVolumeSpecName: "config") pod "c3bb2fc9-822c-4f53-98bf-70933744cf7f" (UID: "c3bb2fc9-822c-4f53-98bf-70933744cf7f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:39:45 crc kubenswrapper[4922]: I0218 11:39:45.668801 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/158f0672-c017-4e45-a564-96de81f21772-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "158f0672-c017-4e45-a564-96de81f21772" (UID: "158f0672-c017-4e45-a564-96de81f21772"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:39:45 crc kubenswrapper[4922]: I0218 11:39:45.669133 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3bb2fc9-822c-4f53-98bf-70933744cf7f-client-ca" (OuterVolumeSpecName: "client-ca") pod "c3bb2fc9-822c-4f53-98bf-70933744cf7f" (UID: "c3bb2fc9-822c-4f53-98bf-70933744cf7f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:39:45 crc kubenswrapper[4922]: I0218 11:39:45.681742 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3bb2fc9-822c-4f53-98bf-70933744cf7f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c3bb2fc9-822c-4f53-98bf-70933744cf7f" (UID: "c3bb2fc9-822c-4f53-98bf-70933744cf7f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:39:45 crc kubenswrapper[4922]: I0218 11:39:45.682874 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/158f0672-c017-4e45-a564-96de81f21772-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "158f0672-c017-4e45-a564-96de81f21772" (UID: "158f0672-c017-4e45-a564-96de81f21772"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:39:45 crc kubenswrapper[4922]: I0218 11:39:45.682946 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/158f0672-c017-4e45-a564-96de81f21772-kube-api-access-k46td" (OuterVolumeSpecName: "kube-api-access-k46td") pod "158f0672-c017-4e45-a564-96de81f21772" (UID: "158f0672-c017-4e45-a564-96de81f21772"). InnerVolumeSpecName "kube-api-access-k46td". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:39:45 crc kubenswrapper[4922]: I0218 11:39:45.683503 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3bb2fc9-822c-4f53-98bf-70933744cf7f-kube-api-access-6lkpw" (OuterVolumeSpecName: "kube-api-access-6lkpw") pod "c3bb2fc9-822c-4f53-98bf-70933744cf7f" (UID: "c3bb2fc9-822c-4f53-98bf-70933744cf7f"). InnerVolumeSpecName "kube-api-access-6lkpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:39:45 crc kubenswrapper[4922]: I0218 11:39:45.769054 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/15bde054-692c-48d7-9289-37ba209fa899-client-ca\") pod \"route-controller-manager-588997d685-rq7xx\" (UID: \"15bde054-692c-48d7-9289-37ba209fa899\") " pod="openshift-route-controller-manager/route-controller-manager-588997d685-rq7xx" Feb 18 11:39:45 crc kubenswrapper[4922]: I0218 11:39:45.769114 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15bde054-692c-48d7-9289-37ba209fa899-serving-cert\") pod \"route-controller-manager-588997d685-rq7xx\" (UID: \"15bde054-692c-48d7-9289-37ba209fa899\") " pod="openshift-route-controller-manager/route-controller-manager-588997d685-rq7xx" Feb 18 11:39:45 crc kubenswrapper[4922]: I0218 11:39:45.769134 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15bde054-692c-48d7-9289-37ba209fa899-config\") pod \"route-controller-manager-588997d685-rq7xx\" (UID: \"15bde054-692c-48d7-9289-37ba209fa899\") " pod="openshift-route-controller-manager/route-controller-manager-588997d685-rq7xx" Feb 18 11:39:45 crc kubenswrapper[4922]: I0218 11:39:45.769195 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjhth\" (UniqueName: \"kubernetes.io/projected/15bde054-692c-48d7-9289-37ba209fa899-kube-api-access-tjhth\") pod \"route-controller-manager-588997d685-rq7xx\" (UID: \"15bde054-692c-48d7-9289-37ba209fa899\") " pod="openshift-route-controller-manager/route-controller-manager-588997d685-rq7xx" Feb 18 11:39:45 crc kubenswrapper[4922]: I0218 11:39:45.769260 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/158f0672-c017-4e45-a564-96de81f21772-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:39:45 crc kubenswrapper[4922]: I0218 11:39:45.769275 4922 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/158f0672-c017-4e45-a564-96de81f21772-client-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:39:45 crc kubenswrapper[4922]: I0218 11:39:45.769288 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3bb2fc9-822c-4f53-98bf-70933744cf7f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:39:45 crc kubenswrapper[4922]: I0218 11:39:45.769299 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3bb2fc9-822c-4f53-98bf-70933744cf7f-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:39:45 crc kubenswrapper[4922]: I0218 11:39:45.769309 4922 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3bb2fc9-822c-4f53-98bf-70933744cf7f-client-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:39:45 crc kubenswrapper[4922]: I0218 11:39:45.769320 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lkpw\" (UniqueName: \"kubernetes.io/projected/c3bb2fc9-822c-4f53-98bf-70933744cf7f-kube-api-access-6lkpw\") on node \"crc\" DevicePath \"\"" Feb 18 11:39:45 crc kubenswrapper[4922]: I0218 11:39:45.769332 4922 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/158f0672-c017-4e45-a564-96de81f21772-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 18 11:39:45 crc kubenswrapper[4922]: I0218 11:39:45.769343 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k46td\" (UniqueName: \"kubernetes.io/projected/158f0672-c017-4e45-a564-96de81f21772-kube-api-access-k46td\") on node \"crc\" DevicePath \"\"" Feb 18 11:39:45 crc kubenswrapper[4922]: I0218 11:39:45.769353 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/158f0672-c017-4e45-a564-96de81f21772-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:39:45 crc kubenswrapper[4922]: I0218 11:39:45.770719 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/15bde054-692c-48d7-9289-37ba209fa899-client-ca\") pod \"route-controller-manager-588997d685-rq7xx\" (UID: \"15bde054-692c-48d7-9289-37ba209fa899\") " pod="openshift-route-controller-manager/route-controller-manager-588997d685-rq7xx" Feb 18 11:39:45 crc kubenswrapper[4922]: I0218 11:39:45.770770 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15bde054-692c-48d7-9289-37ba209fa899-config\") pod \"route-controller-manager-588997d685-rq7xx\" (UID: \"15bde054-692c-48d7-9289-37ba209fa899\") " pod="openshift-route-controller-manager/route-controller-manager-588997d685-rq7xx" Feb 18 11:39:45 crc kubenswrapper[4922]: I0218 11:39:45.773060 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15bde054-692c-48d7-9289-37ba209fa899-serving-cert\") pod \"route-controller-manager-588997d685-rq7xx\" (UID: \"15bde054-692c-48d7-9289-37ba209fa899\") " pod="openshift-route-controller-manager/route-controller-manager-588997d685-rq7xx" Feb 18 11:39:45 crc kubenswrapper[4922]: I0218 11:39:45.785518 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjhth\" (UniqueName: \"kubernetes.io/projected/15bde054-692c-48d7-9289-37ba209fa899-kube-api-access-tjhth\") pod \"route-controller-manager-588997d685-rq7xx\" (UID: \"15bde054-692c-48d7-9289-37ba209fa899\") " pod="openshift-route-controller-manager/route-controller-manager-588997d685-rq7xx" Feb 18 11:39:45 crc kubenswrapper[4922]: I0218 11:39:45.932565 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-588997d685-rq7xx" Feb 18 11:39:45 crc kubenswrapper[4922]: E0218 11:39:45.981733 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 18 11:39:45 crc kubenswrapper[4922]: E0218 11:39:45.981880 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fvvjz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-lcnjk_openshift-marketplace(fc9b41f8-ac9b-4166-a2a6-80326e19254a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 18 11:39:45 crc kubenswrapper[4922]: E0218 11:39:45.983050 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-lcnjk" podUID="fc9b41f8-ac9b-4166-a2a6-80326e19254a" Feb 18 11:39:46 crc kubenswrapper[4922]: I0218 11:39:46.086144 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-pspfr"] Feb 18 11:39:46 crc kubenswrapper[4922]: E0218 11:39:46.176234 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 18 11:39:46 crc kubenswrapper[4922]: E0218 11:39:46.176468 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6x5ts,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-5lflw_openshift-marketplace(9cddee0a-8b13-429b-89b6-e820f8f3ec59): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 18 11:39:46 crc kubenswrapper[4922]: E0218 11:39:46.177680 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-5lflw" podUID="9cddee0a-8b13-429b-89b6-e820f8f3ec59" Feb 18 11:39:46 crc kubenswrapper[4922]: I0218 11:39:46.186191 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-588997d685-rq7xx"] Feb 18 11:39:46 crc kubenswrapper[4922]: W0218 11:39:46.196489 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15bde054_692c_48d7_9289_37ba209fa899.slice/crio-4ffa7a4229a3fad7ed4a84f18a413cfd9e9c021ec94c8917ef154f3e1f7c78a5 WatchSource:0}: Error finding container 4ffa7a4229a3fad7ed4a84f18a413cfd9e9c021ec94c8917ef154f3e1f7c78a5: Status 404 returned error can't find the container with id 4ffa7a4229a3fad7ed4a84f18a413cfd9e9c021ec94c8917ef154f3e1f7c78a5 Feb 18 11:39:46 crc kubenswrapper[4922]: I0218 11:39:46.293112 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-v6m8j" event={"ID":"c3bb2fc9-822c-4f53-98bf-70933744cf7f","Type":"ContainerDied","Data":"52db1346d3e5c28577ccc9c9f0dc56c1b32dc4dc3a9883af1710dab9823da00c"} Feb 18 11:39:46 crc kubenswrapper[4922]: I0218 11:39:46.293173 4922 scope.go:117] "RemoveContainer" containerID="6194b5e6d35bc369d7c503f0dab1c1b4699e56c00e185999f0dd2d244bb4e4b6" Feb 18 11:39:46 crc kubenswrapper[4922]: I0218 11:39:46.293141 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-v6m8j" Feb 18 11:39:46 crc kubenswrapper[4922]: I0218 11:39:46.296096 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-588997d685-rq7xx" event={"ID":"15bde054-692c-48d7-9289-37ba209fa899","Type":"ContainerStarted","Data":"4ffa7a4229a3fad7ed4a84f18a413cfd9e9c021ec94c8917ef154f3e1f7c78a5"} Feb 18 11:39:46 crc kubenswrapper[4922]: I0218 11:39:46.301601 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wz74v" event={"ID":"47c627d0-6fb9-4b77-b266-74670361fcd6","Type":"ContainerStarted","Data":"e1f53ef0abd3288d9c5b216794b6258c6d92e85e01c8016df80b39534a5185b1"} Feb 18 11:39:46 crc kubenswrapper[4922]: I0218 11:39:46.308007 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7dzbt" event={"ID":"fe4edbcb-8a38-4f30-975f-aa4825192b4e","Type":"ContainerStarted","Data":"a2d26f75f0731112c78aac584a6d3dba85b5a815a715a0b94935517739394395"} Feb 18 11:39:46 crc kubenswrapper[4922]: I0218 11:39:46.309812 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-pspfr" event={"ID":"4702cf45-b47b-4291-a553-5bfc7bc22674","Type":"ContainerStarted","Data":"fe2dfb96ca37dee3ade1541dcfa518d248bc5a915bfb3f141cd9748702d9dc7b"} Feb 18 11:39:46 crc kubenswrapper[4922]: I0218 11:39:46.311003 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pj7qx" event={"ID":"158f0672-c017-4e45-a564-96de81f21772","Type":"ContainerDied","Data":"2d992c5fc4f578358729ef2e1374662b53a6ba4618dfe838d93d24e0c5e5a820"} Feb 18 11:39:46 crc kubenswrapper[4922]: I0218 11:39:46.311071 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pj7qx" Feb 18 11:39:46 crc kubenswrapper[4922]: I0218 11:39:46.314442 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nrxb6" event={"ID":"48ccc67a-6393-4dea-9c00-24bbc55e34d3","Type":"ContainerStarted","Data":"f96343f5641693f6e288bbd4accc54022881c3b1914c16bca3dd70533cbcb9f1"} Feb 18 11:39:46 crc kubenswrapper[4922]: E0218 11:39:46.315313 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-5lflw" podUID="9cddee0a-8b13-429b-89b6-e820f8f3ec59" Feb 18 11:39:46 crc kubenswrapper[4922]: I0218 11:39:46.315404 4922 scope.go:117] "RemoveContainer" containerID="5e927c6e4399c8def57efddaefd76cf053c5e57aa111182c3f59a17801fceec0" Feb 18 11:39:46 crc kubenswrapper[4922]: E0218 11:39:46.315828 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-lcnjk" podUID="fc9b41f8-ac9b-4166-a2a6-80326e19254a" Feb 18 11:39:46 crc kubenswrapper[4922]: I0218 11:39:46.399873 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-v6m8j"] Feb 18 11:39:46 crc kubenswrapper[4922]: I0218 11:39:46.402478 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-v6m8j"] Feb 18 11:39:46 crc kubenswrapper[4922]: I0218 11:39:46.446989 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pj7qx"] Feb 18 11:39:46 crc kubenswrapper[4922]: I0218 11:39:46.451579 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pj7qx"] Feb 18 11:39:46 crc kubenswrapper[4922]: I0218 11:39:46.979380 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="158f0672-c017-4e45-a564-96de81f21772" path="/var/lib/kubelet/pods/158f0672-c017-4e45-a564-96de81f21772/volumes" Feb 18 11:39:46 crc kubenswrapper[4922]: I0218 11:39:46.980242 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3bb2fc9-822c-4f53-98bf-70933744cf7f" path="/var/lib/kubelet/pods/c3bb2fc9-822c-4f53-98bf-70933744cf7f/volumes" Feb 18 11:39:47 crc kubenswrapper[4922]: I0218 11:39:47.204197 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-588bb8688f-kgnf7"] Feb 18 11:39:47 crc kubenswrapper[4922]: I0218 11:39:47.205036 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-588bb8688f-kgnf7" Feb 18 11:39:47 crc kubenswrapper[4922]: I0218 11:39:47.206706 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 18 11:39:47 crc kubenswrapper[4922]: I0218 11:39:47.206876 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 18 11:39:47 crc kubenswrapper[4922]: I0218 11:39:47.207682 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 18 11:39:47 crc kubenswrapper[4922]: I0218 11:39:47.207933 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 18 11:39:47 crc kubenswrapper[4922]: I0218 11:39:47.207944 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 18 11:39:47 crc kubenswrapper[4922]: I0218 11:39:47.212099 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 18 11:39:47 crc kubenswrapper[4922]: I0218 11:39:47.216757 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-588bb8688f-kgnf7"] Feb 18 11:39:47 crc kubenswrapper[4922]: I0218 11:39:47.219415 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 18 11:39:47 crc kubenswrapper[4922]: I0218 11:39:47.283004 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-588997d685-rq7xx"] Feb 18 11:39:47 crc kubenswrapper[4922]: I0218 11:39:47.288906 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/381bae09-292e-47ab-a85b-eeec711acdd9-serving-cert\") pod \"controller-manager-588bb8688f-kgnf7\" (UID: \"381bae09-292e-47ab-a85b-eeec711acdd9\") " pod="openshift-controller-manager/controller-manager-588bb8688f-kgnf7" Feb 18 11:39:47 crc kubenswrapper[4922]: I0218 11:39:47.288951 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/381bae09-292e-47ab-a85b-eeec711acdd9-client-ca\") pod \"controller-manager-588bb8688f-kgnf7\" (UID: \"381bae09-292e-47ab-a85b-eeec711acdd9\") " pod="openshift-controller-manager/controller-manager-588bb8688f-kgnf7" Feb 18 11:39:47 crc kubenswrapper[4922]: I0218 11:39:47.289053 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqh2b\" (UniqueName: \"kubernetes.io/projected/381bae09-292e-47ab-a85b-eeec711acdd9-kube-api-access-dqh2b\") pod \"controller-manager-588bb8688f-kgnf7\" (UID: \"381bae09-292e-47ab-a85b-eeec711acdd9\") " pod="openshift-controller-manager/controller-manager-588bb8688f-kgnf7" Feb 18 11:39:47 crc kubenswrapper[4922]: I0218 11:39:47.289089 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/381bae09-292e-47ab-a85b-eeec711acdd9-proxy-ca-bundles\") pod \"controller-manager-588bb8688f-kgnf7\" (UID: \"381bae09-292e-47ab-a85b-eeec711acdd9\") " pod="openshift-controller-manager/controller-manager-588bb8688f-kgnf7" Feb 18 11:39:47 crc kubenswrapper[4922]: I0218 11:39:47.289114 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/381bae09-292e-47ab-a85b-eeec711acdd9-config\") pod \"controller-manager-588bb8688f-kgnf7\" (UID: \"381bae09-292e-47ab-a85b-eeec711acdd9\") " pod="openshift-controller-manager/controller-manager-588bb8688f-kgnf7" Feb 18 11:39:47 crc kubenswrapper[4922]: I0218 11:39:47.320413 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-pspfr" event={"ID":"4702cf45-b47b-4291-a553-5bfc7bc22674","Type":"ContainerStarted","Data":"8badc5fff288446a9924553a34a6c1a2dbca9ca14cd487ff8689885221e9eb21"} Feb 18 11:39:47 crc kubenswrapper[4922]: I0218 11:39:47.320479 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-pspfr" event={"ID":"4702cf45-b47b-4291-a553-5bfc7bc22674","Type":"ContainerStarted","Data":"f5c18659fc59cc701f027079e2e983aa960911b7cfcf99509ebb977934856e7f"} Feb 18 11:39:47 crc kubenswrapper[4922]: I0218 11:39:47.323393 4922 generic.go:334] "Generic (PLEG): container finished" podID="48ccc67a-6393-4dea-9c00-24bbc55e34d3" containerID="f96343f5641693f6e288bbd4accc54022881c3b1914c16bca3dd70533cbcb9f1" exitCode=0 Feb 18 11:39:47 crc kubenswrapper[4922]: I0218 11:39:47.323539 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nrxb6" event={"ID":"48ccc67a-6393-4dea-9c00-24bbc55e34d3","Type":"ContainerDied","Data":"f96343f5641693f6e288bbd4accc54022881c3b1914c16bca3dd70533cbcb9f1"} Feb 18 11:39:47 crc kubenswrapper[4922]: I0218 11:39:47.327249 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-588997d685-rq7xx" event={"ID":"15bde054-692c-48d7-9289-37ba209fa899","Type":"ContainerStarted","Data":"7a881cfb4efbbdd30238b4a8820d29058b35729d117859298ac41459fbf8841c"} Feb 18 11:39:47 crc kubenswrapper[4922]: I0218 11:39:47.327640 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-588997d685-rq7xx" Feb 18 11:39:47 crc kubenswrapper[4922]: I0218 11:39:47.330408 4922 generic.go:334] "Generic (PLEG): container finished" podID="47c627d0-6fb9-4b77-b266-74670361fcd6" containerID="e1f53ef0abd3288d9c5b216794b6258c6d92e85e01c8016df80b39534a5185b1" exitCode=0 Feb 18 11:39:47 crc kubenswrapper[4922]: I0218 11:39:47.330474 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wz74v" event={"ID":"47c627d0-6fb9-4b77-b266-74670361fcd6","Type":"ContainerDied","Data":"e1f53ef0abd3288d9c5b216794b6258c6d92e85e01c8016df80b39534a5185b1"} Feb 18 11:39:47 crc kubenswrapper[4922]: I0218 11:39:47.332746 4922 generic.go:334] "Generic (PLEG): container finished" podID="fe4edbcb-8a38-4f30-975f-aa4825192b4e" containerID="a2d26f75f0731112c78aac584a6d3dba85b5a815a715a0b94935517739394395" exitCode=0 Feb 18 11:39:47 crc kubenswrapper[4922]: I0218 11:39:47.332778 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7dzbt" event={"ID":"fe4edbcb-8a38-4f30-975f-aa4825192b4e","Type":"ContainerDied","Data":"a2d26f75f0731112c78aac584a6d3dba85b5a815a715a0b94935517739394395"} Feb 18 11:39:47 crc kubenswrapper[4922]: I0218 11:39:47.335224 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-588997d685-rq7xx" Feb 18 11:39:47 crc kubenswrapper[4922]: I0218 11:39:47.345054 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-pspfr" podStartSLOduration=169.345036652 podStartE2EDuration="2m49.345036652s" podCreationTimestamp="2026-02-18 11:36:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:39:47.342211401 +0000 UTC m=+189.069915481" watchObservedRunningTime="2026-02-18 11:39:47.345036652 +0000 UTC m=+189.072740732" Feb 18 11:39:47 crc kubenswrapper[4922]: I0218 11:39:47.390329 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/381bae09-292e-47ab-a85b-eeec711acdd9-serving-cert\") pod \"controller-manager-588bb8688f-kgnf7\" (UID: \"381bae09-292e-47ab-a85b-eeec711acdd9\") " pod="openshift-controller-manager/controller-manager-588bb8688f-kgnf7" Feb 18 11:39:47 crc kubenswrapper[4922]: I0218 11:39:47.390404 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/381bae09-292e-47ab-a85b-eeec711acdd9-client-ca\") pod \"controller-manager-588bb8688f-kgnf7\" (UID: \"381bae09-292e-47ab-a85b-eeec711acdd9\") " pod="openshift-controller-manager/controller-manager-588bb8688f-kgnf7" Feb 18 11:39:47 crc kubenswrapper[4922]: I0218 11:39:47.390468 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqh2b\" (UniqueName: \"kubernetes.io/projected/381bae09-292e-47ab-a85b-eeec711acdd9-kube-api-access-dqh2b\") pod \"controller-manager-588bb8688f-kgnf7\" (UID: \"381bae09-292e-47ab-a85b-eeec711acdd9\") " pod="openshift-controller-manager/controller-manager-588bb8688f-kgnf7" Feb 18 11:39:47 crc kubenswrapper[4922]: I0218 11:39:47.390543 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/381bae09-292e-47ab-a85b-eeec711acdd9-proxy-ca-bundles\") pod \"controller-manager-588bb8688f-kgnf7\" (UID: \"381bae09-292e-47ab-a85b-eeec711acdd9\") " pod="openshift-controller-manager/controller-manager-588bb8688f-kgnf7" Feb 18 11:39:47 crc kubenswrapper[4922]: I0218 11:39:47.390589 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/381bae09-292e-47ab-a85b-eeec711acdd9-config\") pod \"controller-manager-588bb8688f-kgnf7\" (UID: \"381bae09-292e-47ab-a85b-eeec711acdd9\") " pod="openshift-controller-manager/controller-manager-588bb8688f-kgnf7" Feb 18 11:39:47 crc kubenswrapper[4922]: I0218 11:39:47.393381 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/381bae09-292e-47ab-a85b-eeec711acdd9-client-ca\") pod \"controller-manager-588bb8688f-kgnf7\" (UID: \"381bae09-292e-47ab-a85b-eeec711acdd9\") " pod="openshift-controller-manager/controller-manager-588bb8688f-kgnf7" Feb 18 11:39:47 crc kubenswrapper[4922]: I0218 11:39:47.393630 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/381bae09-292e-47ab-a85b-eeec711acdd9-config\") pod \"controller-manager-588bb8688f-kgnf7\" (UID: \"381bae09-292e-47ab-a85b-eeec711acdd9\") " pod="openshift-controller-manager/controller-manager-588bb8688f-kgnf7" Feb 18 11:39:47 crc kubenswrapper[4922]: I0218 11:39:47.393821 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/381bae09-292e-47ab-a85b-eeec711acdd9-proxy-ca-bundles\") pod \"controller-manager-588bb8688f-kgnf7\" (UID: \"381bae09-292e-47ab-a85b-eeec711acdd9\") " pod="openshift-controller-manager/controller-manager-588bb8688f-kgnf7" Feb 18 11:39:47 crc kubenswrapper[4922]: I0218 11:39:47.396893 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-588997d685-rq7xx" podStartSLOduration=20.396873745 podStartE2EDuration="20.396873745s" podCreationTimestamp="2026-02-18 11:39:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:39:47.393204982 +0000 UTC m=+189.120909062" watchObservedRunningTime="2026-02-18 11:39:47.396873745 +0000 UTC m=+189.124577825" Feb 18 11:39:47 crc kubenswrapper[4922]: I0218 11:39:47.412675 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqh2b\" (UniqueName: \"kubernetes.io/projected/381bae09-292e-47ab-a85b-eeec711acdd9-kube-api-access-dqh2b\") pod \"controller-manager-588bb8688f-kgnf7\" (UID: \"381bae09-292e-47ab-a85b-eeec711acdd9\") " pod="openshift-controller-manager/controller-manager-588bb8688f-kgnf7" Feb 18 11:39:47 crc kubenswrapper[4922]: I0218 11:39:47.429231 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/381bae09-292e-47ab-a85b-eeec711acdd9-serving-cert\") pod \"controller-manager-588bb8688f-kgnf7\" (UID: \"381bae09-292e-47ab-a85b-eeec711acdd9\") " pod="openshift-controller-manager/controller-manager-588bb8688f-kgnf7" Feb 18 11:39:47 crc kubenswrapper[4922]: I0218 11:39:47.520779 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-588bb8688f-kgnf7" Feb 18 11:39:47 crc kubenswrapper[4922]: I0218 11:39:47.702901 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-588bb8688f-kgnf7"] Feb 18 11:39:47 crc kubenswrapper[4922]: W0218 11:39:47.709685 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod381bae09_292e_47ab_a85b_eeec711acdd9.slice/crio-24eefe81e8715469a6d268edf7354dca96b3d7cb46067661d42653e51f19e9c9 WatchSource:0}: Error finding container 24eefe81e8715469a6d268edf7354dca96b3d7cb46067661d42653e51f19e9c9: Status 404 returned error can't find the container with id 24eefe81e8715469a6d268edf7354dca96b3d7cb46067661d42653e51f19e9c9 Feb 18 11:39:48 crc kubenswrapper[4922]: I0218 11:39:48.314671 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:39:48 crc kubenswrapper[4922]: I0218 11:39:48.341304 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nrxb6" event={"ID":"48ccc67a-6393-4dea-9c00-24bbc55e34d3","Type":"ContainerStarted","Data":"ade67fc75daaf75a5b69f90fb93842c53cb85bb22999b13e9dd0f075f8b6c4bd"} Feb 18 11:39:48 crc kubenswrapper[4922]: I0218 11:39:48.343593 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-588bb8688f-kgnf7" event={"ID":"381bae09-292e-47ab-a85b-eeec711acdd9","Type":"ContainerStarted","Data":"888942b059f1b7ce4402d4d451bbe5ffef2dc01e980cc850af5d6d975e6db9a0"} Feb 18 11:39:48 crc kubenswrapper[4922]: I0218 11:39:48.343634 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-588bb8688f-kgnf7" event={"ID":"381bae09-292e-47ab-a85b-eeec711acdd9","Type":"ContainerStarted","Data":"24eefe81e8715469a6d268edf7354dca96b3d7cb46067661d42653e51f19e9c9"} Feb 18 11:39:48 crc kubenswrapper[4922]: I0218 11:39:48.343972 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-588bb8688f-kgnf7" Feb 18 11:39:48 crc kubenswrapper[4922]: I0218 11:39:48.351924 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wz74v" event={"ID":"47c627d0-6fb9-4b77-b266-74670361fcd6","Type":"ContainerStarted","Data":"f527847320af902ee6f743ec960949ebda5617bce3a441ada1527d1de64a2927"} Feb 18 11:39:48 crc kubenswrapper[4922]: I0218 11:39:48.355855 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7dzbt" event={"ID":"fe4edbcb-8a38-4f30-975f-aa4825192b4e","Type":"ContainerStarted","Data":"8fdaa83d5f72f970ccf8db14422f820c61b3754fa8464fddf21f1bada5ca3b77"} Feb 18 11:39:48 crc kubenswrapper[4922]: I0218 11:39:48.356127 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-588bb8688f-kgnf7" Feb 18 11:39:48 crc kubenswrapper[4922]: I0218 11:39:48.356266 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-588997d685-rq7xx" podUID="15bde054-692c-48d7-9289-37ba209fa899" containerName="route-controller-manager" containerID="cri-o://7a881cfb4efbbdd30238b4a8820d29058b35729d117859298ac41459fbf8841c" gracePeriod=30 Feb 18 11:39:48 crc kubenswrapper[4922]: I0218 11:39:48.371926 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nrxb6" podStartSLOduration=2.47432703 podStartE2EDuration="41.371909604s" podCreationTimestamp="2026-02-18 11:39:07 +0000 UTC" firstStartedPulling="2026-02-18 11:39:08.926136785 +0000 UTC m=+150.653840865" lastFinishedPulling="2026-02-18 11:39:47.823719359 +0000 UTC m=+189.551423439" observedRunningTime="2026-02-18 11:39:48.371577306 +0000 UTC m=+190.099281386" watchObservedRunningTime="2026-02-18 11:39:48.371909604 +0000 UTC m=+190.099613684" Feb 18 11:39:48 crc kubenswrapper[4922]: I0218 11:39:48.388850 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wz74v" podStartSLOduration=2.681490798 podStartE2EDuration="38.388835453s" podCreationTimestamp="2026-02-18 11:39:10 +0000 UTC" firstStartedPulling="2026-02-18 11:39:12.044606967 +0000 UTC m=+153.772311047" lastFinishedPulling="2026-02-18 11:39:47.751951632 +0000 UTC m=+189.479655702" observedRunningTime="2026-02-18 11:39:48.38714703 +0000 UTC m=+190.114851110" watchObservedRunningTime="2026-02-18 11:39:48.388835453 +0000 UTC m=+190.116539533" Feb 18 11:39:48 crc kubenswrapper[4922]: I0218 11:39:48.449109 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-588bb8688f-kgnf7" podStartSLOduration=1.449088028 podStartE2EDuration="1.449088028s" podCreationTimestamp="2026-02-18 11:39:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:39:48.443021984 +0000 UTC m=+190.170726064" watchObservedRunningTime="2026-02-18 11:39:48.449088028 +0000 UTC m=+190.176792108" Feb 18 11:39:48 crc kubenswrapper[4922]: I0218 11:39:48.452431 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7dzbt" podStartSLOduration=2.671793258 podStartE2EDuration="41.452414912s" podCreationTimestamp="2026-02-18 11:39:07 +0000 UTC" firstStartedPulling="2026-02-18 11:39:08.950178673 +0000 UTC m=+150.677882743" lastFinishedPulling="2026-02-18 11:39:47.730800317 +0000 UTC m=+189.458504397" observedRunningTime="2026-02-18 11:39:48.404855468 +0000 UTC m=+190.132559558" watchObservedRunningTime="2026-02-18 11:39:48.452414912 +0000 UTC m=+190.180118992" Feb 18 11:39:48 crc kubenswrapper[4922]: I0218 11:39:48.772756 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-588997d685-rq7xx" Feb 18 11:39:48 crc kubenswrapper[4922]: I0218 11:39:48.808275 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66bbd976d-c94kz"] Feb 18 11:39:48 crc kubenswrapper[4922]: E0218 11:39:48.808559 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15bde054-692c-48d7-9289-37ba209fa899" containerName="route-controller-manager" Feb 18 11:39:48 crc kubenswrapper[4922]: I0218 11:39:48.808575 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="15bde054-692c-48d7-9289-37ba209fa899" containerName="route-controller-manager" Feb 18 11:39:48 crc kubenswrapper[4922]: I0218 11:39:48.808735 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="15bde054-692c-48d7-9289-37ba209fa899" containerName="route-controller-manager" Feb 18 11:39:48 crc kubenswrapper[4922]: I0218 11:39:48.809187 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66bbd976d-c94kz" Feb 18 11:39:48 crc kubenswrapper[4922]: I0218 11:39:48.814332 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mhdl\" (UniqueName: \"kubernetes.io/projected/1ed4eefa-9d69-468e-b783-af3d0a1e7e75-kube-api-access-7mhdl\") pod \"route-controller-manager-66bbd976d-c94kz\" (UID: \"1ed4eefa-9d69-468e-b783-af3d0a1e7e75\") " pod="openshift-route-controller-manager/route-controller-manager-66bbd976d-c94kz" Feb 18 11:39:48 crc kubenswrapper[4922]: I0218 11:39:48.814421 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ed4eefa-9d69-468e-b783-af3d0a1e7e75-config\") pod \"route-controller-manager-66bbd976d-c94kz\" (UID: \"1ed4eefa-9d69-468e-b783-af3d0a1e7e75\") " pod="openshift-route-controller-manager/route-controller-manager-66bbd976d-c94kz" Feb 18 11:39:48 crc kubenswrapper[4922]: I0218 11:39:48.814452 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ed4eefa-9d69-468e-b783-af3d0a1e7e75-serving-cert\") pod \"route-controller-manager-66bbd976d-c94kz\" (UID: \"1ed4eefa-9d69-468e-b783-af3d0a1e7e75\") " pod="openshift-route-controller-manager/route-controller-manager-66bbd976d-c94kz" Feb 18 11:39:48 crc kubenswrapper[4922]: I0218 11:39:48.814486 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1ed4eefa-9d69-468e-b783-af3d0a1e7e75-client-ca\") pod \"route-controller-manager-66bbd976d-c94kz\" (UID: \"1ed4eefa-9d69-468e-b783-af3d0a1e7e75\") " pod="openshift-route-controller-manager/route-controller-manager-66bbd976d-c94kz" Feb 18 11:39:48 crc kubenswrapper[4922]: I0218 11:39:48.838527 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66bbd976d-c94kz"] Feb 18 11:39:48 crc kubenswrapper[4922]: I0218 11:39:48.915404 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15bde054-692c-48d7-9289-37ba209fa899-config\") pod \"15bde054-692c-48d7-9289-37ba209fa899\" (UID: \"15bde054-692c-48d7-9289-37ba209fa899\") " Feb 18 11:39:48 crc kubenswrapper[4922]: I0218 11:39:48.915467 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15bde054-692c-48d7-9289-37ba209fa899-serving-cert\") pod \"15bde054-692c-48d7-9289-37ba209fa899\" (UID: \"15bde054-692c-48d7-9289-37ba209fa899\") " Feb 18 11:39:48 crc kubenswrapper[4922]: I0218 11:39:48.915775 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/15bde054-692c-48d7-9289-37ba209fa899-client-ca\") pod \"15bde054-692c-48d7-9289-37ba209fa899\" (UID: \"15bde054-692c-48d7-9289-37ba209fa899\") " Feb 18 11:39:48 crc kubenswrapper[4922]: I0218 11:39:48.916027 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjhth\" (UniqueName: \"kubernetes.io/projected/15bde054-692c-48d7-9289-37ba209fa899-kube-api-access-tjhth\") pod \"15bde054-692c-48d7-9289-37ba209fa899\" (UID: \"15bde054-692c-48d7-9289-37ba209fa899\") " Feb 18 11:39:48 crc kubenswrapper[4922]: I0218 11:39:48.916309 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ed4eefa-9d69-468e-b783-af3d0a1e7e75-config\") pod \"route-controller-manager-66bbd976d-c94kz\" (UID: \"1ed4eefa-9d69-468e-b783-af3d0a1e7e75\") " pod="openshift-route-controller-manager/route-controller-manager-66bbd976d-c94kz" Feb 18 11:39:48 crc kubenswrapper[4922]: I0218 11:39:48.916380 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ed4eefa-9d69-468e-b783-af3d0a1e7e75-serving-cert\") pod \"route-controller-manager-66bbd976d-c94kz\" (UID: \"1ed4eefa-9d69-468e-b783-af3d0a1e7e75\") " pod="openshift-route-controller-manager/route-controller-manager-66bbd976d-c94kz" Feb 18 11:39:48 crc kubenswrapper[4922]: I0218 11:39:48.916443 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1ed4eefa-9d69-468e-b783-af3d0a1e7e75-client-ca\") pod \"route-controller-manager-66bbd976d-c94kz\" (UID: \"1ed4eefa-9d69-468e-b783-af3d0a1e7e75\") " pod="openshift-route-controller-manager/route-controller-manager-66bbd976d-c94kz" Feb 18 11:39:48 crc kubenswrapper[4922]: I0218 11:39:48.916484 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15bde054-692c-48d7-9289-37ba209fa899-config" (OuterVolumeSpecName: "config") pod "15bde054-692c-48d7-9289-37ba209fa899" (UID: "15bde054-692c-48d7-9289-37ba209fa899"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:39:48 crc kubenswrapper[4922]: I0218 11:39:48.916639 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mhdl\" (UniqueName: \"kubernetes.io/projected/1ed4eefa-9d69-468e-b783-af3d0a1e7e75-kube-api-access-7mhdl\") pod \"route-controller-manager-66bbd976d-c94kz\" (UID: \"1ed4eefa-9d69-468e-b783-af3d0a1e7e75\") " pod="openshift-route-controller-manager/route-controller-manager-66bbd976d-c94kz" Feb 18 11:39:48 crc kubenswrapper[4922]: I0218 11:39:48.916715 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15bde054-692c-48d7-9289-37ba209fa899-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:39:48 crc kubenswrapper[4922]: I0218 11:39:48.917410 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ed4eefa-9d69-468e-b783-af3d0a1e7e75-config\") pod \"route-controller-manager-66bbd976d-c94kz\" (UID: \"1ed4eefa-9d69-468e-b783-af3d0a1e7e75\") " pod="openshift-route-controller-manager/route-controller-manager-66bbd976d-c94kz" Feb 18 11:39:48 crc kubenswrapper[4922]: I0218 11:39:48.917441 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1ed4eefa-9d69-468e-b783-af3d0a1e7e75-client-ca\") pod \"route-controller-manager-66bbd976d-c94kz\" (UID: \"1ed4eefa-9d69-468e-b783-af3d0a1e7e75\") " pod="openshift-route-controller-manager/route-controller-manager-66bbd976d-c94kz" Feb 18 11:39:48 crc kubenswrapper[4922]: I0218 11:39:48.917498 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15bde054-692c-48d7-9289-37ba209fa899-client-ca" (OuterVolumeSpecName: "client-ca") pod "15bde054-692c-48d7-9289-37ba209fa899" (UID: "15bde054-692c-48d7-9289-37ba209fa899"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:39:48 crc kubenswrapper[4922]: I0218 11:39:48.921730 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ed4eefa-9d69-468e-b783-af3d0a1e7e75-serving-cert\") pod \"route-controller-manager-66bbd976d-c94kz\" (UID: \"1ed4eefa-9d69-468e-b783-af3d0a1e7e75\") " pod="openshift-route-controller-manager/route-controller-manager-66bbd976d-c94kz" Feb 18 11:39:48 crc kubenswrapper[4922]: I0218 11:39:48.926601 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15bde054-692c-48d7-9289-37ba209fa899-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "15bde054-692c-48d7-9289-37ba209fa899" (UID: "15bde054-692c-48d7-9289-37ba209fa899"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:39:48 crc kubenswrapper[4922]: I0218 11:39:48.926628 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15bde054-692c-48d7-9289-37ba209fa899-kube-api-access-tjhth" (OuterVolumeSpecName: "kube-api-access-tjhth") pod "15bde054-692c-48d7-9289-37ba209fa899" (UID: "15bde054-692c-48d7-9289-37ba209fa899"). InnerVolumeSpecName "kube-api-access-tjhth". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:39:48 crc kubenswrapper[4922]: I0218 11:39:48.937847 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mhdl\" (UniqueName: \"kubernetes.io/projected/1ed4eefa-9d69-468e-b783-af3d0a1e7e75-kube-api-access-7mhdl\") pod \"route-controller-manager-66bbd976d-c94kz\" (UID: \"1ed4eefa-9d69-468e-b783-af3d0a1e7e75\") " pod="openshift-route-controller-manager/route-controller-manager-66bbd976d-c94kz" Feb 18 11:39:49 crc kubenswrapper[4922]: I0218 11:39:49.017147 4922 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/15bde054-692c-48d7-9289-37ba209fa899-client-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:39:49 crc kubenswrapper[4922]: I0218 11:39:49.017178 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjhth\" (UniqueName: \"kubernetes.io/projected/15bde054-692c-48d7-9289-37ba209fa899-kube-api-access-tjhth\") on node \"crc\" DevicePath \"\"" Feb 18 11:39:49 crc kubenswrapper[4922]: I0218 11:39:49.017202 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15bde054-692c-48d7-9289-37ba209fa899-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:39:49 crc kubenswrapper[4922]: I0218 11:39:49.128739 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66bbd976d-c94kz" Feb 18 11:39:49 crc kubenswrapper[4922]: I0218 11:39:49.221561 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 18 11:39:49 crc kubenswrapper[4922]: I0218 11:39:49.222355 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 18 11:39:49 crc kubenswrapper[4922]: I0218 11:39:49.227531 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 18 11:39:49 crc kubenswrapper[4922]: I0218 11:39:49.228356 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 18 11:39:49 crc kubenswrapper[4922]: I0218 11:39:49.234519 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 18 11:39:49 crc kubenswrapper[4922]: I0218 11:39:49.320250 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f8148f2a-3e96-4e61-8537-7cd39940a907-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f8148f2a-3e96-4e61-8537-7cd39940a907\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 18 11:39:49 crc kubenswrapper[4922]: I0218 11:39:49.320311 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f8148f2a-3e96-4e61-8537-7cd39940a907-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f8148f2a-3e96-4e61-8537-7cd39940a907\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 18 11:39:49 crc kubenswrapper[4922]: I0218 11:39:49.368114 4922 generic.go:334] "Generic (PLEG): container finished" podID="15bde054-692c-48d7-9289-37ba209fa899" containerID="7a881cfb4efbbdd30238b4a8820d29058b35729d117859298ac41459fbf8841c" exitCode=0 Feb 18 11:39:49 crc kubenswrapper[4922]: I0218 11:39:49.368785 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-588997d685-rq7xx" event={"ID":"15bde054-692c-48d7-9289-37ba209fa899","Type":"ContainerDied","Data":"7a881cfb4efbbdd30238b4a8820d29058b35729d117859298ac41459fbf8841c"} Feb 18 11:39:49 crc kubenswrapper[4922]: I0218 11:39:49.368816 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-588997d685-rq7xx" event={"ID":"15bde054-692c-48d7-9289-37ba209fa899","Type":"ContainerDied","Data":"4ffa7a4229a3fad7ed4a84f18a413cfd9e9c021ec94c8917ef154f3e1f7c78a5"} Feb 18 11:39:49 crc kubenswrapper[4922]: I0218 11:39:49.368833 4922 scope.go:117] "RemoveContainer" containerID="7a881cfb4efbbdd30238b4a8820d29058b35729d117859298ac41459fbf8841c" Feb 18 11:39:49 crc kubenswrapper[4922]: I0218 11:39:49.369315 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-588997d685-rq7xx" Feb 18 11:39:49 crc kubenswrapper[4922]: I0218 11:39:49.395126 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-588997d685-rq7xx"] Feb 18 11:39:49 crc kubenswrapper[4922]: I0218 11:39:49.404727 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-588997d685-rq7xx"] Feb 18 11:39:49 crc kubenswrapper[4922]: I0218 11:39:49.406458 4922 scope.go:117] "RemoveContainer" containerID="7a881cfb4efbbdd30238b4a8820d29058b35729d117859298ac41459fbf8841c" Feb 18 11:39:49 crc kubenswrapper[4922]: E0218 11:39:49.407555 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a881cfb4efbbdd30238b4a8820d29058b35729d117859298ac41459fbf8841c\": container with ID starting with 7a881cfb4efbbdd30238b4a8820d29058b35729d117859298ac41459fbf8841c not found: ID does not exist" containerID="7a881cfb4efbbdd30238b4a8820d29058b35729d117859298ac41459fbf8841c" Feb 18 11:39:49 crc kubenswrapper[4922]: I0218 11:39:49.407587 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a881cfb4efbbdd30238b4a8820d29058b35729d117859298ac41459fbf8841c"} err="failed to get container status \"7a881cfb4efbbdd30238b4a8820d29058b35729d117859298ac41459fbf8841c\": rpc error: code = NotFound desc = could not find container \"7a881cfb4efbbdd30238b4a8820d29058b35729d117859298ac41459fbf8841c\": container with ID starting with 7a881cfb4efbbdd30238b4a8820d29058b35729d117859298ac41459fbf8841c not found: ID does not exist" Feb 18 11:39:49 crc kubenswrapper[4922]: I0218 11:39:49.421697 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f8148f2a-3e96-4e61-8537-7cd39940a907-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f8148f2a-3e96-4e61-8537-7cd39940a907\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 18 11:39:49 crc kubenswrapper[4922]: I0218 11:39:49.421822 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f8148f2a-3e96-4e61-8537-7cd39940a907-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f8148f2a-3e96-4e61-8537-7cd39940a907\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 18 11:39:49 crc kubenswrapper[4922]: I0218 11:39:49.427007 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f8148f2a-3e96-4e61-8537-7cd39940a907-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f8148f2a-3e96-4e61-8537-7cd39940a907\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 18 11:39:49 crc kubenswrapper[4922]: I0218 11:39:49.450618 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f8148f2a-3e96-4e61-8537-7cd39940a907-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f8148f2a-3e96-4e61-8537-7cd39940a907\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 18 11:39:49 crc kubenswrapper[4922]: I0218 11:39:49.601451 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 18 11:39:49 crc kubenswrapper[4922]: I0218 11:39:49.614318 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66bbd976d-c94kz"] Feb 18 11:39:49 crc kubenswrapper[4922]: W0218 11:39:49.626548 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ed4eefa_9d69_468e_b783_af3d0a1e7e75.slice/crio-689bf74da1dea7687a9978545b709d3db28e7791ae16a26226addae8a1619d61 WatchSource:0}: Error finding container 689bf74da1dea7687a9978545b709d3db28e7791ae16a26226addae8a1619d61: Status 404 returned error can't find the container with id 689bf74da1dea7687a9978545b709d3db28e7791ae16a26226addae8a1619d61 Feb 18 11:39:49 crc kubenswrapper[4922]: I0218 11:39:49.878906 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 18 11:39:50 crc kubenswrapper[4922]: I0218 11:39:50.395021 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"f8148f2a-3e96-4e61-8537-7cd39940a907","Type":"ContainerStarted","Data":"c96d9c5829843ccf1859da54d4fbe26e00d3587d1857a6d9339eeb6676859e82"} Feb 18 11:39:50 crc kubenswrapper[4922]: I0218 11:39:50.396396 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66bbd976d-c94kz" event={"ID":"1ed4eefa-9d69-468e-b783-af3d0a1e7e75","Type":"ContainerStarted","Data":"e286f860b267a30576d94c00bd4a0c76c12231f84c1094cab4113f22c54770c4"} Feb 18 11:39:50 crc kubenswrapper[4922]: I0218 11:39:50.396420 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66bbd976d-c94kz" event={"ID":"1ed4eefa-9d69-468e-b783-af3d0a1e7e75","Type":"ContainerStarted","Data":"689bf74da1dea7687a9978545b709d3db28e7791ae16a26226addae8a1619d61"} Feb 18 11:39:50 crc kubenswrapper[4922]: I0218 11:39:50.398345 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-66bbd976d-c94kz" Feb 18 11:39:50 crc kubenswrapper[4922]: I0218 11:39:50.402832 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-66bbd976d-c94kz" Feb 18 11:39:50 crc kubenswrapper[4922]: I0218 11:39:50.436287 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-66bbd976d-c94kz" podStartSLOduration=3.436271676 podStartE2EDuration="3.436271676s" podCreationTimestamp="2026-02-18 11:39:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:39:50.418746192 +0000 UTC m=+192.146450272" watchObservedRunningTime="2026-02-18 11:39:50.436271676 +0000 UTC m=+192.163975756" Feb 18 11:39:50 crc kubenswrapper[4922]: I0218 11:39:50.606860 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wz74v" Feb 18 11:39:50 crc kubenswrapper[4922]: I0218 11:39:50.607246 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wz74v" Feb 18 11:39:50 crc kubenswrapper[4922]: I0218 11:39:50.984665 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15bde054-692c-48d7-9289-37ba209fa899" path="/var/lib/kubelet/pods/15bde054-692c-48d7-9289-37ba209fa899/volumes" Feb 18 11:39:51 crc kubenswrapper[4922]: I0218 11:39:51.406713 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"f8148f2a-3e96-4e61-8537-7cd39940a907","Type":"ContainerStarted","Data":"8781754afcfa0017e16623396079b0583edf117901c99e1ded8fe44b7e8ca6d4"} Feb 18 11:39:52 crc kubenswrapper[4922]: I0218 11:39:52.059297 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wz74v" podUID="47c627d0-6fb9-4b77-b266-74670361fcd6" containerName="registry-server" probeResult="failure" output=< Feb 18 11:39:52 crc kubenswrapper[4922]: timeout: failed to connect service ":50051" within 1s Feb 18 11:39:52 crc kubenswrapper[4922]: > Feb 18 11:39:52 crc kubenswrapper[4922]: I0218 11:39:52.415511 4922 generic.go:334] "Generic (PLEG): container finished" podID="f8148f2a-3e96-4e61-8537-7cd39940a907" containerID="8781754afcfa0017e16623396079b0583edf117901c99e1ded8fe44b7e8ca6d4" exitCode=0 Feb 18 11:39:52 crc kubenswrapper[4922]: I0218 11:39:52.415603 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"f8148f2a-3e96-4e61-8537-7cd39940a907","Type":"ContainerDied","Data":"8781754afcfa0017e16623396079b0583edf117901c99e1ded8fe44b7e8ca6d4"} Feb 18 11:39:53 crc kubenswrapper[4922]: I0218 11:39:53.719374 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 18 11:39:53 crc kubenswrapper[4922]: I0218 11:39:53.880328 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f8148f2a-3e96-4e61-8537-7cd39940a907-kube-api-access\") pod \"f8148f2a-3e96-4e61-8537-7cd39940a907\" (UID: \"f8148f2a-3e96-4e61-8537-7cd39940a907\") " Feb 18 11:39:53 crc kubenswrapper[4922]: I0218 11:39:53.880407 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f8148f2a-3e96-4e61-8537-7cd39940a907-kubelet-dir\") pod \"f8148f2a-3e96-4e61-8537-7cd39940a907\" (UID: \"f8148f2a-3e96-4e61-8537-7cd39940a907\") " Feb 18 11:39:53 crc kubenswrapper[4922]: I0218 11:39:53.880543 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f8148f2a-3e96-4e61-8537-7cd39940a907-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "f8148f2a-3e96-4e61-8537-7cd39940a907" (UID: "f8148f2a-3e96-4e61-8537-7cd39940a907"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 11:39:53 crc kubenswrapper[4922]: I0218 11:39:53.880843 4922 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f8148f2a-3e96-4e61-8537-7cd39940a907-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 18 11:39:53 crc kubenswrapper[4922]: I0218 11:39:53.887226 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8148f2a-3e96-4e61-8537-7cd39940a907-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f8148f2a-3e96-4e61-8537-7cd39940a907" (UID: "f8148f2a-3e96-4e61-8537-7cd39940a907"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:39:53 crc kubenswrapper[4922]: I0218 11:39:53.981891 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f8148f2a-3e96-4e61-8537-7cd39940a907-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 18 11:39:54 crc kubenswrapper[4922]: I0218 11:39:54.014059 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 18 11:39:54 crc kubenswrapper[4922]: E0218 11:39:54.014341 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8148f2a-3e96-4e61-8537-7cd39940a907" containerName="pruner" Feb 18 11:39:54 crc kubenswrapper[4922]: I0218 11:39:54.014375 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8148f2a-3e96-4e61-8537-7cd39940a907" containerName="pruner" Feb 18 11:39:54 crc kubenswrapper[4922]: I0218 11:39:54.014516 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8148f2a-3e96-4e61-8537-7cd39940a907" containerName="pruner" Feb 18 11:39:54 crc kubenswrapper[4922]: I0218 11:39:54.014997 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 18 11:39:54 crc kubenswrapper[4922]: I0218 11:39:54.032537 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 18 11:39:54 crc kubenswrapper[4922]: I0218 11:39:54.184074 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad37f26c-293d-42c8-a88e-21e0a2c5e05d-kube-api-access\") pod \"installer-9-crc\" (UID: \"ad37f26c-293d-42c8-a88e-21e0a2c5e05d\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 18 11:39:54 crc kubenswrapper[4922]: I0218 11:39:54.184157 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ad37f26c-293d-42c8-a88e-21e0a2c5e05d-kubelet-dir\") pod \"installer-9-crc\" (UID: \"ad37f26c-293d-42c8-a88e-21e0a2c5e05d\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 18 11:39:54 crc kubenswrapper[4922]: I0218 11:39:54.184313 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ad37f26c-293d-42c8-a88e-21e0a2c5e05d-var-lock\") pod \"installer-9-crc\" (UID: \"ad37f26c-293d-42c8-a88e-21e0a2c5e05d\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 18 11:39:54 crc kubenswrapper[4922]: I0218 11:39:54.285862 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad37f26c-293d-42c8-a88e-21e0a2c5e05d-kube-api-access\") pod \"installer-9-crc\" (UID: \"ad37f26c-293d-42c8-a88e-21e0a2c5e05d\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 18 11:39:54 crc kubenswrapper[4922]: I0218 11:39:54.286012 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ad37f26c-293d-42c8-a88e-21e0a2c5e05d-kubelet-dir\") pod \"installer-9-crc\" (UID: \"ad37f26c-293d-42c8-a88e-21e0a2c5e05d\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 18 11:39:54 crc kubenswrapper[4922]: I0218 11:39:54.286043 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ad37f26c-293d-42c8-a88e-21e0a2c5e05d-var-lock\") pod \"installer-9-crc\" (UID: \"ad37f26c-293d-42c8-a88e-21e0a2c5e05d\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 18 11:39:54 crc kubenswrapper[4922]: I0218 11:39:54.286141 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ad37f26c-293d-42c8-a88e-21e0a2c5e05d-var-lock\") pod \"installer-9-crc\" (UID: \"ad37f26c-293d-42c8-a88e-21e0a2c5e05d\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 18 11:39:54 crc kubenswrapper[4922]: I0218 11:39:54.286157 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ad37f26c-293d-42c8-a88e-21e0a2c5e05d-kubelet-dir\") pod \"installer-9-crc\" (UID: \"ad37f26c-293d-42c8-a88e-21e0a2c5e05d\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 18 11:39:54 crc kubenswrapper[4922]: I0218 11:39:54.302346 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad37f26c-293d-42c8-a88e-21e0a2c5e05d-kube-api-access\") pod \"installer-9-crc\" (UID: \"ad37f26c-293d-42c8-a88e-21e0a2c5e05d\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 18 11:39:54 crc kubenswrapper[4922]: I0218 11:39:54.332566 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 18 11:39:54 crc kubenswrapper[4922]: I0218 11:39:54.431419 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"f8148f2a-3e96-4e61-8537-7cd39940a907","Type":"ContainerDied","Data":"c96d9c5829843ccf1859da54d4fbe26e00d3587d1857a6d9339eeb6676859e82"} Feb 18 11:39:54 crc kubenswrapper[4922]: I0218 11:39:54.431454 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c96d9c5829843ccf1859da54d4fbe26e00d3587d1857a6d9339eeb6676859e82" Feb 18 11:39:54 crc kubenswrapper[4922]: I0218 11:39:54.431501 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 18 11:39:54 crc kubenswrapper[4922]: I0218 11:39:54.813117 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 18 11:39:55 crc kubenswrapper[4922]: I0218 11:39:55.437353 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ad37f26c-293d-42c8-a88e-21e0a2c5e05d","Type":"ContainerStarted","Data":"0f5aa83e132da84ebb77b9c8d8371acd88a3509bc96c992baa1b131822fe3971"} Feb 18 11:39:55 crc kubenswrapper[4922]: I0218 11:39:55.437420 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ad37f26c-293d-42c8-a88e-21e0a2c5e05d","Type":"ContainerStarted","Data":"2993e8b81cf0d5924b6da2a78a590d591a1692ddbff65fb9a4fa27002842c2e5"} Feb 18 11:39:55 crc kubenswrapper[4922]: I0218 11:39:55.452282 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=1.452261868 podStartE2EDuration="1.452261868s" podCreationTimestamp="2026-02-18 11:39:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:39:55.449703463 +0000 UTC m=+197.177407543" watchObservedRunningTime="2026-02-18 11:39:55.452261868 +0000 UTC m=+197.179965948" Feb 18 11:39:57 crc kubenswrapper[4922]: I0218 11:39:57.384478 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7dzbt" Feb 18 11:39:57 crc kubenswrapper[4922]: I0218 11:39:57.384759 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7dzbt" Feb 18 11:39:57 crc kubenswrapper[4922]: I0218 11:39:57.439413 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7dzbt" Feb 18 11:39:57 crc kubenswrapper[4922]: I0218 11:39:57.448149 4922 generic.go:334] "Generic (PLEG): container finished" podID="80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54" containerID="88bed4d626019726ebb91e748414ebf1fc49f67cf72a95e0e09268ee83ee0d6b" exitCode=0 Feb 18 11:39:57 crc kubenswrapper[4922]: I0218 11:39:57.448213 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wg8w6" event={"ID":"80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54","Type":"ContainerDied","Data":"88bed4d626019726ebb91e748414ebf1fc49f67cf72a95e0e09268ee83ee0d6b"} Feb 18 11:39:57 crc kubenswrapper[4922]: I0218 11:39:57.493012 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7dzbt" Feb 18 11:39:57 crc kubenswrapper[4922]: I0218 11:39:57.921538 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nrxb6" Feb 18 11:39:57 crc kubenswrapper[4922]: I0218 11:39:57.921596 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nrxb6" Feb 18 11:39:57 crc kubenswrapper[4922]: I0218 11:39:57.999389 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nrxb6" Feb 18 11:39:58 crc kubenswrapper[4922]: I0218 11:39:58.497248 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nrxb6" Feb 18 11:39:59 crc kubenswrapper[4922]: I0218 11:39:59.203815 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nrxb6"] Feb 18 11:40:00 crc kubenswrapper[4922]: I0218 11:40:00.464186 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nrxb6" podUID="48ccc67a-6393-4dea-9c00-24bbc55e34d3" containerName="registry-server" containerID="cri-o://ade67fc75daaf75a5b69f90fb93842c53cb85bb22999b13e9dd0f075f8b6c4bd" gracePeriod=2 Feb 18 11:40:00 crc kubenswrapper[4922]: I0218 11:40:00.659584 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wz74v" Feb 18 11:40:00 crc kubenswrapper[4922]: I0218 11:40:00.698481 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wz74v" Feb 18 11:40:00 crc kubenswrapper[4922]: E0218 11:40:00.700013 4922 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48ccc67a_6393_4dea_9c00_24bbc55e34d3.slice/crio-ade67fc75daaf75a5b69f90fb93842c53cb85bb22999b13e9dd0f075f8b6c4bd.scope\": RecentStats: unable to find data in memory cache]" Feb 18 11:40:01 crc kubenswrapper[4922]: I0218 11:40:01.400632 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nrxb6" Feb 18 11:40:01 crc kubenswrapper[4922]: I0218 11:40:01.480250 4922 generic.go:334] "Generic (PLEG): container finished" podID="48ccc67a-6393-4dea-9c00-24bbc55e34d3" containerID="ade67fc75daaf75a5b69f90fb93842c53cb85bb22999b13e9dd0f075f8b6c4bd" exitCode=0 Feb 18 11:40:01 crc kubenswrapper[4922]: I0218 11:40:01.480326 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nrxb6" event={"ID":"48ccc67a-6393-4dea-9c00-24bbc55e34d3","Type":"ContainerDied","Data":"ade67fc75daaf75a5b69f90fb93842c53cb85bb22999b13e9dd0f075f8b6c4bd"} Feb 18 11:40:01 crc kubenswrapper[4922]: I0218 11:40:01.480376 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nrxb6" event={"ID":"48ccc67a-6393-4dea-9c00-24bbc55e34d3","Type":"ContainerDied","Data":"cf75383b2030608656ce7bf5f480ccca8d0511a948c0bdf822601c66ee3563e2"} Feb 18 11:40:01 crc kubenswrapper[4922]: I0218 11:40:01.480401 4922 scope.go:117] "RemoveContainer" containerID="ade67fc75daaf75a5b69f90fb93842c53cb85bb22999b13e9dd0f075f8b6c4bd" Feb 18 11:40:01 crc kubenswrapper[4922]: I0218 11:40:01.480697 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nrxb6" Feb 18 11:40:01 crc kubenswrapper[4922]: I0218 11:40:01.502637 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48ccc67a-6393-4dea-9c00-24bbc55e34d3-utilities\") pod \"48ccc67a-6393-4dea-9c00-24bbc55e34d3\" (UID: \"48ccc67a-6393-4dea-9c00-24bbc55e34d3\") " Feb 18 11:40:01 crc kubenswrapper[4922]: I0218 11:40:01.502685 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tq9z\" (UniqueName: \"kubernetes.io/projected/48ccc67a-6393-4dea-9c00-24bbc55e34d3-kube-api-access-9tq9z\") pod \"48ccc67a-6393-4dea-9c00-24bbc55e34d3\" (UID: \"48ccc67a-6393-4dea-9c00-24bbc55e34d3\") " Feb 18 11:40:01 crc kubenswrapper[4922]: I0218 11:40:01.503838 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48ccc67a-6393-4dea-9c00-24bbc55e34d3-utilities" (OuterVolumeSpecName: "utilities") pod "48ccc67a-6393-4dea-9c00-24bbc55e34d3" (UID: "48ccc67a-6393-4dea-9c00-24bbc55e34d3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:40:01 crc kubenswrapper[4922]: I0218 11:40:01.508731 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48ccc67a-6393-4dea-9c00-24bbc55e34d3-kube-api-access-9tq9z" (OuterVolumeSpecName: "kube-api-access-9tq9z") pod "48ccc67a-6393-4dea-9c00-24bbc55e34d3" (UID: "48ccc67a-6393-4dea-9c00-24bbc55e34d3"). InnerVolumeSpecName "kube-api-access-9tq9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:40:01 crc kubenswrapper[4922]: I0218 11:40:01.567790 4922 scope.go:117] "RemoveContainer" containerID="f96343f5641693f6e288bbd4accc54022881c3b1914c16bca3dd70533cbcb9f1" Feb 18 11:40:01 crc kubenswrapper[4922]: I0218 11:40:01.603420 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48ccc67a-6393-4dea-9c00-24bbc55e34d3-catalog-content\") pod \"48ccc67a-6393-4dea-9c00-24bbc55e34d3\" (UID: \"48ccc67a-6393-4dea-9c00-24bbc55e34d3\") " Feb 18 11:40:01 crc kubenswrapper[4922]: I0218 11:40:01.603756 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48ccc67a-6393-4dea-9c00-24bbc55e34d3-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 11:40:01 crc kubenswrapper[4922]: I0218 11:40:01.603774 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tq9z\" (UniqueName: \"kubernetes.io/projected/48ccc67a-6393-4dea-9c00-24bbc55e34d3-kube-api-access-9tq9z\") on node \"crc\" DevicePath \"\"" Feb 18 11:40:01 crc kubenswrapper[4922]: I0218 11:40:01.655242 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48ccc67a-6393-4dea-9c00-24bbc55e34d3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "48ccc67a-6393-4dea-9c00-24bbc55e34d3" (UID: "48ccc67a-6393-4dea-9c00-24bbc55e34d3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:40:01 crc kubenswrapper[4922]: I0218 11:40:01.708234 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48ccc67a-6393-4dea-9c00-24bbc55e34d3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 11:40:01 crc kubenswrapper[4922]: I0218 11:40:01.710806 4922 scope.go:117] "RemoveContainer" containerID="26033474e45a010781ba836c11c8f393e1aeb5595fb8120441fc214ed20d9a16" Feb 18 11:40:01 crc kubenswrapper[4922]: I0218 11:40:01.829839 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nrxb6"] Feb 18 11:40:01 crc kubenswrapper[4922]: I0218 11:40:01.832555 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nrxb6"] Feb 18 11:40:01 crc kubenswrapper[4922]: I0218 11:40:01.915506 4922 scope.go:117] "RemoveContainer" containerID="ade67fc75daaf75a5b69f90fb93842c53cb85bb22999b13e9dd0f075f8b6c4bd" Feb 18 11:40:01 crc kubenswrapper[4922]: E0218 11:40:01.915921 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ade67fc75daaf75a5b69f90fb93842c53cb85bb22999b13e9dd0f075f8b6c4bd\": container with ID starting with ade67fc75daaf75a5b69f90fb93842c53cb85bb22999b13e9dd0f075f8b6c4bd not found: ID does not exist" containerID="ade67fc75daaf75a5b69f90fb93842c53cb85bb22999b13e9dd0f075f8b6c4bd" Feb 18 11:40:01 crc kubenswrapper[4922]: I0218 11:40:01.915955 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ade67fc75daaf75a5b69f90fb93842c53cb85bb22999b13e9dd0f075f8b6c4bd"} err="failed to get container status \"ade67fc75daaf75a5b69f90fb93842c53cb85bb22999b13e9dd0f075f8b6c4bd\": rpc error: code = NotFound desc = could not find container \"ade67fc75daaf75a5b69f90fb93842c53cb85bb22999b13e9dd0f075f8b6c4bd\": container with ID starting with ade67fc75daaf75a5b69f90fb93842c53cb85bb22999b13e9dd0f075f8b6c4bd not found: ID does not exist" Feb 18 11:40:01 crc kubenswrapper[4922]: I0218 11:40:01.915982 4922 scope.go:117] "RemoveContainer" containerID="f96343f5641693f6e288bbd4accc54022881c3b1914c16bca3dd70533cbcb9f1" Feb 18 11:40:01 crc kubenswrapper[4922]: E0218 11:40:01.921224 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f96343f5641693f6e288bbd4accc54022881c3b1914c16bca3dd70533cbcb9f1\": container with ID starting with f96343f5641693f6e288bbd4accc54022881c3b1914c16bca3dd70533cbcb9f1 not found: ID does not exist" containerID="f96343f5641693f6e288bbd4accc54022881c3b1914c16bca3dd70533cbcb9f1" Feb 18 11:40:01 crc kubenswrapper[4922]: I0218 11:40:01.921263 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f96343f5641693f6e288bbd4accc54022881c3b1914c16bca3dd70533cbcb9f1"} err="failed to get container status \"f96343f5641693f6e288bbd4accc54022881c3b1914c16bca3dd70533cbcb9f1\": rpc error: code = NotFound desc = could not find container \"f96343f5641693f6e288bbd4accc54022881c3b1914c16bca3dd70533cbcb9f1\": container with ID starting with f96343f5641693f6e288bbd4accc54022881c3b1914c16bca3dd70533cbcb9f1 not found: ID does not exist" Feb 18 11:40:01 crc kubenswrapper[4922]: I0218 11:40:01.921305 4922 scope.go:117] "RemoveContainer" containerID="26033474e45a010781ba836c11c8f393e1aeb5595fb8120441fc214ed20d9a16" Feb 18 11:40:01 crc kubenswrapper[4922]: E0218 11:40:01.921705 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26033474e45a010781ba836c11c8f393e1aeb5595fb8120441fc214ed20d9a16\": container with ID starting with 26033474e45a010781ba836c11c8f393e1aeb5595fb8120441fc214ed20d9a16 not found: ID does not exist" containerID="26033474e45a010781ba836c11c8f393e1aeb5595fb8120441fc214ed20d9a16" Feb 18 11:40:01 crc kubenswrapper[4922]: I0218 11:40:01.921727 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26033474e45a010781ba836c11c8f393e1aeb5595fb8120441fc214ed20d9a16"} err="failed to get container status \"26033474e45a010781ba836c11c8f393e1aeb5595fb8120441fc214ed20d9a16\": rpc error: code = NotFound desc = could not find container \"26033474e45a010781ba836c11c8f393e1aeb5595fb8120441fc214ed20d9a16\": container with ID starting with 26033474e45a010781ba836c11c8f393e1aeb5595fb8120441fc214ed20d9a16 not found: ID does not exist" Feb 18 11:40:02 crc kubenswrapper[4922]: I0218 11:40:02.493999 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5vjsn" event={"ID":"bf0d2342-e758-43cc-8c89-adc3ceb98453","Type":"ContainerStarted","Data":"bb328f8b6bae7db774e27f7422bc433c58fde944326f82372643b61339bf49bd"} Feb 18 11:40:02 crc kubenswrapper[4922]: I0218 11:40:02.497220 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wg8w6" event={"ID":"80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54","Type":"ContainerStarted","Data":"b5153992a200b7e1d462bdf29a1a95548cb5c21776d32e5caa2c14598b127e2e"} Feb 18 11:40:02 crc kubenswrapper[4922]: I0218 11:40:02.985604 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48ccc67a-6393-4dea-9c00-24bbc55e34d3" path="/var/lib/kubelet/pods/48ccc67a-6393-4dea-9c00-24bbc55e34d3/volumes" Feb 18 11:40:03 crc kubenswrapper[4922]: I0218 11:40:03.505321 4922 generic.go:334] "Generic (PLEG): container finished" podID="bf0d2342-e758-43cc-8c89-adc3ceb98453" containerID="bb328f8b6bae7db774e27f7422bc433c58fde944326f82372643b61339bf49bd" exitCode=0 Feb 18 11:40:03 crc kubenswrapper[4922]: I0218 11:40:03.505459 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5vjsn" event={"ID":"bf0d2342-e758-43cc-8c89-adc3ceb98453","Type":"ContainerDied","Data":"bb328f8b6bae7db774e27f7422bc433c58fde944326f82372643b61339bf49bd"} Feb 18 11:40:03 crc kubenswrapper[4922]: I0218 11:40:03.508313 4922 generic.go:334] "Generic (PLEG): container finished" podID="d0a05956-6087-461d-a271-52db98c6032a" containerID="647b9976eecd46fff2a94f43bff0b109d69e22d4342128856c40ac682b1d18e8" exitCode=0 Feb 18 11:40:03 crc kubenswrapper[4922]: I0218 11:40:03.508395 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nf4nk" event={"ID":"d0a05956-6087-461d-a271-52db98c6032a","Type":"ContainerDied","Data":"647b9976eecd46fff2a94f43bff0b109d69e22d4342128856c40ac682b1d18e8"} Feb 18 11:40:03 crc kubenswrapper[4922]: I0218 11:40:03.525939 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wg8w6" podStartSLOduration=4.3414163 podStartE2EDuration="53.525919109s" podCreationTimestamp="2026-02-18 11:39:10 +0000 UTC" firstStartedPulling="2026-02-18 11:39:12.049587493 +0000 UTC m=+153.777291573" lastFinishedPulling="2026-02-18 11:40:01.234090302 +0000 UTC m=+202.961794382" observedRunningTime="2026-02-18 11:40:02.514264968 +0000 UTC m=+204.241969048" watchObservedRunningTime="2026-02-18 11:40:03.525919109 +0000 UTC m=+205.253623189" Feb 18 11:40:07 crc kubenswrapper[4922]: I0218 11:40:07.205241 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-588bb8688f-kgnf7"] Feb 18 11:40:07 crc kubenswrapper[4922]: I0218 11:40:07.205801 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-588bb8688f-kgnf7" podUID="381bae09-292e-47ab-a85b-eeec711acdd9" containerName="controller-manager" containerID="cri-o://888942b059f1b7ce4402d4d451bbe5ffef2dc01e980cc850af5d6d975e6db9a0" gracePeriod=30 Feb 18 11:40:07 crc kubenswrapper[4922]: I0218 11:40:07.258893 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66bbd976d-c94kz"] Feb 18 11:40:07 crc kubenswrapper[4922]: I0218 11:40:07.259457 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-66bbd976d-c94kz" podUID="1ed4eefa-9d69-468e-b783-af3d0a1e7e75" containerName="route-controller-manager" containerID="cri-o://e286f860b267a30576d94c00bd4a0c76c12231f84c1094cab4113f22c54770c4" gracePeriod=30 Feb 18 11:40:07 crc kubenswrapper[4922]: I0218 11:40:07.521949 4922 patch_prober.go:28] interesting pod/controller-manager-588bb8688f-kgnf7 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: connect: connection refused" start-of-body= Feb 18 11:40:07 crc kubenswrapper[4922]: I0218 11:40:07.522015 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-588bb8688f-kgnf7" podUID="381bae09-292e-47ab-a85b-eeec711acdd9" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: connect: connection refused" Feb 18 11:40:08 crc kubenswrapper[4922]: I0218 11:40:08.546644 4922 generic.go:334] "Generic (PLEG): container finished" podID="381bae09-292e-47ab-a85b-eeec711acdd9" containerID="888942b059f1b7ce4402d4d451bbe5ffef2dc01e980cc850af5d6d975e6db9a0" exitCode=0 Feb 18 11:40:08 crc kubenswrapper[4922]: I0218 11:40:08.546935 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-588bb8688f-kgnf7" event={"ID":"381bae09-292e-47ab-a85b-eeec711acdd9","Type":"ContainerDied","Data":"888942b059f1b7ce4402d4d451bbe5ffef2dc01e980cc850af5d6d975e6db9a0"} Feb 18 11:40:08 crc kubenswrapper[4922]: I0218 11:40:08.548724 4922 generic.go:334] "Generic (PLEG): container finished" podID="1ed4eefa-9d69-468e-b783-af3d0a1e7e75" containerID="e286f860b267a30576d94c00bd4a0c76c12231f84c1094cab4113f22c54770c4" exitCode=0 Feb 18 11:40:08 crc kubenswrapper[4922]: I0218 11:40:08.548813 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66bbd976d-c94kz" event={"ID":"1ed4eefa-9d69-468e-b783-af3d0a1e7e75","Type":"ContainerDied","Data":"e286f860b267a30576d94c00bd4a0c76c12231f84c1094cab4113f22c54770c4"} Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.128573 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66bbd976d-c94kz" Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.167479 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c5f5b6bc4-4xglc"] Feb 18 11:40:09 crc kubenswrapper[4922]: E0218 11:40:09.167657 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48ccc67a-6393-4dea-9c00-24bbc55e34d3" containerName="registry-server" Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.167675 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="48ccc67a-6393-4dea-9c00-24bbc55e34d3" containerName="registry-server" Feb 18 11:40:09 crc kubenswrapper[4922]: E0218 11:40:09.167689 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48ccc67a-6393-4dea-9c00-24bbc55e34d3" containerName="extract-content" Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.167695 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="48ccc67a-6393-4dea-9c00-24bbc55e34d3" containerName="extract-content" Feb 18 11:40:09 crc kubenswrapper[4922]: E0218 11:40:09.167703 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48ccc67a-6393-4dea-9c00-24bbc55e34d3" containerName="extract-utilities" Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.167709 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="48ccc67a-6393-4dea-9c00-24bbc55e34d3" containerName="extract-utilities" Feb 18 11:40:09 crc kubenswrapper[4922]: E0218 11:40:09.167722 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ed4eefa-9d69-468e-b783-af3d0a1e7e75" containerName="route-controller-manager" Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.167728 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ed4eefa-9d69-468e-b783-af3d0a1e7e75" containerName="route-controller-manager" Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.167812 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ed4eefa-9d69-468e-b783-af3d0a1e7e75" containerName="route-controller-manager" Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.167822 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="48ccc67a-6393-4dea-9c00-24bbc55e34d3" containerName="registry-server" Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.168168 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c5f5b6bc4-4xglc" Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.175481 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-588bb8688f-kgnf7" Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.194159 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c5f5b6bc4-4xglc"] Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.310522 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqh2b\" (UniqueName: \"kubernetes.io/projected/381bae09-292e-47ab-a85b-eeec711acdd9-kube-api-access-dqh2b\") pod \"381bae09-292e-47ab-a85b-eeec711acdd9\" (UID: \"381bae09-292e-47ab-a85b-eeec711acdd9\") " Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.310573 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mhdl\" (UniqueName: \"kubernetes.io/projected/1ed4eefa-9d69-468e-b783-af3d0a1e7e75-kube-api-access-7mhdl\") pod \"1ed4eefa-9d69-468e-b783-af3d0a1e7e75\" (UID: \"1ed4eefa-9d69-468e-b783-af3d0a1e7e75\") " Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.310602 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ed4eefa-9d69-468e-b783-af3d0a1e7e75-config\") pod \"1ed4eefa-9d69-468e-b783-af3d0a1e7e75\" (UID: \"1ed4eefa-9d69-468e-b783-af3d0a1e7e75\") " Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.310640 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/381bae09-292e-47ab-a85b-eeec711acdd9-proxy-ca-bundles\") pod \"381bae09-292e-47ab-a85b-eeec711acdd9\" (UID: \"381bae09-292e-47ab-a85b-eeec711acdd9\") " Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.310664 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/381bae09-292e-47ab-a85b-eeec711acdd9-client-ca\") pod \"381bae09-292e-47ab-a85b-eeec711acdd9\" (UID: \"381bae09-292e-47ab-a85b-eeec711acdd9\") " Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.310680 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1ed4eefa-9d69-468e-b783-af3d0a1e7e75-client-ca\") pod \"1ed4eefa-9d69-468e-b783-af3d0a1e7e75\" (UID: \"1ed4eefa-9d69-468e-b783-af3d0a1e7e75\") " Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.310737 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ed4eefa-9d69-468e-b783-af3d0a1e7e75-serving-cert\") pod \"1ed4eefa-9d69-468e-b783-af3d0a1e7e75\" (UID: \"1ed4eefa-9d69-468e-b783-af3d0a1e7e75\") " Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.310754 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/381bae09-292e-47ab-a85b-eeec711acdd9-serving-cert\") pod \"381bae09-292e-47ab-a85b-eeec711acdd9\" (UID: \"381bae09-292e-47ab-a85b-eeec711acdd9\") " Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.310773 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/381bae09-292e-47ab-a85b-eeec711acdd9-config\") pod \"381bae09-292e-47ab-a85b-eeec711acdd9\" (UID: \"381bae09-292e-47ab-a85b-eeec711acdd9\") " Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.310936 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33610a6b-c93e-4578-b8d9-93f5a0dbe1a4-config\") pod \"route-controller-manager-c5f5b6bc4-4xglc\" (UID: \"33610a6b-c93e-4578-b8d9-93f5a0dbe1a4\") " pod="openshift-route-controller-manager/route-controller-manager-c5f5b6bc4-4xglc" Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.310974 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33610a6b-c93e-4578-b8d9-93f5a0dbe1a4-serving-cert\") pod \"route-controller-manager-c5f5b6bc4-4xglc\" (UID: \"33610a6b-c93e-4578-b8d9-93f5a0dbe1a4\") " pod="openshift-route-controller-manager/route-controller-manager-c5f5b6bc4-4xglc" Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.311001 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdp7v\" (UniqueName: \"kubernetes.io/projected/33610a6b-c93e-4578-b8d9-93f5a0dbe1a4-kube-api-access-bdp7v\") pod \"route-controller-manager-c5f5b6bc4-4xglc\" (UID: \"33610a6b-c93e-4578-b8d9-93f5a0dbe1a4\") " pod="openshift-route-controller-manager/route-controller-manager-c5f5b6bc4-4xglc" Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.311049 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/33610a6b-c93e-4578-b8d9-93f5a0dbe1a4-client-ca\") pod \"route-controller-manager-c5f5b6bc4-4xglc\" (UID: \"33610a6b-c93e-4578-b8d9-93f5a0dbe1a4\") " pod="openshift-route-controller-manager/route-controller-manager-c5f5b6bc4-4xglc" Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.316102 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/381bae09-292e-47ab-a85b-eeec711acdd9-kube-api-access-dqh2b" (OuterVolumeSpecName: "kube-api-access-dqh2b") pod "381bae09-292e-47ab-a85b-eeec711acdd9" (UID: "381bae09-292e-47ab-a85b-eeec711acdd9"). InnerVolumeSpecName "kube-api-access-dqh2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.316181 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ed4eefa-9d69-468e-b783-af3d0a1e7e75-kube-api-access-7mhdl" (OuterVolumeSpecName: "kube-api-access-7mhdl") pod "1ed4eefa-9d69-468e-b783-af3d0a1e7e75" (UID: "1ed4eefa-9d69-468e-b783-af3d0a1e7e75"). InnerVolumeSpecName "kube-api-access-7mhdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.316584 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/381bae09-292e-47ab-a85b-eeec711acdd9-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "381bae09-292e-47ab-a85b-eeec711acdd9" (UID: "381bae09-292e-47ab-a85b-eeec711acdd9"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.316599 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/381bae09-292e-47ab-a85b-eeec711acdd9-client-ca" (OuterVolumeSpecName: "client-ca") pod "381bae09-292e-47ab-a85b-eeec711acdd9" (UID: "381bae09-292e-47ab-a85b-eeec711acdd9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.317063 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ed4eefa-9d69-468e-b783-af3d0a1e7e75-client-ca" (OuterVolumeSpecName: "client-ca") pod "1ed4eefa-9d69-468e-b783-af3d0a1e7e75" (UID: "1ed4eefa-9d69-468e-b783-af3d0a1e7e75"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.317189 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ed4eefa-9d69-468e-b783-af3d0a1e7e75-config" (OuterVolumeSpecName: "config") pod "1ed4eefa-9d69-468e-b783-af3d0a1e7e75" (UID: "1ed4eefa-9d69-468e-b783-af3d0a1e7e75"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.317220 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/381bae09-292e-47ab-a85b-eeec711acdd9-config" (OuterVolumeSpecName: "config") pod "381bae09-292e-47ab-a85b-eeec711acdd9" (UID: "381bae09-292e-47ab-a85b-eeec711acdd9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.318773 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/381bae09-292e-47ab-a85b-eeec711acdd9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "381bae09-292e-47ab-a85b-eeec711acdd9" (UID: "381bae09-292e-47ab-a85b-eeec711acdd9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.319249 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ed4eefa-9d69-468e-b783-af3d0a1e7e75-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1ed4eefa-9d69-468e-b783-af3d0a1e7e75" (UID: "1ed4eefa-9d69-468e-b783-af3d0a1e7e75"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.412439 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/33610a6b-c93e-4578-b8d9-93f5a0dbe1a4-client-ca\") pod \"route-controller-manager-c5f5b6bc4-4xglc\" (UID: \"33610a6b-c93e-4578-b8d9-93f5a0dbe1a4\") " pod="openshift-route-controller-manager/route-controller-manager-c5f5b6bc4-4xglc" Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.412519 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33610a6b-c93e-4578-b8d9-93f5a0dbe1a4-config\") pod \"route-controller-manager-c5f5b6bc4-4xglc\" (UID: \"33610a6b-c93e-4578-b8d9-93f5a0dbe1a4\") " pod="openshift-route-controller-manager/route-controller-manager-c5f5b6bc4-4xglc" Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.412567 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33610a6b-c93e-4578-b8d9-93f5a0dbe1a4-serving-cert\") pod \"route-controller-manager-c5f5b6bc4-4xglc\" (UID: \"33610a6b-c93e-4578-b8d9-93f5a0dbe1a4\") " pod="openshift-route-controller-manager/route-controller-manager-c5f5b6bc4-4xglc" Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.412609 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdp7v\" (UniqueName: \"kubernetes.io/projected/33610a6b-c93e-4578-b8d9-93f5a0dbe1a4-kube-api-access-bdp7v\") pod \"route-controller-manager-c5f5b6bc4-4xglc\" (UID: \"33610a6b-c93e-4578-b8d9-93f5a0dbe1a4\") " pod="openshift-route-controller-manager/route-controller-manager-c5f5b6bc4-4xglc" Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.412674 4922 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/381bae09-292e-47ab-a85b-eeec711acdd9-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.412688 4922 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/381bae09-292e-47ab-a85b-eeec711acdd9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.412697 4922 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1ed4eefa-9d69-468e-b783-af3d0a1e7e75-client-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.412706 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/381bae09-292e-47ab-a85b-eeec711acdd9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.412714 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ed4eefa-9d69-468e-b783-af3d0a1e7e75-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.412723 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/381bae09-292e-47ab-a85b-eeec711acdd9-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.412733 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqh2b\" (UniqueName: \"kubernetes.io/projected/381bae09-292e-47ab-a85b-eeec711acdd9-kube-api-access-dqh2b\") on node \"crc\" DevicePath \"\"" Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.412744 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mhdl\" (UniqueName: \"kubernetes.io/projected/1ed4eefa-9d69-468e-b783-af3d0a1e7e75-kube-api-access-7mhdl\") on node \"crc\" DevicePath \"\"" Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.412753 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ed4eefa-9d69-468e-b783-af3d0a1e7e75-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.413813 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33610a6b-c93e-4578-b8d9-93f5a0dbe1a4-config\") pod \"route-controller-manager-c5f5b6bc4-4xglc\" (UID: \"33610a6b-c93e-4578-b8d9-93f5a0dbe1a4\") " pod="openshift-route-controller-manager/route-controller-manager-c5f5b6bc4-4xglc" Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.414185 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/33610a6b-c93e-4578-b8d9-93f5a0dbe1a4-client-ca\") pod \"route-controller-manager-c5f5b6bc4-4xglc\" (UID: \"33610a6b-c93e-4578-b8d9-93f5a0dbe1a4\") " pod="openshift-route-controller-manager/route-controller-manager-c5f5b6bc4-4xglc" Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.417948 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33610a6b-c93e-4578-b8d9-93f5a0dbe1a4-serving-cert\") pod \"route-controller-manager-c5f5b6bc4-4xglc\" (UID: \"33610a6b-c93e-4578-b8d9-93f5a0dbe1a4\") " pod="openshift-route-controller-manager/route-controller-manager-c5f5b6bc4-4xglc" Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.433706 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdp7v\" (UniqueName: \"kubernetes.io/projected/33610a6b-c93e-4578-b8d9-93f5a0dbe1a4-kube-api-access-bdp7v\") pod \"route-controller-manager-c5f5b6bc4-4xglc\" (UID: \"33610a6b-c93e-4578-b8d9-93f5a0dbe1a4\") " pod="openshift-route-controller-manager/route-controller-manager-c5f5b6bc4-4xglc" Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.493760 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c5f5b6bc4-4xglc" Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.555242 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-588bb8688f-kgnf7" event={"ID":"381bae09-292e-47ab-a85b-eeec711acdd9","Type":"ContainerDied","Data":"24eefe81e8715469a6d268edf7354dca96b3d7cb46067661d42653e51f19e9c9"} Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.555323 4922 scope.go:117] "RemoveContainer" containerID="888942b059f1b7ce4402d4d451bbe5ffef2dc01e980cc850af5d6d975e6db9a0" Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.555269 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-588bb8688f-kgnf7" Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.557736 4922 generic.go:334] "Generic (PLEG): container finished" podID="fc9b41f8-ac9b-4166-a2a6-80326e19254a" containerID="e63e35cde0feed6b8593caa8621cdbf494c204e6b29cfbb54918e219b2da13b4" exitCode=0 Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.557923 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lcnjk" event={"ID":"fc9b41f8-ac9b-4166-a2a6-80326e19254a","Type":"ContainerDied","Data":"e63e35cde0feed6b8593caa8621cdbf494c204e6b29cfbb54918e219b2da13b4"} Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.565169 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5vjsn" event={"ID":"bf0d2342-e758-43cc-8c89-adc3ceb98453","Type":"ContainerStarted","Data":"7540772acba64ca21dc3279082ac26db8226d215a021ff0397854fc0d4e90178"} Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.573397 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nf4nk" event={"ID":"d0a05956-6087-461d-a271-52db98c6032a","Type":"ContainerStarted","Data":"f8ca4a530f7e5d39796311f5abf7b45f579aa1e9d87cb174faf9dad00bb56f6b"} Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.577634 4922 generic.go:334] "Generic (PLEG): container finished" podID="9cddee0a-8b13-429b-89b6-e820f8f3ec59" containerID="0cb6e0bde2199b61ebae0ee2265108d0d8e55760b5cb6ca0dac142b8c2a65181" exitCode=0 Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.577709 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5lflw" event={"ID":"9cddee0a-8b13-429b-89b6-e820f8f3ec59","Type":"ContainerDied","Data":"0cb6e0bde2199b61ebae0ee2265108d0d8e55760b5cb6ca0dac142b8c2a65181"} Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.583582 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66bbd976d-c94kz" event={"ID":"1ed4eefa-9d69-468e-b783-af3d0a1e7e75","Type":"ContainerDied","Data":"689bf74da1dea7687a9978545b709d3db28e7791ae16a26226addae8a1619d61"} Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.583642 4922 scope.go:117] "RemoveContainer" containerID="e286f860b267a30576d94c00bd4a0c76c12231f84c1094cab4113f22c54770c4" Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.583790 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66bbd976d-c94kz" Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.626167 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5vjsn" podStartSLOduration=2.845171413 podStartE2EDuration="1m1.626147721s" podCreationTimestamp="2026-02-18 11:39:08 +0000 UTC" firstStartedPulling="2026-02-18 11:39:09.998674541 +0000 UTC m=+151.726378621" lastFinishedPulling="2026-02-18 11:40:08.779650859 +0000 UTC m=+210.507354929" observedRunningTime="2026-02-18 11:40:09.622722919 +0000 UTC m=+211.350426999" watchObservedRunningTime="2026-02-18 11:40:09.626147721 +0000 UTC m=+211.353851811" Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.648899 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-588bb8688f-kgnf7"] Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.663227 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-588bb8688f-kgnf7"] Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.663327 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nf4nk" Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.663392 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nf4nk" Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.672352 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nf4nk" podStartSLOduration=2.926752521 podStartE2EDuration="1m0.672329261s" podCreationTimestamp="2026-02-18 11:39:09 +0000 UTC" firstStartedPulling="2026-02-18 11:39:11.031483733 +0000 UTC m=+152.759187813" lastFinishedPulling="2026-02-18 11:40:08.777060473 +0000 UTC m=+210.504764553" observedRunningTime="2026-02-18 11:40:09.655291725 +0000 UTC m=+211.382995795" watchObservedRunningTime="2026-02-18 11:40:09.672329261 +0000 UTC m=+211.400033351" Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.677489 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66bbd976d-c94kz"] Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.682487 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66bbd976d-c94kz"] Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.761601 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c5f5b6bc4-4xglc"] Feb 18 11:40:09 crc kubenswrapper[4922]: W0218 11:40:09.767688 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33610a6b_c93e_4578_b8d9_93f5a0dbe1a4.slice/crio-45dfbf68b12565a4b195c299a902d800748f462bcdfa1500e5b1e3f3b967157e WatchSource:0}: Error finding container 45dfbf68b12565a4b195c299a902d800748f462bcdfa1500e5b1e3f3b967157e: Status 404 returned error can't find the container with id 45dfbf68b12565a4b195c299a902d800748f462bcdfa1500e5b1e3f3b967157e Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.807935 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.808006 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.808060 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-znglx" Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.808773 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"653a3be8615ffc467d4be70b77ab62c837c24df920515a9965ec20ce2941d86b"} pod="openshift-machine-config-operator/machine-config-daemon-znglx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 11:40:09 crc kubenswrapper[4922]: I0218 11:40:09.808830 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" containerID="cri-o://653a3be8615ffc467d4be70b77ab62c837c24df920515a9965ec20ce2941d86b" gracePeriod=600 Feb 18 11:40:10 crc kubenswrapper[4922]: I0218 11:40:10.600795 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c5f5b6bc4-4xglc" event={"ID":"33610a6b-c93e-4578-b8d9-93f5a0dbe1a4","Type":"ContainerStarted","Data":"bebc46c7f9271b765da6413fad509e6ec5685b8951728d70348b293df49ce847"} Feb 18 11:40:10 crc kubenswrapper[4922]: I0218 11:40:10.600839 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c5f5b6bc4-4xglc" event={"ID":"33610a6b-c93e-4578-b8d9-93f5a0dbe1a4","Type":"ContainerStarted","Data":"45dfbf68b12565a4b195c299a902d800748f462bcdfa1500e5b1e3f3b967157e"} Feb 18 11:40:10 crc kubenswrapper[4922]: I0218 11:40:10.726995 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-nf4nk" podUID="d0a05956-6087-461d-a271-52db98c6032a" containerName="registry-server" probeResult="failure" output=< Feb 18 11:40:10 crc kubenswrapper[4922]: timeout: failed to connect service ":50051" within 1s Feb 18 11:40:10 crc kubenswrapper[4922]: > Feb 18 11:40:10 crc kubenswrapper[4922]: I0218 11:40:10.981526 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ed4eefa-9d69-468e-b783-af3d0a1e7e75" path="/var/lib/kubelet/pods/1ed4eefa-9d69-468e-b783-af3d0a1e7e75/volumes" Feb 18 11:40:10 crc kubenswrapper[4922]: I0218 11:40:10.982770 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="381bae09-292e-47ab-a85b-eeec711acdd9" path="/var/lib/kubelet/pods/381bae09-292e-47ab-a85b-eeec711acdd9/volumes" Feb 18 11:40:10 crc kubenswrapper[4922]: I0218 11:40:10.994791 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wg8w6" Feb 18 11:40:10 crc kubenswrapper[4922]: I0218 11:40:10.994999 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wg8w6" Feb 18 11:40:11 crc kubenswrapper[4922]: I0218 11:40:11.039574 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wg8w6" Feb 18 11:40:11 crc kubenswrapper[4922]: I0218 11:40:11.611804 4922 generic.go:334] "Generic (PLEG): container finished" podID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerID="653a3be8615ffc467d4be70b77ab62c837c24df920515a9965ec20ce2941d86b" exitCode=0 Feb 18 11:40:11 crc kubenswrapper[4922]: I0218 11:40:11.611932 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" event={"ID":"fdb7cedc-b2e3-48f0-80e0-e17073b43228","Type":"ContainerDied","Data":"653a3be8615ffc467d4be70b77ab62c837c24df920515a9965ec20ce2941d86b"} Feb 18 11:40:11 crc kubenswrapper[4922]: I0218 11:40:11.612213 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" event={"ID":"fdb7cedc-b2e3-48f0-80e0-e17073b43228","Type":"ContainerStarted","Data":"f1a0cd1a4059d80457c2ea470bb511d4a674a1cbfe4b38356d0636a94629e11f"} Feb 18 11:40:11 crc kubenswrapper[4922]: I0218 11:40:11.619174 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lcnjk" event={"ID":"fc9b41f8-ac9b-4166-a2a6-80326e19254a","Type":"ContainerStarted","Data":"99eb33433ccb75dafa75d7dcda595a86756a6a3d18a2403102396dc29b8e05f3"} Feb 18 11:40:11 crc kubenswrapper[4922]: I0218 11:40:11.623676 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-c5f5b6bc4-4xglc" Feb 18 11:40:11 crc kubenswrapper[4922]: I0218 11:40:11.627827 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-c5f5b6bc4-4xglc" Feb 18 11:40:11 crc kubenswrapper[4922]: I0218 11:40:11.656607 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-c5f5b6bc4-4xglc" podStartSLOduration=4.656579463 podStartE2EDuration="4.656579463s" podCreationTimestamp="2026-02-18 11:40:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:40:11.654567153 +0000 UTC m=+213.382271243" watchObservedRunningTime="2026-02-18 11:40:11.656579463 +0000 UTC m=+213.384283543" Feb 18 11:40:11 crc kubenswrapper[4922]: I0218 11:40:11.686207 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lcnjk" podStartSLOduration=2.206283972 podStartE2EDuration="1m4.686189241s" podCreationTimestamp="2026-02-18 11:39:07 +0000 UTC" firstStartedPulling="2026-02-18 11:39:08.91292146 +0000 UTC m=+150.640625540" lastFinishedPulling="2026-02-18 11:40:11.392826729 +0000 UTC m=+213.120530809" observedRunningTime="2026-02-18 11:40:11.684915364 +0000 UTC m=+213.412619454" watchObservedRunningTime="2026-02-18 11:40:11.686189241 +0000 UTC m=+213.413893321" Feb 18 11:40:11 crc kubenswrapper[4922]: I0218 11:40:11.689629 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wg8w6" Feb 18 11:40:11 crc kubenswrapper[4922]: I0218 11:40:11.971437 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5cc84f956-ltmd7"] Feb 18 11:40:11 crc kubenswrapper[4922]: E0218 11:40:11.972061 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="381bae09-292e-47ab-a85b-eeec711acdd9" containerName="controller-manager" Feb 18 11:40:11 crc kubenswrapper[4922]: I0218 11:40:11.972081 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="381bae09-292e-47ab-a85b-eeec711acdd9" containerName="controller-manager" Feb 18 11:40:11 crc kubenswrapper[4922]: I0218 11:40:11.972207 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="381bae09-292e-47ab-a85b-eeec711acdd9" containerName="controller-manager" Feb 18 11:40:11 crc kubenswrapper[4922]: I0218 11:40:11.972943 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5cc84f956-ltmd7" Feb 18 11:40:11 crc kubenswrapper[4922]: I0218 11:40:11.975065 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 18 11:40:11 crc kubenswrapper[4922]: I0218 11:40:11.975112 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 18 11:40:11 crc kubenswrapper[4922]: I0218 11:40:11.975628 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 18 11:40:11 crc kubenswrapper[4922]: I0218 11:40:11.975775 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 18 11:40:11 crc kubenswrapper[4922]: I0218 11:40:11.977828 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 18 11:40:11 crc kubenswrapper[4922]: I0218 11:40:11.978322 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 18 11:40:11 crc kubenswrapper[4922]: I0218 11:40:11.989197 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 18 11:40:11 crc kubenswrapper[4922]: I0218 11:40:11.995999 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5cc84f956-ltmd7"] Feb 18 11:40:12 crc kubenswrapper[4922]: I0218 11:40:12.054302 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72ad78b5-bc12-488f-aab9-869895d67ce8-serving-cert\") pod \"controller-manager-5cc84f956-ltmd7\" (UID: \"72ad78b5-bc12-488f-aab9-869895d67ce8\") " pod="openshift-controller-manager/controller-manager-5cc84f956-ltmd7" Feb 18 11:40:12 crc kubenswrapper[4922]: I0218 11:40:12.054369 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/72ad78b5-bc12-488f-aab9-869895d67ce8-proxy-ca-bundles\") pod \"controller-manager-5cc84f956-ltmd7\" (UID: \"72ad78b5-bc12-488f-aab9-869895d67ce8\") " pod="openshift-controller-manager/controller-manager-5cc84f956-ltmd7" Feb 18 11:40:12 crc kubenswrapper[4922]: I0218 11:40:12.054441 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72ad78b5-bc12-488f-aab9-869895d67ce8-config\") pod \"controller-manager-5cc84f956-ltmd7\" (UID: \"72ad78b5-bc12-488f-aab9-869895d67ce8\") " pod="openshift-controller-manager/controller-manager-5cc84f956-ltmd7" Feb 18 11:40:12 crc kubenswrapper[4922]: I0218 11:40:12.054482 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjcrd\" (UniqueName: \"kubernetes.io/projected/72ad78b5-bc12-488f-aab9-869895d67ce8-kube-api-access-kjcrd\") pod \"controller-manager-5cc84f956-ltmd7\" (UID: \"72ad78b5-bc12-488f-aab9-869895d67ce8\") " pod="openshift-controller-manager/controller-manager-5cc84f956-ltmd7" Feb 18 11:40:12 crc kubenswrapper[4922]: I0218 11:40:12.054533 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/72ad78b5-bc12-488f-aab9-869895d67ce8-client-ca\") pod \"controller-manager-5cc84f956-ltmd7\" (UID: \"72ad78b5-bc12-488f-aab9-869895d67ce8\") " pod="openshift-controller-manager/controller-manager-5cc84f956-ltmd7" Feb 18 11:40:12 crc kubenswrapper[4922]: I0218 11:40:12.155524 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72ad78b5-bc12-488f-aab9-869895d67ce8-serving-cert\") pod \"controller-manager-5cc84f956-ltmd7\" (UID: \"72ad78b5-bc12-488f-aab9-869895d67ce8\") " pod="openshift-controller-manager/controller-manager-5cc84f956-ltmd7" Feb 18 11:40:12 crc kubenswrapper[4922]: I0218 11:40:12.155856 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/72ad78b5-bc12-488f-aab9-869895d67ce8-proxy-ca-bundles\") pod \"controller-manager-5cc84f956-ltmd7\" (UID: \"72ad78b5-bc12-488f-aab9-869895d67ce8\") " pod="openshift-controller-manager/controller-manager-5cc84f956-ltmd7" Feb 18 11:40:12 crc kubenswrapper[4922]: I0218 11:40:12.155969 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72ad78b5-bc12-488f-aab9-869895d67ce8-config\") pod \"controller-manager-5cc84f956-ltmd7\" (UID: \"72ad78b5-bc12-488f-aab9-869895d67ce8\") " pod="openshift-controller-manager/controller-manager-5cc84f956-ltmd7" Feb 18 11:40:12 crc kubenswrapper[4922]: I0218 11:40:12.156104 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjcrd\" (UniqueName: \"kubernetes.io/projected/72ad78b5-bc12-488f-aab9-869895d67ce8-kube-api-access-kjcrd\") pod \"controller-manager-5cc84f956-ltmd7\" (UID: \"72ad78b5-bc12-488f-aab9-869895d67ce8\") " pod="openshift-controller-manager/controller-manager-5cc84f956-ltmd7" Feb 18 11:40:12 crc kubenswrapper[4922]: I0218 11:40:12.157085 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/72ad78b5-bc12-488f-aab9-869895d67ce8-proxy-ca-bundles\") pod \"controller-manager-5cc84f956-ltmd7\" (UID: \"72ad78b5-bc12-488f-aab9-869895d67ce8\") " pod="openshift-controller-manager/controller-manager-5cc84f956-ltmd7" Feb 18 11:40:12 crc kubenswrapper[4922]: I0218 11:40:12.157201 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72ad78b5-bc12-488f-aab9-869895d67ce8-config\") pod \"controller-manager-5cc84f956-ltmd7\" (UID: \"72ad78b5-bc12-488f-aab9-869895d67ce8\") " pod="openshift-controller-manager/controller-manager-5cc84f956-ltmd7" Feb 18 11:40:12 crc kubenswrapper[4922]: I0218 11:40:12.157294 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/72ad78b5-bc12-488f-aab9-869895d67ce8-client-ca\") pod \"controller-manager-5cc84f956-ltmd7\" (UID: \"72ad78b5-bc12-488f-aab9-869895d67ce8\") " pod="openshift-controller-manager/controller-manager-5cc84f956-ltmd7" Feb 18 11:40:12 crc kubenswrapper[4922]: I0218 11:40:12.157845 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/72ad78b5-bc12-488f-aab9-869895d67ce8-client-ca\") pod \"controller-manager-5cc84f956-ltmd7\" (UID: \"72ad78b5-bc12-488f-aab9-869895d67ce8\") " pod="openshift-controller-manager/controller-manager-5cc84f956-ltmd7" Feb 18 11:40:12 crc kubenswrapper[4922]: I0218 11:40:12.162816 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72ad78b5-bc12-488f-aab9-869895d67ce8-serving-cert\") pod \"controller-manager-5cc84f956-ltmd7\" (UID: \"72ad78b5-bc12-488f-aab9-869895d67ce8\") " pod="openshift-controller-manager/controller-manager-5cc84f956-ltmd7" Feb 18 11:40:12 crc kubenswrapper[4922]: I0218 11:40:12.179577 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjcrd\" (UniqueName: \"kubernetes.io/projected/72ad78b5-bc12-488f-aab9-869895d67ce8-kube-api-access-kjcrd\") pod \"controller-manager-5cc84f956-ltmd7\" (UID: \"72ad78b5-bc12-488f-aab9-869895d67ce8\") " pod="openshift-controller-manager/controller-manager-5cc84f956-ltmd7" Feb 18 11:40:12 crc kubenswrapper[4922]: I0218 11:40:12.288344 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5cc84f956-ltmd7" Feb 18 11:40:12 crc kubenswrapper[4922]: I0218 11:40:12.500944 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5cc84f956-ltmd7"] Feb 18 11:40:12 crc kubenswrapper[4922]: W0218 11:40:12.506208 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72ad78b5_bc12_488f_aab9_869895d67ce8.slice/crio-9ec1524a75d8bd2f1625be98e138e6cf305c4ac3d285301d8e45c5b5ccd89829 WatchSource:0}: Error finding container 9ec1524a75d8bd2f1625be98e138e6cf305c4ac3d285301d8e45c5b5ccd89829: Status 404 returned error can't find the container with id 9ec1524a75d8bd2f1625be98e138e6cf305c4ac3d285301d8e45c5b5ccd89829 Feb 18 11:40:12 crc kubenswrapper[4922]: I0218 11:40:12.631524 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5lflw" event={"ID":"9cddee0a-8b13-429b-89b6-e820f8f3ec59","Type":"ContainerStarted","Data":"3095825ff9ad7d2de643243385e47b2af86046b60063ce1e20f19e93ab18808b"} Feb 18 11:40:12 crc kubenswrapper[4922]: I0218 11:40:12.633061 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5cc84f956-ltmd7" event={"ID":"72ad78b5-bc12-488f-aab9-869895d67ce8","Type":"ContainerStarted","Data":"9ec1524a75d8bd2f1625be98e138e6cf305c4ac3d285301d8e45c5b5ccd89829"} Feb 18 11:40:12 crc kubenswrapper[4922]: I0218 11:40:12.655433 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5lflw" podStartSLOduration=4.137952085 podStartE2EDuration="1m6.655408254s" podCreationTimestamp="2026-02-18 11:39:06 +0000 UTC" firstStartedPulling="2026-02-18 11:39:08.942073978 +0000 UTC m=+150.669778058" lastFinishedPulling="2026-02-18 11:40:11.459530137 +0000 UTC m=+213.187234227" observedRunningTime="2026-02-18 11:40:12.651866838 +0000 UTC m=+214.379570918" watchObservedRunningTime="2026-02-18 11:40:12.655408254 +0000 UTC m=+214.383112364" Feb 18 11:40:13 crc kubenswrapper[4922]: I0218 11:40:13.600443 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wg8w6"] Feb 18 11:40:13 crc kubenswrapper[4922]: I0218 11:40:13.640471 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5cc84f956-ltmd7" event={"ID":"72ad78b5-bc12-488f-aab9-869895d67ce8","Type":"ContainerStarted","Data":"c09c49cb94f646993c2a4b0fda6fbdd7f540489b4fa899ee2bacffa6390fbbe3"} Feb 18 11:40:13 crc kubenswrapper[4922]: I0218 11:40:13.663830 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5cc84f956-ltmd7" podStartSLOduration=6.6638067979999995 podStartE2EDuration="6.663806798s" podCreationTimestamp="2026-02-18 11:40:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:40:13.661024195 +0000 UTC m=+215.388728295" watchObservedRunningTime="2026-02-18 11:40:13.663806798 +0000 UTC m=+215.391510888" Feb 18 11:40:14 crc kubenswrapper[4922]: I0218 11:40:14.646972 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5cc84f956-ltmd7" Feb 18 11:40:14 crc kubenswrapper[4922]: I0218 11:40:14.646988 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wg8w6" podUID="80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54" containerName="registry-server" containerID="cri-o://b5153992a200b7e1d462bdf29a1a95548cb5c21776d32e5caa2c14598b127e2e" gracePeriod=2 Feb 18 11:40:14 crc kubenswrapper[4922]: I0218 11:40:14.653757 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5cc84f956-ltmd7" Feb 18 11:40:16 crc kubenswrapper[4922]: I0218 11:40:16.376979 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wg8w6" Feb 18 11:40:16 crc kubenswrapper[4922]: I0218 11:40:16.429115 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54-catalog-content\") pod \"80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54\" (UID: \"80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54\") " Feb 18 11:40:16 crc kubenswrapper[4922]: I0218 11:40:16.429176 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8c89j\" (UniqueName: \"kubernetes.io/projected/80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54-kube-api-access-8c89j\") pod \"80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54\" (UID: \"80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54\") " Feb 18 11:40:16 crc kubenswrapper[4922]: I0218 11:40:16.429257 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54-utilities\") pod \"80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54\" (UID: \"80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54\") " Feb 18 11:40:16 crc kubenswrapper[4922]: I0218 11:40:16.430102 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54-utilities" (OuterVolumeSpecName: "utilities") pod "80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54" (UID: "80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:40:16 crc kubenswrapper[4922]: I0218 11:40:16.435072 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54-kube-api-access-8c89j" (OuterVolumeSpecName: "kube-api-access-8c89j") pod "80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54" (UID: "80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54"). InnerVolumeSpecName "kube-api-access-8c89j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:40:16 crc kubenswrapper[4922]: I0218 11:40:16.531385 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 11:40:16 crc kubenswrapper[4922]: I0218 11:40:16.531441 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8c89j\" (UniqueName: \"kubernetes.io/projected/80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54-kube-api-access-8c89j\") on node \"crc\" DevicePath \"\"" Feb 18 11:40:16 crc kubenswrapper[4922]: I0218 11:40:16.553830 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54" (UID: "80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:40:16 crc kubenswrapper[4922]: I0218 11:40:16.632706 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 11:40:16 crc kubenswrapper[4922]: I0218 11:40:16.659637 4922 generic.go:334] "Generic (PLEG): container finished" podID="80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54" containerID="b5153992a200b7e1d462bdf29a1a95548cb5c21776d32e5caa2c14598b127e2e" exitCode=0 Feb 18 11:40:16 crc kubenswrapper[4922]: I0218 11:40:16.659709 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wg8w6" Feb 18 11:40:16 crc kubenswrapper[4922]: I0218 11:40:16.659733 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wg8w6" event={"ID":"80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54","Type":"ContainerDied","Data":"b5153992a200b7e1d462bdf29a1a95548cb5c21776d32e5caa2c14598b127e2e"} Feb 18 11:40:16 crc kubenswrapper[4922]: I0218 11:40:16.659777 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wg8w6" event={"ID":"80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54","Type":"ContainerDied","Data":"25d2b5e0e4b5a6b68be1bcb304a04f4b6084a6c0bdb0404fc5532afe958fbbd6"} Feb 18 11:40:16 crc kubenswrapper[4922]: I0218 11:40:16.659796 4922 scope.go:117] "RemoveContainer" containerID="b5153992a200b7e1d462bdf29a1a95548cb5c21776d32e5caa2c14598b127e2e" Feb 18 11:40:16 crc kubenswrapper[4922]: I0218 11:40:16.677034 4922 scope.go:117] "RemoveContainer" containerID="88bed4d626019726ebb91e748414ebf1fc49f67cf72a95e0e09268ee83ee0d6b" Feb 18 11:40:16 crc kubenswrapper[4922]: I0218 11:40:16.695692 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wg8w6"] Feb 18 11:40:16 crc kubenswrapper[4922]: I0218 11:40:16.698904 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wg8w6"] Feb 18 11:40:16 crc kubenswrapper[4922]: I0218 11:40:16.707650 4922 scope.go:117] "RemoveContainer" containerID="8dd3aff6ccf90bcc5b2f3c4b1fa26fee9db4eb20a75dd95c5b684cb6dd48e777" Feb 18 11:40:16 crc kubenswrapper[4922]: I0218 11:40:16.729177 4922 scope.go:117] "RemoveContainer" containerID="b5153992a200b7e1d462bdf29a1a95548cb5c21776d32e5caa2c14598b127e2e" Feb 18 11:40:16 crc kubenswrapper[4922]: E0218 11:40:16.730243 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5153992a200b7e1d462bdf29a1a95548cb5c21776d32e5caa2c14598b127e2e\": container with ID starting with b5153992a200b7e1d462bdf29a1a95548cb5c21776d32e5caa2c14598b127e2e not found: ID does not exist" containerID="b5153992a200b7e1d462bdf29a1a95548cb5c21776d32e5caa2c14598b127e2e" Feb 18 11:40:16 crc kubenswrapper[4922]: I0218 11:40:16.730308 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5153992a200b7e1d462bdf29a1a95548cb5c21776d32e5caa2c14598b127e2e"} err="failed to get container status \"b5153992a200b7e1d462bdf29a1a95548cb5c21776d32e5caa2c14598b127e2e\": rpc error: code = NotFound desc = could not find container \"b5153992a200b7e1d462bdf29a1a95548cb5c21776d32e5caa2c14598b127e2e\": container with ID starting with b5153992a200b7e1d462bdf29a1a95548cb5c21776d32e5caa2c14598b127e2e not found: ID does not exist" Feb 18 11:40:16 crc kubenswrapper[4922]: I0218 11:40:16.730342 4922 scope.go:117] "RemoveContainer" containerID="88bed4d626019726ebb91e748414ebf1fc49f67cf72a95e0e09268ee83ee0d6b" Feb 18 11:40:16 crc kubenswrapper[4922]: E0218 11:40:16.730694 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88bed4d626019726ebb91e748414ebf1fc49f67cf72a95e0e09268ee83ee0d6b\": container with ID starting with 88bed4d626019726ebb91e748414ebf1fc49f67cf72a95e0e09268ee83ee0d6b not found: ID does not exist" containerID="88bed4d626019726ebb91e748414ebf1fc49f67cf72a95e0e09268ee83ee0d6b" Feb 18 11:40:16 crc kubenswrapper[4922]: I0218 11:40:16.731032 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88bed4d626019726ebb91e748414ebf1fc49f67cf72a95e0e09268ee83ee0d6b"} err="failed to get container status \"88bed4d626019726ebb91e748414ebf1fc49f67cf72a95e0e09268ee83ee0d6b\": rpc error: code = NotFound desc = could not find container \"88bed4d626019726ebb91e748414ebf1fc49f67cf72a95e0e09268ee83ee0d6b\": container with ID starting with 88bed4d626019726ebb91e748414ebf1fc49f67cf72a95e0e09268ee83ee0d6b not found: ID does not exist" Feb 18 11:40:16 crc kubenswrapper[4922]: I0218 11:40:16.731071 4922 scope.go:117] "RemoveContainer" containerID="8dd3aff6ccf90bcc5b2f3c4b1fa26fee9db4eb20a75dd95c5b684cb6dd48e777" Feb 18 11:40:16 crc kubenswrapper[4922]: E0218 11:40:16.731354 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8dd3aff6ccf90bcc5b2f3c4b1fa26fee9db4eb20a75dd95c5b684cb6dd48e777\": container with ID starting with 8dd3aff6ccf90bcc5b2f3c4b1fa26fee9db4eb20a75dd95c5b684cb6dd48e777 not found: ID does not exist" containerID="8dd3aff6ccf90bcc5b2f3c4b1fa26fee9db4eb20a75dd95c5b684cb6dd48e777" Feb 18 11:40:16 crc kubenswrapper[4922]: I0218 11:40:16.731402 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dd3aff6ccf90bcc5b2f3c4b1fa26fee9db4eb20a75dd95c5b684cb6dd48e777"} err="failed to get container status \"8dd3aff6ccf90bcc5b2f3c4b1fa26fee9db4eb20a75dd95c5b684cb6dd48e777\": rpc error: code = NotFound desc = could not find container \"8dd3aff6ccf90bcc5b2f3c4b1fa26fee9db4eb20a75dd95c5b684cb6dd48e777\": container with ID starting with 8dd3aff6ccf90bcc5b2f3c4b1fa26fee9db4eb20a75dd95c5b684cb6dd48e777 not found: ID does not exist" Feb 18 11:40:16 crc kubenswrapper[4922]: I0218 11:40:16.982600 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54" path="/var/lib/kubelet/pods/80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54/volumes" Feb 18 11:40:17 crc kubenswrapper[4922]: I0218 11:40:17.252886 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5lflw" Feb 18 11:40:17 crc kubenswrapper[4922]: I0218 11:40:17.253165 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5lflw" Feb 18 11:40:17 crc kubenswrapper[4922]: I0218 11:40:17.309979 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5lflw" Feb 18 11:40:17 crc kubenswrapper[4922]: I0218 11:40:17.629119 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lcnjk" Feb 18 11:40:17 crc kubenswrapper[4922]: I0218 11:40:17.629525 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lcnjk" Feb 18 11:40:17 crc kubenswrapper[4922]: I0218 11:40:17.675022 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lcnjk" Feb 18 11:40:17 crc kubenswrapper[4922]: I0218 11:40:17.716072 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lcnjk" Feb 18 11:40:17 crc kubenswrapper[4922]: I0218 11:40:17.726601 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5lflw" Feb 18 11:40:19 crc kubenswrapper[4922]: I0218 11:40:19.007868 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lcnjk"] Feb 18 11:40:19 crc kubenswrapper[4922]: I0218 11:40:19.199478 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5vjsn" Feb 18 11:40:19 crc kubenswrapper[4922]: I0218 11:40:19.199664 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5vjsn" Feb 18 11:40:19 crc kubenswrapper[4922]: I0218 11:40:19.255563 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5vjsn" Feb 18 11:40:19 crc kubenswrapper[4922]: I0218 11:40:19.685736 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lcnjk" podUID="fc9b41f8-ac9b-4166-a2a6-80326e19254a" containerName="registry-server" containerID="cri-o://99eb33433ccb75dafa75d7dcda595a86756a6a3d18a2403102396dc29b8e05f3" gracePeriod=2 Feb 18 11:40:19 crc kubenswrapper[4922]: I0218 11:40:19.706888 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-q7mwg"] Feb 18 11:40:19 crc kubenswrapper[4922]: I0218 11:40:19.738147 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nf4nk" Feb 18 11:40:19 crc kubenswrapper[4922]: I0218 11:40:19.772721 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5vjsn" Feb 18 11:40:19 crc kubenswrapper[4922]: I0218 11:40:19.817022 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nf4nk" Feb 18 11:40:20 crc kubenswrapper[4922]: I0218 11:40:20.695127 4922 generic.go:334] "Generic (PLEG): container finished" podID="fc9b41f8-ac9b-4166-a2a6-80326e19254a" containerID="99eb33433ccb75dafa75d7dcda595a86756a6a3d18a2403102396dc29b8e05f3" exitCode=0 Feb 18 11:40:20 crc kubenswrapper[4922]: I0218 11:40:20.695234 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lcnjk" event={"ID":"fc9b41f8-ac9b-4166-a2a6-80326e19254a","Type":"ContainerDied","Data":"99eb33433ccb75dafa75d7dcda595a86756a6a3d18a2403102396dc29b8e05f3"} Feb 18 11:40:20 crc kubenswrapper[4922]: I0218 11:40:20.931030 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lcnjk" Feb 18 11:40:21 crc kubenswrapper[4922]: I0218 11:40:21.006204 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc9b41f8-ac9b-4166-a2a6-80326e19254a-utilities\") pod \"fc9b41f8-ac9b-4166-a2a6-80326e19254a\" (UID: \"fc9b41f8-ac9b-4166-a2a6-80326e19254a\") " Feb 18 11:40:21 crc kubenswrapper[4922]: I0218 11:40:21.006243 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc9b41f8-ac9b-4166-a2a6-80326e19254a-catalog-content\") pod \"fc9b41f8-ac9b-4166-a2a6-80326e19254a\" (UID: \"fc9b41f8-ac9b-4166-a2a6-80326e19254a\") " Feb 18 11:40:21 crc kubenswrapper[4922]: I0218 11:40:21.006271 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvvjz\" (UniqueName: \"kubernetes.io/projected/fc9b41f8-ac9b-4166-a2a6-80326e19254a-kube-api-access-fvvjz\") pod \"fc9b41f8-ac9b-4166-a2a6-80326e19254a\" (UID: \"fc9b41f8-ac9b-4166-a2a6-80326e19254a\") " Feb 18 11:40:21 crc kubenswrapper[4922]: I0218 11:40:21.007258 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc9b41f8-ac9b-4166-a2a6-80326e19254a-utilities" (OuterVolumeSpecName: "utilities") pod "fc9b41f8-ac9b-4166-a2a6-80326e19254a" (UID: "fc9b41f8-ac9b-4166-a2a6-80326e19254a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:40:21 crc kubenswrapper[4922]: I0218 11:40:21.015739 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc9b41f8-ac9b-4166-a2a6-80326e19254a-kube-api-access-fvvjz" (OuterVolumeSpecName: "kube-api-access-fvvjz") pod "fc9b41f8-ac9b-4166-a2a6-80326e19254a" (UID: "fc9b41f8-ac9b-4166-a2a6-80326e19254a"). InnerVolumeSpecName "kube-api-access-fvvjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:40:21 crc kubenswrapper[4922]: I0218 11:40:21.060080 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc9b41f8-ac9b-4166-a2a6-80326e19254a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fc9b41f8-ac9b-4166-a2a6-80326e19254a" (UID: "fc9b41f8-ac9b-4166-a2a6-80326e19254a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:40:21 crc kubenswrapper[4922]: I0218 11:40:21.107779 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvvjz\" (UniqueName: \"kubernetes.io/projected/fc9b41f8-ac9b-4166-a2a6-80326e19254a-kube-api-access-fvvjz\") on node \"crc\" DevicePath \"\"" Feb 18 11:40:21 crc kubenswrapper[4922]: I0218 11:40:21.107825 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc9b41f8-ac9b-4166-a2a6-80326e19254a-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 11:40:21 crc kubenswrapper[4922]: I0218 11:40:21.107839 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc9b41f8-ac9b-4166-a2a6-80326e19254a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 11:40:21 crc kubenswrapper[4922]: I0218 11:40:21.602122 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nf4nk"] Feb 18 11:40:21 crc kubenswrapper[4922]: I0218 11:40:21.705290 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lcnjk" event={"ID":"fc9b41f8-ac9b-4166-a2a6-80326e19254a","Type":"ContainerDied","Data":"a8cd6fc2cc1529740a5644abc0364ded3c6ac51bb6647c92d79339c69415a299"} Feb 18 11:40:21 crc kubenswrapper[4922]: I0218 11:40:21.705339 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lcnjk" Feb 18 11:40:21 crc kubenswrapper[4922]: I0218 11:40:21.705428 4922 scope.go:117] "RemoveContainer" containerID="99eb33433ccb75dafa75d7dcda595a86756a6a3d18a2403102396dc29b8e05f3" Feb 18 11:40:21 crc kubenswrapper[4922]: I0218 11:40:21.705446 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nf4nk" podUID="d0a05956-6087-461d-a271-52db98c6032a" containerName="registry-server" containerID="cri-o://f8ca4a530f7e5d39796311f5abf7b45f579aa1e9d87cb174faf9dad00bb56f6b" gracePeriod=2 Feb 18 11:40:21 crc kubenswrapper[4922]: I0218 11:40:21.728823 4922 scope.go:117] "RemoveContainer" containerID="e63e35cde0feed6b8593caa8621cdbf494c204e6b29cfbb54918e219b2da13b4" Feb 18 11:40:21 crc kubenswrapper[4922]: I0218 11:40:21.737257 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lcnjk"] Feb 18 11:40:21 crc kubenswrapper[4922]: I0218 11:40:21.740469 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lcnjk"] Feb 18 11:40:21 crc kubenswrapper[4922]: I0218 11:40:21.764796 4922 scope.go:117] "RemoveContainer" containerID="120adbdad8789c27eefe3c782cdf2eec2b4857607b20057ec0fcf6bbe6831fd0" Feb 18 11:40:22 crc kubenswrapper[4922]: I0218 11:40:22.228864 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nf4nk" Feb 18 11:40:22 crc kubenswrapper[4922]: I0218 11:40:22.422825 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzdsv\" (UniqueName: \"kubernetes.io/projected/d0a05956-6087-461d-a271-52db98c6032a-kube-api-access-tzdsv\") pod \"d0a05956-6087-461d-a271-52db98c6032a\" (UID: \"d0a05956-6087-461d-a271-52db98c6032a\") " Feb 18 11:40:22 crc kubenswrapper[4922]: I0218 11:40:22.422988 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0a05956-6087-461d-a271-52db98c6032a-utilities\") pod \"d0a05956-6087-461d-a271-52db98c6032a\" (UID: \"d0a05956-6087-461d-a271-52db98c6032a\") " Feb 18 11:40:22 crc kubenswrapper[4922]: I0218 11:40:22.423057 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0a05956-6087-461d-a271-52db98c6032a-catalog-content\") pod \"d0a05956-6087-461d-a271-52db98c6032a\" (UID: \"d0a05956-6087-461d-a271-52db98c6032a\") " Feb 18 11:40:22 crc kubenswrapper[4922]: I0218 11:40:22.425292 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0a05956-6087-461d-a271-52db98c6032a-utilities" (OuterVolumeSpecName: "utilities") pod "d0a05956-6087-461d-a271-52db98c6032a" (UID: "d0a05956-6087-461d-a271-52db98c6032a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:40:22 crc kubenswrapper[4922]: I0218 11:40:22.433638 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0a05956-6087-461d-a271-52db98c6032a-kube-api-access-tzdsv" (OuterVolumeSpecName: "kube-api-access-tzdsv") pod "d0a05956-6087-461d-a271-52db98c6032a" (UID: "d0a05956-6087-461d-a271-52db98c6032a"). InnerVolumeSpecName "kube-api-access-tzdsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:40:22 crc kubenswrapper[4922]: I0218 11:40:22.461929 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0a05956-6087-461d-a271-52db98c6032a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d0a05956-6087-461d-a271-52db98c6032a" (UID: "d0a05956-6087-461d-a271-52db98c6032a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:40:22 crc kubenswrapper[4922]: I0218 11:40:22.524351 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzdsv\" (UniqueName: \"kubernetes.io/projected/d0a05956-6087-461d-a271-52db98c6032a-kube-api-access-tzdsv\") on node \"crc\" DevicePath \"\"" Feb 18 11:40:22 crc kubenswrapper[4922]: I0218 11:40:22.524399 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0a05956-6087-461d-a271-52db98c6032a-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 11:40:22 crc kubenswrapper[4922]: I0218 11:40:22.524411 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0a05956-6087-461d-a271-52db98c6032a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 11:40:22 crc kubenswrapper[4922]: I0218 11:40:22.719423 4922 generic.go:334] "Generic (PLEG): container finished" podID="d0a05956-6087-461d-a271-52db98c6032a" containerID="f8ca4a530f7e5d39796311f5abf7b45f579aa1e9d87cb174faf9dad00bb56f6b" exitCode=0 Feb 18 11:40:22 crc kubenswrapper[4922]: I0218 11:40:22.719508 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nf4nk" event={"ID":"d0a05956-6087-461d-a271-52db98c6032a","Type":"ContainerDied","Data":"f8ca4a530f7e5d39796311f5abf7b45f579aa1e9d87cb174faf9dad00bb56f6b"} Feb 18 11:40:22 crc kubenswrapper[4922]: I0218 11:40:22.719649 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nf4nk" event={"ID":"d0a05956-6087-461d-a271-52db98c6032a","Type":"ContainerDied","Data":"a1f5dac4997bf52c337b4ff07c819ce5fc50051e9cba18a9a9a797070bbd762b"} Feb 18 11:40:22 crc kubenswrapper[4922]: I0218 11:40:22.719683 4922 scope.go:117] "RemoveContainer" containerID="f8ca4a530f7e5d39796311f5abf7b45f579aa1e9d87cb174faf9dad00bb56f6b" Feb 18 11:40:22 crc kubenswrapper[4922]: I0218 11:40:22.719862 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nf4nk" Feb 18 11:40:22 crc kubenswrapper[4922]: I0218 11:40:22.740725 4922 scope.go:117] "RemoveContainer" containerID="647b9976eecd46fff2a94f43bff0b109d69e22d4342128856c40ac682b1d18e8" Feb 18 11:40:22 crc kubenswrapper[4922]: I0218 11:40:22.774242 4922 scope.go:117] "RemoveContainer" containerID="e5b592a27d7235e3492be66b98f4503fc9cce34317fcbba82078ccf86ac362b0" Feb 18 11:40:22 crc kubenswrapper[4922]: I0218 11:40:22.777843 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nf4nk"] Feb 18 11:40:22 crc kubenswrapper[4922]: I0218 11:40:22.784678 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nf4nk"] Feb 18 11:40:22 crc kubenswrapper[4922]: I0218 11:40:22.802528 4922 scope.go:117] "RemoveContainer" containerID="f8ca4a530f7e5d39796311f5abf7b45f579aa1e9d87cb174faf9dad00bb56f6b" Feb 18 11:40:22 crc kubenswrapper[4922]: E0218 11:40:22.803225 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8ca4a530f7e5d39796311f5abf7b45f579aa1e9d87cb174faf9dad00bb56f6b\": container with ID starting with f8ca4a530f7e5d39796311f5abf7b45f579aa1e9d87cb174faf9dad00bb56f6b not found: ID does not exist" containerID="f8ca4a530f7e5d39796311f5abf7b45f579aa1e9d87cb174faf9dad00bb56f6b" Feb 18 11:40:22 crc kubenswrapper[4922]: I0218 11:40:22.803294 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8ca4a530f7e5d39796311f5abf7b45f579aa1e9d87cb174faf9dad00bb56f6b"} err="failed to get container status \"f8ca4a530f7e5d39796311f5abf7b45f579aa1e9d87cb174faf9dad00bb56f6b\": rpc error: code = NotFound desc = could not find container \"f8ca4a530f7e5d39796311f5abf7b45f579aa1e9d87cb174faf9dad00bb56f6b\": container with ID starting with f8ca4a530f7e5d39796311f5abf7b45f579aa1e9d87cb174faf9dad00bb56f6b not found: ID does not exist" Feb 18 11:40:22 crc kubenswrapper[4922]: I0218 11:40:22.803343 4922 scope.go:117] "RemoveContainer" containerID="647b9976eecd46fff2a94f43bff0b109d69e22d4342128856c40ac682b1d18e8" Feb 18 11:40:22 crc kubenswrapper[4922]: E0218 11:40:22.803905 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"647b9976eecd46fff2a94f43bff0b109d69e22d4342128856c40ac682b1d18e8\": container with ID starting with 647b9976eecd46fff2a94f43bff0b109d69e22d4342128856c40ac682b1d18e8 not found: ID does not exist" containerID="647b9976eecd46fff2a94f43bff0b109d69e22d4342128856c40ac682b1d18e8" Feb 18 11:40:22 crc kubenswrapper[4922]: I0218 11:40:22.803959 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"647b9976eecd46fff2a94f43bff0b109d69e22d4342128856c40ac682b1d18e8"} err="failed to get container status \"647b9976eecd46fff2a94f43bff0b109d69e22d4342128856c40ac682b1d18e8\": rpc error: code = NotFound desc = could not find container \"647b9976eecd46fff2a94f43bff0b109d69e22d4342128856c40ac682b1d18e8\": container with ID starting with 647b9976eecd46fff2a94f43bff0b109d69e22d4342128856c40ac682b1d18e8 not found: ID does not exist" Feb 18 11:40:22 crc kubenswrapper[4922]: I0218 11:40:22.803996 4922 scope.go:117] "RemoveContainer" containerID="e5b592a27d7235e3492be66b98f4503fc9cce34317fcbba82078ccf86ac362b0" Feb 18 11:40:22 crc kubenswrapper[4922]: E0218 11:40:22.804577 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5b592a27d7235e3492be66b98f4503fc9cce34317fcbba82078ccf86ac362b0\": container with ID starting with e5b592a27d7235e3492be66b98f4503fc9cce34317fcbba82078ccf86ac362b0 not found: ID does not exist" containerID="e5b592a27d7235e3492be66b98f4503fc9cce34317fcbba82078ccf86ac362b0" Feb 18 11:40:22 crc kubenswrapper[4922]: I0218 11:40:22.804620 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5b592a27d7235e3492be66b98f4503fc9cce34317fcbba82078ccf86ac362b0"} err="failed to get container status \"e5b592a27d7235e3492be66b98f4503fc9cce34317fcbba82078ccf86ac362b0\": rpc error: code = NotFound desc = could not find container \"e5b592a27d7235e3492be66b98f4503fc9cce34317fcbba82078ccf86ac362b0\": container with ID starting with e5b592a27d7235e3492be66b98f4503fc9cce34317fcbba82078ccf86ac362b0 not found: ID does not exist" Feb 18 11:40:22 crc kubenswrapper[4922]: I0218 11:40:22.980204 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0a05956-6087-461d-a271-52db98c6032a" path="/var/lib/kubelet/pods/d0a05956-6087-461d-a271-52db98c6032a/volumes" Feb 18 11:40:22 crc kubenswrapper[4922]: I0218 11:40:22.981014 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc9b41f8-ac9b-4166-a2a6-80326e19254a" path="/var/lib/kubelet/pods/fc9b41f8-ac9b-4166-a2a6-80326e19254a/volumes" Feb 18 11:40:27 crc kubenswrapper[4922]: I0218 11:40:27.203221 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5cc84f956-ltmd7"] Feb 18 11:40:27 crc kubenswrapper[4922]: I0218 11:40:27.204628 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5cc84f956-ltmd7" podUID="72ad78b5-bc12-488f-aab9-869895d67ce8" containerName="controller-manager" containerID="cri-o://c09c49cb94f646993c2a4b0fda6fbdd7f540489b4fa899ee2bacffa6390fbbe3" gracePeriod=30 Feb 18 11:40:27 crc kubenswrapper[4922]: I0218 11:40:27.302827 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c5f5b6bc4-4xglc"] Feb 18 11:40:27 crc kubenswrapper[4922]: I0218 11:40:27.303429 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-c5f5b6bc4-4xglc" podUID="33610a6b-c93e-4578-b8d9-93f5a0dbe1a4" containerName="route-controller-manager" containerID="cri-o://bebc46c7f9271b765da6413fad509e6ec5685b8951728d70348b293df49ce847" gracePeriod=30 Feb 18 11:40:27 crc kubenswrapper[4922]: I0218 11:40:27.745453 4922 generic.go:334] "Generic (PLEG): container finished" podID="33610a6b-c93e-4578-b8d9-93f5a0dbe1a4" containerID="bebc46c7f9271b765da6413fad509e6ec5685b8951728d70348b293df49ce847" exitCode=0 Feb 18 11:40:27 crc kubenswrapper[4922]: I0218 11:40:27.745540 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c5f5b6bc4-4xglc" event={"ID":"33610a6b-c93e-4578-b8d9-93f5a0dbe1a4","Type":"ContainerDied","Data":"bebc46c7f9271b765da6413fad509e6ec5685b8951728d70348b293df49ce847"} Feb 18 11:40:27 crc kubenswrapper[4922]: I0218 11:40:27.745593 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c5f5b6bc4-4xglc" event={"ID":"33610a6b-c93e-4578-b8d9-93f5a0dbe1a4","Type":"ContainerDied","Data":"45dfbf68b12565a4b195c299a902d800748f462bcdfa1500e5b1e3f3b967157e"} Feb 18 11:40:27 crc kubenswrapper[4922]: I0218 11:40:27.745617 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45dfbf68b12565a4b195c299a902d800748f462bcdfa1500e5b1e3f3b967157e" Feb 18 11:40:27 crc kubenswrapper[4922]: I0218 11:40:27.746775 4922 generic.go:334] "Generic (PLEG): container finished" podID="72ad78b5-bc12-488f-aab9-869895d67ce8" containerID="c09c49cb94f646993c2a4b0fda6fbdd7f540489b4fa899ee2bacffa6390fbbe3" exitCode=0 Feb 18 11:40:27 crc kubenswrapper[4922]: I0218 11:40:27.746814 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5cc84f956-ltmd7" event={"ID":"72ad78b5-bc12-488f-aab9-869895d67ce8","Type":"ContainerDied","Data":"c09c49cb94f646993c2a4b0fda6fbdd7f540489b4fa899ee2bacffa6390fbbe3"} Feb 18 11:40:27 crc kubenswrapper[4922]: I0218 11:40:27.779322 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c5f5b6bc4-4xglc" Feb 18 11:40:27 crc kubenswrapper[4922]: I0218 11:40:27.814733 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/33610a6b-c93e-4578-b8d9-93f5a0dbe1a4-client-ca\") pod \"33610a6b-c93e-4578-b8d9-93f5a0dbe1a4\" (UID: \"33610a6b-c93e-4578-b8d9-93f5a0dbe1a4\") " Feb 18 11:40:27 crc kubenswrapper[4922]: I0218 11:40:27.814787 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33610a6b-c93e-4578-b8d9-93f5a0dbe1a4-serving-cert\") pod \"33610a6b-c93e-4578-b8d9-93f5a0dbe1a4\" (UID: \"33610a6b-c93e-4578-b8d9-93f5a0dbe1a4\") " Feb 18 11:40:27 crc kubenswrapper[4922]: I0218 11:40:27.814857 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdp7v\" (UniqueName: \"kubernetes.io/projected/33610a6b-c93e-4578-b8d9-93f5a0dbe1a4-kube-api-access-bdp7v\") pod \"33610a6b-c93e-4578-b8d9-93f5a0dbe1a4\" (UID: \"33610a6b-c93e-4578-b8d9-93f5a0dbe1a4\") " Feb 18 11:40:27 crc kubenswrapper[4922]: I0218 11:40:27.814892 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33610a6b-c93e-4578-b8d9-93f5a0dbe1a4-config\") pod \"33610a6b-c93e-4578-b8d9-93f5a0dbe1a4\" (UID: \"33610a6b-c93e-4578-b8d9-93f5a0dbe1a4\") " Feb 18 11:40:27 crc kubenswrapper[4922]: I0218 11:40:27.815761 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33610a6b-c93e-4578-b8d9-93f5a0dbe1a4-config" (OuterVolumeSpecName: "config") pod "33610a6b-c93e-4578-b8d9-93f5a0dbe1a4" (UID: "33610a6b-c93e-4578-b8d9-93f5a0dbe1a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:40:27 crc kubenswrapper[4922]: I0218 11:40:27.816109 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33610a6b-c93e-4578-b8d9-93f5a0dbe1a4-client-ca" (OuterVolumeSpecName: "client-ca") pod "33610a6b-c93e-4578-b8d9-93f5a0dbe1a4" (UID: "33610a6b-c93e-4578-b8d9-93f5a0dbe1a4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:40:27 crc kubenswrapper[4922]: I0218 11:40:27.820807 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33610a6b-c93e-4578-b8d9-93f5a0dbe1a4-kube-api-access-bdp7v" (OuterVolumeSpecName: "kube-api-access-bdp7v") pod "33610a6b-c93e-4578-b8d9-93f5a0dbe1a4" (UID: "33610a6b-c93e-4578-b8d9-93f5a0dbe1a4"). InnerVolumeSpecName "kube-api-access-bdp7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:40:27 crc kubenswrapper[4922]: I0218 11:40:27.825702 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33610a6b-c93e-4578-b8d9-93f5a0dbe1a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "33610a6b-c93e-4578-b8d9-93f5a0dbe1a4" (UID: "33610a6b-c93e-4578-b8d9-93f5a0dbe1a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:40:27 crc kubenswrapper[4922]: I0218 11:40:27.916192 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdp7v\" (UniqueName: \"kubernetes.io/projected/33610a6b-c93e-4578-b8d9-93f5a0dbe1a4-kube-api-access-bdp7v\") on node \"crc\" DevicePath \"\"" Feb 18 11:40:27 crc kubenswrapper[4922]: I0218 11:40:27.916701 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33610a6b-c93e-4578-b8d9-93f5a0dbe1a4-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:40:27 crc kubenswrapper[4922]: I0218 11:40:27.916784 4922 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/33610a6b-c93e-4578-b8d9-93f5a0dbe1a4-client-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:40:27 crc kubenswrapper[4922]: I0218 11:40:27.916846 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33610a6b-c93e-4578-b8d9-93f5a0dbe1a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.276647 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5cc84f956-ltmd7" Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.320703 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72ad78b5-bc12-488f-aab9-869895d67ce8-config\") pod \"72ad78b5-bc12-488f-aab9-869895d67ce8\" (UID: \"72ad78b5-bc12-488f-aab9-869895d67ce8\") " Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.320747 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/72ad78b5-bc12-488f-aab9-869895d67ce8-proxy-ca-bundles\") pod \"72ad78b5-bc12-488f-aab9-869895d67ce8\" (UID: \"72ad78b5-bc12-488f-aab9-869895d67ce8\") " Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.320765 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/72ad78b5-bc12-488f-aab9-869895d67ce8-client-ca\") pod \"72ad78b5-bc12-488f-aab9-869895d67ce8\" (UID: \"72ad78b5-bc12-488f-aab9-869895d67ce8\") " Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.320786 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjcrd\" (UniqueName: \"kubernetes.io/projected/72ad78b5-bc12-488f-aab9-869895d67ce8-kube-api-access-kjcrd\") pod \"72ad78b5-bc12-488f-aab9-869895d67ce8\" (UID: \"72ad78b5-bc12-488f-aab9-869895d67ce8\") " Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.321772 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72ad78b5-bc12-488f-aab9-869895d67ce8-client-ca" (OuterVolumeSpecName: "client-ca") pod "72ad78b5-bc12-488f-aab9-869895d67ce8" (UID: "72ad78b5-bc12-488f-aab9-869895d67ce8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.321823 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72ad78b5-bc12-488f-aab9-869895d67ce8-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "72ad78b5-bc12-488f-aab9-869895d67ce8" (UID: "72ad78b5-bc12-488f-aab9-869895d67ce8"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.321845 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72ad78b5-bc12-488f-aab9-869895d67ce8-config" (OuterVolumeSpecName: "config") pod "72ad78b5-bc12-488f-aab9-869895d67ce8" (UID: "72ad78b5-bc12-488f-aab9-869895d67ce8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.322332 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72ad78b5-bc12-488f-aab9-869895d67ce8-serving-cert\") pod \"72ad78b5-bc12-488f-aab9-869895d67ce8\" (UID: \"72ad78b5-bc12-488f-aab9-869895d67ce8\") " Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.322917 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72ad78b5-bc12-488f-aab9-869895d67ce8-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.322944 4922 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/72ad78b5-bc12-488f-aab9-869895d67ce8-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.322959 4922 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/72ad78b5-bc12-488f-aab9-869895d67ce8-client-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.325663 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72ad78b5-bc12-488f-aab9-869895d67ce8-kube-api-access-kjcrd" (OuterVolumeSpecName: "kube-api-access-kjcrd") pod "72ad78b5-bc12-488f-aab9-869895d67ce8" (UID: "72ad78b5-bc12-488f-aab9-869895d67ce8"). InnerVolumeSpecName "kube-api-access-kjcrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.325775 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72ad78b5-bc12-488f-aab9-869895d67ce8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "72ad78b5-bc12-488f-aab9-869895d67ce8" (UID: "72ad78b5-bc12-488f-aab9-869895d67ce8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.424492 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72ad78b5-bc12-488f-aab9-869895d67ce8-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.424537 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjcrd\" (UniqueName: \"kubernetes.io/projected/72ad78b5-bc12-488f-aab9-869895d67ce8-kube-api-access-kjcrd\") on node \"crc\" DevicePath \"\"" Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.752623 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c5f5b6bc4-4xglc" Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.752661 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5cc84f956-ltmd7" event={"ID":"72ad78b5-bc12-488f-aab9-869895d67ce8","Type":"ContainerDied","Data":"9ec1524a75d8bd2f1625be98e138e6cf305c4ac3d285301d8e45c5b5ccd89829"} Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.752630 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5cc84f956-ltmd7" Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.752766 4922 scope.go:117] "RemoveContainer" containerID="c09c49cb94f646993c2a4b0fda6fbdd7f540489b4fa899ee2bacffa6390fbbe3" Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.797577 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5cc84f956-ltmd7"] Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.802925 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5cc84f956-ltmd7"] Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.811091 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c5f5b6bc4-4xglc"] Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.814658 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c5f5b6bc4-4xglc"] Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.980151 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33610a6b-c93e-4578-b8d9-93f5a0dbe1a4" path="/var/lib/kubelet/pods/33610a6b-c93e-4578-b8d9-93f5a0dbe1a4/volumes" Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.980805 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72ad78b5-bc12-488f-aab9-869895d67ce8" path="/var/lib/kubelet/pods/72ad78b5-bc12-488f-aab9-869895d67ce8/volumes" Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.981245 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cc78685b5-mmg7w"] Feb 18 11:40:28 crc kubenswrapper[4922]: E0218 11:40:28.981467 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc9b41f8-ac9b-4166-a2a6-80326e19254a" containerName="extract-content" Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.981488 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc9b41f8-ac9b-4166-a2a6-80326e19254a" containerName="extract-content" Feb 18 11:40:28 crc kubenswrapper[4922]: E0218 11:40:28.981504 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54" containerName="extract-utilities" Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.981512 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54" containerName="extract-utilities" Feb 18 11:40:28 crc kubenswrapper[4922]: E0218 11:40:28.981522 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0a05956-6087-461d-a271-52db98c6032a" containerName="registry-server" Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.981529 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a05956-6087-461d-a271-52db98c6032a" containerName="registry-server" Feb 18 11:40:28 crc kubenswrapper[4922]: E0218 11:40:28.981541 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33610a6b-c93e-4578-b8d9-93f5a0dbe1a4" containerName="route-controller-manager" Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.981549 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="33610a6b-c93e-4578-b8d9-93f5a0dbe1a4" containerName="route-controller-manager" Feb 18 11:40:28 crc kubenswrapper[4922]: E0218 11:40:28.981559 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc9b41f8-ac9b-4166-a2a6-80326e19254a" containerName="registry-server" Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.981566 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc9b41f8-ac9b-4166-a2a6-80326e19254a" containerName="registry-server" Feb 18 11:40:28 crc kubenswrapper[4922]: E0218 11:40:28.981587 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc9b41f8-ac9b-4166-a2a6-80326e19254a" containerName="extract-utilities" Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.981596 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc9b41f8-ac9b-4166-a2a6-80326e19254a" containerName="extract-utilities" Feb 18 11:40:28 crc kubenswrapper[4922]: E0218 11:40:28.981605 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72ad78b5-bc12-488f-aab9-869895d67ce8" containerName="controller-manager" Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.981613 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="72ad78b5-bc12-488f-aab9-869895d67ce8" containerName="controller-manager" Feb 18 11:40:28 crc kubenswrapper[4922]: E0218 11:40:28.981625 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0a05956-6087-461d-a271-52db98c6032a" containerName="extract-content" Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.981631 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a05956-6087-461d-a271-52db98c6032a" containerName="extract-content" Feb 18 11:40:28 crc kubenswrapper[4922]: E0218 11:40:28.981640 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54" containerName="registry-server" Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.981648 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54" containerName="registry-server" Feb 18 11:40:28 crc kubenswrapper[4922]: E0218 11:40:28.981660 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0a05956-6087-461d-a271-52db98c6032a" containerName="extract-utilities" Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.981668 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a05956-6087-461d-a271-52db98c6032a" containerName="extract-utilities" Feb 18 11:40:28 crc kubenswrapper[4922]: E0218 11:40:28.981679 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54" containerName="extract-content" Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.981687 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54" containerName="extract-content" Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.981816 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="72ad78b5-bc12-488f-aab9-869895d67ce8" containerName="controller-manager" Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.981830 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0a05956-6087-461d-a271-52db98c6032a" containerName="registry-server" Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.981844 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="33610a6b-c93e-4578-b8d9-93f5a0dbe1a4" containerName="route-controller-manager" Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.981855 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="80efc1b6-0ae2-4cbf-8dc9-0e2c4d526f54" containerName="registry-server" Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.981868 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc9b41f8-ac9b-4166-a2a6-80326e19254a" containerName="registry-server" Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.982714 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-cc78685b5-mmg7w" Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.983583 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-65bf898576-dcl7m"] Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.984408 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-65bf898576-dcl7m" Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.990936 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.990977 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.991198 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.996912 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.997209 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.997478 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.997506 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.997667 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.997797 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.997920 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.998805 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 18 11:40:28 crc kubenswrapper[4922]: I0218 11:40:28.998982 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 18 11:40:29 crc kubenswrapper[4922]: I0218 11:40:29.001104 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cc78685b5-mmg7w"] Feb 18 11:40:29 crc kubenswrapper[4922]: I0218 11:40:29.004714 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 18 11:40:29 crc kubenswrapper[4922]: I0218 11:40:29.009499 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-65bf898576-dcl7m"] Feb 18 11:40:29 crc kubenswrapper[4922]: I0218 11:40:29.031034 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wg8v\" (UniqueName: \"kubernetes.io/projected/0e5ba5b5-0f09-46fe-bcd1-52d3283562e3-kube-api-access-5wg8v\") pod \"controller-manager-65bf898576-dcl7m\" (UID: \"0e5ba5b5-0f09-46fe-bcd1-52d3283562e3\") " pod="openshift-controller-manager/controller-manager-65bf898576-dcl7m" Feb 18 11:40:29 crc kubenswrapper[4922]: I0218 11:40:29.031076 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e5ba5b5-0f09-46fe-bcd1-52d3283562e3-serving-cert\") pod \"controller-manager-65bf898576-dcl7m\" (UID: \"0e5ba5b5-0f09-46fe-bcd1-52d3283562e3\") " pod="openshift-controller-manager/controller-manager-65bf898576-dcl7m" Feb 18 11:40:29 crc kubenswrapper[4922]: I0218 11:40:29.031136 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e5ba5b5-0f09-46fe-bcd1-52d3283562e3-config\") pod \"controller-manager-65bf898576-dcl7m\" (UID: \"0e5ba5b5-0f09-46fe-bcd1-52d3283562e3\") " pod="openshift-controller-manager/controller-manager-65bf898576-dcl7m" Feb 18 11:40:29 crc kubenswrapper[4922]: I0218 11:40:29.031170 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84d32a70-8611-452d-8f4a-04e84753d49d-serving-cert\") pod \"route-controller-manager-cc78685b5-mmg7w\" (UID: \"84d32a70-8611-452d-8f4a-04e84753d49d\") " pod="openshift-route-controller-manager/route-controller-manager-cc78685b5-mmg7w" Feb 18 11:40:29 crc kubenswrapper[4922]: I0218 11:40:29.031190 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0e5ba5b5-0f09-46fe-bcd1-52d3283562e3-client-ca\") pod \"controller-manager-65bf898576-dcl7m\" (UID: \"0e5ba5b5-0f09-46fe-bcd1-52d3283562e3\") " pod="openshift-controller-manager/controller-manager-65bf898576-dcl7m" Feb 18 11:40:29 crc kubenswrapper[4922]: I0218 11:40:29.031206 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/84d32a70-8611-452d-8f4a-04e84753d49d-client-ca\") pod \"route-controller-manager-cc78685b5-mmg7w\" (UID: \"84d32a70-8611-452d-8f4a-04e84753d49d\") " pod="openshift-route-controller-manager/route-controller-manager-cc78685b5-mmg7w" Feb 18 11:40:29 crc kubenswrapper[4922]: I0218 11:40:29.031220 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frvld\" (UniqueName: \"kubernetes.io/projected/84d32a70-8611-452d-8f4a-04e84753d49d-kube-api-access-frvld\") pod \"route-controller-manager-cc78685b5-mmg7w\" (UID: \"84d32a70-8611-452d-8f4a-04e84753d49d\") " pod="openshift-route-controller-manager/route-controller-manager-cc78685b5-mmg7w" Feb 18 11:40:29 crc kubenswrapper[4922]: I0218 11:40:29.031237 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84d32a70-8611-452d-8f4a-04e84753d49d-config\") pod \"route-controller-manager-cc78685b5-mmg7w\" (UID: \"84d32a70-8611-452d-8f4a-04e84753d49d\") " pod="openshift-route-controller-manager/route-controller-manager-cc78685b5-mmg7w" Feb 18 11:40:29 crc kubenswrapper[4922]: I0218 11:40:29.031258 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0e5ba5b5-0f09-46fe-bcd1-52d3283562e3-proxy-ca-bundles\") pod \"controller-manager-65bf898576-dcl7m\" (UID: \"0e5ba5b5-0f09-46fe-bcd1-52d3283562e3\") " pod="openshift-controller-manager/controller-manager-65bf898576-dcl7m" Feb 18 11:40:29 crc kubenswrapper[4922]: I0218 11:40:29.131916 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84d32a70-8611-452d-8f4a-04e84753d49d-serving-cert\") pod \"route-controller-manager-cc78685b5-mmg7w\" (UID: \"84d32a70-8611-452d-8f4a-04e84753d49d\") " pod="openshift-route-controller-manager/route-controller-manager-cc78685b5-mmg7w" Feb 18 11:40:29 crc kubenswrapper[4922]: I0218 11:40:29.132202 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0e5ba5b5-0f09-46fe-bcd1-52d3283562e3-client-ca\") pod \"controller-manager-65bf898576-dcl7m\" (UID: \"0e5ba5b5-0f09-46fe-bcd1-52d3283562e3\") " pod="openshift-controller-manager/controller-manager-65bf898576-dcl7m" Feb 18 11:40:29 crc kubenswrapper[4922]: I0218 11:40:29.132292 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/84d32a70-8611-452d-8f4a-04e84753d49d-client-ca\") pod \"route-controller-manager-cc78685b5-mmg7w\" (UID: \"84d32a70-8611-452d-8f4a-04e84753d49d\") " pod="openshift-route-controller-manager/route-controller-manager-cc78685b5-mmg7w" Feb 18 11:40:29 crc kubenswrapper[4922]: I0218 11:40:29.132406 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frvld\" (UniqueName: \"kubernetes.io/projected/84d32a70-8611-452d-8f4a-04e84753d49d-kube-api-access-frvld\") pod \"route-controller-manager-cc78685b5-mmg7w\" (UID: \"84d32a70-8611-452d-8f4a-04e84753d49d\") " pod="openshift-route-controller-manager/route-controller-manager-cc78685b5-mmg7w" Feb 18 11:40:29 crc kubenswrapper[4922]: I0218 11:40:29.132494 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84d32a70-8611-452d-8f4a-04e84753d49d-config\") pod \"route-controller-manager-cc78685b5-mmg7w\" (UID: \"84d32a70-8611-452d-8f4a-04e84753d49d\") " pod="openshift-route-controller-manager/route-controller-manager-cc78685b5-mmg7w" Feb 18 11:40:29 crc kubenswrapper[4922]: I0218 11:40:29.132585 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0e5ba5b5-0f09-46fe-bcd1-52d3283562e3-proxy-ca-bundles\") pod \"controller-manager-65bf898576-dcl7m\" (UID: \"0e5ba5b5-0f09-46fe-bcd1-52d3283562e3\") " pod="openshift-controller-manager/controller-manager-65bf898576-dcl7m" Feb 18 11:40:29 crc kubenswrapper[4922]: I0218 11:40:29.132675 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wg8v\" (UniqueName: \"kubernetes.io/projected/0e5ba5b5-0f09-46fe-bcd1-52d3283562e3-kube-api-access-5wg8v\") pod \"controller-manager-65bf898576-dcl7m\" (UID: \"0e5ba5b5-0f09-46fe-bcd1-52d3283562e3\") " pod="openshift-controller-manager/controller-manager-65bf898576-dcl7m" Feb 18 11:40:29 crc kubenswrapper[4922]: I0218 11:40:29.132763 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e5ba5b5-0f09-46fe-bcd1-52d3283562e3-serving-cert\") pod \"controller-manager-65bf898576-dcl7m\" (UID: \"0e5ba5b5-0f09-46fe-bcd1-52d3283562e3\") " pod="openshift-controller-manager/controller-manager-65bf898576-dcl7m" Feb 18 11:40:29 crc kubenswrapper[4922]: I0218 11:40:29.132858 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e5ba5b5-0f09-46fe-bcd1-52d3283562e3-config\") pod \"controller-manager-65bf898576-dcl7m\" (UID: \"0e5ba5b5-0f09-46fe-bcd1-52d3283562e3\") " pod="openshift-controller-manager/controller-manager-65bf898576-dcl7m" Feb 18 11:40:29 crc kubenswrapper[4922]: I0218 11:40:29.133542 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/84d32a70-8611-452d-8f4a-04e84753d49d-client-ca\") pod \"route-controller-manager-cc78685b5-mmg7w\" (UID: \"84d32a70-8611-452d-8f4a-04e84753d49d\") " pod="openshift-route-controller-manager/route-controller-manager-cc78685b5-mmg7w" Feb 18 11:40:29 crc kubenswrapper[4922]: I0218 11:40:29.133778 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0e5ba5b5-0f09-46fe-bcd1-52d3283562e3-client-ca\") pod \"controller-manager-65bf898576-dcl7m\" (UID: \"0e5ba5b5-0f09-46fe-bcd1-52d3283562e3\") " pod="openshift-controller-manager/controller-manager-65bf898576-dcl7m" Feb 18 11:40:29 crc kubenswrapper[4922]: I0218 11:40:29.134092 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0e5ba5b5-0f09-46fe-bcd1-52d3283562e3-proxy-ca-bundles\") pod \"controller-manager-65bf898576-dcl7m\" (UID: \"0e5ba5b5-0f09-46fe-bcd1-52d3283562e3\") " pod="openshift-controller-manager/controller-manager-65bf898576-dcl7m" Feb 18 11:40:29 crc kubenswrapper[4922]: I0218 11:40:29.134531 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e5ba5b5-0f09-46fe-bcd1-52d3283562e3-config\") pod \"controller-manager-65bf898576-dcl7m\" (UID: \"0e5ba5b5-0f09-46fe-bcd1-52d3283562e3\") " pod="openshift-controller-manager/controller-manager-65bf898576-dcl7m" Feb 18 11:40:29 crc kubenswrapper[4922]: I0218 11:40:29.134638 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84d32a70-8611-452d-8f4a-04e84753d49d-config\") pod \"route-controller-manager-cc78685b5-mmg7w\" (UID: \"84d32a70-8611-452d-8f4a-04e84753d49d\") " pod="openshift-route-controller-manager/route-controller-manager-cc78685b5-mmg7w" Feb 18 11:40:29 crc kubenswrapper[4922]: I0218 11:40:29.136355 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84d32a70-8611-452d-8f4a-04e84753d49d-serving-cert\") pod \"route-controller-manager-cc78685b5-mmg7w\" (UID: \"84d32a70-8611-452d-8f4a-04e84753d49d\") " pod="openshift-route-controller-manager/route-controller-manager-cc78685b5-mmg7w" Feb 18 11:40:29 crc kubenswrapper[4922]: I0218 11:40:29.136379 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e5ba5b5-0f09-46fe-bcd1-52d3283562e3-serving-cert\") pod \"controller-manager-65bf898576-dcl7m\" (UID: \"0e5ba5b5-0f09-46fe-bcd1-52d3283562e3\") " pod="openshift-controller-manager/controller-manager-65bf898576-dcl7m" Feb 18 11:40:29 crc kubenswrapper[4922]: I0218 11:40:29.155244 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frvld\" (UniqueName: \"kubernetes.io/projected/84d32a70-8611-452d-8f4a-04e84753d49d-kube-api-access-frvld\") pod \"route-controller-manager-cc78685b5-mmg7w\" (UID: \"84d32a70-8611-452d-8f4a-04e84753d49d\") " pod="openshift-route-controller-manager/route-controller-manager-cc78685b5-mmg7w" Feb 18 11:40:29 crc kubenswrapper[4922]: I0218 11:40:29.159935 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wg8v\" (UniqueName: \"kubernetes.io/projected/0e5ba5b5-0f09-46fe-bcd1-52d3283562e3-kube-api-access-5wg8v\") pod \"controller-manager-65bf898576-dcl7m\" (UID: \"0e5ba5b5-0f09-46fe-bcd1-52d3283562e3\") " pod="openshift-controller-manager/controller-manager-65bf898576-dcl7m" Feb 18 11:40:29 crc kubenswrapper[4922]: I0218 11:40:29.296949 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-cc78685b5-mmg7w" Feb 18 11:40:29 crc kubenswrapper[4922]: I0218 11:40:29.304207 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-65bf898576-dcl7m" Feb 18 11:40:29 crc kubenswrapper[4922]: I0218 11:40:29.802900 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cc78685b5-mmg7w"] Feb 18 11:40:29 crc kubenswrapper[4922]: W0218 11:40:29.805647 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84d32a70_8611_452d_8f4a_04e84753d49d.slice/crio-2b95fa6d35dcdb869da77156680a0c535a0d0e1a92ce39842346105f70fc3ff8 WatchSource:0}: Error finding container 2b95fa6d35dcdb869da77156680a0c535a0d0e1a92ce39842346105f70fc3ff8: Status 404 returned error can't find the container with id 2b95fa6d35dcdb869da77156680a0c535a0d0e1a92ce39842346105f70fc3ff8 Feb 18 11:40:29 crc kubenswrapper[4922]: I0218 11:40:29.862229 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-65bf898576-dcl7m"] Feb 18 11:40:30 crc kubenswrapper[4922]: I0218 11:40:30.768269 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-cc78685b5-mmg7w" event={"ID":"84d32a70-8611-452d-8f4a-04e84753d49d","Type":"ContainerStarted","Data":"e2a1d3e949a6e433768f3d71473b4d25c266f367076510b6399c9ab451840655"} Feb 18 11:40:30 crc kubenswrapper[4922]: I0218 11:40:30.768339 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-cc78685b5-mmg7w" event={"ID":"84d32a70-8611-452d-8f4a-04e84753d49d","Type":"ContainerStarted","Data":"2b95fa6d35dcdb869da77156680a0c535a0d0e1a92ce39842346105f70fc3ff8"} Feb 18 11:40:30 crc kubenswrapper[4922]: I0218 11:40:30.768758 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-cc78685b5-mmg7w" Feb 18 11:40:30 crc kubenswrapper[4922]: I0218 11:40:30.769553 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-65bf898576-dcl7m" event={"ID":"0e5ba5b5-0f09-46fe-bcd1-52d3283562e3","Type":"ContainerStarted","Data":"58935e6a8e4f4b43e0cc61ef368e0aeafc91a6f1c962ce9208aa73de4023391b"} Feb 18 11:40:30 crc kubenswrapper[4922]: I0218 11:40:30.769592 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-65bf898576-dcl7m" event={"ID":"0e5ba5b5-0f09-46fe-bcd1-52d3283562e3","Type":"ContainerStarted","Data":"32127c44cac5dc4ac1970ac1b7396179a79cdd3a32cb30e2c9d6219e276ce192"} Feb 18 11:40:30 crc kubenswrapper[4922]: I0218 11:40:30.769858 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-65bf898576-dcl7m" Feb 18 11:40:30 crc kubenswrapper[4922]: I0218 11:40:30.774138 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-65bf898576-dcl7m" Feb 18 11:40:30 crc kubenswrapper[4922]: I0218 11:40:30.774309 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-cc78685b5-mmg7w" Feb 18 11:40:30 crc kubenswrapper[4922]: I0218 11:40:30.790625 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-cc78685b5-mmg7w" podStartSLOduration=3.790609223 podStartE2EDuration="3.790609223s" podCreationTimestamp="2026-02-18 11:40:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:40:30.787910753 +0000 UTC m=+232.515614843" watchObservedRunningTime="2026-02-18 11:40:30.790609223 +0000 UTC m=+232.518313303" Feb 18 11:40:30 crc kubenswrapper[4922]: I0218 11:40:30.807573 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-65bf898576-dcl7m" podStartSLOduration=3.807553555 podStartE2EDuration="3.807553555s" podCreationTimestamp="2026-02-18 11:40:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:40:30.806304578 +0000 UTC m=+232.534008658" watchObservedRunningTime="2026-02-18 11:40:30.807553555 +0000 UTC m=+232.535257635" Feb 18 11:40:32 crc kubenswrapper[4922]: I0218 11:40:32.880602 4922 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 18 11:40:32 crc kubenswrapper[4922]: E0218 11:40:32.881621 4922 file.go:109] "Unable to process watch event" err="can't process config file \"/etc/kubernetes/manifests/kube-apiserver-pod.yaml\": /etc/kubernetes/manifests/kube-apiserver-pod.yaml: couldn't parse as pod(Object 'Kind' is missing in 'null'), please check config file" Feb 18 11:40:32 crc kubenswrapper[4922]: I0218 11:40:32.882557 4922 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 18 11:40:32 crc kubenswrapper[4922]: I0218 11:40:32.882705 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 11:40:32 crc kubenswrapper[4922]: I0218 11:40:32.883671 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://90b891a2fe938b437cbfb28c7715d0ad758541d844a4768434d3c1f1a3f0578c" gracePeriod=15 Feb 18 11:40:32 crc kubenswrapper[4922]: I0218 11:40:32.883737 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://434010c9772cf1c9db5578ecd4726ae2de0c0a3923d45b522868677cf78fcaf6" gracePeriod=15 Feb 18 11:40:32 crc kubenswrapper[4922]: I0218 11:40:32.883690 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://9af62821d90e57595091aa9e7f1083acd21cb955ada0837c5ab4f71885ff7f1a" gracePeriod=15 Feb 18 11:40:32 crc kubenswrapper[4922]: I0218 11:40:32.883704 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://bab832676116da0ae3d9dc70d182e283d7ae2418db02c1ea0062812cab0d71b8" gracePeriod=15 Feb 18 11:40:32 crc kubenswrapper[4922]: I0218 11:40:32.884074 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://dacf3659cbb51e3d8df303470837041f3110f38b42187d40cbb4e53a8750cafe" gracePeriod=15 Feb 18 11:40:32 crc kubenswrapper[4922]: I0218 11:40:32.887751 4922 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 18 11:40:32 crc kubenswrapper[4922]: E0218 11:40:32.888323 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 18 11:40:32 crc kubenswrapper[4922]: I0218 11:40:32.888357 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 18 11:40:32 crc kubenswrapper[4922]: E0218 11:40:32.888406 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 18 11:40:32 crc kubenswrapper[4922]: I0218 11:40:32.888425 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 18 11:40:32 crc kubenswrapper[4922]: E0218 11:40:32.888446 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 18 11:40:32 crc kubenswrapper[4922]: I0218 11:40:32.888650 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 18 11:40:32 crc kubenswrapper[4922]: E0218 11:40:32.888694 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 18 11:40:32 crc kubenswrapper[4922]: I0218 11:40:32.888707 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 18 11:40:32 crc kubenswrapper[4922]: E0218 11:40:32.888747 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 18 11:40:32 crc kubenswrapper[4922]: I0218 11:40:32.888760 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 18 11:40:32 crc kubenswrapper[4922]: E0218 11:40:32.888784 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 18 11:40:32 crc kubenswrapper[4922]: I0218 11:40:32.888796 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 18 11:40:32 crc kubenswrapper[4922]: E0218 11:40:32.888815 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 18 11:40:32 crc kubenswrapper[4922]: I0218 11:40:32.888827 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 18 11:40:32 crc kubenswrapper[4922]: I0218 11:40:32.889181 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 18 11:40:32 crc kubenswrapper[4922]: I0218 11:40:32.889214 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 18 11:40:32 crc kubenswrapper[4922]: I0218 11:40:32.889243 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 18 11:40:32 crc kubenswrapper[4922]: I0218 11:40:32.889260 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 18 11:40:32 crc kubenswrapper[4922]: I0218 11:40:32.889284 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 18 11:40:32 crc kubenswrapper[4922]: I0218 11:40:32.889298 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 18 11:40:32 crc kubenswrapper[4922]: I0218 11:40:32.889317 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 18 11:40:32 crc kubenswrapper[4922]: E0218 11:40:32.889709 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 18 11:40:32 crc kubenswrapper[4922]: I0218 11:40:32.889730 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 18 11:40:32 crc kubenswrapper[4922]: I0218 11:40:32.933704 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 18 11:40:33 crc kubenswrapper[4922]: I0218 11:40:33.002435 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 11:40:33 crc kubenswrapper[4922]: I0218 11:40:33.002482 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 11:40:33 crc kubenswrapper[4922]: I0218 11:40:33.002534 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 11:40:33 crc kubenswrapper[4922]: I0218 11:40:33.002569 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 11:40:33 crc kubenswrapper[4922]: I0218 11:40:33.002594 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 11:40:33 crc kubenswrapper[4922]: I0218 11:40:33.003346 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 11:40:33 crc kubenswrapper[4922]: I0218 11:40:33.003404 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 11:40:33 crc kubenswrapper[4922]: I0218 11:40:33.003441 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 11:40:33 crc kubenswrapper[4922]: I0218 11:40:33.104963 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 11:40:33 crc kubenswrapper[4922]: I0218 11:40:33.105020 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 11:40:33 crc kubenswrapper[4922]: I0218 11:40:33.105044 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 11:40:33 crc kubenswrapper[4922]: I0218 11:40:33.105071 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 11:40:33 crc kubenswrapper[4922]: I0218 11:40:33.105098 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 11:40:33 crc kubenswrapper[4922]: I0218 11:40:33.105125 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 11:40:33 crc kubenswrapper[4922]: I0218 11:40:33.105234 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 11:40:33 crc kubenswrapper[4922]: I0218 11:40:33.105231 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 11:40:33 crc kubenswrapper[4922]: I0218 11:40:33.105099 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 11:40:33 crc kubenswrapper[4922]: I0218 11:40:33.105411 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 11:40:33 crc kubenswrapper[4922]: I0218 11:40:33.105437 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 11:40:33 crc kubenswrapper[4922]: I0218 11:40:33.105532 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 11:40:33 crc kubenswrapper[4922]: I0218 11:40:33.105580 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 11:40:33 crc kubenswrapper[4922]: I0218 11:40:33.105614 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 11:40:33 crc kubenswrapper[4922]: I0218 11:40:33.105645 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 11:40:33 crc kubenswrapper[4922]: I0218 11:40:33.105725 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 11:40:33 crc kubenswrapper[4922]: I0218 11:40:33.228706 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 11:40:33 crc kubenswrapper[4922]: E0218 11:40:33.256926 4922 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.113:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18955467ce10eb1c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-18 11:40:33.25573814 +0000 UTC m=+234.983442240,LastTimestamp:2026-02-18 11:40:33.25573814 +0000 UTC m=+234.983442240,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 18 11:40:33 crc kubenswrapper[4922]: I0218 11:40:33.791248 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"856502c67849034b25212f7f840344a5e8241e9f0ff22714b0ccd364807de125"} Feb 18 11:40:33 crc kubenswrapper[4922]: I0218 11:40:33.791298 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"b7186cbec24efd9deedddf85df28365fee1afa7fa17382bc569bd5b4abc33045"} Feb 18 11:40:33 crc kubenswrapper[4922]: I0218 11:40:33.792387 4922 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:33 crc kubenswrapper[4922]: I0218 11:40:33.795600 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 18 11:40:33 crc kubenswrapper[4922]: I0218 11:40:33.798409 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 18 11:40:33 crc kubenswrapper[4922]: I0218 11:40:33.799514 4922 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9af62821d90e57595091aa9e7f1083acd21cb955ada0837c5ab4f71885ff7f1a" exitCode=0 Feb 18 11:40:33 crc kubenswrapper[4922]: I0218 11:40:33.799594 4922 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="90b891a2fe938b437cbfb28c7715d0ad758541d844a4768434d3c1f1a3f0578c" exitCode=0 Feb 18 11:40:33 crc kubenswrapper[4922]: I0218 11:40:33.799599 4922 scope.go:117] "RemoveContainer" containerID="4e43b874193e0cccc7b1ff6e7d41a35d7777ecc53ad7d7178923e009ca76dbcd" Feb 18 11:40:33 crc kubenswrapper[4922]: I0218 11:40:33.799606 4922 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="bab832676116da0ae3d9dc70d182e283d7ae2418db02c1ea0062812cab0d71b8" exitCode=0 Feb 18 11:40:33 crc kubenswrapper[4922]: I0218 11:40:33.799704 4922 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="434010c9772cf1c9db5578ecd4726ae2de0c0a3923d45b522868677cf78fcaf6" exitCode=2 Feb 18 11:40:33 crc kubenswrapper[4922]: I0218 11:40:33.802300 4922 generic.go:334] "Generic (PLEG): container finished" podID="ad37f26c-293d-42c8-a88e-21e0a2c5e05d" containerID="0f5aa83e132da84ebb77b9c8d8371acd88a3509bc96c992baa1b131822fe3971" exitCode=0 Feb 18 11:40:33 crc kubenswrapper[4922]: I0218 11:40:33.802352 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ad37f26c-293d-42c8-a88e-21e0a2c5e05d","Type":"ContainerDied","Data":"0f5aa83e132da84ebb77b9c8d8371acd88a3509bc96c992baa1b131822fe3971"} Feb 18 11:40:33 crc kubenswrapper[4922]: I0218 11:40:33.803124 4922 status_manager.go:851] "Failed to get status for pod" podUID="ad37f26c-293d-42c8-a88e-21e0a2c5e05d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:33 crc kubenswrapper[4922]: I0218 11:40:33.803619 4922 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:34 crc kubenswrapper[4922]: I0218 11:40:34.814693 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.285395 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.286394 4922 status_manager.go:851] "Failed to get status for pod" podUID="ad37f26c-293d-42c8-a88e-21e0a2c5e05d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.286720 4922 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.292479 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.293413 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.293867 4922 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.294287 4922 status_manager.go:851] "Failed to get status for pod" podUID="ad37f26c-293d-42c8-a88e-21e0a2c5e05d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.294793 4922 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.337671 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad37f26c-293d-42c8-a88e-21e0a2c5e05d-kube-api-access\") pod \"ad37f26c-293d-42c8-a88e-21e0a2c5e05d\" (UID: \"ad37f26c-293d-42c8-a88e-21e0a2c5e05d\") " Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.337743 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.337819 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.337877 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.337926 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.337973 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ad37f26c-293d-42c8-a88e-21e0a2c5e05d-kubelet-dir\") pod \"ad37f26c-293d-42c8-a88e-21e0a2c5e05d\" (UID: \"ad37f26c-293d-42c8-a88e-21e0a2c5e05d\") " Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.337970 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.338045 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.338135 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ad37f26c-293d-42c8-a88e-21e0a2c5e05d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ad37f26c-293d-42c8-a88e-21e0a2c5e05d" (UID: "ad37f26c-293d-42c8-a88e-21e0a2c5e05d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.338201 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ad37f26c-293d-42c8-a88e-21e0a2c5e05d-var-lock\") pod \"ad37f26c-293d-42c8-a88e-21e0a2c5e05d\" (UID: \"ad37f26c-293d-42c8-a88e-21e0a2c5e05d\") " Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.338309 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ad37f26c-293d-42c8-a88e-21e0a2c5e05d-var-lock" (OuterVolumeSpecName: "var-lock") pod "ad37f26c-293d-42c8-a88e-21e0a2c5e05d" (UID: "ad37f26c-293d-42c8-a88e-21e0a2c5e05d"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.338550 4922 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ad37f26c-293d-42c8-a88e-21e0a2c5e05d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.338575 4922 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ad37f26c-293d-42c8-a88e-21e0a2c5e05d-var-lock\") on node \"crc\" DevicePath \"\"" Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.338593 4922 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.338611 4922 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.338628 4922 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.347642 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad37f26c-293d-42c8-a88e-21e0a2c5e05d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ad37f26c-293d-42c8-a88e-21e0a2c5e05d" (UID: "ad37f26c-293d-42c8-a88e-21e0a2c5e05d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.440081 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad37f26c-293d-42c8-a88e-21e0a2c5e05d-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.824109 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ad37f26c-293d-42c8-a88e-21e0a2c5e05d","Type":"ContainerDied","Data":"2993e8b81cf0d5924b6da2a78a590d591a1692ddbff65fb9a4fa27002842c2e5"} Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.824188 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2993e8b81cf0d5924b6da2a78a590d591a1692ddbff65fb9a4fa27002842c2e5" Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.824195 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.833661 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.837001 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.837106 4922 scope.go:117] "RemoveContainer" containerID="9af62821d90e57595091aa9e7f1083acd21cb955ada0837c5ab4f71885ff7f1a" Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.836845 4922 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="dacf3659cbb51e3d8df303470837041f3110f38b42187d40cbb4e53a8750cafe" exitCode=0 Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.859146 4922 scope.go:117] "RemoveContainer" containerID="90b891a2fe938b437cbfb28c7715d0ad758541d844a4768434d3c1f1a3f0578c" Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.860077 4922 status_manager.go:851] "Failed to get status for pod" podUID="ad37f26c-293d-42c8-a88e-21e0a2c5e05d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.860493 4922 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.861175 4922 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.867274 4922 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.867927 4922 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.868255 4922 status_manager.go:851] "Failed to get status for pod" podUID="ad37f26c-293d-42c8-a88e-21e0a2c5e05d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.875821 4922 scope.go:117] "RemoveContainer" containerID="bab832676116da0ae3d9dc70d182e283d7ae2418db02c1ea0062812cab0d71b8" Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.890517 4922 scope.go:117] "RemoveContainer" containerID="434010c9772cf1c9db5578ecd4726ae2de0c0a3923d45b522868677cf78fcaf6" Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.904938 4922 scope.go:117] "RemoveContainer" containerID="dacf3659cbb51e3d8df303470837041f3110f38b42187d40cbb4e53a8750cafe" Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.923319 4922 scope.go:117] "RemoveContainer" containerID="aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e" Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.945007 4922 scope.go:117] "RemoveContainer" containerID="9af62821d90e57595091aa9e7f1083acd21cb955ada0837c5ab4f71885ff7f1a" Feb 18 11:40:35 crc kubenswrapper[4922]: E0218 11:40:35.946026 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9af62821d90e57595091aa9e7f1083acd21cb955ada0837c5ab4f71885ff7f1a\": container with ID starting with 9af62821d90e57595091aa9e7f1083acd21cb955ada0837c5ab4f71885ff7f1a not found: ID does not exist" containerID="9af62821d90e57595091aa9e7f1083acd21cb955ada0837c5ab4f71885ff7f1a" Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.946090 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9af62821d90e57595091aa9e7f1083acd21cb955ada0837c5ab4f71885ff7f1a"} err="failed to get container status \"9af62821d90e57595091aa9e7f1083acd21cb955ada0837c5ab4f71885ff7f1a\": rpc error: code = NotFound desc = could not find container \"9af62821d90e57595091aa9e7f1083acd21cb955ada0837c5ab4f71885ff7f1a\": container with ID starting with 9af62821d90e57595091aa9e7f1083acd21cb955ada0837c5ab4f71885ff7f1a not found: ID does not exist" Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.946122 4922 scope.go:117] "RemoveContainer" containerID="90b891a2fe938b437cbfb28c7715d0ad758541d844a4768434d3c1f1a3f0578c" Feb 18 11:40:35 crc kubenswrapper[4922]: E0218 11:40:35.946554 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90b891a2fe938b437cbfb28c7715d0ad758541d844a4768434d3c1f1a3f0578c\": container with ID starting with 90b891a2fe938b437cbfb28c7715d0ad758541d844a4768434d3c1f1a3f0578c not found: ID does not exist" containerID="90b891a2fe938b437cbfb28c7715d0ad758541d844a4768434d3c1f1a3f0578c" Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.946691 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90b891a2fe938b437cbfb28c7715d0ad758541d844a4768434d3c1f1a3f0578c"} err="failed to get container status \"90b891a2fe938b437cbfb28c7715d0ad758541d844a4768434d3c1f1a3f0578c\": rpc error: code = NotFound desc = could not find container \"90b891a2fe938b437cbfb28c7715d0ad758541d844a4768434d3c1f1a3f0578c\": container with ID starting with 90b891a2fe938b437cbfb28c7715d0ad758541d844a4768434d3c1f1a3f0578c not found: ID does not exist" Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.946785 4922 scope.go:117] "RemoveContainer" containerID="bab832676116da0ae3d9dc70d182e283d7ae2418db02c1ea0062812cab0d71b8" Feb 18 11:40:35 crc kubenswrapper[4922]: E0218 11:40:35.947418 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bab832676116da0ae3d9dc70d182e283d7ae2418db02c1ea0062812cab0d71b8\": container with ID starting with bab832676116da0ae3d9dc70d182e283d7ae2418db02c1ea0062812cab0d71b8 not found: ID does not exist" containerID="bab832676116da0ae3d9dc70d182e283d7ae2418db02c1ea0062812cab0d71b8" Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.947474 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bab832676116da0ae3d9dc70d182e283d7ae2418db02c1ea0062812cab0d71b8"} err="failed to get container status \"bab832676116da0ae3d9dc70d182e283d7ae2418db02c1ea0062812cab0d71b8\": rpc error: code = NotFound desc = could not find container \"bab832676116da0ae3d9dc70d182e283d7ae2418db02c1ea0062812cab0d71b8\": container with ID starting with bab832676116da0ae3d9dc70d182e283d7ae2418db02c1ea0062812cab0d71b8 not found: ID does not exist" Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.947495 4922 scope.go:117] "RemoveContainer" containerID="434010c9772cf1c9db5578ecd4726ae2de0c0a3923d45b522868677cf78fcaf6" Feb 18 11:40:35 crc kubenswrapper[4922]: E0218 11:40:35.948888 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"434010c9772cf1c9db5578ecd4726ae2de0c0a3923d45b522868677cf78fcaf6\": container with ID starting with 434010c9772cf1c9db5578ecd4726ae2de0c0a3923d45b522868677cf78fcaf6 not found: ID does not exist" containerID="434010c9772cf1c9db5578ecd4726ae2de0c0a3923d45b522868677cf78fcaf6" Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.949043 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"434010c9772cf1c9db5578ecd4726ae2de0c0a3923d45b522868677cf78fcaf6"} err="failed to get container status \"434010c9772cf1c9db5578ecd4726ae2de0c0a3923d45b522868677cf78fcaf6\": rpc error: code = NotFound desc = could not find container \"434010c9772cf1c9db5578ecd4726ae2de0c0a3923d45b522868677cf78fcaf6\": container with ID starting with 434010c9772cf1c9db5578ecd4726ae2de0c0a3923d45b522868677cf78fcaf6 not found: ID does not exist" Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.949163 4922 scope.go:117] "RemoveContainer" containerID="dacf3659cbb51e3d8df303470837041f3110f38b42187d40cbb4e53a8750cafe" Feb 18 11:40:35 crc kubenswrapper[4922]: E0218 11:40:35.949866 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dacf3659cbb51e3d8df303470837041f3110f38b42187d40cbb4e53a8750cafe\": container with ID starting with dacf3659cbb51e3d8df303470837041f3110f38b42187d40cbb4e53a8750cafe not found: ID does not exist" containerID="dacf3659cbb51e3d8df303470837041f3110f38b42187d40cbb4e53a8750cafe" Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.949894 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dacf3659cbb51e3d8df303470837041f3110f38b42187d40cbb4e53a8750cafe"} err="failed to get container status \"dacf3659cbb51e3d8df303470837041f3110f38b42187d40cbb4e53a8750cafe\": rpc error: code = NotFound desc = could not find container \"dacf3659cbb51e3d8df303470837041f3110f38b42187d40cbb4e53a8750cafe\": container with ID starting with dacf3659cbb51e3d8df303470837041f3110f38b42187d40cbb4e53a8750cafe not found: ID does not exist" Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.949910 4922 scope.go:117] "RemoveContainer" containerID="aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e" Feb 18 11:40:35 crc kubenswrapper[4922]: E0218 11:40:35.950254 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\": container with ID starting with aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e not found: ID does not exist" containerID="aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e" Feb 18 11:40:35 crc kubenswrapper[4922]: I0218 11:40:35.950385 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e"} err="failed to get container status \"aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\": rpc error: code = NotFound desc = could not find container \"aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e\": container with ID starting with aef2d87a1f410aab2dfaf3663b6d48d4772bcebaafc1d7b87e49613f8216826e not found: ID does not exist" Feb 18 11:40:36 crc kubenswrapper[4922]: I0218 11:40:36.984641 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 18 11:40:38 crc kubenswrapper[4922]: E0218 11:40:38.432350 4922 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:38 crc kubenswrapper[4922]: E0218 11:40:38.433237 4922 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:38 crc kubenswrapper[4922]: E0218 11:40:38.433752 4922 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:38 crc kubenswrapper[4922]: E0218 11:40:38.434409 4922 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:38 crc kubenswrapper[4922]: E0218 11:40:38.434729 4922 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:38 crc kubenswrapper[4922]: I0218 11:40:38.434776 4922 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 18 11:40:38 crc kubenswrapper[4922]: E0218 11:40:38.435060 4922 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.113:6443: connect: connection refused" interval="200ms" Feb 18 11:40:38 crc kubenswrapper[4922]: E0218 11:40:38.635824 4922 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.113:6443: connect: connection refused" interval="400ms" Feb 18 11:40:38 crc kubenswrapper[4922]: I0218 11:40:38.976395 4922 status_manager.go:851] "Failed to get status for pod" podUID="ad37f26c-293d-42c8-a88e-21e0a2c5e05d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:38 crc kubenswrapper[4922]: I0218 11:40:38.976897 4922 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:39 crc kubenswrapper[4922]: E0218 11:40:39.037755 4922 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.113:6443: connect: connection refused" interval="800ms" Feb 18 11:40:39 crc kubenswrapper[4922]: E0218 11:40:39.839773 4922 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.113:6443: connect: connection refused" interval="1.6s" Feb 18 11:40:41 crc kubenswrapper[4922]: E0218 11:40:41.402766 4922 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.113:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18955467ce10eb1c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-18 11:40:33.25573814 +0000 UTC m=+234.983442240,LastTimestamp:2026-02-18 11:40:33.25573814 +0000 UTC m=+234.983442240,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 18 11:40:41 crc kubenswrapper[4922]: E0218 11:40:41.440636 4922 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.113:6443: connect: connection refused" interval="3.2s" Feb 18 11:40:44 crc kubenswrapper[4922]: E0218 11:40:44.642329 4922 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.113:6443: connect: connection refused" interval="6.4s" Feb 18 11:40:44 crc kubenswrapper[4922]: I0218 11:40:44.750810 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" podUID="9e36551d-13cd-4a75-a29b-658850b46cb8" containerName="oauth-openshift" containerID="cri-o://6267ab86fea6d91bef9e1d2fb055261c2673103810304b496229122d5cfe0a9c" gracePeriod=15 Feb 18 11:40:44 crc kubenswrapper[4922]: I0218 11:40:44.904034 4922 generic.go:334] "Generic (PLEG): container finished" podID="9e36551d-13cd-4a75-a29b-658850b46cb8" containerID="6267ab86fea6d91bef9e1d2fb055261c2673103810304b496229122d5cfe0a9c" exitCode=0 Feb 18 11:40:44 crc kubenswrapper[4922]: I0218 11:40:44.904147 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" event={"ID":"9e36551d-13cd-4a75-a29b-658850b46cb8","Type":"ContainerDied","Data":"6267ab86fea6d91bef9e1d2fb055261c2673103810304b496229122d5cfe0a9c"} Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.364979 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.365689 4922 status_manager.go:851] "Failed to get status for pod" podUID="ad37f26c-293d-42c8-a88e-21e0a2c5e05d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.366159 4922 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.366573 4922 status_manager.go:851] "Failed to get status for pod" podUID="9e36551d-13cd-4a75-a29b-658850b46cb8" pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-q7mwg\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.491716 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-system-session\") pod \"9e36551d-13cd-4a75-a29b-658850b46cb8\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.491891 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpknq\" (UniqueName: \"kubernetes.io/projected/9e36551d-13cd-4a75-a29b-658850b46cb8-kube-api-access-mpknq\") pod \"9e36551d-13cd-4a75-a29b-658850b46cb8\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.491952 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-user-template-error\") pod \"9e36551d-13cd-4a75-a29b-658850b46cb8\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.491988 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9e36551d-13cd-4a75-a29b-658850b46cb8-audit-dir\") pod \"9e36551d-13cd-4a75-a29b-658850b46cb8\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.492049 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-system-ocp-branding-template\") pod \"9e36551d-13cd-4a75-a29b-658850b46cb8\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.492099 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9e36551d-13cd-4a75-a29b-658850b46cb8-audit-policies\") pod \"9e36551d-13cd-4a75-a29b-658850b46cb8\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.492138 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-system-cliconfig\") pod \"9e36551d-13cd-4a75-a29b-658850b46cb8\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.492181 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-user-idp-0-file-data\") pod \"9e36551d-13cd-4a75-a29b-658850b46cb8\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.492170 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9e36551d-13cd-4a75-a29b-658850b46cb8-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "9e36551d-13cd-4a75-a29b-658850b46cb8" (UID: "9e36551d-13cd-4a75-a29b-658850b46cb8"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.492240 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-user-template-login\") pod \"9e36551d-13cd-4a75-a29b-658850b46cb8\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.492304 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-user-template-provider-selection\") pod \"9e36551d-13cd-4a75-a29b-658850b46cb8\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.492389 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-system-service-ca\") pod \"9e36551d-13cd-4a75-a29b-658850b46cb8\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.492443 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-system-serving-cert\") pod \"9e36551d-13cd-4a75-a29b-658850b46cb8\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.492492 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-system-router-certs\") pod \"9e36551d-13cd-4a75-a29b-658850b46cb8\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.492528 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-system-trusted-ca-bundle\") pod \"9e36551d-13cd-4a75-a29b-658850b46cb8\" (UID: \"9e36551d-13cd-4a75-a29b-658850b46cb8\") " Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.492879 4922 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9e36551d-13cd-4a75-a29b-658850b46cb8-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.494114 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "9e36551d-13cd-4a75-a29b-658850b46cb8" (UID: "9e36551d-13cd-4a75-a29b-658850b46cb8"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.494138 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "9e36551d-13cd-4a75-a29b-658850b46cb8" (UID: "9e36551d-13cd-4a75-a29b-658850b46cb8"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.494295 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e36551d-13cd-4a75-a29b-658850b46cb8-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "9e36551d-13cd-4a75-a29b-658850b46cb8" (UID: "9e36551d-13cd-4a75-a29b-658850b46cb8"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.495926 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "9e36551d-13cd-4a75-a29b-658850b46cb8" (UID: "9e36551d-13cd-4a75-a29b-658850b46cb8"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.498672 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "9e36551d-13cd-4a75-a29b-658850b46cb8" (UID: "9e36551d-13cd-4a75-a29b-658850b46cb8"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.499355 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "9e36551d-13cd-4a75-a29b-658850b46cb8" (UID: "9e36551d-13cd-4a75-a29b-658850b46cb8"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.499974 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "9e36551d-13cd-4a75-a29b-658850b46cb8" (UID: "9e36551d-13cd-4a75-a29b-658850b46cb8"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.500322 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "9e36551d-13cd-4a75-a29b-658850b46cb8" (UID: "9e36551d-13cd-4a75-a29b-658850b46cb8"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.506144 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "9e36551d-13cd-4a75-a29b-658850b46cb8" (UID: "9e36551d-13cd-4a75-a29b-658850b46cb8"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.506383 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "9e36551d-13cd-4a75-a29b-658850b46cb8" (UID: "9e36551d-13cd-4a75-a29b-658850b46cb8"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.506599 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "9e36551d-13cd-4a75-a29b-658850b46cb8" (UID: "9e36551d-13cd-4a75-a29b-658850b46cb8"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.507666 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e36551d-13cd-4a75-a29b-658850b46cb8-kube-api-access-mpknq" (OuterVolumeSpecName: "kube-api-access-mpknq") pod "9e36551d-13cd-4a75-a29b-658850b46cb8" (UID: "9e36551d-13cd-4a75-a29b-658850b46cb8"). InnerVolumeSpecName "kube-api-access-mpknq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.510586 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "9e36551d-13cd-4a75-a29b-658850b46cb8" (UID: "9e36551d-13cd-4a75-a29b-658850b46cb8"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.594560 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.594619 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.594641 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.594661 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.594684 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.594705 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.594725 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.594747 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpknq\" (UniqueName: \"kubernetes.io/projected/9e36551d-13cd-4a75-a29b-658850b46cb8-kube-api-access-mpknq\") on node \"crc\" DevicePath \"\"" Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.594765 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.594782 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.594803 4922 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9e36551d-13cd-4a75-a29b-658850b46cb8-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.594820 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.594841 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9e36551d-13cd-4a75-a29b-658850b46cb8-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.911580 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" event={"ID":"9e36551d-13cd-4a75-a29b-658850b46cb8","Type":"ContainerDied","Data":"f32dc9e435ded17fa5224c75138342bd559fb562ee11ecfd7567b73083fd4cff"} Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.911635 4922 scope.go:117] "RemoveContainer" containerID="6267ab86fea6d91bef9e1d2fb055261c2673103810304b496229122d5cfe0a9c" Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.911715 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.912839 4922 status_manager.go:851] "Failed to get status for pod" podUID="ad37f26c-293d-42c8-a88e-21e0a2c5e05d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.913208 4922 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.913469 4922 status_manager.go:851] "Failed to get status for pod" podUID="9e36551d-13cd-4a75-a29b-658850b46cb8" pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-q7mwg\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.936235 4922 status_manager.go:851] "Failed to get status for pod" podUID="ad37f26c-293d-42c8-a88e-21e0a2c5e05d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.936819 4922 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:45 crc kubenswrapper[4922]: I0218 11:40:45.937199 4922 status_manager.go:851] "Failed to get status for pod" podUID="9e36551d-13cd-4a75-a29b-658850b46cb8" pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-q7mwg\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:46 crc kubenswrapper[4922]: I0218 11:40:46.922664 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 18 11:40:46 crc kubenswrapper[4922]: I0218 11:40:46.922915 4922 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="8dda1daddcf256e52a7dc75c6b5c221fcd8b8fad50c50b6053774f9a8fab57e3" exitCode=1 Feb 18 11:40:46 crc kubenswrapper[4922]: I0218 11:40:46.922962 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"8dda1daddcf256e52a7dc75c6b5c221fcd8b8fad50c50b6053774f9a8fab57e3"} Feb 18 11:40:46 crc kubenswrapper[4922]: I0218 11:40:46.923752 4922 scope.go:117] "RemoveContainer" containerID="8dda1daddcf256e52a7dc75c6b5c221fcd8b8fad50c50b6053774f9a8fab57e3" Feb 18 11:40:46 crc kubenswrapper[4922]: I0218 11:40:46.924789 4922 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:46 crc kubenswrapper[4922]: I0218 11:40:46.925219 4922 status_manager.go:851] "Failed to get status for pod" podUID="ad37f26c-293d-42c8-a88e-21e0a2c5e05d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:46 crc kubenswrapper[4922]: I0218 11:40:46.925666 4922 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:46 crc kubenswrapper[4922]: I0218 11:40:46.926191 4922 status_manager.go:851] "Failed to get status for pod" podUID="9e36551d-13cd-4a75-a29b-658850b46cb8" pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-q7mwg\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:47 crc kubenswrapper[4922]: I0218 11:40:47.934015 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 18 11:40:47 crc kubenswrapper[4922]: I0218 11:40:47.934088 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"66a332edd3c4a9a02ce364e963c0758fd557d7e92bd10259e23ad02259ab09e2"} Feb 18 11:40:47 crc kubenswrapper[4922]: I0218 11:40:47.935324 4922 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:47 crc kubenswrapper[4922]: I0218 11:40:47.935949 4922 status_manager.go:851] "Failed to get status for pod" podUID="ad37f26c-293d-42c8-a88e-21e0a2c5e05d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:47 crc kubenswrapper[4922]: I0218 11:40:47.936529 4922 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:47 crc kubenswrapper[4922]: I0218 11:40:47.937061 4922 status_manager.go:851] "Failed to get status for pod" podUID="9e36551d-13cd-4a75-a29b-658850b46cb8" pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-q7mwg\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:47 crc kubenswrapper[4922]: I0218 11:40:47.973017 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 11:40:47 crc kubenswrapper[4922]: I0218 11:40:47.974068 4922 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:47 crc kubenswrapper[4922]: I0218 11:40:47.974679 4922 status_manager.go:851] "Failed to get status for pod" podUID="ad37f26c-293d-42c8-a88e-21e0a2c5e05d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:47 crc kubenswrapper[4922]: I0218 11:40:47.975039 4922 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:47 crc kubenswrapper[4922]: I0218 11:40:47.975412 4922 status_manager.go:851] "Failed to get status for pod" podUID="9e36551d-13cd-4a75-a29b-658850b46cb8" pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-q7mwg\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:47 crc kubenswrapper[4922]: I0218 11:40:47.994819 4922 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="384d508e-8f6a-427d-a85d-85bb61d8405e" Feb 18 11:40:47 crc kubenswrapper[4922]: I0218 11:40:47.994854 4922 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="384d508e-8f6a-427d-a85d-85bb61d8405e" Feb 18 11:40:47 crc kubenswrapper[4922]: E0218 11:40:47.995408 4922 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 11:40:47 crc kubenswrapper[4922]: I0218 11:40:47.996095 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 11:40:48 crc kubenswrapper[4922]: I0218 11:40:48.946856 4922 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="67e441d2187cdcc2ea099c84fa65e2f200b56661adf368584566eb8d7312fee4" exitCode=0 Feb 18 11:40:48 crc kubenswrapper[4922]: I0218 11:40:48.946907 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"67e441d2187cdcc2ea099c84fa65e2f200b56661adf368584566eb8d7312fee4"} Feb 18 11:40:48 crc kubenswrapper[4922]: I0218 11:40:48.946933 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5c8d25a24af43d093ea8adebb00d2a85e2a250ba2dc1fc9fff1030a5a5a733bd"} Feb 18 11:40:48 crc kubenswrapper[4922]: I0218 11:40:48.947187 4922 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="384d508e-8f6a-427d-a85d-85bb61d8405e" Feb 18 11:40:48 crc kubenswrapper[4922]: I0218 11:40:48.947201 4922 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="384d508e-8f6a-427d-a85d-85bb61d8405e" Feb 18 11:40:48 crc kubenswrapper[4922]: E0218 11:40:48.947779 4922 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 11:40:48 crc kubenswrapper[4922]: I0218 11:40:48.948075 4922 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:48 crc kubenswrapper[4922]: I0218 11:40:48.948655 4922 status_manager.go:851] "Failed to get status for pod" podUID="ad37f26c-293d-42c8-a88e-21e0a2c5e05d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:48 crc kubenswrapper[4922]: I0218 11:40:48.949315 4922 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:48 crc kubenswrapper[4922]: I0218 11:40:48.949936 4922 status_manager.go:851] "Failed to get status for pod" podUID="9e36551d-13cd-4a75-a29b-658850b46cb8" pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-q7mwg\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:48 crc kubenswrapper[4922]: I0218 11:40:48.984504 4922 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:48 crc kubenswrapper[4922]: I0218 11:40:48.984982 4922 status_manager.go:851] "Failed to get status for pod" podUID="ad37f26c-293d-42c8-a88e-21e0a2c5e05d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:48 crc kubenswrapper[4922]: I0218 11:40:48.985312 4922 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:48 crc kubenswrapper[4922]: I0218 11:40:48.985665 4922 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:48 crc kubenswrapper[4922]: I0218 11:40:48.985886 4922 status_manager.go:851] "Failed to get status for pod" podUID="9e36551d-13cd-4a75-a29b-658850b46cb8" pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-q7mwg\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:49 crc kubenswrapper[4922]: I0218 11:40:49.014141 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 18 11:40:49 crc kubenswrapper[4922]: I0218 11:40:49.018681 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 18 11:40:49 crc kubenswrapper[4922]: I0218 11:40:49.019233 4922 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:49 crc kubenswrapper[4922]: I0218 11:40:49.019740 4922 status_manager.go:851] "Failed to get status for pod" podUID="ad37f26c-293d-42c8-a88e-21e0a2c5e05d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:49 crc kubenswrapper[4922]: I0218 11:40:49.020393 4922 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:49 crc kubenswrapper[4922]: I0218 11:40:49.020845 4922 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:49 crc kubenswrapper[4922]: I0218 11:40:49.021245 4922 status_manager.go:851] "Failed to get status for pod" podUID="9e36551d-13cd-4a75-a29b-658850b46cb8" pod="openshift-authentication/oauth-openshift-558db77b4-q7mwg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-q7mwg\": dial tcp 38.102.83.113:6443: connect: connection refused" Feb 18 11:40:49 crc kubenswrapper[4922]: I0218 11:40:49.956612 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"55bd723f32496e423143c9f5042299fcbf7e094a0e7a52680bc120b042d2616b"} Feb 18 11:40:49 crc kubenswrapper[4922]: I0218 11:40:49.956940 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 18 11:40:49 crc kubenswrapper[4922]: I0218 11:40:49.956952 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"cbdc5ff7e662f23c66224b081dc8fc9186de34341b715a86d99a10cbdccfdc3f"} Feb 18 11:40:49 crc kubenswrapper[4922]: I0218 11:40:49.956964 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ca941c710b6879755b97d9936d9c1cf4defb71efa4628851713585ac256bf739"} Feb 18 11:40:49 crc kubenswrapper[4922]: I0218 11:40:49.956982 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2e770938f62620937d9c3a7f4f84873a0e913f10ed1948e9f814cafccc25c5a0"} Feb 18 11:40:50 crc kubenswrapper[4922]: I0218 11:40:50.970352 4922 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="384d508e-8f6a-427d-a85d-85bb61d8405e" Feb 18 11:40:50 crc kubenswrapper[4922]: I0218 11:40:50.970905 4922 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="384d508e-8f6a-427d-a85d-85bb61d8405e" Feb 18 11:40:50 crc kubenswrapper[4922]: I0218 11:40:50.970976 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"dba04fd7b3341e318f1e884b3228eb4837fbd5b0209793720cba0ab6a1c029cd"} Feb 18 11:40:50 crc kubenswrapper[4922]: I0218 11:40:50.971105 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 11:40:52 crc kubenswrapper[4922]: I0218 11:40:52.996670 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 11:40:52 crc kubenswrapper[4922]: I0218 11:40:52.997088 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 11:40:53 crc kubenswrapper[4922]: I0218 11:40:53.004692 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 11:40:55 crc kubenswrapper[4922]: I0218 11:40:55.981342 4922 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 11:40:56 crc kubenswrapper[4922]: I0218 11:40:56.009956 4922 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="384d508e-8f6a-427d-a85d-85bb61d8405e" Feb 18 11:40:56 crc kubenswrapper[4922]: I0218 11:40:56.009991 4922 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="384d508e-8f6a-427d-a85d-85bb61d8405e" Feb 18 11:40:56 crc kubenswrapper[4922]: I0218 11:40:56.017357 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 11:40:56 crc kubenswrapper[4922]: I0218 11:40:56.063139 4922 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="f1156a0a-0a03-46cd-88d7-4b38085976bd" Feb 18 11:40:57 crc kubenswrapper[4922]: I0218 11:40:57.017568 4922 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="384d508e-8f6a-427d-a85d-85bb61d8405e" Feb 18 11:40:57 crc kubenswrapper[4922]: I0218 11:40:57.017881 4922 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="384d508e-8f6a-427d-a85d-85bb61d8405e" Feb 18 11:40:57 crc kubenswrapper[4922]: I0218 11:40:57.021916 4922 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="f1156a0a-0a03-46cd-88d7-4b38085976bd" Feb 18 11:41:02 crc kubenswrapper[4922]: I0218 11:41:02.361911 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 18 11:41:05 crc kubenswrapper[4922]: I0218 11:41:05.246868 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 18 11:41:05 crc kubenswrapper[4922]: I0218 11:41:05.731916 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 18 11:41:06 crc kubenswrapper[4922]: I0218 11:41:06.455083 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 18 11:41:06 crc kubenswrapper[4922]: I0218 11:41:06.512857 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 18 11:41:06 crc kubenswrapper[4922]: I0218 11:41:06.914005 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 18 11:41:07 crc kubenswrapper[4922]: I0218 11:41:07.084205 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 18 11:41:07 crc kubenswrapper[4922]: I0218 11:41:07.639907 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 18 11:41:07 crc kubenswrapper[4922]: I0218 11:41:07.811447 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 18 11:41:07 crc kubenswrapper[4922]: I0218 11:41:07.816996 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 18 11:41:07 crc kubenswrapper[4922]: I0218 11:41:07.915942 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 18 11:41:07 crc kubenswrapper[4922]: I0218 11:41:07.993223 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 18 11:41:07 crc kubenswrapper[4922]: I0218 11:41:07.997652 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 18 11:41:08 crc kubenswrapper[4922]: I0218 11:41:08.506395 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 18 11:41:08 crc kubenswrapper[4922]: I0218 11:41:08.558242 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 18 11:41:08 crc kubenswrapper[4922]: I0218 11:41:08.566193 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 18 11:41:08 crc kubenswrapper[4922]: I0218 11:41:08.584939 4922 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 18 11:41:08 crc kubenswrapper[4922]: I0218 11:41:08.677205 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 18 11:41:08 crc kubenswrapper[4922]: I0218 11:41:08.743099 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 18 11:41:08 crc kubenswrapper[4922]: I0218 11:41:08.810601 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 18 11:41:08 crc kubenswrapper[4922]: I0218 11:41:08.838483 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 18 11:41:08 crc kubenswrapper[4922]: I0218 11:41:08.896680 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 18 11:41:08 crc kubenswrapper[4922]: I0218 11:41:08.958422 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 18 11:41:09 crc kubenswrapper[4922]: I0218 11:41:09.038910 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 18 11:41:09 crc kubenswrapper[4922]: I0218 11:41:09.134969 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 18 11:41:09 crc kubenswrapper[4922]: I0218 11:41:09.149858 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 18 11:41:09 crc kubenswrapper[4922]: I0218 11:41:09.278782 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 18 11:41:09 crc kubenswrapper[4922]: I0218 11:41:09.293909 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 18 11:41:09 crc kubenswrapper[4922]: I0218 11:41:09.347158 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 18 11:41:09 crc kubenswrapper[4922]: I0218 11:41:09.379323 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 18 11:41:09 crc kubenswrapper[4922]: I0218 11:41:09.425521 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 18 11:41:09 crc kubenswrapper[4922]: I0218 11:41:09.435041 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 18 11:41:09 crc kubenswrapper[4922]: I0218 11:41:09.633276 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 18 11:41:09 crc kubenswrapper[4922]: I0218 11:41:09.798007 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 18 11:41:09 crc kubenswrapper[4922]: I0218 11:41:09.815934 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 18 11:41:09 crc kubenswrapper[4922]: I0218 11:41:09.836749 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 18 11:41:09 crc kubenswrapper[4922]: I0218 11:41:09.839658 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 18 11:41:09 crc kubenswrapper[4922]: I0218 11:41:09.854002 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 18 11:41:09 crc kubenswrapper[4922]: I0218 11:41:09.906495 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 18 11:41:09 crc kubenswrapper[4922]: I0218 11:41:09.991300 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 18 11:41:10 crc kubenswrapper[4922]: I0218 11:41:10.066507 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 18 11:41:10 crc kubenswrapper[4922]: I0218 11:41:10.117163 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 18 11:41:10 crc kubenswrapper[4922]: I0218 11:41:10.151677 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 18 11:41:10 crc kubenswrapper[4922]: I0218 11:41:10.176397 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 18 11:41:10 crc kubenswrapper[4922]: I0218 11:41:10.177018 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 18 11:41:10 crc kubenswrapper[4922]: I0218 11:41:10.362505 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 18 11:41:10 crc kubenswrapper[4922]: I0218 11:41:10.388851 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 18 11:41:10 crc kubenswrapper[4922]: I0218 11:41:10.401540 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 18 11:41:10 crc kubenswrapper[4922]: I0218 11:41:10.440319 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 18 11:41:10 crc kubenswrapper[4922]: I0218 11:41:10.444844 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 18 11:41:10 crc kubenswrapper[4922]: I0218 11:41:10.558546 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 18 11:41:10 crc kubenswrapper[4922]: I0218 11:41:10.574621 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 18 11:41:10 crc kubenswrapper[4922]: I0218 11:41:10.637863 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 18 11:41:10 crc kubenswrapper[4922]: I0218 11:41:10.689546 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 18 11:41:10 crc kubenswrapper[4922]: I0218 11:41:10.702655 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 18 11:41:10 crc kubenswrapper[4922]: I0218 11:41:10.723106 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 18 11:41:10 crc kubenswrapper[4922]: I0218 11:41:10.821926 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 18 11:41:10 crc kubenswrapper[4922]: I0218 11:41:10.832847 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 18 11:41:10 crc kubenswrapper[4922]: I0218 11:41:10.862450 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 18 11:41:10 crc kubenswrapper[4922]: I0218 11:41:10.880760 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 18 11:41:10 crc kubenswrapper[4922]: I0218 11:41:10.887638 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 18 11:41:11 crc kubenswrapper[4922]: I0218 11:41:11.153148 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 18 11:41:11 crc kubenswrapper[4922]: I0218 11:41:11.252747 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 18 11:41:11 crc kubenswrapper[4922]: I0218 11:41:11.292396 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 18 11:41:11 crc kubenswrapper[4922]: I0218 11:41:11.328204 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 18 11:41:11 crc kubenswrapper[4922]: I0218 11:41:11.348438 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 18 11:41:11 crc kubenswrapper[4922]: I0218 11:41:11.363976 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 18 11:41:11 crc kubenswrapper[4922]: I0218 11:41:11.364095 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 18 11:41:11 crc kubenswrapper[4922]: I0218 11:41:11.365838 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 18 11:41:11 crc kubenswrapper[4922]: I0218 11:41:11.389993 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 18 11:41:11 crc kubenswrapper[4922]: I0218 11:41:11.420874 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 18 11:41:11 crc kubenswrapper[4922]: I0218 11:41:11.451483 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 18 11:41:11 crc kubenswrapper[4922]: I0218 11:41:11.492830 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 18 11:41:11 crc kubenswrapper[4922]: I0218 11:41:11.510310 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 18 11:41:11 crc kubenswrapper[4922]: I0218 11:41:11.659900 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 18 11:41:11 crc kubenswrapper[4922]: I0218 11:41:11.725761 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 18 11:41:11 crc kubenswrapper[4922]: I0218 11:41:11.806438 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 18 11:41:11 crc kubenswrapper[4922]: I0218 11:41:11.823225 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 18 11:41:11 crc kubenswrapper[4922]: I0218 11:41:11.823467 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 18 11:41:11 crc kubenswrapper[4922]: I0218 11:41:11.921257 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 18 11:41:12 crc kubenswrapper[4922]: I0218 11:41:12.090128 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 18 11:41:12 crc kubenswrapper[4922]: I0218 11:41:12.108645 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 18 11:41:12 crc kubenswrapper[4922]: I0218 11:41:12.241772 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 18 11:41:12 crc kubenswrapper[4922]: I0218 11:41:12.308589 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 18 11:41:12 crc kubenswrapper[4922]: I0218 11:41:12.317168 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 18 11:41:12 crc kubenswrapper[4922]: I0218 11:41:12.351675 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 18 11:41:12 crc kubenswrapper[4922]: I0218 11:41:12.454033 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 18 11:41:12 crc kubenswrapper[4922]: I0218 11:41:12.485518 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 18 11:41:12 crc kubenswrapper[4922]: I0218 11:41:12.486942 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 18 11:41:12 crc kubenswrapper[4922]: I0218 11:41:12.491717 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 18 11:41:12 crc kubenswrapper[4922]: I0218 11:41:12.598995 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 18 11:41:12 crc kubenswrapper[4922]: I0218 11:41:12.636117 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 18 11:41:12 crc kubenswrapper[4922]: I0218 11:41:12.673773 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 18 11:41:12 crc kubenswrapper[4922]: I0218 11:41:12.832295 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 18 11:41:12 crc kubenswrapper[4922]: I0218 11:41:12.918536 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 18 11:41:12 crc kubenswrapper[4922]: I0218 11:41:12.935878 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 18 11:41:12 crc kubenswrapper[4922]: I0218 11:41:12.992622 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 18 11:41:13 crc kubenswrapper[4922]: I0218 11:41:13.008931 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 18 11:41:13 crc kubenswrapper[4922]: I0218 11:41:13.053803 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 18 11:41:13 crc kubenswrapper[4922]: I0218 11:41:13.071738 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 18 11:41:13 crc kubenswrapper[4922]: I0218 11:41:13.119711 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 18 11:41:13 crc kubenswrapper[4922]: I0218 11:41:13.169793 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 18 11:41:13 crc kubenswrapper[4922]: I0218 11:41:13.182060 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 18 11:41:13 crc kubenswrapper[4922]: I0218 11:41:13.234010 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 18 11:41:13 crc kubenswrapper[4922]: I0218 11:41:13.291337 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 18 11:41:13 crc kubenswrapper[4922]: I0218 11:41:13.303691 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 18 11:41:13 crc kubenswrapper[4922]: I0218 11:41:13.303722 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 18 11:41:13 crc kubenswrapper[4922]: I0218 11:41:13.334031 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 18 11:41:13 crc kubenswrapper[4922]: I0218 11:41:13.374592 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 18 11:41:13 crc kubenswrapper[4922]: I0218 11:41:13.405186 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 18 11:41:13 crc kubenswrapper[4922]: I0218 11:41:13.409123 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 18 11:41:13 crc kubenswrapper[4922]: I0218 11:41:13.465931 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 18 11:41:13 crc kubenswrapper[4922]: I0218 11:41:13.567174 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 18 11:41:13 crc kubenswrapper[4922]: I0218 11:41:13.614258 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 18 11:41:13 crc kubenswrapper[4922]: I0218 11:41:13.629198 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 18 11:41:13 crc kubenswrapper[4922]: I0218 11:41:13.661465 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 18 11:41:13 crc kubenswrapper[4922]: I0218 11:41:13.668845 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 18 11:41:13 crc kubenswrapper[4922]: I0218 11:41:13.699761 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 18 11:41:13 crc kubenswrapper[4922]: I0218 11:41:13.830444 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 18 11:41:13 crc kubenswrapper[4922]: I0218 11:41:13.842080 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 18 11:41:13 crc kubenswrapper[4922]: I0218 11:41:13.862609 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 18 11:41:14 crc kubenswrapper[4922]: I0218 11:41:14.044700 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 18 11:41:14 crc kubenswrapper[4922]: I0218 11:41:14.187309 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 18 11:41:14 crc kubenswrapper[4922]: I0218 11:41:14.245568 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 18 11:41:14 crc kubenswrapper[4922]: I0218 11:41:14.257326 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 18 11:41:14 crc kubenswrapper[4922]: I0218 11:41:14.273822 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 18 11:41:14 crc kubenswrapper[4922]: I0218 11:41:14.359595 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 18 11:41:14 crc kubenswrapper[4922]: I0218 11:41:14.511096 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 18 11:41:14 crc kubenswrapper[4922]: I0218 11:41:14.536607 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 18 11:41:14 crc kubenswrapper[4922]: I0218 11:41:14.587220 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 18 11:41:14 crc kubenswrapper[4922]: I0218 11:41:14.658062 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 18 11:41:14 crc kubenswrapper[4922]: I0218 11:41:14.686174 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 18 11:41:14 crc kubenswrapper[4922]: I0218 11:41:14.726552 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 18 11:41:14 crc kubenswrapper[4922]: I0218 11:41:14.756835 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 18 11:41:14 crc kubenswrapper[4922]: I0218 11:41:14.772251 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 18 11:41:14 crc kubenswrapper[4922]: I0218 11:41:14.775857 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 18 11:41:14 crc kubenswrapper[4922]: I0218 11:41:14.788840 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 18 11:41:14 crc kubenswrapper[4922]: I0218 11:41:14.912562 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 18 11:41:14 crc kubenswrapper[4922]: I0218 11:41:14.988102 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 18 11:41:14 crc kubenswrapper[4922]: I0218 11:41:14.991028 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 18 11:41:15 crc kubenswrapper[4922]: I0218 11:41:15.056744 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 18 11:41:15 crc kubenswrapper[4922]: I0218 11:41:15.068969 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 18 11:41:15 crc kubenswrapper[4922]: I0218 11:41:15.084655 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 18 11:41:15 crc kubenswrapper[4922]: I0218 11:41:15.117403 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 18 11:41:15 crc kubenswrapper[4922]: I0218 11:41:15.153387 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 18 11:41:15 crc kubenswrapper[4922]: I0218 11:41:15.228776 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 18 11:41:15 crc kubenswrapper[4922]: I0218 11:41:15.237388 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 18 11:41:15 crc kubenswrapper[4922]: I0218 11:41:15.238286 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 18 11:41:15 crc kubenswrapper[4922]: I0218 11:41:15.482209 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 18 11:41:15 crc kubenswrapper[4922]: I0218 11:41:15.483090 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 18 11:41:15 crc kubenswrapper[4922]: I0218 11:41:15.521983 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 18 11:41:15 crc kubenswrapper[4922]: I0218 11:41:15.588305 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 18 11:41:15 crc kubenswrapper[4922]: I0218 11:41:15.608229 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 18 11:41:15 crc kubenswrapper[4922]: I0218 11:41:15.688754 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 18 11:41:15 crc kubenswrapper[4922]: I0218 11:41:15.735005 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 18 11:41:15 crc kubenswrapper[4922]: I0218 11:41:15.789235 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 18 11:41:15 crc kubenswrapper[4922]: I0218 11:41:15.914508 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 18 11:41:15 crc kubenswrapper[4922]: I0218 11:41:15.920938 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 18 11:41:16 crc kubenswrapper[4922]: I0218 11:41:16.013914 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 18 11:41:16 crc kubenswrapper[4922]: I0218 11:41:16.280447 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 18 11:41:16 crc kubenswrapper[4922]: I0218 11:41:16.382742 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 18 11:41:16 crc kubenswrapper[4922]: I0218 11:41:16.382784 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 18 11:41:16 crc kubenswrapper[4922]: I0218 11:41:16.446221 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 18 11:41:16 crc kubenswrapper[4922]: I0218 11:41:16.487582 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 18 11:41:16 crc kubenswrapper[4922]: I0218 11:41:16.526750 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 18 11:41:16 crc kubenswrapper[4922]: I0218 11:41:16.570309 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 18 11:41:16 crc kubenswrapper[4922]: I0218 11:41:16.604657 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 18 11:41:16 crc kubenswrapper[4922]: I0218 11:41:16.650643 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 18 11:41:16 crc kubenswrapper[4922]: I0218 11:41:16.672864 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 18 11:41:16 crc kubenswrapper[4922]: I0218 11:41:16.700970 4922 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 18 11:41:16 crc kubenswrapper[4922]: I0218 11:41:16.878526 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 18 11:41:16 crc kubenswrapper[4922]: I0218 11:41:16.945435 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.023219 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.027932 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.110418 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.127661 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.137421 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.185748 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.377213 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.407153 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.521631 4922 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.522402 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.523349 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=45.523332933 podStartE2EDuration="45.523332933s" podCreationTimestamp="2026-02-18 11:40:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:40:56.059827889 +0000 UTC m=+257.787531959" watchObservedRunningTime="2026-02-18 11:41:17.523332933 +0000 UTC m=+279.251037023" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.524271 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.526950 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-q7mwg","openshift-kube-apiserver/kube-apiserver-crc"] Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.527004 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-556766588f-78hf2"] Feb 18 11:41:17 crc kubenswrapper[4922]: E0218 11:41:17.527200 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e36551d-13cd-4a75-a29b-658850b46cb8" containerName="oauth-openshift" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.527218 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e36551d-13cd-4a75-a29b-658850b46cb8" containerName="oauth-openshift" Feb 18 11:41:17 crc kubenswrapper[4922]: E0218 11:41:17.527239 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad37f26c-293d-42c8-a88e-21e0a2c5e05d" containerName="installer" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.527248 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad37f26c-293d-42c8-a88e-21e0a2c5e05d" containerName="installer" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.527391 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e36551d-13cd-4a75-a29b-658850b46cb8" containerName="oauth-openshift" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.527404 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad37f26c-293d-42c8-a88e-21e0a2c5e05d" containerName="installer" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.527625 4922 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="384d508e-8f6a-427d-a85d-85bb61d8405e" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.527658 4922 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="384d508e-8f6a-427d-a85d-85bb61d8405e" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.527845 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-556766588f-78hf2" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.531029 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.531285 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.531322 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.531461 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.531681 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.532559 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.532620 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.533715 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.535043 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.535247 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.535445 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.535583 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.538045 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.547678 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.551983 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.558686 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.572511 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.580182 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.587350 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=22.587335436 podStartE2EDuration="22.587335436s" podCreationTimestamp="2026-02-18 11:40:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:41:17.586992307 +0000 UTC m=+279.314696407" watchObservedRunningTime="2026-02-18 11:41:17.587335436 +0000 UTC m=+279.315039516" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.592315 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.594059 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.612466 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.658333 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/398e726b-2b70-4438-ac1b-bda8ca321928-v4-0-config-system-serving-cert\") pod \"oauth-openshift-556766588f-78hf2\" (UID: \"398e726b-2b70-4438-ac1b-bda8ca321928\") " pod="openshift-authentication/oauth-openshift-556766588f-78hf2" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.658701 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/398e726b-2b70-4438-ac1b-bda8ca321928-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-556766588f-78hf2\" (UID: \"398e726b-2b70-4438-ac1b-bda8ca321928\") " pod="openshift-authentication/oauth-openshift-556766588f-78hf2" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.658834 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/398e726b-2b70-4438-ac1b-bda8ca321928-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-556766588f-78hf2\" (UID: \"398e726b-2b70-4438-ac1b-bda8ca321928\") " pod="openshift-authentication/oauth-openshift-556766588f-78hf2" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.658941 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/398e726b-2b70-4438-ac1b-bda8ca321928-v4-0-config-user-template-login\") pod \"oauth-openshift-556766588f-78hf2\" (UID: \"398e726b-2b70-4438-ac1b-bda8ca321928\") " pod="openshift-authentication/oauth-openshift-556766588f-78hf2" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.659046 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/398e726b-2b70-4438-ac1b-bda8ca321928-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-556766588f-78hf2\" (UID: \"398e726b-2b70-4438-ac1b-bda8ca321928\") " pod="openshift-authentication/oauth-openshift-556766588f-78hf2" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.659162 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/398e726b-2b70-4438-ac1b-bda8ca321928-audit-policies\") pod \"oauth-openshift-556766588f-78hf2\" (UID: \"398e726b-2b70-4438-ac1b-bda8ca321928\") " pod="openshift-authentication/oauth-openshift-556766588f-78hf2" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.659280 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/398e726b-2b70-4438-ac1b-bda8ca321928-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-556766588f-78hf2\" (UID: \"398e726b-2b70-4438-ac1b-bda8ca321928\") " pod="openshift-authentication/oauth-openshift-556766588f-78hf2" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.659445 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/398e726b-2b70-4438-ac1b-bda8ca321928-v4-0-config-system-service-ca\") pod \"oauth-openshift-556766588f-78hf2\" (UID: \"398e726b-2b70-4438-ac1b-bda8ca321928\") " pod="openshift-authentication/oauth-openshift-556766588f-78hf2" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.659571 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qms6p\" (UniqueName: \"kubernetes.io/projected/398e726b-2b70-4438-ac1b-bda8ca321928-kube-api-access-qms6p\") pod \"oauth-openshift-556766588f-78hf2\" (UID: \"398e726b-2b70-4438-ac1b-bda8ca321928\") " pod="openshift-authentication/oauth-openshift-556766588f-78hf2" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.659684 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/398e726b-2b70-4438-ac1b-bda8ca321928-v4-0-config-system-router-certs\") pod \"oauth-openshift-556766588f-78hf2\" (UID: \"398e726b-2b70-4438-ac1b-bda8ca321928\") " pod="openshift-authentication/oauth-openshift-556766588f-78hf2" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.659808 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/398e726b-2b70-4438-ac1b-bda8ca321928-audit-dir\") pod \"oauth-openshift-556766588f-78hf2\" (UID: \"398e726b-2b70-4438-ac1b-bda8ca321928\") " pod="openshift-authentication/oauth-openshift-556766588f-78hf2" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.659925 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/398e726b-2b70-4438-ac1b-bda8ca321928-v4-0-config-user-template-error\") pod \"oauth-openshift-556766588f-78hf2\" (UID: \"398e726b-2b70-4438-ac1b-bda8ca321928\") " pod="openshift-authentication/oauth-openshift-556766588f-78hf2" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.660024 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/398e726b-2b70-4438-ac1b-bda8ca321928-v4-0-config-system-session\") pod \"oauth-openshift-556766588f-78hf2\" (UID: \"398e726b-2b70-4438-ac1b-bda8ca321928\") " pod="openshift-authentication/oauth-openshift-556766588f-78hf2" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.660381 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/398e726b-2b70-4438-ac1b-bda8ca321928-v4-0-config-system-cliconfig\") pod \"oauth-openshift-556766588f-78hf2\" (UID: \"398e726b-2b70-4438-ac1b-bda8ca321928\") " pod="openshift-authentication/oauth-openshift-556766588f-78hf2" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.761863 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/398e726b-2b70-4438-ac1b-bda8ca321928-v4-0-config-system-cliconfig\") pod \"oauth-openshift-556766588f-78hf2\" (UID: \"398e726b-2b70-4438-ac1b-bda8ca321928\") " pod="openshift-authentication/oauth-openshift-556766588f-78hf2" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.762140 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/398e726b-2b70-4438-ac1b-bda8ca321928-v4-0-config-system-serving-cert\") pod \"oauth-openshift-556766588f-78hf2\" (UID: \"398e726b-2b70-4438-ac1b-bda8ca321928\") " pod="openshift-authentication/oauth-openshift-556766588f-78hf2" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.762234 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/398e726b-2b70-4438-ac1b-bda8ca321928-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-556766588f-78hf2\" (UID: \"398e726b-2b70-4438-ac1b-bda8ca321928\") " pod="openshift-authentication/oauth-openshift-556766588f-78hf2" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.762332 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/398e726b-2b70-4438-ac1b-bda8ca321928-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-556766588f-78hf2\" (UID: \"398e726b-2b70-4438-ac1b-bda8ca321928\") " pod="openshift-authentication/oauth-openshift-556766588f-78hf2" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.762430 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/398e726b-2b70-4438-ac1b-bda8ca321928-v4-0-config-user-template-login\") pod \"oauth-openshift-556766588f-78hf2\" (UID: \"398e726b-2b70-4438-ac1b-bda8ca321928\") " pod="openshift-authentication/oauth-openshift-556766588f-78hf2" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.762512 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/398e726b-2b70-4438-ac1b-bda8ca321928-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-556766588f-78hf2\" (UID: \"398e726b-2b70-4438-ac1b-bda8ca321928\") " pod="openshift-authentication/oauth-openshift-556766588f-78hf2" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.762591 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/398e726b-2b70-4438-ac1b-bda8ca321928-audit-policies\") pod \"oauth-openshift-556766588f-78hf2\" (UID: \"398e726b-2b70-4438-ac1b-bda8ca321928\") " pod="openshift-authentication/oauth-openshift-556766588f-78hf2" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.762674 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/398e726b-2b70-4438-ac1b-bda8ca321928-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-556766588f-78hf2\" (UID: \"398e726b-2b70-4438-ac1b-bda8ca321928\") " pod="openshift-authentication/oauth-openshift-556766588f-78hf2" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.762764 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/398e726b-2b70-4438-ac1b-bda8ca321928-v4-0-config-system-service-ca\") pod \"oauth-openshift-556766588f-78hf2\" (UID: \"398e726b-2b70-4438-ac1b-bda8ca321928\") " pod="openshift-authentication/oauth-openshift-556766588f-78hf2" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.762846 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qms6p\" (UniqueName: \"kubernetes.io/projected/398e726b-2b70-4438-ac1b-bda8ca321928-kube-api-access-qms6p\") pod \"oauth-openshift-556766588f-78hf2\" (UID: \"398e726b-2b70-4438-ac1b-bda8ca321928\") " pod="openshift-authentication/oauth-openshift-556766588f-78hf2" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.762924 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/398e726b-2b70-4438-ac1b-bda8ca321928-v4-0-config-system-router-certs\") pod \"oauth-openshift-556766588f-78hf2\" (UID: \"398e726b-2b70-4438-ac1b-bda8ca321928\") " pod="openshift-authentication/oauth-openshift-556766588f-78hf2" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.763001 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/398e726b-2b70-4438-ac1b-bda8ca321928-audit-dir\") pod \"oauth-openshift-556766588f-78hf2\" (UID: \"398e726b-2b70-4438-ac1b-bda8ca321928\") " pod="openshift-authentication/oauth-openshift-556766588f-78hf2" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.763094 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/398e726b-2b70-4438-ac1b-bda8ca321928-v4-0-config-user-template-error\") pod \"oauth-openshift-556766588f-78hf2\" (UID: \"398e726b-2b70-4438-ac1b-bda8ca321928\") " pod="openshift-authentication/oauth-openshift-556766588f-78hf2" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.763177 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/398e726b-2b70-4438-ac1b-bda8ca321928-v4-0-config-system-session\") pod \"oauth-openshift-556766588f-78hf2\" (UID: \"398e726b-2b70-4438-ac1b-bda8ca321928\") " pod="openshift-authentication/oauth-openshift-556766588f-78hf2" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.763613 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/398e726b-2b70-4438-ac1b-bda8ca321928-audit-dir\") pod \"oauth-openshift-556766588f-78hf2\" (UID: \"398e726b-2b70-4438-ac1b-bda8ca321928\") " pod="openshift-authentication/oauth-openshift-556766588f-78hf2" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.763001 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/398e726b-2b70-4438-ac1b-bda8ca321928-v4-0-config-system-cliconfig\") pod \"oauth-openshift-556766588f-78hf2\" (UID: \"398e726b-2b70-4438-ac1b-bda8ca321928\") " pod="openshift-authentication/oauth-openshift-556766588f-78hf2" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.764586 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/398e726b-2b70-4438-ac1b-bda8ca321928-audit-policies\") pod \"oauth-openshift-556766588f-78hf2\" (UID: \"398e726b-2b70-4438-ac1b-bda8ca321928\") " pod="openshift-authentication/oauth-openshift-556766588f-78hf2" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.764599 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/398e726b-2b70-4438-ac1b-bda8ca321928-v4-0-config-system-service-ca\") pod \"oauth-openshift-556766588f-78hf2\" (UID: \"398e726b-2b70-4438-ac1b-bda8ca321928\") " pod="openshift-authentication/oauth-openshift-556766588f-78hf2" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.764868 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/398e726b-2b70-4438-ac1b-bda8ca321928-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-556766588f-78hf2\" (UID: \"398e726b-2b70-4438-ac1b-bda8ca321928\") " pod="openshift-authentication/oauth-openshift-556766588f-78hf2" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.769079 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/398e726b-2b70-4438-ac1b-bda8ca321928-v4-0-config-system-serving-cert\") pod \"oauth-openshift-556766588f-78hf2\" (UID: \"398e726b-2b70-4438-ac1b-bda8ca321928\") " pod="openshift-authentication/oauth-openshift-556766588f-78hf2" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.769603 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/398e726b-2b70-4438-ac1b-bda8ca321928-v4-0-config-system-session\") pod \"oauth-openshift-556766588f-78hf2\" (UID: \"398e726b-2b70-4438-ac1b-bda8ca321928\") " pod="openshift-authentication/oauth-openshift-556766588f-78hf2" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.770689 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/398e726b-2b70-4438-ac1b-bda8ca321928-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-556766588f-78hf2\" (UID: \"398e726b-2b70-4438-ac1b-bda8ca321928\") " pod="openshift-authentication/oauth-openshift-556766588f-78hf2" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.771885 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/398e726b-2b70-4438-ac1b-bda8ca321928-v4-0-config-user-template-error\") pod \"oauth-openshift-556766588f-78hf2\" (UID: \"398e726b-2b70-4438-ac1b-bda8ca321928\") " pod="openshift-authentication/oauth-openshift-556766588f-78hf2" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.772023 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/398e726b-2b70-4438-ac1b-bda8ca321928-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-556766588f-78hf2\" (UID: \"398e726b-2b70-4438-ac1b-bda8ca321928\") " pod="openshift-authentication/oauth-openshift-556766588f-78hf2" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.773347 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/398e726b-2b70-4438-ac1b-bda8ca321928-v4-0-config-user-template-login\") pod \"oauth-openshift-556766588f-78hf2\" (UID: \"398e726b-2b70-4438-ac1b-bda8ca321928\") " pod="openshift-authentication/oauth-openshift-556766588f-78hf2" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.775782 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/398e726b-2b70-4438-ac1b-bda8ca321928-v4-0-config-system-router-certs\") pod \"oauth-openshift-556766588f-78hf2\" (UID: \"398e726b-2b70-4438-ac1b-bda8ca321928\") " pod="openshift-authentication/oauth-openshift-556766588f-78hf2" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.785743 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/398e726b-2b70-4438-ac1b-bda8ca321928-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-556766588f-78hf2\" (UID: \"398e726b-2b70-4438-ac1b-bda8ca321928\") " pod="openshift-authentication/oauth-openshift-556766588f-78hf2" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.793107 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qms6p\" (UniqueName: \"kubernetes.io/projected/398e726b-2b70-4438-ac1b-bda8ca321928-kube-api-access-qms6p\") pod \"oauth-openshift-556766588f-78hf2\" (UID: \"398e726b-2b70-4438-ac1b-bda8ca321928\") " pod="openshift-authentication/oauth-openshift-556766588f-78hf2" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.801072 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.801250 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 18 11:41:17 crc kubenswrapper[4922]: I0218 11:41:17.846440 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-556766588f-78hf2" Feb 18 11:41:18 crc kubenswrapper[4922]: I0218 11:41:18.003647 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 18 11:41:18 crc kubenswrapper[4922]: I0218 11:41:18.059911 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-556766588f-78hf2"] Feb 18 11:41:18 crc kubenswrapper[4922]: I0218 11:41:18.159557 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 18 11:41:18 crc kubenswrapper[4922]: I0218 11:41:18.205971 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 18 11:41:18 crc kubenswrapper[4922]: I0218 11:41:18.221141 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 18 11:41:18 crc kubenswrapper[4922]: I0218 11:41:18.251893 4922 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 18 11:41:18 crc kubenswrapper[4922]: I0218 11:41:18.260619 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 18 11:41:18 crc kubenswrapper[4922]: I0218 11:41:18.264764 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 18 11:41:18 crc kubenswrapper[4922]: I0218 11:41:18.278446 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 18 11:41:18 crc kubenswrapper[4922]: I0218 11:41:18.297440 4922 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 18 11:41:18 crc kubenswrapper[4922]: I0218 11:41:18.297690 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://856502c67849034b25212f7f840344a5e8241e9f0ff22714b0ccd364807de125" gracePeriod=5 Feb 18 11:41:18 crc kubenswrapper[4922]: I0218 11:41:18.305572 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 18 11:41:18 crc kubenswrapper[4922]: I0218 11:41:18.340337 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 18 11:41:18 crc kubenswrapper[4922]: I0218 11:41:18.431512 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 18 11:41:18 crc kubenswrapper[4922]: I0218 11:41:18.475712 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-556766588f-78hf2"] Feb 18 11:41:18 crc kubenswrapper[4922]: I0218 11:41:18.601817 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 18 11:41:18 crc kubenswrapper[4922]: I0218 11:41:18.656612 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 18 11:41:18 crc kubenswrapper[4922]: I0218 11:41:18.715271 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 18 11:41:18 crc kubenswrapper[4922]: I0218 11:41:18.747570 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 18 11:41:18 crc kubenswrapper[4922]: I0218 11:41:18.981581 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e36551d-13cd-4a75-a29b-658850b46cb8" path="/var/lib/kubelet/pods/9e36551d-13cd-4a75-a29b-658850b46cb8/volumes" Feb 18 11:41:19 crc kubenswrapper[4922]: I0218 11:41:19.000316 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 18 11:41:19 crc kubenswrapper[4922]: I0218 11:41:19.012964 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 18 11:41:19 crc kubenswrapper[4922]: I0218 11:41:19.115762 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 18 11:41:19 crc kubenswrapper[4922]: I0218 11:41:19.141614 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 18 11:41:19 crc kubenswrapper[4922]: I0218 11:41:19.150791 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-556766588f-78hf2" event={"ID":"398e726b-2b70-4438-ac1b-bda8ca321928","Type":"ContainerStarted","Data":"2a35855c08d4543fbb694477f09db21d8232028117686782bfed4d4d27c22e2e"} Feb 18 11:41:19 crc kubenswrapper[4922]: I0218 11:41:19.150851 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-556766588f-78hf2" event={"ID":"398e726b-2b70-4438-ac1b-bda8ca321928","Type":"ContainerStarted","Data":"a07aa410aa6179b667b4c581f1683feb197787cff1c9d03d894b65033c360287"} Feb 18 11:41:19 crc kubenswrapper[4922]: I0218 11:41:19.154862 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-556766588f-78hf2" Feb 18 11:41:19 crc kubenswrapper[4922]: I0218 11:41:19.158094 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-556766588f-78hf2" Feb 18 11:41:19 crc kubenswrapper[4922]: I0218 11:41:19.172076 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-556766588f-78hf2" podStartSLOduration=60.172058754 podStartE2EDuration="1m0.172058754s" podCreationTimestamp="2026-02-18 11:40:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:41:19.171483138 +0000 UTC m=+280.899187228" watchObservedRunningTime="2026-02-18 11:41:19.172058754 +0000 UTC m=+280.899762834" Feb 18 11:41:19 crc kubenswrapper[4922]: I0218 11:41:19.438401 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 18 11:41:19 crc kubenswrapper[4922]: I0218 11:41:19.449888 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 18 11:41:19 crc kubenswrapper[4922]: I0218 11:41:19.485320 4922 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 18 11:41:19 crc kubenswrapper[4922]: I0218 11:41:19.565936 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 18 11:41:19 crc kubenswrapper[4922]: I0218 11:41:19.857739 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 18 11:41:19 crc kubenswrapper[4922]: I0218 11:41:19.908546 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 18 11:41:19 crc kubenswrapper[4922]: I0218 11:41:19.990213 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 18 11:41:20 crc kubenswrapper[4922]: I0218 11:41:20.007557 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 18 11:41:20 crc kubenswrapper[4922]: I0218 11:41:20.065549 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 18 11:41:20 crc kubenswrapper[4922]: I0218 11:41:20.067339 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 18 11:41:20 crc kubenswrapper[4922]: I0218 11:41:20.168569 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 18 11:41:20 crc kubenswrapper[4922]: I0218 11:41:20.233219 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 18 11:41:20 crc kubenswrapper[4922]: I0218 11:41:20.238971 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 18 11:41:20 crc kubenswrapper[4922]: I0218 11:41:20.276888 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 18 11:41:20 crc kubenswrapper[4922]: I0218 11:41:20.392300 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 18 11:41:20 crc kubenswrapper[4922]: I0218 11:41:20.429855 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 18 11:41:20 crc kubenswrapper[4922]: I0218 11:41:20.475990 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 18 11:41:20 crc kubenswrapper[4922]: I0218 11:41:20.609776 4922 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 18 11:41:20 crc kubenswrapper[4922]: I0218 11:41:20.811349 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 18 11:41:20 crc kubenswrapper[4922]: I0218 11:41:20.976559 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 18 11:41:23 crc kubenswrapper[4922]: I0218 11:41:23.929571 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 18 11:41:23 crc kubenswrapper[4922]: I0218 11:41:23.930664 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 11:41:24 crc kubenswrapper[4922]: I0218 11:41:24.039281 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 18 11:41:24 crc kubenswrapper[4922]: I0218 11:41:24.039337 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 18 11:41:24 crc kubenswrapper[4922]: I0218 11:41:24.039387 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 18 11:41:24 crc kubenswrapper[4922]: I0218 11:41:24.039430 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 18 11:41:24 crc kubenswrapper[4922]: I0218 11:41:24.039465 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 18 11:41:24 crc kubenswrapper[4922]: I0218 11:41:24.039803 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 11:41:24 crc kubenswrapper[4922]: I0218 11:41:24.039878 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 11:41:24 crc kubenswrapper[4922]: I0218 11:41:24.040099 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 11:41:24 crc kubenswrapper[4922]: I0218 11:41:24.040213 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 11:41:24 crc kubenswrapper[4922]: I0218 11:41:24.048550 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 11:41:24 crc kubenswrapper[4922]: I0218 11:41:24.141058 4922 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 18 11:41:24 crc kubenswrapper[4922]: I0218 11:41:24.141093 4922 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 18 11:41:24 crc kubenswrapper[4922]: I0218 11:41:24.141108 4922 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 18 11:41:24 crc kubenswrapper[4922]: I0218 11:41:24.141128 4922 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 18 11:41:24 crc kubenswrapper[4922]: I0218 11:41:24.141142 4922 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 18 11:41:24 crc kubenswrapper[4922]: I0218 11:41:24.192490 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 18 11:41:24 crc kubenswrapper[4922]: I0218 11:41:24.192565 4922 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="856502c67849034b25212f7f840344a5e8241e9f0ff22714b0ccd364807de125" exitCode=137 Feb 18 11:41:24 crc kubenswrapper[4922]: I0218 11:41:24.192619 4922 scope.go:117] "RemoveContainer" containerID="856502c67849034b25212f7f840344a5e8241e9f0ff22714b0ccd364807de125" Feb 18 11:41:24 crc kubenswrapper[4922]: I0218 11:41:24.192679 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 11:41:24 crc kubenswrapper[4922]: I0218 11:41:24.222453 4922 scope.go:117] "RemoveContainer" containerID="856502c67849034b25212f7f840344a5e8241e9f0ff22714b0ccd364807de125" Feb 18 11:41:24 crc kubenswrapper[4922]: E0218 11:41:24.223068 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"856502c67849034b25212f7f840344a5e8241e9f0ff22714b0ccd364807de125\": container with ID starting with 856502c67849034b25212f7f840344a5e8241e9f0ff22714b0ccd364807de125 not found: ID does not exist" containerID="856502c67849034b25212f7f840344a5e8241e9f0ff22714b0ccd364807de125" Feb 18 11:41:24 crc kubenswrapper[4922]: I0218 11:41:24.223382 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"856502c67849034b25212f7f840344a5e8241e9f0ff22714b0ccd364807de125"} err="failed to get container status \"856502c67849034b25212f7f840344a5e8241e9f0ff22714b0ccd364807de125\": rpc error: code = NotFound desc = could not find container \"856502c67849034b25212f7f840344a5e8241e9f0ff22714b0ccd364807de125\": container with ID starting with 856502c67849034b25212f7f840344a5e8241e9f0ff22714b0ccd364807de125 not found: ID does not exist" Feb 18 11:41:24 crc kubenswrapper[4922]: I0218 11:41:24.983162 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 18 11:41:24 crc kubenswrapper[4922]: I0218 11:41:24.984016 4922 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Feb 18 11:41:25 crc kubenswrapper[4922]: I0218 11:41:25.033628 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 18 11:41:25 crc kubenswrapper[4922]: I0218 11:41:25.033691 4922 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="032f3636-eb0f-4013-8788-c79899e94cc8" Feb 18 11:41:25 crc kubenswrapper[4922]: I0218 11:41:25.039212 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 18 11:41:25 crc kubenswrapper[4922]: I0218 11:41:25.039264 4922 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="032f3636-eb0f-4013-8788-c79899e94cc8" Feb 18 11:41:37 crc kubenswrapper[4922]: I0218 11:41:37.632943 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 18 11:41:37 crc kubenswrapper[4922]: I0218 11:41:37.782521 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 18 11:41:38 crc kubenswrapper[4922]: I0218 11:41:38.745026 4922 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Feb 18 11:41:40 crc kubenswrapper[4922]: I0218 11:41:40.088157 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 18 11:41:46 crc kubenswrapper[4922]: I0218 11:41:46.741830 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 18 11:41:47 crc kubenswrapper[4922]: I0218 11:41:47.588988 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 18 11:41:47 crc kubenswrapper[4922]: I0218 11:41:47.597599 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 18 11:41:47 crc kubenswrapper[4922]: I0218 11:41:47.833641 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 18 11:41:59 crc kubenswrapper[4922]: I0218 11:41:59.937657 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 18 11:42:00 crc kubenswrapper[4922]: I0218 11:42:00.446410 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 18 11:42:02 crc kubenswrapper[4922]: I0218 11:42:02.402721 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-vs74j"] Feb 18 11:42:02 crc kubenswrapper[4922]: E0218 11:42:02.402977 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 18 11:42:02 crc kubenswrapper[4922]: I0218 11:42:02.402992 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 18 11:42:02 crc kubenswrapper[4922]: I0218 11:42:02.403127 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 18 11:42:02 crc kubenswrapper[4922]: I0218 11:42:02.403801 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-vs74j" Feb 18 11:42:02 crc kubenswrapper[4922]: I0218 11:42:02.417004 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-vs74j"] Feb 18 11:42:02 crc kubenswrapper[4922]: I0218 11:42:02.481961 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/907f9baa-e193-4055-b982-d9a58830ea01-trusted-ca\") pod \"image-registry-66df7c8f76-vs74j\" (UID: \"907f9baa-e193-4055-b982-d9a58830ea01\") " pod="openshift-image-registry/image-registry-66df7c8f76-vs74j" Feb 18 11:42:02 crc kubenswrapper[4922]: I0218 11:42:02.482306 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/907f9baa-e193-4055-b982-d9a58830ea01-ca-trust-extracted\") pod \"image-registry-66df7c8f76-vs74j\" (UID: \"907f9baa-e193-4055-b982-d9a58830ea01\") " pod="openshift-image-registry/image-registry-66df7c8f76-vs74j" Feb 18 11:42:02 crc kubenswrapper[4922]: I0218 11:42:02.482337 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/907f9baa-e193-4055-b982-d9a58830ea01-registry-tls\") pod \"image-registry-66df7c8f76-vs74j\" (UID: \"907f9baa-e193-4055-b982-d9a58830ea01\") " pod="openshift-image-registry/image-registry-66df7c8f76-vs74j" Feb 18 11:42:02 crc kubenswrapper[4922]: I0218 11:42:02.482397 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-vs74j\" (UID: \"907f9baa-e193-4055-b982-d9a58830ea01\") " pod="openshift-image-registry/image-registry-66df7c8f76-vs74j" Feb 18 11:42:02 crc kubenswrapper[4922]: I0218 11:42:02.482429 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8khc\" (UniqueName: \"kubernetes.io/projected/907f9baa-e193-4055-b982-d9a58830ea01-kube-api-access-p8khc\") pod \"image-registry-66df7c8f76-vs74j\" (UID: \"907f9baa-e193-4055-b982-d9a58830ea01\") " pod="openshift-image-registry/image-registry-66df7c8f76-vs74j" Feb 18 11:42:02 crc kubenswrapper[4922]: I0218 11:42:02.482456 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/907f9baa-e193-4055-b982-d9a58830ea01-installation-pull-secrets\") pod \"image-registry-66df7c8f76-vs74j\" (UID: \"907f9baa-e193-4055-b982-d9a58830ea01\") " pod="openshift-image-registry/image-registry-66df7c8f76-vs74j" Feb 18 11:42:02 crc kubenswrapper[4922]: I0218 11:42:02.482488 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/907f9baa-e193-4055-b982-d9a58830ea01-bound-sa-token\") pod \"image-registry-66df7c8f76-vs74j\" (UID: \"907f9baa-e193-4055-b982-d9a58830ea01\") " pod="openshift-image-registry/image-registry-66df7c8f76-vs74j" Feb 18 11:42:02 crc kubenswrapper[4922]: I0218 11:42:02.482525 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/907f9baa-e193-4055-b982-d9a58830ea01-registry-certificates\") pod \"image-registry-66df7c8f76-vs74j\" (UID: \"907f9baa-e193-4055-b982-d9a58830ea01\") " pod="openshift-image-registry/image-registry-66df7c8f76-vs74j" Feb 18 11:42:02 crc kubenswrapper[4922]: I0218 11:42:02.502682 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-vs74j\" (UID: \"907f9baa-e193-4055-b982-d9a58830ea01\") " pod="openshift-image-registry/image-registry-66df7c8f76-vs74j" Feb 18 11:42:02 crc kubenswrapper[4922]: I0218 11:42:02.583020 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/907f9baa-e193-4055-b982-d9a58830ea01-trusted-ca\") pod \"image-registry-66df7c8f76-vs74j\" (UID: \"907f9baa-e193-4055-b982-d9a58830ea01\") " pod="openshift-image-registry/image-registry-66df7c8f76-vs74j" Feb 18 11:42:02 crc kubenswrapper[4922]: I0218 11:42:02.583078 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/907f9baa-e193-4055-b982-d9a58830ea01-ca-trust-extracted\") pod \"image-registry-66df7c8f76-vs74j\" (UID: \"907f9baa-e193-4055-b982-d9a58830ea01\") " pod="openshift-image-registry/image-registry-66df7c8f76-vs74j" Feb 18 11:42:02 crc kubenswrapper[4922]: I0218 11:42:02.583096 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/907f9baa-e193-4055-b982-d9a58830ea01-registry-tls\") pod \"image-registry-66df7c8f76-vs74j\" (UID: \"907f9baa-e193-4055-b982-d9a58830ea01\") " pod="openshift-image-registry/image-registry-66df7c8f76-vs74j" Feb 18 11:42:02 crc kubenswrapper[4922]: I0218 11:42:02.583126 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8khc\" (UniqueName: \"kubernetes.io/projected/907f9baa-e193-4055-b982-d9a58830ea01-kube-api-access-p8khc\") pod \"image-registry-66df7c8f76-vs74j\" (UID: \"907f9baa-e193-4055-b982-d9a58830ea01\") " pod="openshift-image-registry/image-registry-66df7c8f76-vs74j" Feb 18 11:42:02 crc kubenswrapper[4922]: I0218 11:42:02.583147 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/907f9baa-e193-4055-b982-d9a58830ea01-installation-pull-secrets\") pod \"image-registry-66df7c8f76-vs74j\" (UID: \"907f9baa-e193-4055-b982-d9a58830ea01\") " pod="openshift-image-registry/image-registry-66df7c8f76-vs74j" Feb 18 11:42:02 crc kubenswrapper[4922]: I0218 11:42:02.583171 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/907f9baa-e193-4055-b982-d9a58830ea01-bound-sa-token\") pod \"image-registry-66df7c8f76-vs74j\" (UID: \"907f9baa-e193-4055-b982-d9a58830ea01\") " pod="openshift-image-registry/image-registry-66df7c8f76-vs74j" Feb 18 11:42:02 crc kubenswrapper[4922]: I0218 11:42:02.583346 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/907f9baa-e193-4055-b982-d9a58830ea01-registry-certificates\") pod \"image-registry-66df7c8f76-vs74j\" (UID: \"907f9baa-e193-4055-b982-d9a58830ea01\") " pod="openshift-image-registry/image-registry-66df7c8f76-vs74j" Feb 18 11:42:02 crc kubenswrapper[4922]: I0218 11:42:02.584282 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/907f9baa-e193-4055-b982-d9a58830ea01-ca-trust-extracted\") pod \"image-registry-66df7c8f76-vs74j\" (UID: \"907f9baa-e193-4055-b982-d9a58830ea01\") " pod="openshift-image-registry/image-registry-66df7c8f76-vs74j" Feb 18 11:42:02 crc kubenswrapper[4922]: I0218 11:42:02.584710 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/907f9baa-e193-4055-b982-d9a58830ea01-trusted-ca\") pod \"image-registry-66df7c8f76-vs74j\" (UID: \"907f9baa-e193-4055-b982-d9a58830ea01\") " pod="openshift-image-registry/image-registry-66df7c8f76-vs74j" Feb 18 11:42:02 crc kubenswrapper[4922]: I0218 11:42:02.584770 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/907f9baa-e193-4055-b982-d9a58830ea01-registry-certificates\") pod \"image-registry-66df7c8f76-vs74j\" (UID: \"907f9baa-e193-4055-b982-d9a58830ea01\") " pod="openshift-image-registry/image-registry-66df7c8f76-vs74j" Feb 18 11:42:02 crc kubenswrapper[4922]: I0218 11:42:02.590457 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/907f9baa-e193-4055-b982-d9a58830ea01-registry-tls\") pod \"image-registry-66df7c8f76-vs74j\" (UID: \"907f9baa-e193-4055-b982-d9a58830ea01\") " pod="openshift-image-registry/image-registry-66df7c8f76-vs74j" Feb 18 11:42:02 crc kubenswrapper[4922]: I0218 11:42:02.602895 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/907f9baa-e193-4055-b982-d9a58830ea01-bound-sa-token\") pod \"image-registry-66df7c8f76-vs74j\" (UID: \"907f9baa-e193-4055-b982-d9a58830ea01\") " pod="openshift-image-registry/image-registry-66df7c8f76-vs74j" Feb 18 11:42:02 crc kubenswrapper[4922]: I0218 11:42:02.604075 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/907f9baa-e193-4055-b982-d9a58830ea01-installation-pull-secrets\") pod \"image-registry-66df7c8f76-vs74j\" (UID: \"907f9baa-e193-4055-b982-d9a58830ea01\") " pod="openshift-image-registry/image-registry-66df7c8f76-vs74j" Feb 18 11:42:02 crc kubenswrapper[4922]: I0218 11:42:02.605160 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8khc\" (UniqueName: \"kubernetes.io/projected/907f9baa-e193-4055-b982-d9a58830ea01-kube-api-access-p8khc\") pod \"image-registry-66df7c8f76-vs74j\" (UID: \"907f9baa-e193-4055-b982-d9a58830ea01\") " pod="openshift-image-registry/image-registry-66df7c8f76-vs74j" Feb 18 11:42:02 crc kubenswrapper[4922]: I0218 11:42:02.766401 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-vs74j" Feb 18 11:42:03 crc kubenswrapper[4922]: I0218 11:42:03.390524 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-vs74j"] Feb 18 11:42:03 crc kubenswrapper[4922]: I0218 11:42:03.479910 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-vs74j" event={"ID":"907f9baa-e193-4055-b982-d9a58830ea01","Type":"ContainerStarted","Data":"42ac242033fda98a291ccd1a343dcd807760f0ab7af506661d3fd52a6c320d25"} Feb 18 11:42:04 crc kubenswrapper[4922]: I0218 11:42:04.487832 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-vs74j" event={"ID":"907f9baa-e193-4055-b982-d9a58830ea01","Type":"ContainerStarted","Data":"c7f1fff82df0683e8840e22109dfa6a24b674de8548daeedacd7f959f941418e"} Feb 18 11:42:04 crc kubenswrapper[4922]: I0218 11:42:04.488009 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-vs74j" Feb 18 11:42:04 crc kubenswrapper[4922]: I0218 11:42:04.516567 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-vs74j" podStartSLOduration=2.516547376 podStartE2EDuration="2.516547376s" podCreationTimestamp="2026-02-18 11:42:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:42:04.513045457 +0000 UTC m=+326.240749537" watchObservedRunningTime="2026-02-18 11:42:04.516547376 +0000 UTC m=+326.244251456" Feb 18 11:42:10 crc kubenswrapper[4922]: I0218 11:42:10.782381 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7dzbt"] Feb 18 11:42:10 crc kubenswrapper[4922]: I0218 11:42:10.783185 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7dzbt" podUID="fe4edbcb-8a38-4f30-975f-aa4825192b4e" containerName="registry-server" containerID="cri-o://8fdaa83d5f72f970ccf8db14422f820c61b3754fa8464fddf21f1bada5ca3b77" gracePeriod=30 Feb 18 11:42:10 crc kubenswrapper[4922]: I0218 11:42:10.795518 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5lflw"] Feb 18 11:42:10 crc kubenswrapper[4922]: I0218 11:42:10.795787 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5lflw" podUID="9cddee0a-8b13-429b-89b6-e820f8f3ec59" containerName="registry-server" containerID="cri-o://3095825ff9ad7d2de643243385e47b2af86046b60063ce1e20f19e93ab18808b" gracePeriod=30 Feb 18 11:42:10 crc kubenswrapper[4922]: I0218 11:42:10.803219 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nc7b9"] Feb 18 11:42:10 crc kubenswrapper[4922]: I0218 11:42:10.803611 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-nc7b9" podUID="aa233e7a-8a71-495c-b696-2f3dac9f0ada" containerName="marketplace-operator" containerID="cri-o://47f81b843aae8ab322577ee88976e3fa1b06eba4eb120e58e24ce0de633eadc0" gracePeriod=30 Feb 18 11:42:10 crc kubenswrapper[4922]: I0218 11:42:10.825560 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5vjsn"] Feb 18 11:42:10 crc kubenswrapper[4922]: I0218 11:42:10.826085 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5vjsn" podUID="bf0d2342-e758-43cc-8c89-adc3ceb98453" containerName="registry-server" containerID="cri-o://7540772acba64ca21dc3279082ac26db8226d215a021ff0397854fc0d4e90178" gracePeriod=30 Feb 18 11:42:10 crc kubenswrapper[4922]: I0218 11:42:10.830240 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wz74v"] Feb 18 11:42:10 crc kubenswrapper[4922]: I0218 11:42:10.830421 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wz74v" podUID="47c627d0-6fb9-4b77-b266-74670361fcd6" containerName="registry-server" containerID="cri-o://f527847320af902ee6f743ec960949ebda5617bce3a441ada1527d1de64a2927" gracePeriod=30 Feb 18 11:42:10 crc kubenswrapper[4922]: I0218 11:42:10.834254 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gjc8w"] Feb 18 11:42:10 crc kubenswrapper[4922]: I0218 11:42:10.835065 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gjc8w" Feb 18 11:42:10 crc kubenswrapper[4922]: I0218 11:42:10.844272 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gjc8w"] Feb 18 11:42:10 crc kubenswrapper[4922]: I0218 11:42:10.920801 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/452cdbd0-d1e1-491a-8edd-d0f88f602364-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gjc8w\" (UID: \"452cdbd0-d1e1-491a-8edd-d0f88f602364\") " pod="openshift-marketplace/marketplace-operator-79b997595-gjc8w" Feb 18 11:42:10 crc kubenswrapper[4922]: I0218 11:42:10.920841 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq9rm\" (UniqueName: \"kubernetes.io/projected/452cdbd0-d1e1-491a-8edd-d0f88f602364-kube-api-access-tq9rm\") pod \"marketplace-operator-79b997595-gjc8w\" (UID: \"452cdbd0-d1e1-491a-8edd-d0f88f602364\") " pod="openshift-marketplace/marketplace-operator-79b997595-gjc8w" Feb 18 11:42:10 crc kubenswrapper[4922]: I0218 11:42:10.920899 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/452cdbd0-d1e1-491a-8edd-d0f88f602364-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gjc8w\" (UID: \"452cdbd0-d1e1-491a-8edd-d0f88f602364\") " pod="openshift-marketplace/marketplace-operator-79b997595-gjc8w" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.022482 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/452cdbd0-d1e1-491a-8edd-d0f88f602364-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gjc8w\" (UID: \"452cdbd0-d1e1-491a-8edd-d0f88f602364\") " pod="openshift-marketplace/marketplace-operator-79b997595-gjc8w" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.022609 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/452cdbd0-d1e1-491a-8edd-d0f88f602364-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gjc8w\" (UID: \"452cdbd0-d1e1-491a-8edd-d0f88f602364\") " pod="openshift-marketplace/marketplace-operator-79b997595-gjc8w" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.022657 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tq9rm\" (UniqueName: \"kubernetes.io/projected/452cdbd0-d1e1-491a-8edd-d0f88f602364-kube-api-access-tq9rm\") pod \"marketplace-operator-79b997595-gjc8w\" (UID: \"452cdbd0-d1e1-491a-8edd-d0f88f602364\") " pod="openshift-marketplace/marketplace-operator-79b997595-gjc8w" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.024420 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/452cdbd0-d1e1-491a-8edd-d0f88f602364-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gjc8w\" (UID: \"452cdbd0-d1e1-491a-8edd-d0f88f602364\") " pod="openshift-marketplace/marketplace-operator-79b997595-gjc8w" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.029409 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/452cdbd0-d1e1-491a-8edd-d0f88f602364-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gjc8w\" (UID: \"452cdbd0-d1e1-491a-8edd-d0f88f602364\") " pod="openshift-marketplace/marketplace-operator-79b997595-gjc8w" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.040544 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq9rm\" (UniqueName: \"kubernetes.io/projected/452cdbd0-d1e1-491a-8edd-d0f88f602364-kube-api-access-tq9rm\") pod \"marketplace-operator-79b997595-gjc8w\" (UID: \"452cdbd0-d1e1-491a-8edd-d0f88f602364\") " pod="openshift-marketplace/marketplace-operator-79b997595-gjc8w" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.290185 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gjc8w" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.293938 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7dzbt" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.305079 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-nc7b9" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.328902 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6nbf\" (UniqueName: \"kubernetes.io/projected/fe4edbcb-8a38-4f30-975f-aa4825192b4e-kube-api-access-k6nbf\") pod \"fe4edbcb-8a38-4f30-975f-aa4825192b4e\" (UID: \"fe4edbcb-8a38-4f30-975f-aa4825192b4e\") " Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.328988 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe4edbcb-8a38-4f30-975f-aa4825192b4e-utilities\") pod \"fe4edbcb-8a38-4f30-975f-aa4825192b4e\" (UID: \"fe4edbcb-8a38-4f30-975f-aa4825192b4e\") " Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.329161 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe4edbcb-8a38-4f30-975f-aa4825192b4e-catalog-content\") pod \"fe4edbcb-8a38-4f30-975f-aa4825192b4e\" (UID: \"fe4edbcb-8a38-4f30-975f-aa4825192b4e\") " Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.330683 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe4edbcb-8a38-4f30-975f-aa4825192b4e-utilities" (OuterVolumeSpecName: "utilities") pod "fe4edbcb-8a38-4f30-975f-aa4825192b4e" (UID: "fe4edbcb-8a38-4f30-975f-aa4825192b4e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.341698 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5vjsn" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.352148 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe4edbcb-8a38-4f30-975f-aa4825192b4e-kube-api-access-k6nbf" (OuterVolumeSpecName: "kube-api-access-k6nbf") pod "fe4edbcb-8a38-4f30-975f-aa4825192b4e" (UID: "fe4edbcb-8a38-4f30-975f-aa4825192b4e"). InnerVolumeSpecName "kube-api-access-k6nbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.360041 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5lflw" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.362740 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wz74v" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.410220 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe4edbcb-8a38-4f30-975f-aa4825192b4e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fe4edbcb-8a38-4f30-975f-aa4825192b4e" (UID: "fe4edbcb-8a38-4f30-975f-aa4825192b4e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.431184 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47c627d0-6fb9-4b77-b266-74670361fcd6-catalog-content\") pod \"47c627d0-6fb9-4b77-b266-74670361fcd6\" (UID: \"47c627d0-6fb9-4b77-b266-74670361fcd6\") " Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.431249 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cddee0a-8b13-429b-89b6-e820f8f3ec59-utilities\") pod \"9cddee0a-8b13-429b-89b6-e820f8f3ec59\" (UID: \"9cddee0a-8b13-429b-89b6-e820f8f3ec59\") " Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.431287 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf0d2342-e758-43cc-8c89-adc3ceb98453-catalog-content\") pod \"bf0d2342-e758-43cc-8c89-adc3ceb98453\" (UID: \"bf0d2342-e758-43cc-8c89-adc3ceb98453\") " Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.431320 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/aa233e7a-8a71-495c-b696-2f3dac9f0ada-marketplace-operator-metrics\") pod \"aa233e7a-8a71-495c-b696-2f3dac9f0ada\" (UID: \"aa233e7a-8a71-495c-b696-2f3dac9f0ada\") " Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.431386 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2h5ql\" (UniqueName: \"kubernetes.io/projected/aa233e7a-8a71-495c-b696-2f3dac9f0ada-kube-api-access-2h5ql\") pod \"aa233e7a-8a71-495c-b696-2f3dac9f0ada\" (UID: \"aa233e7a-8a71-495c-b696-2f3dac9f0ada\") " Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.431507 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zw6\" (UniqueName: \"kubernetes.io/projected/bf0d2342-e758-43cc-8c89-adc3ceb98453-kube-api-access-x7zw6\") pod \"bf0d2342-e758-43cc-8c89-adc3ceb98453\" (UID: \"bf0d2342-e758-43cc-8c89-adc3ceb98453\") " Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.431537 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf0d2342-e758-43cc-8c89-adc3ceb98453-utilities\") pod \"bf0d2342-e758-43cc-8c89-adc3ceb98453\" (UID: \"bf0d2342-e758-43cc-8c89-adc3ceb98453\") " Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.431569 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cddee0a-8b13-429b-89b6-e820f8f3ec59-catalog-content\") pod \"9cddee0a-8b13-429b-89b6-e820f8f3ec59\" (UID: \"9cddee0a-8b13-429b-89b6-e820f8f3ec59\") " Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.431597 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6x5ts\" (UniqueName: \"kubernetes.io/projected/9cddee0a-8b13-429b-89b6-e820f8f3ec59-kube-api-access-6x5ts\") pod \"9cddee0a-8b13-429b-89b6-e820f8f3ec59\" (UID: \"9cddee0a-8b13-429b-89b6-e820f8f3ec59\") " Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.431733 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4z6qm\" (UniqueName: \"kubernetes.io/projected/47c627d0-6fb9-4b77-b266-74670361fcd6-kube-api-access-4z6qm\") pod \"47c627d0-6fb9-4b77-b266-74670361fcd6\" (UID: \"47c627d0-6fb9-4b77-b266-74670361fcd6\") " Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.431795 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aa233e7a-8a71-495c-b696-2f3dac9f0ada-marketplace-trusted-ca\") pod \"aa233e7a-8a71-495c-b696-2f3dac9f0ada\" (UID: \"aa233e7a-8a71-495c-b696-2f3dac9f0ada\") " Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.431818 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47c627d0-6fb9-4b77-b266-74670361fcd6-utilities\") pod \"47c627d0-6fb9-4b77-b266-74670361fcd6\" (UID: \"47c627d0-6fb9-4b77-b266-74670361fcd6\") " Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.432149 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6nbf\" (UniqueName: \"kubernetes.io/projected/fe4edbcb-8a38-4f30-975f-aa4825192b4e-kube-api-access-k6nbf\") on node \"crc\" DevicePath \"\"" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.432171 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe4edbcb-8a38-4f30-975f-aa4825192b4e-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.432184 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe4edbcb-8a38-4f30-975f-aa4825192b4e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.433071 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47c627d0-6fb9-4b77-b266-74670361fcd6-utilities" (OuterVolumeSpecName: "utilities") pod "47c627d0-6fb9-4b77-b266-74670361fcd6" (UID: "47c627d0-6fb9-4b77-b266-74670361fcd6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.433947 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf0d2342-e758-43cc-8c89-adc3ceb98453-utilities" (OuterVolumeSpecName: "utilities") pod "bf0d2342-e758-43cc-8c89-adc3ceb98453" (UID: "bf0d2342-e758-43cc-8c89-adc3ceb98453"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.434774 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf0d2342-e758-43cc-8c89-adc3ceb98453-kube-api-access-x7zw6" (OuterVolumeSpecName: "kube-api-access-x7zw6") pod "bf0d2342-e758-43cc-8c89-adc3ceb98453" (UID: "bf0d2342-e758-43cc-8c89-adc3ceb98453"). InnerVolumeSpecName "kube-api-access-x7zw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.437174 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa233e7a-8a71-495c-b696-2f3dac9f0ada-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "aa233e7a-8a71-495c-b696-2f3dac9f0ada" (UID: "aa233e7a-8a71-495c-b696-2f3dac9f0ada"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.441339 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa233e7a-8a71-495c-b696-2f3dac9f0ada-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "aa233e7a-8a71-495c-b696-2f3dac9f0ada" (UID: "aa233e7a-8a71-495c-b696-2f3dac9f0ada"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.441649 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47c627d0-6fb9-4b77-b266-74670361fcd6-kube-api-access-4z6qm" (OuterVolumeSpecName: "kube-api-access-4z6qm") pod "47c627d0-6fb9-4b77-b266-74670361fcd6" (UID: "47c627d0-6fb9-4b77-b266-74670361fcd6"). InnerVolumeSpecName "kube-api-access-4z6qm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.442185 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cddee0a-8b13-429b-89b6-e820f8f3ec59-utilities" (OuterVolumeSpecName: "utilities") pod "9cddee0a-8b13-429b-89b6-e820f8f3ec59" (UID: "9cddee0a-8b13-429b-89b6-e820f8f3ec59"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.444764 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa233e7a-8a71-495c-b696-2f3dac9f0ada-kube-api-access-2h5ql" (OuterVolumeSpecName: "kube-api-access-2h5ql") pod "aa233e7a-8a71-495c-b696-2f3dac9f0ada" (UID: "aa233e7a-8a71-495c-b696-2f3dac9f0ada"). InnerVolumeSpecName "kube-api-access-2h5ql". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.444918 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cddee0a-8b13-429b-89b6-e820f8f3ec59-kube-api-access-6x5ts" (OuterVolumeSpecName: "kube-api-access-6x5ts") pod "9cddee0a-8b13-429b-89b6-e820f8f3ec59" (UID: "9cddee0a-8b13-429b-89b6-e820f8f3ec59"). InnerVolumeSpecName "kube-api-access-6x5ts". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.480514 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf0d2342-e758-43cc-8c89-adc3ceb98453-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bf0d2342-e758-43cc-8c89-adc3ceb98453" (UID: "bf0d2342-e758-43cc-8c89-adc3ceb98453"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.500637 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cddee0a-8b13-429b-89b6-e820f8f3ec59-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9cddee0a-8b13-429b-89b6-e820f8f3ec59" (UID: "9cddee0a-8b13-429b-89b6-e820f8f3ec59"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.530124 4922 generic.go:334] "Generic (PLEG): container finished" podID="47c627d0-6fb9-4b77-b266-74670361fcd6" containerID="f527847320af902ee6f743ec960949ebda5617bce3a441ada1527d1de64a2927" exitCode=0 Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.530181 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wz74v" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.530179 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wz74v" event={"ID":"47c627d0-6fb9-4b77-b266-74670361fcd6","Type":"ContainerDied","Data":"f527847320af902ee6f743ec960949ebda5617bce3a441ada1527d1de64a2927"} Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.530224 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wz74v" event={"ID":"47c627d0-6fb9-4b77-b266-74670361fcd6","Type":"ContainerDied","Data":"86936b3faf99a98562fccc6ce9e3e9f7de7879c692a3b15d363c67f9bb07864e"} Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.530242 4922 scope.go:117] "RemoveContainer" containerID="f527847320af902ee6f743ec960949ebda5617bce3a441ada1527d1de64a2927" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.532704 4922 generic.go:334] "Generic (PLEG): container finished" podID="fe4edbcb-8a38-4f30-975f-aa4825192b4e" containerID="8fdaa83d5f72f970ccf8db14422f820c61b3754fa8464fddf21f1bada5ca3b77" exitCode=0 Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.532759 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zw6\" (UniqueName: \"kubernetes.io/projected/bf0d2342-e758-43cc-8c89-adc3ceb98453-kube-api-access-x7zw6\") on node \"crc\" DevicePath \"\"" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.532773 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7dzbt" event={"ID":"fe4edbcb-8a38-4f30-975f-aa4825192b4e","Type":"ContainerDied","Data":"8fdaa83d5f72f970ccf8db14422f820c61b3754fa8464fddf21f1bada5ca3b77"} Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.532800 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7dzbt" event={"ID":"fe4edbcb-8a38-4f30-975f-aa4825192b4e","Type":"ContainerDied","Data":"aa0626d406720474e06eba27d9c88b12751f048f72073c63b3e1e91b6784d080"} Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.532782 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf0d2342-e758-43cc-8c89-adc3ceb98453-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.532838 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cddee0a-8b13-429b-89b6-e820f8f3ec59-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.532853 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6x5ts\" (UniqueName: \"kubernetes.io/projected/9cddee0a-8b13-429b-89b6-e820f8f3ec59-kube-api-access-6x5ts\") on node \"crc\" DevicePath \"\"" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.532863 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7dzbt" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.532870 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4z6qm\" (UniqueName: \"kubernetes.io/projected/47c627d0-6fb9-4b77-b266-74670361fcd6-kube-api-access-4z6qm\") on node \"crc\" DevicePath \"\"" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.532884 4922 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aa233e7a-8a71-495c-b696-2f3dac9f0ada-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.532897 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47c627d0-6fb9-4b77-b266-74670361fcd6-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.532910 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cddee0a-8b13-429b-89b6-e820f8f3ec59-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.532923 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf0d2342-e758-43cc-8c89-adc3ceb98453-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.532936 4922 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/aa233e7a-8a71-495c-b696-2f3dac9f0ada-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.532950 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2h5ql\" (UniqueName: \"kubernetes.io/projected/aa233e7a-8a71-495c-b696-2f3dac9f0ada-kube-api-access-2h5ql\") on node \"crc\" DevicePath \"\"" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.537488 4922 generic.go:334] "Generic (PLEG): container finished" podID="bf0d2342-e758-43cc-8c89-adc3ceb98453" containerID="7540772acba64ca21dc3279082ac26db8226d215a021ff0397854fc0d4e90178" exitCode=0 Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.537545 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5vjsn" event={"ID":"bf0d2342-e758-43cc-8c89-adc3ceb98453","Type":"ContainerDied","Data":"7540772acba64ca21dc3279082ac26db8226d215a021ff0397854fc0d4e90178"} Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.537571 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5vjsn" event={"ID":"bf0d2342-e758-43cc-8c89-adc3ceb98453","Type":"ContainerDied","Data":"253c1fe74cd3ee618dc4b79bd02124aafb18cc9d88de6205e0a30abc2ecadb34"} Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.537635 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5vjsn" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.548341 4922 generic.go:334] "Generic (PLEG): container finished" podID="9cddee0a-8b13-429b-89b6-e820f8f3ec59" containerID="3095825ff9ad7d2de643243385e47b2af86046b60063ce1e20f19e93ab18808b" exitCode=0 Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.548485 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5lflw" event={"ID":"9cddee0a-8b13-429b-89b6-e820f8f3ec59","Type":"ContainerDied","Data":"3095825ff9ad7d2de643243385e47b2af86046b60063ce1e20f19e93ab18808b"} Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.548530 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5lflw" event={"ID":"9cddee0a-8b13-429b-89b6-e820f8f3ec59","Type":"ContainerDied","Data":"c1ce59c10870c2ecd21ad32da1730316e1c9e1d338deac7b1c3b3f7688db298c"} Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.548579 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5lflw" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.551908 4922 generic.go:334] "Generic (PLEG): container finished" podID="aa233e7a-8a71-495c-b696-2f3dac9f0ada" containerID="47f81b843aae8ab322577ee88976e3fa1b06eba4eb120e58e24ce0de633eadc0" exitCode=0 Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.551940 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-nc7b9" event={"ID":"aa233e7a-8a71-495c-b696-2f3dac9f0ada","Type":"ContainerDied","Data":"47f81b843aae8ab322577ee88976e3fa1b06eba4eb120e58e24ce0de633eadc0"} Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.551964 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-nc7b9" event={"ID":"aa233e7a-8a71-495c-b696-2f3dac9f0ada","Type":"ContainerDied","Data":"0d0074ed8642e690505cb0bcb8a3858df0eaa0a88849d13791f2daa4f4d6c521"} Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.552038 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-nc7b9" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.553535 4922 scope.go:117] "RemoveContainer" containerID="e1f53ef0abd3288d9c5b216794b6258c6d92e85e01c8016df80b39534a5185b1" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.586504 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5vjsn"] Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.588955 4922 scope.go:117] "RemoveContainer" containerID="339bdbf56d4f0a3e6321916973a3099db06b916e6ec5ad9353826af30e315679" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.596960 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5vjsn"] Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.610731 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7dzbt"] Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.614741 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7dzbt"] Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.621510 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5lflw"] Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.622571 4922 scope.go:117] "RemoveContainer" containerID="f527847320af902ee6f743ec960949ebda5617bce3a441ada1527d1de64a2927" Feb 18 11:42:11 crc kubenswrapper[4922]: E0218 11:42:11.623104 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f527847320af902ee6f743ec960949ebda5617bce3a441ada1527d1de64a2927\": container with ID starting with f527847320af902ee6f743ec960949ebda5617bce3a441ada1527d1de64a2927 not found: ID does not exist" containerID="f527847320af902ee6f743ec960949ebda5617bce3a441ada1527d1de64a2927" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.623165 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f527847320af902ee6f743ec960949ebda5617bce3a441ada1527d1de64a2927"} err="failed to get container status \"f527847320af902ee6f743ec960949ebda5617bce3a441ada1527d1de64a2927\": rpc error: code = NotFound desc = could not find container \"f527847320af902ee6f743ec960949ebda5617bce3a441ada1527d1de64a2927\": container with ID starting with f527847320af902ee6f743ec960949ebda5617bce3a441ada1527d1de64a2927 not found: ID does not exist" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.623188 4922 scope.go:117] "RemoveContainer" containerID="e1f53ef0abd3288d9c5b216794b6258c6d92e85e01c8016df80b39534a5185b1" Feb 18 11:42:11 crc kubenswrapper[4922]: E0218 11:42:11.623498 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1f53ef0abd3288d9c5b216794b6258c6d92e85e01c8016df80b39534a5185b1\": container with ID starting with e1f53ef0abd3288d9c5b216794b6258c6d92e85e01c8016df80b39534a5185b1 not found: ID does not exist" containerID="e1f53ef0abd3288d9c5b216794b6258c6d92e85e01c8016df80b39534a5185b1" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.623520 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1f53ef0abd3288d9c5b216794b6258c6d92e85e01c8016df80b39534a5185b1"} err="failed to get container status \"e1f53ef0abd3288d9c5b216794b6258c6d92e85e01c8016df80b39534a5185b1\": rpc error: code = NotFound desc = could not find container \"e1f53ef0abd3288d9c5b216794b6258c6d92e85e01c8016df80b39534a5185b1\": container with ID starting with e1f53ef0abd3288d9c5b216794b6258c6d92e85e01c8016df80b39534a5185b1 not found: ID does not exist" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.623534 4922 scope.go:117] "RemoveContainer" containerID="339bdbf56d4f0a3e6321916973a3099db06b916e6ec5ad9353826af30e315679" Feb 18 11:42:11 crc kubenswrapper[4922]: E0218 11:42:11.623786 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"339bdbf56d4f0a3e6321916973a3099db06b916e6ec5ad9353826af30e315679\": container with ID starting with 339bdbf56d4f0a3e6321916973a3099db06b916e6ec5ad9353826af30e315679 not found: ID does not exist" containerID="339bdbf56d4f0a3e6321916973a3099db06b916e6ec5ad9353826af30e315679" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.623807 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"339bdbf56d4f0a3e6321916973a3099db06b916e6ec5ad9353826af30e315679"} err="failed to get container status \"339bdbf56d4f0a3e6321916973a3099db06b916e6ec5ad9353826af30e315679\": rpc error: code = NotFound desc = could not find container \"339bdbf56d4f0a3e6321916973a3099db06b916e6ec5ad9353826af30e315679\": container with ID starting with 339bdbf56d4f0a3e6321916973a3099db06b916e6ec5ad9353826af30e315679 not found: ID does not exist" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.623818 4922 scope.go:117] "RemoveContainer" containerID="8fdaa83d5f72f970ccf8db14422f820c61b3754fa8464fddf21f1bada5ca3b77" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.625591 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5lflw"] Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.628832 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nc7b9"] Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.631887 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nc7b9"] Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.635853 4922 scope.go:117] "RemoveContainer" containerID="a2d26f75f0731112c78aac584a6d3dba85b5a815a715a0b94935517739394395" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.642745 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47c627d0-6fb9-4b77-b266-74670361fcd6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "47c627d0-6fb9-4b77-b266-74670361fcd6" (UID: "47c627d0-6fb9-4b77-b266-74670361fcd6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.648835 4922 scope.go:117] "RemoveContainer" containerID="f76c202bd50c16deefebdd21c812fb020f0a14b30d06f1245a0034d7909dea9c" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.664323 4922 scope.go:117] "RemoveContainer" containerID="8fdaa83d5f72f970ccf8db14422f820c61b3754fa8464fddf21f1bada5ca3b77" Feb 18 11:42:11 crc kubenswrapper[4922]: E0218 11:42:11.664772 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fdaa83d5f72f970ccf8db14422f820c61b3754fa8464fddf21f1bada5ca3b77\": container with ID starting with 8fdaa83d5f72f970ccf8db14422f820c61b3754fa8464fddf21f1bada5ca3b77 not found: ID does not exist" containerID="8fdaa83d5f72f970ccf8db14422f820c61b3754fa8464fddf21f1bada5ca3b77" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.664818 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fdaa83d5f72f970ccf8db14422f820c61b3754fa8464fddf21f1bada5ca3b77"} err="failed to get container status \"8fdaa83d5f72f970ccf8db14422f820c61b3754fa8464fddf21f1bada5ca3b77\": rpc error: code = NotFound desc = could not find container \"8fdaa83d5f72f970ccf8db14422f820c61b3754fa8464fddf21f1bada5ca3b77\": container with ID starting with 8fdaa83d5f72f970ccf8db14422f820c61b3754fa8464fddf21f1bada5ca3b77 not found: ID does not exist" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.664845 4922 scope.go:117] "RemoveContainer" containerID="a2d26f75f0731112c78aac584a6d3dba85b5a815a715a0b94935517739394395" Feb 18 11:42:11 crc kubenswrapper[4922]: E0218 11:42:11.665242 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2d26f75f0731112c78aac584a6d3dba85b5a815a715a0b94935517739394395\": container with ID starting with a2d26f75f0731112c78aac584a6d3dba85b5a815a715a0b94935517739394395 not found: ID does not exist" containerID="a2d26f75f0731112c78aac584a6d3dba85b5a815a715a0b94935517739394395" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.665266 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2d26f75f0731112c78aac584a6d3dba85b5a815a715a0b94935517739394395"} err="failed to get container status \"a2d26f75f0731112c78aac584a6d3dba85b5a815a715a0b94935517739394395\": rpc error: code = NotFound desc = could not find container \"a2d26f75f0731112c78aac584a6d3dba85b5a815a715a0b94935517739394395\": container with ID starting with a2d26f75f0731112c78aac584a6d3dba85b5a815a715a0b94935517739394395 not found: ID does not exist" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.665295 4922 scope.go:117] "RemoveContainer" containerID="f76c202bd50c16deefebdd21c812fb020f0a14b30d06f1245a0034d7909dea9c" Feb 18 11:42:11 crc kubenswrapper[4922]: E0218 11:42:11.665569 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f76c202bd50c16deefebdd21c812fb020f0a14b30d06f1245a0034d7909dea9c\": container with ID starting with f76c202bd50c16deefebdd21c812fb020f0a14b30d06f1245a0034d7909dea9c not found: ID does not exist" containerID="f76c202bd50c16deefebdd21c812fb020f0a14b30d06f1245a0034d7909dea9c" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.665591 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f76c202bd50c16deefebdd21c812fb020f0a14b30d06f1245a0034d7909dea9c"} err="failed to get container status \"f76c202bd50c16deefebdd21c812fb020f0a14b30d06f1245a0034d7909dea9c\": rpc error: code = NotFound desc = could not find container \"f76c202bd50c16deefebdd21c812fb020f0a14b30d06f1245a0034d7909dea9c\": container with ID starting with f76c202bd50c16deefebdd21c812fb020f0a14b30d06f1245a0034d7909dea9c not found: ID does not exist" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.665605 4922 scope.go:117] "RemoveContainer" containerID="7540772acba64ca21dc3279082ac26db8226d215a021ff0397854fc0d4e90178" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.678393 4922 scope.go:117] "RemoveContainer" containerID="bb328f8b6bae7db774e27f7422bc433c58fde944326f82372643b61339bf49bd" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.694744 4922 scope.go:117] "RemoveContainer" containerID="88ce7213e522a2bdfb89014af40e0edefc2f7d30268eda8f28f4c4825cf47c50" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.721054 4922 scope.go:117] "RemoveContainer" containerID="7540772acba64ca21dc3279082ac26db8226d215a021ff0397854fc0d4e90178" Feb 18 11:42:11 crc kubenswrapper[4922]: E0218 11:42:11.721823 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7540772acba64ca21dc3279082ac26db8226d215a021ff0397854fc0d4e90178\": container with ID starting with 7540772acba64ca21dc3279082ac26db8226d215a021ff0397854fc0d4e90178 not found: ID does not exist" containerID="7540772acba64ca21dc3279082ac26db8226d215a021ff0397854fc0d4e90178" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.721901 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7540772acba64ca21dc3279082ac26db8226d215a021ff0397854fc0d4e90178"} err="failed to get container status \"7540772acba64ca21dc3279082ac26db8226d215a021ff0397854fc0d4e90178\": rpc error: code = NotFound desc = could not find container \"7540772acba64ca21dc3279082ac26db8226d215a021ff0397854fc0d4e90178\": container with ID starting with 7540772acba64ca21dc3279082ac26db8226d215a021ff0397854fc0d4e90178 not found: ID does not exist" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.722012 4922 scope.go:117] "RemoveContainer" containerID="bb328f8b6bae7db774e27f7422bc433c58fde944326f82372643b61339bf49bd" Feb 18 11:42:11 crc kubenswrapper[4922]: E0218 11:42:11.722889 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb328f8b6bae7db774e27f7422bc433c58fde944326f82372643b61339bf49bd\": container with ID starting with bb328f8b6bae7db774e27f7422bc433c58fde944326f82372643b61339bf49bd not found: ID does not exist" containerID="bb328f8b6bae7db774e27f7422bc433c58fde944326f82372643b61339bf49bd" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.722945 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb328f8b6bae7db774e27f7422bc433c58fde944326f82372643b61339bf49bd"} err="failed to get container status \"bb328f8b6bae7db774e27f7422bc433c58fde944326f82372643b61339bf49bd\": rpc error: code = NotFound desc = could not find container \"bb328f8b6bae7db774e27f7422bc433c58fde944326f82372643b61339bf49bd\": container with ID starting with bb328f8b6bae7db774e27f7422bc433c58fde944326f82372643b61339bf49bd not found: ID does not exist" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.722981 4922 scope.go:117] "RemoveContainer" containerID="88ce7213e522a2bdfb89014af40e0edefc2f7d30268eda8f28f4c4825cf47c50" Feb 18 11:42:11 crc kubenswrapper[4922]: E0218 11:42:11.723348 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88ce7213e522a2bdfb89014af40e0edefc2f7d30268eda8f28f4c4825cf47c50\": container with ID starting with 88ce7213e522a2bdfb89014af40e0edefc2f7d30268eda8f28f4c4825cf47c50 not found: ID does not exist" containerID="88ce7213e522a2bdfb89014af40e0edefc2f7d30268eda8f28f4c4825cf47c50" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.723395 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88ce7213e522a2bdfb89014af40e0edefc2f7d30268eda8f28f4c4825cf47c50"} err="failed to get container status \"88ce7213e522a2bdfb89014af40e0edefc2f7d30268eda8f28f4c4825cf47c50\": rpc error: code = NotFound desc = could not find container \"88ce7213e522a2bdfb89014af40e0edefc2f7d30268eda8f28f4c4825cf47c50\": container with ID starting with 88ce7213e522a2bdfb89014af40e0edefc2f7d30268eda8f28f4c4825cf47c50 not found: ID does not exist" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.723411 4922 scope.go:117] "RemoveContainer" containerID="3095825ff9ad7d2de643243385e47b2af86046b60063ce1e20f19e93ab18808b" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.736039 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47c627d0-6fb9-4b77-b266-74670361fcd6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.742508 4922 scope.go:117] "RemoveContainer" containerID="0cb6e0bde2199b61ebae0ee2265108d0d8e55760b5cb6ca0dac142b8c2a65181" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.770542 4922 scope.go:117] "RemoveContainer" containerID="51b067b284f625e9e52dba70686a38665f13c1a97c6bdfd4e23fe3752e04d605" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.779645 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gjc8w"] Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.792032 4922 scope.go:117] "RemoveContainer" containerID="3095825ff9ad7d2de643243385e47b2af86046b60063ce1e20f19e93ab18808b" Feb 18 11:42:11 crc kubenswrapper[4922]: E0218 11:42:11.793050 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3095825ff9ad7d2de643243385e47b2af86046b60063ce1e20f19e93ab18808b\": container with ID starting with 3095825ff9ad7d2de643243385e47b2af86046b60063ce1e20f19e93ab18808b not found: ID does not exist" containerID="3095825ff9ad7d2de643243385e47b2af86046b60063ce1e20f19e93ab18808b" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.793088 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3095825ff9ad7d2de643243385e47b2af86046b60063ce1e20f19e93ab18808b"} err="failed to get container status \"3095825ff9ad7d2de643243385e47b2af86046b60063ce1e20f19e93ab18808b\": rpc error: code = NotFound desc = could not find container \"3095825ff9ad7d2de643243385e47b2af86046b60063ce1e20f19e93ab18808b\": container with ID starting with 3095825ff9ad7d2de643243385e47b2af86046b60063ce1e20f19e93ab18808b not found: ID does not exist" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.793115 4922 scope.go:117] "RemoveContainer" containerID="0cb6e0bde2199b61ebae0ee2265108d0d8e55760b5cb6ca0dac142b8c2a65181" Feb 18 11:42:11 crc kubenswrapper[4922]: E0218 11:42:11.793421 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cb6e0bde2199b61ebae0ee2265108d0d8e55760b5cb6ca0dac142b8c2a65181\": container with ID starting with 0cb6e0bde2199b61ebae0ee2265108d0d8e55760b5cb6ca0dac142b8c2a65181 not found: ID does not exist" containerID="0cb6e0bde2199b61ebae0ee2265108d0d8e55760b5cb6ca0dac142b8c2a65181" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.793453 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cb6e0bde2199b61ebae0ee2265108d0d8e55760b5cb6ca0dac142b8c2a65181"} err="failed to get container status \"0cb6e0bde2199b61ebae0ee2265108d0d8e55760b5cb6ca0dac142b8c2a65181\": rpc error: code = NotFound desc = could not find container \"0cb6e0bde2199b61ebae0ee2265108d0d8e55760b5cb6ca0dac142b8c2a65181\": container with ID starting with 0cb6e0bde2199b61ebae0ee2265108d0d8e55760b5cb6ca0dac142b8c2a65181 not found: ID does not exist" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.793476 4922 scope.go:117] "RemoveContainer" containerID="51b067b284f625e9e52dba70686a38665f13c1a97c6bdfd4e23fe3752e04d605" Feb 18 11:42:11 crc kubenswrapper[4922]: E0218 11:42:11.793797 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51b067b284f625e9e52dba70686a38665f13c1a97c6bdfd4e23fe3752e04d605\": container with ID starting with 51b067b284f625e9e52dba70686a38665f13c1a97c6bdfd4e23fe3752e04d605 not found: ID does not exist" containerID="51b067b284f625e9e52dba70686a38665f13c1a97c6bdfd4e23fe3752e04d605" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.793828 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51b067b284f625e9e52dba70686a38665f13c1a97c6bdfd4e23fe3752e04d605"} err="failed to get container status \"51b067b284f625e9e52dba70686a38665f13c1a97c6bdfd4e23fe3752e04d605\": rpc error: code = NotFound desc = could not find container \"51b067b284f625e9e52dba70686a38665f13c1a97c6bdfd4e23fe3752e04d605\": container with ID starting with 51b067b284f625e9e52dba70686a38665f13c1a97c6bdfd4e23fe3752e04d605 not found: ID does not exist" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.793852 4922 scope.go:117] "RemoveContainer" containerID="47f81b843aae8ab322577ee88976e3fa1b06eba4eb120e58e24ce0de633eadc0" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.811878 4922 scope.go:117] "RemoveContainer" containerID="47f81b843aae8ab322577ee88976e3fa1b06eba4eb120e58e24ce0de633eadc0" Feb 18 11:42:11 crc kubenswrapper[4922]: E0218 11:42:11.812398 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47f81b843aae8ab322577ee88976e3fa1b06eba4eb120e58e24ce0de633eadc0\": container with ID starting with 47f81b843aae8ab322577ee88976e3fa1b06eba4eb120e58e24ce0de633eadc0 not found: ID does not exist" containerID="47f81b843aae8ab322577ee88976e3fa1b06eba4eb120e58e24ce0de633eadc0" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.812431 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47f81b843aae8ab322577ee88976e3fa1b06eba4eb120e58e24ce0de633eadc0"} err="failed to get container status \"47f81b843aae8ab322577ee88976e3fa1b06eba4eb120e58e24ce0de633eadc0\": rpc error: code = NotFound desc = could not find container \"47f81b843aae8ab322577ee88976e3fa1b06eba4eb120e58e24ce0de633eadc0\": container with ID starting with 47f81b843aae8ab322577ee88976e3fa1b06eba4eb120e58e24ce0de633eadc0 not found: ID does not exist" Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.865895 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wz74v"] Feb 18 11:42:11 crc kubenswrapper[4922]: I0218 11:42:11.869950 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wz74v"] Feb 18 11:42:12 crc kubenswrapper[4922]: I0218 11:42:12.570179 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gjc8w" event={"ID":"452cdbd0-d1e1-491a-8edd-d0f88f602364","Type":"ContainerStarted","Data":"08d143d8ffe4dee8c593cdef88229d35623dba820d4fea7632533876ebe75223"} Feb 18 11:42:12 crc kubenswrapper[4922]: I0218 11:42:12.570435 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-gjc8w" Feb 18 11:42:12 crc kubenswrapper[4922]: I0218 11:42:12.570450 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gjc8w" event={"ID":"452cdbd0-d1e1-491a-8edd-d0f88f602364","Type":"ContainerStarted","Data":"5c956ec9346cb8bb4e092814f3abd8855e5b58608bc7328e5a8361ab35542ede"} Feb 18 11:42:12 crc kubenswrapper[4922]: I0218 11:42:12.574907 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-gjc8w" Feb 18 11:42:12 crc kubenswrapper[4922]: I0218 11:42:12.588087 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-gjc8w" podStartSLOduration=2.588062174 podStartE2EDuration="2.588062174s" podCreationTimestamp="2026-02-18 11:42:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:42:12.584496537 +0000 UTC m=+334.312200627" watchObservedRunningTime="2026-02-18 11:42:12.588062174 +0000 UTC m=+334.315766274" Feb 18 11:42:12 crc kubenswrapper[4922]: I0218 11:42:12.982010 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47c627d0-6fb9-4b77-b266-74670361fcd6" path="/var/lib/kubelet/pods/47c627d0-6fb9-4b77-b266-74670361fcd6/volumes" Feb 18 11:42:12 crc kubenswrapper[4922]: I0218 11:42:12.983765 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cddee0a-8b13-429b-89b6-e820f8f3ec59" path="/var/lib/kubelet/pods/9cddee0a-8b13-429b-89b6-e820f8f3ec59/volumes" Feb 18 11:42:12 crc kubenswrapper[4922]: I0218 11:42:12.984561 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa233e7a-8a71-495c-b696-2f3dac9f0ada" path="/var/lib/kubelet/pods/aa233e7a-8a71-495c-b696-2f3dac9f0ada/volumes" Feb 18 11:42:12 crc kubenswrapper[4922]: I0218 11:42:12.987218 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf0d2342-e758-43cc-8c89-adc3ceb98453" path="/var/lib/kubelet/pods/bf0d2342-e758-43cc-8c89-adc3ceb98453/volumes" Feb 18 11:42:12 crc kubenswrapper[4922]: I0218 11:42:12.987892 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe4edbcb-8a38-4f30-975f-aa4825192b4e" path="/var/lib/kubelet/pods/fe4edbcb-8a38-4f30-975f-aa4825192b4e/volumes" Feb 18 11:42:12 crc kubenswrapper[4922]: I0218 11:42:12.998054 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cfw5z"] Feb 18 11:42:12 crc kubenswrapper[4922]: E0218 11:42:12.998314 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cddee0a-8b13-429b-89b6-e820f8f3ec59" containerName="extract-content" Feb 18 11:42:12 crc kubenswrapper[4922]: I0218 11:42:12.998343 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cddee0a-8b13-429b-89b6-e820f8f3ec59" containerName="extract-content" Feb 18 11:42:12 crc kubenswrapper[4922]: E0218 11:42:12.998353 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47c627d0-6fb9-4b77-b266-74670361fcd6" containerName="registry-server" Feb 18 11:42:12 crc kubenswrapper[4922]: I0218 11:42:12.998380 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="47c627d0-6fb9-4b77-b266-74670361fcd6" containerName="registry-server" Feb 18 11:42:12 crc kubenswrapper[4922]: E0218 11:42:12.998389 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe4edbcb-8a38-4f30-975f-aa4825192b4e" containerName="extract-utilities" Feb 18 11:42:12 crc kubenswrapper[4922]: I0218 11:42:12.998399 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe4edbcb-8a38-4f30-975f-aa4825192b4e" containerName="extract-utilities" Feb 18 11:42:12 crc kubenswrapper[4922]: E0218 11:42:12.998409 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf0d2342-e758-43cc-8c89-adc3ceb98453" containerName="extract-utilities" Feb 18 11:42:12 crc kubenswrapper[4922]: I0218 11:42:12.998415 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf0d2342-e758-43cc-8c89-adc3ceb98453" containerName="extract-utilities" Feb 18 11:42:12 crc kubenswrapper[4922]: E0218 11:42:12.998423 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf0d2342-e758-43cc-8c89-adc3ceb98453" containerName="registry-server" Feb 18 11:42:12 crc kubenswrapper[4922]: I0218 11:42:12.998429 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf0d2342-e758-43cc-8c89-adc3ceb98453" containerName="registry-server" Feb 18 11:42:12 crc kubenswrapper[4922]: E0218 11:42:12.998436 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47c627d0-6fb9-4b77-b266-74670361fcd6" containerName="extract-utilities" Feb 18 11:42:12 crc kubenswrapper[4922]: I0218 11:42:12.998460 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="47c627d0-6fb9-4b77-b266-74670361fcd6" containerName="extract-utilities" Feb 18 11:42:12 crc kubenswrapper[4922]: E0218 11:42:12.998470 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe4edbcb-8a38-4f30-975f-aa4825192b4e" containerName="extract-content" Feb 18 11:42:12 crc kubenswrapper[4922]: I0218 11:42:12.998476 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe4edbcb-8a38-4f30-975f-aa4825192b4e" containerName="extract-content" Feb 18 11:42:12 crc kubenswrapper[4922]: E0218 11:42:12.998488 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf0d2342-e758-43cc-8c89-adc3ceb98453" containerName="extract-content" Feb 18 11:42:12 crc kubenswrapper[4922]: I0218 11:42:12.998493 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf0d2342-e758-43cc-8c89-adc3ceb98453" containerName="extract-content" Feb 18 11:42:12 crc kubenswrapper[4922]: E0218 11:42:12.998501 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47c627d0-6fb9-4b77-b266-74670361fcd6" containerName="extract-content" Feb 18 11:42:12 crc kubenswrapper[4922]: I0218 11:42:12.998508 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="47c627d0-6fb9-4b77-b266-74670361fcd6" containerName="extract-content" Feb 18 11:42:12 crc kubenswrapper[4922]: E0218 11:42:12.998536 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe4edbcb-8a38-4f30-975f-aa4825192b4e" containerName="registry-server" Feb 18 11:42:12 crc kubenswrapper[4922]: I0218 11:42:12.998543 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe4edbcb-8a38-4f30-975f-aa4825192b4e" containerName="registry-server" Feb 18 11:42:12 crc kubenswrapper[4922]: E0218 11:42:12.998552 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa233e7a-8a71-495c-b696-2f3dac9f0ada" containerName="marketplace-operator" Feb 18 11:42:12 crc kubenswrapper[4922]: I0218 11:42:12.998557 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa233e7a-8a71-495c-b696-2f3dac9f0ada" containerName="marketplace-operator" Feb 18 11:42:12 crc kubenswrapper[4922]: E0218 11:42:12.998566 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cddee0a-8b13-429b-89b6-e820f8f3ec59" containerName="extract-utilities" Feb 18 11:42:12 crc kubenswrapper[4922]: I0218 11:42:12.998572 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cddee0a-8b13-429b-89b6-e820f8f3ec59" containerName="extract-utilities" Feb 18 11:42:12 crc kubenswrapper[4922]: E0218 11:42:12.998579 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cddee0a-8b13-429b-89b6-e820f8f3ec59" containerName="registry-server" Feb 18 11:42:12 crc kubenswrapper[4922]: I0218 11:42:12.998586 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cddee0a-8b13-429b-89b6-e820f8f3ec59" containerName="registry-server" Feb 18 11:42:12 crc kubenswrapper[4922]: I0218 11:42:12.998760 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="47c627d0-6fb9-4b77-b266-74670361fcd6" containerName="registry-server" Feb 18 11:42:12 crc kubenswrapper[4922]: I0218 11:42:12.998792 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe4edbcb-8a38-4f30-975f-aa4825192b4e" containerName="registry-server" Feb 18 11:42:12 crc kubenswrapper[4922]: I0218 11:42:12.998805 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa233e7a-8a71-495c-b696-2f3dac9f0ada" containerName="marketplace-operator" Feb 18 11:42:12 crc kubenswrapper[4922]: I0218 11:42:12.998812 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf0d2342-e758-43cc-8c89-adc3ceb98453" containerName="registry-server" Feb 18 11:42:12 crc kubenswrapper[4922]: I0218 11:42:12.998821 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cddee0a-8b13-429b-89b6-e820f8f3ec59" containerName="registry-server" Feb 18 11:42:12 crc kubenswrapper[4922]: I0218 11:42:12.999753 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cfw5z" Feb 18 11:42:13 crc kubenswrapper[4922]: I0218 11:42:13.001856 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 18 11:42:13 crc kubenswrapper[4922]: I0218 11:42:13.002226 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cfw5z"] Feb 18 11:42:13 crc kubenswrapper[4922]: I0218 11:42:13.054951 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/523054ef-f8bb-4c7d-9baa-47191e299fcd-utilities\") pod \"certified-operators-cfw5z\" (UID: \"523054ef-f8bb-4c7d-9baa-47191e299fcd\") " pod="openshift-marketplace/certified-operators-cfw5z" Feb 18 11:42:13 crc kubenswrapper[4922]: I0218 11:42:13.055043 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5pzr\" (UniqueName: \"kubernetes.io/projected/523054ef-f8bb-4c7d-9baa-47191e299fcd-kube-api-access-d5pzr\") pod \"certified-operators-cfw5z\" (UID: \"523054ef-f8bb-4c7d-9baa-47191e299fcd\") " pod="openshift-marketplace/certified-operators-cfw5z" Feb 18 11:42:13 crc kubenswrapper[4922]: I0218 11:42:13.055097 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/523054ef-f8bb-4c7d-9baa-47191e299fcd-catalog-content\") pod \"certified-operators-cfw5z\" (UID: \"523054ef-f8bb-4c7d-9baa-47191e299fcd\") " pod="openshift-marketplace/certified-operators-cfw5z" Feb 18 11:42:13 crc kubenswrapper[4922]: I0218 11:42:13.156245 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/523054ef-f8bb-4c7d-9baa-47191e299fcd-utilities\") pod \"certified-operators-cfw5z\" (UID: \"523054ef-f8bb-4c7d-9baa-47191e299fcd\") " pod="openshift-marketplace/certified-operators-cfw5z" Feb 18 11:42:13 crc kubenswrapper[4922]: I0218 11:42:13.156327 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5pzr\" (UniqueName: \"kubernetes.io/projected/523054ef-f8bb-4c7d-9baa-47191e299fcd-kube-api-access-d5pzr\") pod \"certified-operators-cfw5z\" (UID: \"523054ef-f8bb-4c7d-9baa-47191e299fcd\") " pod="openshift-marketplace/certified-operators-cfw5z" Feb 18 11:42:13 crc kubenswrapper[4922]: I0218 11:42:13.156385 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/523054ef-f8bb-4c7d-9baa-47191e299fcd-catalog-content\") pod \"certified-operators-cfw5z\" (UID: \"523054ef-f8bb-4c7d-9baa-47191e299fcd\") " pod="openshift-marketplace/certified-operators-cfw5z" Feb 18 11:42:13 crc kubenswrapper[4922]: I0218 11:42:13.156710 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/523054ef-f8bb-4c7d-9baa-47191e299fcd-utilities\") pod \"certified-operators-cfw5z\" (UID: \"523054ef-f8bb-4c7d-9baa-47191e299fcd\") " pod="openshift-marketplace/certified-operators-cfw5z" Feb 18 11:42:13 crc kubenswrapper[4922]: I0218 11:42:13.156783 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/523054ef-f8bb-4c7d-9baa-47191e299fcd-catalog-content\") pod \"certified-operators-cfw5z\" (UID: \"523054ef-f8bb-4c7d-9baa-47191e299fcd\") " pod="openshift-marketplace/certified-operators-cfw5z" Feb 18 11:42:13 crc kubenswrapper[4922]: I0218 11:42:13.176964 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5pzr\" (UniqueName: \"kubernetes.io/projected/523054ef-f8bb-4c7d-9baa-47191e299fcd-kube-api-access-d5pzr\") pod \"certified-operators-cfw5z\" (UID: \"523054ef-f8bb-4c7d-9baa-47191e299fcd\") " pod="openshift-marketplace/certified-operators-cfw5z" Feb 18 11:42:13 crc kubenswrapper[4922]: I0218 11:42:13.202866 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6bqhb"] Feb 18 11:42:13 crc kubenswrapper[4922]: I0218 11:42:13.204418 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6bqhb" Feb 18 11:42:13 crc kubenswrapper[4922]: I0218 11:42:13.208417 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 18 11:42:13 crc kubenswrapper[4922]: I0218 11:42:13.213379 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6bqhb"] Feb 18 11:42:13 crc kubenswrapper[4922]: I0218 11:42:13.257537 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q98tn\" (UniqueName: \"kubernetes.io/projected/f50f60ee-09dd-45e2-aab0-384f2ff99b7d-kube-api-access-q98tn\") pod \"redhat-marketplace-6bqhb\" (UID: \"f50f60ee-09dd-45e2-aab0-384f2ff99b7d\") " pod="openshift-marketplace/redhat-marketplace-6bqhb" Feb 18 11:42:13 crc kubenswrapper[4922]: I0218 11:42:13.257613 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f50f60ee-09dd-45e2-aab0-384f2ff99b7d-utilities\") pod \"redhat-marketplace-6bqhb\" (UID: \"f50f60ee-09dd-45e2-aab0-384f2ff99b7d\") " pod="openshift-marketplace/redhat-marketplace-6bqhb" Feb 18 11:42:13 crc kubenswrapper[4922]: I0218 11:42:13.257787 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f50f60ee-09dd-45e2-aab0-384f2ff99b7d-catalog-content\") pod \"redhat-marketplace-6bqhb\" (UID: \"f50f60ee-09dd-45e2-aab0-384f2ff99b7d\") " pod="openshift-marketplace/redhat-marketplace-6bqhb" Feb 18 11:42:13 crc kubenswrapper[4922]: I0218 11:42:13.326041 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cfw5z" Feb 18 11:42:13 crc kubenswrapper[4922]: I0218 11:42:13.359971 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q98tn\" (UniqueName: \"kubernetes.io/projected/f50f60ee-09dd-45e2-aab0-384f2ff99b7d-kube-api-access-q98tn\") pod \"redhat-marketplace-6bqhb\" (UID: \"f50f60ee-09dd-45e2-aab0-384f2ff99b7d\") " pod="openshift-marketplace/redhat-marketplace-6bqhb" Feb 18 11:42:13 crc kubenswrapper[4922]: I0218 11:42:13.360115 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f50f60ee-09dd-45e2-aab0-384f2ff99b7d-utilities\") pod \"redhat-marketplace-6bqhb\" (UID: \"f50f60ee-09dd-45e2-aab0-384f2ff99b7d\") " pod="openshift-marketplace/redhat-marketplace-6bqhb" Feb 18 11:42:13 crc kubenswrapper[4922]: I0218 11:42:13.360219 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f50f60ee-09dd-45e2-aab0-384f2ff99b7d-catalog-content\") pod \"redhat-marketplace-6bqhb\" (UID: \"f50f60ee-09dd-45e2-aab0-384f2ff99b7d\") " pod="openshift-marketplace/redhat-marketplace-6bqhb" Feb 18 11:42:13 crc kubenswrapper[4922]: I0218 11:42:13.362136 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f50f60ee-09dd-45e2-aab0-384f2ff99b7d-catalog-content\") pod \"redhat-marketplace-6bqhb\" (UID: \"f50f60ee-09dd-45e2-aab0-384f2ff99b7d\") " pod="openshift-marketplace/redhat-marketplace-6bqhb" Feb 18 11:42:13 crc kubenswrapper[4922]: I0218 11:42:13.362458 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f50f60ee-09dd-45e2-aab0-384f2ff99b7d-utilities\") pod \"redhat-marketplace-6bqhb\" (UID: \"f50f60ee-09dd-45e2-aab0-384f2ff99b7d\") " pod="openshift-marketplace/redhat-marketplace-6bqhb" Feb 18 11:42:13 crc kubenswrapper[4922]: I0218 11:42:13.375227 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q98tn\" (UniqueName: \"kubernetes.io/projected/f50f60ee-09dd-45e2-aab0-384f2ff99b7d-kube-api-access-q98tn\") pod \"redhat-marketplace-6bqhb\" (UID: \"f50f60ee-09dd-45e2-aab0-384f2ff99b7d\") " pod="openshift-marketplace/redhat-marketplace-6bqhb" Feb 18 11:42:13 crc kubenswrapper[4922]: I0218 11:42:13.527583 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6bqhb" Feb 18 11:42:13 crc kubenswrapper[4922]: I0218 11:42:13.752743 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cfw5z"] Feb 18 11:42:13 crc kubenswrapper[4922]: I0218 11:42:13.901731 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6bqhb"] Feb 18 11:42:14 crc kubenswrapper[4922]: I0218 11:42:14.581981 4922 generic.go:334] "Generic (PLEG): container finished" podID="523054ef-f8bb-4c7d-9baa-47191e299fcd" containerID="5c138e51dd53973f4bcf82a962a1cbb32de5df0803519276a5f62e791887d52f" exitCode=0 Feb 18 11:42:14 crc kubenswrapper[4922]: I0218 11:42:14.582033 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cfw5z" event={"ID":"523054ef-f8bb-4c7d-9baa-47191e299fcd","Type":"ContainerDied","Data":"5c138e51dd53973f4bcf82a962a1cbb32de5df0803519276a5f62e791887d52f"} Feb 18 11:42:14 crc kubenswrapper[4922]: I0218 11:42:14.582163 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cfw5z" event={"ID":"523054ef-f8bb-4c7d-9baa-47191e299fcd","Type":"ContainerStarted","Data":"42ee5053d4102f9b61bc44a59c440005e5ec25fd25263fc4c4467a03f68e1731"} Feb 18 11:42:14 crc kubenswrapper[4922]: I0218 11:42:14.584348 4922 generic.go:334] "Generic (PLEG): container finished" podID="f50f60ee-09dd-45e2-aab0-384f2ff99b7d" containerID="66cf4f94781e4ece125829fc4a1a5acf7beefaa52399d35c3cf834cf5448be6c" exitCode=0 Feb 18 11:42:14 crc kubenswrapper[4922]: I0218 11:42:14.584419 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6bqhb" event={"ID":"f50f60ee-09dd-45e2-aab0-384f2ff99b7d","Type":"ContainerDied","Data":"66cf4f94781e4ece125829fc4a1a5acf7beefaa52399d35c3cf834cf5448be6c"} Feb 18 11:42:14 crc kubenswrapper[4922]: I0218 11:42:14.584484 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6bqhb" event={"ID":"f50f60ee-09dd-45e2-aab0-384f2ff99b7d","Type":"ContainerStarted","Data":"81c45efcdac9362802de58dd27bf893197de86f7ebd3df5f164f632b261368cc"} Feb 18 11:42:15 crc kubenswrapper[4922]: I0218 11:42:15.401777 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-h9zn6"] Feb 18 11:42:15 crc kubenswrapper[4922]: I0218 11:42:15.410217 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h9zn6" Feb 18 11:42:15 crc kubenswrapper[4922]: I0218 11:42:15.413593 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 18 11:42:15 crc kubenswrapper[4922]: I0218 11:42:15.419413 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h9zn6"] Feb 18 11:42:15 crc kubenswrapper[4922]: I0218 11:42:15.489138 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f779c873-d525-428d-88ed-828d00bf17eb-utilities\") pod \"community-operators-h9zn6\" (UID: \"f779c873-d525-428d-88ed-828d00bf17eb\") " pod="openshift-marketplace/community-operators-h9zn6" Feb 18 11:42:15 crc kubenswrapper[4922]: I0218 11:42:15.489205 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bhsn\" (UniqueName: \"kubernetes.io/projected/f779c873-d525-428d-88ed-828d00bf17eb-kube-api-access-8bhsn\") pod \"community-operators-h9zn6\" (UID: \"f779c873-d525-428d-88ed-828d00bf17eb\") " pod="openshift-marketplace/community-operators-h9zn6" Feb 18 11:42:15 crc kubenswrapper[4922]: I0218 11:42:15.489288 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f779c873-d525-428d-88ed-828d00bf17eb-catalog-content\") pod \"community-operators-h9zn6\" (UID: \"f779c873-d525-428d-88ed-828d00bf17eb\") " pod="openshift-marketplace/community-operators-h9zn6" Feb 18 11:42:15 crc kubenswrapper[4922]: I0218 11:42:15.591161 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f779c873-d525-428d-88ed-828d00bf17eb-utilities\") pod \"community-operators-h9zn6\" (UID: \"f779c873-d525-428d-88ed-828d00bf17eb\") " pod="openshift-marketplace/community-operators-h9zn6" Feb 18 11:42:15 crc kubenswrapper[4922]: I0218 11:42:15.591239 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bhsn\" (UniqueName: \"kubernetes.io/projected/f779c873-d525-428d-88ed-828d00bf17eb-kube-api-access-8bhsn\") pod \"community-operators-h9zn6\" (UID: \"f779c873-d525-428d-88ed-828d00bf17eb\") " pod="openshift-marketplace/community-operators-h9zn6" Feb 18 11:42:15 crc kubenswrapper[4922]: I0218 11:42:15.591284 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f779c873-d525-428d-88ed-828d00bf17eb-catalog-content\") pod \"community-operators-h9zn6\" (UID: \"f779c873-d525-428d-88ed-828d00bf17eb\") " pod="openshift-marketplace/community-operators-h9zn6" Feb 18 11:42:15 crc kubenswrapper[4922]: I0218 11:42:15.591852 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f779c873-d525-428d-88ed-828d00bf17eb-catalog-content\") pod \"community-operators-h9zn6\" (UID: \"f779c873-d525-428d-88ed-828d00bf17eb\") " pod="openshift-marketplace/community-operators-h9zn6" Feb 18 11:42:15 crc kubenswrapper[4922]: I0218 11:42:15.592144 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f779c873-d525-428d-88ed-828d00bf17eb-utilities\") pod \"community-operators-h9zn6\" (UID: \"f779c873-d525-428d-88ed-828d00bf17eb\") " pod="openshift-marketplace/community-operators-h9zn6" Feb 18 11:42:15 crc kubenswrapper[4922]: I0218 11:42:15.595045 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cfw5z" event={"ID":"523054ef-f8bb-4c7d-9baa-47191e299fcd","Type":"ContainerStarted","Data":"10b3eb493fda827c2b5df4c71ad15012442f5c255c7bce9fa21f0375147ef57f"} Feb 18 11:42:15 crc kubenswrapper[4922]: I0218 11:42:15.596634 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-48d4t"] Feb 18 11:42:15 crc kubenswrapper[4922]: I0218 11:42:15.598096 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6bqhb" event={"ID":"f50f60ee-09dd-45e2-aab0-384f2ff99b7d","Type":"ContainerStarted","Data":"dbaac2d048b1bf28c0a544d56094e4190ceeb213bd8b9b6ab6c58e5f9ca98a21"} Feb 18 11:42:15 crc kubenswrapper[4922]: I0218 11:42:15.598178 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-48d4t" Feb 18 11:42:15 crc kubenswrapper[4922]: I0218 11:42:15.603747 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 18 11:42:15 crc kubenswrapper[4922]: I0218 11:42:15.610594 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-48d4t"] Feb 18 11:42:15 crc kubenswrapper[4922]: I0218 11:42:15.618752 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bhsn\" (UniqueName: \"kubernetes.io/projected/f779c873-d525-428d-88ed-828d00bf17eb-kube-api-access-8bhsn\") pod \"community-operators-h9zn6\" (UID: \"f779c873-d525-428d-88ed-828d00bf17eb\") " pod="openshift-marketplace/community-operators-h9zn6" Feb 18 11:42:15 crc kubenswrapper[4922]: I0218 11:42:15.691911 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1faa074-0925-4c46-b2d7-3d5590f2bfb2-utilities\") pod \"redhat-operators-48d4t\" (UID: \"f1faa074-0925-4c46-b2d7-3d5590f2bfb2\") " pod="openshift-marketplace/redhat-operators-48d4t" Feb 18 11:42:15 crc kubenswrapper[4922]: I0218 11:42:15.691972 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1faa074-0925-4c46-b2d7-3d5590f2bfb2-catalog-content\") pod \"redhat-operators-48d4t\" (UID: \"f1faa074-0925-4c46-b2d7-3d5590f2bfb2\") " pod="openshift-marketplace/redhat-operators-48d4t" Feb 18 11:42:15 crc kubenswrapper[4922]: I0218 11:42:15.691999 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2pw6\" (UniqueName: \"kubernetes.io/projected/f1faa074-0925-4c46-b2d7-3d5590f2bfb2-kube-api-access-g2pw6\") pod \"redhat-operators-48d4t\" (UID: \"f1faa074-0925-4c46-b2d7-3d5590f2bfb2\") " pod="openshift-marketplace/redhat-operators-48d4t" Feb 18 11:42:15 crc kubenswrapper[4922]: I0218 11:42:15.724083 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h9zn6" Feb 18 11:42:15 crc kubenswrapper[4922]: I0218 11:42:15.793209 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1faa074-0925-4c46-b2d7-3d5590f2bfb2-utilities\") pod \"redhat-operators-48d4t\" (UID: \"f1faa074-0925-4c46-b2d7-3d5590f2bfb2\") " pod="openshift-marketplace/redhat-operators-48d4t" Feb 18 11:42:15 crc kubenswrapper[4922]: I0218 11:42:15.793436 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1faa074-0925-4c46-b2d7-3d5590f2bfb2-catalog-content\") pod \"redhat-operators-48d4t\" (UID: \"f1faa074-0925-4c46-b2d7-3d5590f2bfb2\") " pod="openshift-marketplace/redhat-operators-48d4t" Feb 18 11:42:15 crc kubenswrapper[4922]: I0218 11:42:15.793469 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2pw6\" (UniqueName: \"kubernetes.io/projected/f1faa074-0925-4c46-b2d7-3d5590f2bfb2-kube-api-access-g2pw6\") pod \"redhat-operators-48d4t\" (UID: \"f1faa074-0925-4c46-b2d7-3d5590f2bfb2\") " pod="openshift-marketplace/redhat-operators-48d4t" Feb 18 11:42:15 crc kubenswrapper[4922]: I0218 11:42:15.793830 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1faa074-0925-4c46-b2d7-3d5590f2bfb2-utilities\") pod \"redhat-operators-48d4t\" (UID: \"f1faa074-0925-4c46-b2d7-3d5590f2bfb2\") " pod="openshift-marketplace/redhat-operators-48d4t" Feb 18 11:42:15 crc kubenswrapper[4922]: I0218 11:42:15.794091 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1faa074-0925-4c46-b2d7-3d5590f2bfb2-catalog-content\") pod \"redhat-operators-48d4t\" (UID: \"f1faa074-0925-4c46-b2d7-3d5590f2bfb2\") " pod="openshift-marketplace/redhat-operators-48d4t" Feb 18 11:42:15 crc kubenswrapper[4922]: I0218 11:42:15.812922 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2pw6\" (UniqueName: \"kubernetes.io/projected/f1faa074-0925-4c46-b2d7-3d5590f2bfb2-kube-api-access-g2pw6\") pod \"redhat-operators-48d4t\" (UID: \"f1faa074-0925-4c46-b2d7-3d5590f2bfb2\") " pod="openshift-marketplace/redhat-operators-48d4t" Feb 18 11:42:15 crc kubenswrapper[4922]: I0218 11:42:15.909967 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-48d4t" Feb 18 11:42:16 crc kubenswrapper[4922]: I0218 11:42:16.136784 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h9zn6"] Feb 18 11:42:16 crc kubenswrapper[4922]: W0218 11:42:16.144190 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf779c873_d525_428d_88ed_828d00bf17eb.slice/crio-e5d3a35c50339e26242c90315c245635e9fe48bbb2338a83a7676b1899b5b36f WatchSource:0}: Error finding container e5d3a35c50339e26242c90315c245635e9fe48bbb2338a83a7676b1899b5b36f: Status 404 returned error can't find the container with id e5d3a35c50339e26242c90315c245635e9fe48bbb2338a83a7676b1899b5b36f Feb 18 11:42:16 crc kubenswrapper[4922]: I0218 11:42:16.284715 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-48d4t"] Feb 18 11:42:16 crc kubenswrapper[4922]: W0218 11:42:16.290380 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1faa074_0925_4c46_b2d7_3d5590f2bfb2.slice/crio-b63935ff693150fab51446b824532dfaf0826366b92ab601cfc7bb7ab4edc1ce WatchSource:0}: Error finding container b63935ff693150fab51446b824532dfaf0826366b92ab601cfc7bb7ab4edc1ce: Status 404 returned error can't find the container with id b63935ff693150fab51446b824532dfaf0826366b92ab601cfc7bb7ab4edc1ce Feb 18 11:42:16 crc kubenswrapper[4922]: I0218 11:42:16.604964 4922 generic.go:334] "Generic (PLEG): container finished" podID="f779c873-d525-428d-88ed-828d00bf17eb" containerID="7c9a3d7b12c16a99991f15769b2726e6f0cfebcdccc145aa918d5e678eb3f45b" exitCode=0 Feb 18 11:42:16 crc kubenswrapper[4922]: I0218 11:42:16.605041 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h9zn6" event={"ID":"f779c873-d525-428d-88ed-828d00bf17eb","Type":"ContainerDied","Data":"7c9a3d7b12c16a99991f15769b2726e6f0cfebcdccc145aa918d5e678eb3f45b"} Feb 18 11:42:16 crc kubenswrapper[4922]: I0218 11:42:16.605214 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h9zn6" event={"ID":"f779c873-d525-428d-88ed-828d00bf17eb","Type":"ContainerStarted","Data":"e5d3a35c50339e26242c90315c245635e9fe48bbb2338a83a7676b1899b5b36f"} Feb 18 11:42:16 crc kubenswrapper[4922]: I0218 11:42:16.609012 4922 generic.go:334] "Generic (PLEG): container finished" podID="f50f60ee-09dd-45e2-aab0-384f2ff99b7d" containerID="dbaac2d048b1bf28c0a544d56094e4190ceeb213bd8b9b6ab6c58e5f9ca98a21" exitCode=0 Feb 18 11:42:16 crc kubenswrapper[4922]: I0218 11:42:16.609071 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6bqhb" event={"ID":"f50f60ee-09dd-45e2-aab0-384f2ff99b7d","Type":"ContainerDied","Data":"dbaac2d048b1bf28c0a544d56094e4190ceeb213bd8b9b6ab6c58e5f9ca98a21"} Feb 18 11:42:16 crc kubenswrapper[4922]: I0218 11:42:16.611787 4922 generic.go:334] "Generic (PLEG): container finished" podID="f1faa074-0925-4c46-b2d7-3d5590f2bfb2" containerID="9e3a83e756a30ee92f9449f7ddc8339c21a1abdeda0792499d4a21920ed6c1c4" exitCode=0 Feb 18 11:42:16 crc kubenswrapper[4922]: I0218 11:42:16.611863 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-48d4t" event={"ID":"f1faa074-0925-4c46-b2d7-3d5590f2bfb2","Type":"ContainerDied","Data":"9e3a83e756a30ee92f9449f7ddc8339c21a1abdeda0792499d4a21920ed6c1c4"} Feb 18 11:42:16 crc kubenswrapper[4922]: I0218 11:42:16.611886 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-48d4t" event={"ID":"f1faa074-0925-4c46-b2d7-3d5590f2bfb2","Type":"ContainerStarted","Data":"b63935ff693150fab51446b824532dfaf0826366b92ab601cfc7bb7ab4edc1ce"} Feb 18 11:42:16 crc kubenswrapper[4922]: I0218 11:42:16.614520 4922 generic.go:334] "Generic (PLEG): container finished" podID="523054ef-f8bb-4c7d-9baa-47191e299fcd" containerID="10b3eb493fda827c2b5df4c71ad15012442f5c255c7bce9fa21f0375147ef57f" exitCode=0 Feb 18 11:42:16 crc kubenswrapper[4922]: I0218 11:42:16.614543 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cfw5z" event={"ID":"523054ef-f8bb-4c7d-9baa-47191e299fcd","Type":"ContainerDied","Data":"10b3eb493fda827c2b5df4c71ad15012442f5c255c7bce9fa21f0375147ef57f"} Feb 18 11:42:17 crc kubenswrapper[4922]: I0218 11:42:17.622267 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cfw5z" event={"ID":"523054ef-f8bb-4c7d-9baa-47191e299fcd","Type":"ContainerStarted","Data":"13290870f6ffe0c47237c67e10d8cbc97de5e07ed25e7172602da92c17e8b970"} Feb 18 11:42:17 crc kubenswrapper[4922]: I0218 11:42:17.624607 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h9zn6" event={"ID":"f779c873-d525-428d-88ed-828d00bf17eb","Type":"ContainerStarted","Data":"c2bca754efca2700d892df8d249dabe432437c2aabf48986910da880395d4d75"} Feb 18 11:42:17 crc kubenswrapper[4922]: I0218 11:42:17.626777 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6bqhb" event={"ID":"f50f60ee-09dd-45e2-aab0-384f2ff99b7d","Type":"ContainerStarted","Data":"3c30617879326ec9fb7f35dddf404540d84efb8a598a6b6c0ad31087cd8073f1"} Feb 18 11:42:17 crc kubenswrapper[4922]: I0218 11:42:17.629108 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-48d4t" event={"ID":"f1faa074-0925-4c46-b2d7-3d5590f2bfb2","Type":"ContainerStarted","Data":"23f296a4f83051f275c5ad75133b9d989ffbb5e5773e3cf9682aa82e8d0b8ec6"} Feb 18 11:42:17 crc kubenswrapper[4922]: I0218 11:42:17.637303 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cfw5z" podStartSLOduration=3.231219018 podStartE2EDuration="5.637287118s" podCreationTimestamp="2026-02-18 11:42:12 +0000 UTC" firstStartedPulling="2026-02-18 11:42:14.583797102 +0000 UTC m=+336.311501182" lastFinishedPulling="2026-02-18 11:42:16.989865202 +0000 UTC m=+338.717569282" observedRunningTime="2026-02-18 11:42:17.635876534 +0000 UTC m=+339.363580614" watchObservedRunningTime="2026-02-18 11:42:17.637287118 +0000 UTC m=+339.364991198" Feb 18 11:42:17 crc kubenswrapper[4922]: I0218 11:42:17.657971 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6bqhb" podStartSLOduration=2.08840754 podStartE2EDuration="4.657952757s" podCreationTimestamp="2026-02-18 11:42:13 +0000 UTC" firstStartedPulling="2026-02-18 11:42:14.585310948 +0000 UTC m=+336.313015018" lastFinishedPulling="2026-02-18 11:42:17.154856155 +0000 UTC m=+338.882560235" observedRunningTime="2026-02-18 11:42:17.655853056 +0000 UTC m=+339.383557146" watchObservedRunningTime="2026-02-18 11:42:17.657952757 +0000 UTC m=+339.385656837" Feb 18 11:42:18 crc kubenswrapper[4922]: I0218 11:42:18.647163 4922 generic.go:334] "Generic (PLEG): container finished" podID="f779c873-d525-428d-88ed-828d00bf17eb" containerID="c2bca754efca2700d892df8d249dabe432437c2aabf48986910da880395d4d75" exitCode=0 Feb 18 11:42:18 crc kubenswrapper[4922]: I0218 11:42:18.647236 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h9zn6" event={"ID":"f779c873-d525-428d-88ed-828d00bf17eb","Type":"ContainerDied","Data":"c2bca754efca2700d892df8d249dabe432437c2aabf48986910da880395d4d75"} Feb 18 11:42:18 crc kubenswrapper[4922]: I0218 11:42:18.650631 4922 generic.go:334] "Generic (PLEG): container finished" podID="f1faa074-0925-4c46-b2d7-3d5590f2bfb2" containerID="23f296a4f83051f275c5ad75133b9d989ffbb5e5773e3cf9682aa82e8d0b8ec6" exitCode=0 Feb 18 11:42:18 crc kubenswrapper[4922]: I0218 11:42:18.650685 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-48d4t" event={"ID":"f1faa074-0925-4c46-b2d7-3d5590f2bfb2","Type":"ContainerDied","Data":"23f296a4f83051f275c5ad75133b9d989ffbb5e5773e3cf9682aa82e8d0b8ec6"} Feb 18 11:42:19 crc kubenswrapper[4922]: I0218 11:42:19.659031 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-48d4t" event={"ID":"f1faa074-0925-4c46-b2d7-3d5590f2bfb2","Type":"ContainerStarted","Data":"377e6020d0b42c5266d8273b9b34ce499f228a6c68677cbc4c4e34b4940a1f55"} Feb 18 11:42:19 crc kubenswrapper[4922]: I0218 11:42:19.661650 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h9zn6" event={"ID":"f779c873-d525-428d-88ed-828d00bf17eb","Type":"ContainerStarted","Data":"88c1b0603c2e2c8ecd6b5d934399d65a9fdbe60917ef93db2d5c5f480e2fae4f"} Feb 18 11:42:19 crc kubenswrapper[4922]: I0218 11:42:19.680201 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-48d4t" podStartSLOduration=2.1458161159999998 podStartE2EDuration="4.680179624s" podCreationTimestamp="2026-02-18 11:42:15 +0000 UTC" firstStartedPulling="2026-02-18 11:42:16.612940255 +0000 UTC m=+338.340644335" lastFinishedPulling="2026-02-18 11:42:19.147303753 +0000 UTC m=+340.875007843" observedRunningTime="2026-02-18 11:42:19.674416065 +0000 UTC m=+341.402120165" watchObservedRunningTime="2026-02-18 11:42:19.680179624 +0000 UTC m=+341.407883704" Feb 18 11:42:19 crc kubenswrapper[4922]: I0218 11:42:19.695637 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-h9zn6" podStartSLOduration=2.185069312 podStartE2EDuration="4.695618056s" podCreationTimestamp="2026-02-18 11:42:15 +0000 UTC" firstStartedPulling="2026-02-18 11:42:16.606691304 +0000 UTC m=+338.334395404" lastFinishedPulling="2026-02-18 11:42:19.117240068 +0000 UTC m=+340.844944148" observedRunningTime="2026-02-18 11:42:19.693098235 +0000 UTC m=+341.420802325" watchObservedRunningTime="2026-02-18 11:42:19.695618056 +0000 UTC m=+341.423322136" Feb 18 11:42:22 crc kubenswrapper[4922]: I0218 11:42:22.777127 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-vs74j" Feb 18 11:42:22 crc kubenswrapper[4922]: I0218 11:42:22.841290 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wt2rf"] Feb 18 11:42:23 crc kubenswrapper[4922]: I0218 11:42:23.326419 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cfw5z" Feb 18 11:42:23 crc kubenswrapper[4922]: I0218 11:42:23.326695 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cfw5z" Feb 18 11:42:23 crc kubenswrapper[4922]: I0218 11:42:23.369291 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cfw5z" Feb 18 11:42:23 crc kubenswrapper[4922]: I0218 11:42:23.528208 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6bqhb" Feb 18 11:42:23 crc kubenswrapper[4922]: I0218 11:42:23.528279 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6bqhb" Feb 18 11:42:23 crc kubenswrapper[4922]: I0218 11:42:23.565699 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6bqhb" Feb 18 11:42:23 crc kubenswrapper[4922]: I0218 11:42:23.745237 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6bqhb" Feb 18 11:42:23 crc kubenswrapper[4922]: I0218 11:42:23.746832 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cfw5z" Feb 18 11:42:25 crc kubenswrapper[4922]: I0218 11:42:25.725164 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-h9zn6" Feb 18 11:42:25 crc kubenswrapper[4922]: I0218 11:42:25.725740 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-h9zn6" Feb 18 11:42:25 crc kubenswrapper[4922]: I0218 11:42:25.789694 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-h9zn6" Feb 18 11:42:25 crc kubenswrapper[4922]: I0218 11:42:25.911712 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-48d4t" Feb 18 11:42:25 crc kubenswrapper[4922]: I0218 11:42:25.913078 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-48d4t" Feb 18 11:42:25 crc kubenswrapper[4922]: I0218 11:42:25.963536 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-48d4t" Feb 18 11:42:26 crc kubenswrapper[4922]: I0218 11:42:26.768217 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-h9zn6" Feb 18 11:42:26 crc kubenswrapper[4922]: I0218 11:42:26.778898 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-48d4t" Feb 18 11:42:39 crc kubenswrapper[4922]: I0218 11:42:39.807948 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 11:42:39 crc kubenswrapper[4922]: I0218 11:42:39.808639 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 11:42:47 crc kubenswrapper[4922]: I0218 11:42:47.895548 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" podUID="fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0" containerName="registry" containerID="cri-o://9bc6cc301e873229c1127137e22f6a9b28ccb1bc2ceb7bd9d965728c3ed9f3a9" gracePeriod=30 Feb 18 11:42:48 crc kubenswrapper[4922]: I0218 11:42:48.292221 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:42:48 crc kubenswrapper[4922]: I0218 11:42:48.347614 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0-registry-tls\") pod \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " Feb 18 11:42:48 crc kubenswrapper[4922]: I0218 11:42:48.347721 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0-installation-pull-secrets\") pod \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " Feb 18 11:42:48 crc kubenswrapper[4922]: I0218 11:42:48.347818 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0-ca-trust-extracted\") pod \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " Feb 18 11:42:48 crc kubenswrapper[4922]: I0218 11:42:48.347894 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0-registry-certificates\") pod \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " Feb 18 11:42:48 crc kubenswrapper[4922]: I0218 11:42:48.347948 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56jvt\" (UniqueName: \"kubernetes.io/projected/fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0-kube-api-access-56jvt\") pod \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " Feb 18 11:42:48 crc kubenswrapper[4922]: I0218 11:42:48.348081 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0-bound-sa-token\") pod \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " Feb 18 11:42:48 crc kubenswrapper[4922]: I0218 11:42:48.348117 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0-trusted-ca\") pod \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " Feb 18 11:42:48 crc kubenswrapper[4922]: I0218 11:42:48.348300 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\" (UID: \"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0\") " Feb 18 11:42:48 crc kubenswrapper[4922]: I0218 11:42:48.350140 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:42:48 crc kubenswrapper[4922]: I0218 11:42:48.350901 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:42:48 crc kubenswrapper[4922]: I0218 11:42:48.357092 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:42:48 crc kubenswrapper[4922]: I0218 11:42:48.359693 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:42:48 crc kubenswrapper[4922]: I0218 11:42:48.367054 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:42:48 crc kubenswrapper[4922]: I0218 11:42:48.370330 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 18 11:42:48 crc kubenswrapper[4922]: I0218 11:42:48.376709 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0-kube-api-access-56jvt" (OuterVolumeSpecName: "kube-api-access-56jvt") pod "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0"). InnerVolumeSpecName "kube-api-access-56jvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:42:48 crc kubenswrapper[4922]: I0218 11:42:48.383454 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0" (UID: "fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:42:48 crc kubenswrapper[4922]: I0218 11:42:48.449957 4922 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 18 11:42:48 crc kubenswrapper[4922]: I0218 11:42:48.450004 4922 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 18 11:42:48 crc kubenswrapper[4922]: I0218 11:42:48.450020 4922 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 18 11:42:48 crc kubenswrapper[4922]: I0218 11:42:48.450032 4922 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 18 11:42:48 crc kubenswrapper[4922]: I0218 11:42:48.450043 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56jvt\" (UniqueName: \"kubernetes.io/projected/fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0-kube-api-access-56jvt\") on node \"crc\" DevicePath \"\"" Feb 18 11:42:48 crc kubenswrapper[4922]: I0218 11:42:48.450054 4922 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 18 11:42:48 crc kubenswrapper[4922]: I0218 11:42:48.450064 4922 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:42:48 crc kubenswrapper[4922]: I0218 11:42:48.837554 4922 generic.go:334] "Generic (PLEG): container finished" podID="fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0" containerID="9bc6cc301e873229c1127137e22f6a9b28ccb1bc2ceb7bd9d965728c3ed9f3a9" exitCode=0 Feb 18 11:42:48 crc kubenswrapper[4922]: I0218 11:42:48.837613 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" event={"ID":"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0","Type":"ContainerDied","Data":"9bc6cc301e873229c1127137e22f6a9b28ccb1bc2ceb7bd9d965728c3ed9f3a9"} Feb 18 11:42:48 crc kubenswrapper[4922]: I0218 11:42:48.837672 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" Feb 18 11:42:48 crc kubenswrapper[4922]: I0218 11:42:48.837696 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-wt2rf" event={"ID":"fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0","Type":"ContainerDied","Data":"ea1a8d4dd7f49c86b94455df1de54cf958a56e87072aa59f4711d51402743ec5"} Feb 18 11:42:48 crc kubenswrapper[4922]: I0218 11:42:48.837737 4922 scope.go:117] "RemoveContainer" containerID="9bc6cc301e873229c1127137e22f6a9b28ccb1bc2ceb7bd9d965728c3ed9f3a9" Feb 18 11:42:48 crc kubenswrapper[4922]: I0218 11:42:48.864911 4922 scope.go:117] "RemoveContainer" containerID="9bc6cc301e873229c1127137e22f6a9b28ccb1bc2ceb7bd9d965728c3ed9f3a9" Feb 18 11:42:48 crc kubenswrapper[4922]: E0218 11:42:48.866227 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bc6cc301e873229c1127137e22f6a9b28ccb1bc2ceb7bd9d965728c3ed9f3a9\": container with ID starting with 9bc6cc301e873229c1127137e22f6a9b28ccb1bc2ceb7bd9d965728c3ed9f3a9 not found: ID does not exist" containerID="9bc6cc301e873229c1127137e22f6a9b28ccb1bc2ceb7bd9d965728c3ed9f3a9" Feb 18 11:42:48 crc kubenswrapper[4922]: I0218 11:42:48.866313 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bc6cc301e873229c1127137e22f6a9b28ccb1bc2ceb7bd9d965728c3ed9f3a9"} err="failed to get container status \"9bc6cc301e873229c1127137e22f6a9b28ccb1bc2ceb7bd9d965728c3ed9f3a9\": rpc error: code = NotFound desc = could not find container \"9bc6cc301e873229c1127137e22f6a9b28ccb1bc2ceb7bd9d965728c3ed9f3a9\": container with ID starting with 9bc6cc301e873229c1127137e22f6a9b28ccb1bc2ceb7bd9d965728c3ed9f3a9 not found: ID does not exist" Feb 18 11:42:48 crc kubenswrapper[4922]: I0218 11:42:48.890116 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wt2rf"] Feb 18 11:42:48 crc kubenswrapper[4922]: I0218 11:42:48.896840 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wt2rf"] Feb 18 11:42:48 crc kubenswrapper[4922]: I0218 11:42:48.983763 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0" path="/var/lib/kubelet/pods/fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0/volumes" Feb 18 11:43:04 crc kubenswrapper[4922]: I0218 11:43:03.775598 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-x69t8" podUID="a768634b-1586-4ba2-9a05-6a88f5befea1" containerName="hostpath-provisioner" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 11:43:09 crc kubenswrapper[4922]: I0218 11:43:09.807251 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 11:43:09 crc kubenswrapper[4922]: I0218 11:43:09.808021 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 11:43:39 crc kubenswrapper[4922]: I0218 11:43:39.807215 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 11:43:39 crc kubenswrapper[4922]: I0218 11:43:39.807799 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 11:43:39 crc kubenswrapper[4922]: I0218 11:43:39.807846 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-znglx" Feb 18 11:43:39 crc kubenswrapper[4922]: I0218 11:43:39.808329 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f1a0cd1a4059d80457c2ea470bb511d4a674a1cbfe4b38356d0636a94629e11f"} pod="openshift-machine-config-operator/machine-config-daemon-znglx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 11:43:39 crc kubenswrapper[4922]: I0218 11:43:39.808402 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" containerID="cri-o://f1a0cd1a4059d80457c2ea470bb511d4a674a1cbfe4b38356d0636a94629e11f" gracePeriod=600 Feb 18 11:43:40 crc kubenswrapper[4922]: I0218 11:43:40.247267 4922 generic.go:334] "Generic (PLEG): container finished" podID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerID="f1a0cd1a4059d80457c2ea470bb511d4a674a1cbfe4b38356d0636a94629e11f" exitCode=0 Feb 18 11:43:40 crc kubenswrapper[4922]: I0218 11:43:40.247386 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" event={"ID":"fdb7cedc-b2e3-48f0-80e0-e17073b43228","Type":"ContainerDied","Data":"f1a0cd1a4059d80457c2ea470bb511d4a674a1cbfe4b38356d0636a94629e11f"} Feb 18 11:43:40 crc kubenswrapper[4922]: I0218 11:43:40.247653 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" event={"ID":"fdb7cedc-b2e3-48f0-80e0-e17073b43228","Type":"ContainerStarted","Data":"02db91f6fc3d787614fdd835995da44a7dcbace8afdf71101bf73a1aefb8d53b"} Feb 18 11:43:40 crc kubenswrapper[4922]: I0218 11:43:40.247682 4922 scope.go:117] "RemoveContainer" containerID="653a3be8615ffc467d4be70b77ab62c837c24df920515a9965ec20ce2941d86b" Feb 18 11:45:00 crc kubenswrapper[4922]: I0218 11:45:00.178331 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523585-sckv4"] Feb 18 11:45:00 crc kubenswrapper[4922]: E0218 11:45:00.179086 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0" containerName="registry" Feb 18 11:45:00 crc kubenswrapper[4922]: I0218 11:45:00.179100 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0" containerName="registry" Feb 18 11:45:00 crc kubenswrapper[4922]: I0218 11:45:00.179211 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa9f6f1e-d5ab-4de4-b8b4-ee14f742f2e0" containerName="registry" Feb 18 11:45:00 crc kubenswrapper[4922]: I0218 11:45:00.179632 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523585-sckv4" Feb 18 11:45:00 crc kubenswrapper[4922]: I0218 11:45:00.181620 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 18 11:45:00 crc kubenswrapper[4922]: I0218 11:45:00.181621 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 18 11:45:00 crc kubenswrapper[4922]: I0218 11:45:00.187128 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523585-sckv4"] Feb 18 11:45:00 crc kubenswrapper[4922]: I0218 11:45:00.291289 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzhc4\" (UniqueName: \"kubernetes.io/projected/74bd299a-42ac-4c5a-93ff-5809da5517b3-kube-api-access-tzhc4\") pod \"collect-profiles-29523585-sckv4\" (UID: \"74bd299a-42ac-4c5a-93ff-5809da5517b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523585-sckv4" Feb 18 11:45:00 crc kubenswrapper[4922]: I0218 11:45:00.291842 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/74bd299a-42ac-4c5a-93ff-5809da5517b3-secret-volume\") pod \"collect-profiles-29523585-sckv4\" (UID: \"74bd299a-42ac-4c5a-93ff-5809da5517b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523585-sckv4" Feb 18 11:45:00 crc kubenswrapper[4922]: I0218 11:45:00.291933 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74bd299a-42ac-4c5a-93ff-5809da5517b3-config-volume\") pod \"collect-profiles-29523585-sckv4\" (UID: \"74bd299a-42ac-4c5a-93ff-5809da5517b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523585-sckv4" Feb 18 11:45:00 crc kubenswrapper[4922]: I0218 11:45:00.393583 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/74bd299a-42ac-4c5a-93ff-5809da5517b3-secret-volume\") pod \"collect-profiles-29523585-sckv4\" (UID: \"74bd299a-42ac-4c5a-93ff-5809da5517b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523585-sckv4" Feb 18 11:45:00 crc kubenswrapper[4922]: I0218 11:45:00.393644 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74bd299a-42ac-4c5a-93ff-5809da5517b3-config-volume\") pod \"collect-profiles-29523585-sckv4\" (UID: \"74bd299a-42ac-4c5a-93ff-5809da5517b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523585-sckv4" Feb 18 11:45:00 crc kubenswrapper[4922]: I0218 11:45:00.393706 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzhc4\" (UniqueName: \"kubernetes.io/projected/74bd299a-42ac-4c5a-93ff-5809da5517b3-kube-api-access-tzhc4\") pod \"collect-profiles-29523585-sckv4\" (UID: \"74bd299a-42ac-4c5a-93ff-5809da5517b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523585-sckv4" Feb 18 11:45:00 crc kubenswrapper[4922]: I0218 11:45:00.394854 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74bd299a-42ac-4c5a-93ff-5809da5517b3-config-volume\") pod \"collect-profiles-29523585-sckv4\" (UID: \"74bd299a-42ac-4c5a-93ff-5809da5517b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523585-sckv4" Feb 18 11:45:00 crc kubenswrapper[4922]: I0218 11:45:00.408927 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/74bd299a-42ac-4c5a-93ff-5809da5517b3-secret-volume\") pod \"collect-profiles-29523585-sckv4\" (UID: \"74bd299a-42ac-4c5a-93ff-5809da5517b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523585-sckv4" Feb 18 11:45:00 crc kubenswrapper[4922]: I0218 11:45:00.423923 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzhc4\" (UniqueName: \"kubernetes.io/projected/74bd299a-42ac-4c5a-93ff-5809da5517b3-kube-api-access-tzhc4\") pod \"collect-profiles-29523585-sckv4\" (UID: \"74bd299a-42ac-4c5a-93ff-5809da5517b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523585-sckv4" Feb 18 11:45:00 crc kubenswrapper[4922]: I0218 11:45:00.500235 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523585-sckv4" Feb 18 11:45:00 crc kubenswrapper[4922]: I0218 11:45:00.699074 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523585-sckv4"] Feb 18 11:45:00 crc kubenswrapper[4922]: I0218 11:45:00.790091 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523585-sckv4" event={"ID":"74bd299a-42ac-4c5a-93ff-5809da5517b3","Type":"ContainerStarted","Data":"c7c0dc7121ed5ba5ab0dbf0029cc1d1f88e3c9c5a36c6287bbf69b39c0f6db47"} Feb 18 11:45:01 crc kubenswrapper[4922]: I0218 11:45:01.796301 4922 generic.go:334] "Generic (PLEG): container finished" podID="74bd299a-42ac-4c5a-93ff-5809da5517b3" containerID="1e57799f76ef61ec42eb4d7506cd5272291d57133dccf113ac6a6ed7f96b16b6" exitCode=0 Feb 18 11:45:01 crc kubenswrapper[4922]: I0218 11:45:01.796563 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523585-sckv4" event={"ID":"74bd299a-42ac-4c5a-93ff-5809da5517b3","Type":"ContainerDied","Data":"1e57799f76ef61ec42eb4d7506cd5272291d57133dccf113ac6a6ed7f96b16b6"} Feb 18 11:45:03 crc kubenswrapper[4922]: I0218 11:45:03.304550 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523585-sckv4" Feb 18 11:45:03 crc kubenswrapper[4922]: I0218 11:45:03.506176 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74bd299a-42ac-4c5a-93ff-5809da5517b3-config-volume\") pod \"74bd299a-42ac-4c5a-93ff-5809da5517b3\" (UID: \"74bd299a-42ac-4c5a-93ff-5809da5517b3\") " Feb 18 11:45:03 crc kubenswrapper[4922]: I0218 11:45:03.506289 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/74bd299a-42ac-4c5a-93ff-5809da5517b3-secret-volume\") pod \"74bd299a-42ac-4c5a-93ff-5809da5517b3\" (UID: \"74bd299a-42ac-4c5a-93ff-5809da5517b3\") " Feb 18 11:45:03 crc kubenswrapper[4922]: I0218 11:45:03.506547 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzhc4\" (UniqueName: \"kubernetes.io/projected/74bd299a-42ac-4c5a-93ff-5809da5517b3-kube-api-access-tzhc4\") pod \"74bd299a-42ac-4c5a-93ff-5809da5517b3\" (UID: \"74bd299a-42ac-4c5a-93ff-5809da5517b3\") " Feb 18 11:45:03 crc kubenswrapper[4922]: I0218 11:45:03.507023 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74bd299a-42ac-4c5a-93ff-5809da5517b3-config-volume" (OuterVolumeSpecName: "config-volume") pod "74bd299a-42ac-4c5a-93ff-5809da5517b3" (UID: "74bd299a-42ac-4c5a-93ff-5809da5517b3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:45:03 crc kubenswrapper[4922]: I0218 11:45:03.511891 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74bd299a-42ac-4c5a-93ff-5809da5517b3-kube-api-access-tzhc4" (OuterVolumeSpecName: "kube-api-access-tzhc4") pod "74bd299a-42ac-4c5a-93ff-5809da5517b3" (UID: "74bd299a-42ac-4c5a-93ff-5809da5517b3"). InnerVolumeSpecName "kube-api-access-tzhc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:45:03 crc kubenswrapper[4922]: I0218 11:45:03.512779 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74bd299a-42ac-4c5a-93ff-5809da5517b3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "74bd299a-42ac-4c5a-93ff-5809da5517b3" (UID: "74bd299a-42ac-4c5a-93ff-5809da5517b3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:45:03 crc kubenswrapper[4922]: I0218 11:45:03.607898 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzhc4\" (UniqueName: \"kubernetes.io/projected/74bd299a-42ac-4c5a-93ff-5809da5517b3-kube-api-access-tzhc4\") on node \"crc\" DevicePath \"\"" Feb 18 11:45:03 crc kubenswrapper[4922]: I0218 11:45:03.608143 4922 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74bd299a-42ac-4c5a-93ff-5809da5517b3-config-volume\") on node \"crc\" DevicePath \"\"" Feb 18 11:45:03 crc kubenswrapper[4922]: I0218 11:45:03.608235 4922 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/74bd299a-42ac-4c5a-93ff-5809da5517b3-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 18 11:45:04 crc kubenswrapper[4922]: I0218 11:45:04.112802 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523585-sckv4" event={"ID":"74bd299a-42ac-4c5a-93ff-5809da5517b3","Type":"ContainerDied","Data":"c7c0dc7121ed5ba5ab0dbf0029cc1d1f88e3c9c5a36c6287bbf69b39c0f6db47"} Feb 18 11:45:04 crc kubenswrapper[4922]: I0218 11:45:04.113122 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7c0dc7121ed5ba5ab0dbf0029cc1d1f88e3c9c5a36c6287bbf69b39c0f6db47" Feb 18 11:45:04 crc kubenswrapper[4922]: I0218 11:45:04.112859 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523585-sckv4" Feb 18 11:46:09 crc kubenswrapper[4922]: I0218 11:46:09.808157 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 11:46:09 crc kubenswrapper[4922]: I0218 11:46:09.808918 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 11:46:39 crc kubenswrapper[4922]: I0218 11:46:39.254852 4922 scope.go:117] "RemoveContainer" containerID="bebc46c7f9271b765da6413fad509e6ec5685b8951728d70348b293df49ce847" Feb 18 11:46:39 crc kubenswrapper[4922]: I0218 11:46:39.807657 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 11:46:39 crc kubenswrapper[4922]: I0218 11:46:39.808123 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 11:47:09 crc kubenswrapper[4922]: I0218 11:47:09.266474 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-wlvsw"] Feb 18 11:47:09 crc kubenswrapper[4922]: E0218 11:47:09.267241 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74bd299a-42ac-4c5a-93ff-5809da5517b3" containerName="collect-profiles" Feb 18 11:47:09 crc kubenswrapper[4922]: I0218 11:47:09.267257 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="74bd299a-42ac-4c5a-93ff-5809da5517b3" containerName="collect-profiles" Feb 18 11:47:09 crc kubenswrapper[4922]: I0218 11:47:09.267400 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="74bd299a-42ac-4c5a-93ff-5809da5517b3" containerName="collect-profiles" Feb 18 11:47:09 crc kubenswrapper[4922]: I0218 11:47:09.267834 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-wlvsw" Feb 18 11:47:09 crc kubenswrapper[4922]: I0218 11:47:09.272474 4922 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-m7rrs" Feb 18 11:47:09 crc kubenswrapper[4922]: I0218 11:47:09.272663 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 18 11:47:09 crc kubenswrapper[4922]: I0218 11:47:09.272959 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 18 11:47:09 crc kubenswrapper[4922]: I0218 11:47:09.282283 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-wlvsw"] Feb 18 11:47:09 crc kubenswrapper[4922]: I0218 11:47:09.288094 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-tq4pt"] Feb 18 11:47:09 crc kubenswrapper[4922]: I0218 11:47:09.288863 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-tq4pt" Feb 18 11:47:09 crc kubenswrapper[4922]: I0218 11:47:09.293590 4922 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-d6vgr" Feb 18 11:47:09 crc kubenswrapper[4922]: I0218 11:47:09.302867 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-tq4pt"] Feb 18 11:47:09 crc kubenswrapper[4922]: I0218 11:47:09.303916 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttg7v\" (UniqueName: \"kubernetes.io/projected/906da7e7-ffe0-496f-bfb4-a76c2c14589e-kube-api-access-ttg7v\") pod \"cert-manager-858654f9db-tq4pt\" (UID: \"906da7e7-ffe0-496f-bfb4-a76c2c14589e\") " pod="cert-manager/cert-manager-858654f9db-tq4pt" Feb 18 11:47:09 crc kubenswrapper[4922]: I0218 11:47:09.303985 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzzkj\" (UniqueName: \"kubernetes.io/projected/c066ada8-ed3e-4ca8-ba77-2c8c9014fdb1-kube-api-access-gzzkj\") pod \"cert-manager-cainjector-cf98fcc89-wlvsw\" (UID: \"c066ada8-ed3e-4ca8-ba77-2c8c9014fdb1\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-wlvsw" Feb 18 11:47:09 crc kubenswrapper[4922]: I0218 11:47:09.320306 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-vvgzd"] Feb 18 11:47:09 crc kubenswrapper[4922]: I0218 11:47:09.320973 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-vvgzd" Feb 18 11:47:09 crc kubenswrapper[4922]: I0218 11:47:09.322930 4922 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-l4z5h" Feb 18 11:47:09 crc kubenswrapper[4922]: I0218 11:47:09.328277 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-vvgzd"] Feb 18 11:47:09 crc kubenswrapper[4922]: I0218 11:47:09.405972 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttg7v\" (UniqueName: \"kubernetes.io/projected/906da7e7-ffe0-496f-bfb4-a76c2c14589e-kube-api-access-ttg7v\") pod \"cert-manager-858654f9db-tq4pt\" (UID: \"906da7e7-ffe0-496f-bfb4-a76c2c14589e\") " pod="cert-manager/cert-manager-858654f9db-tq4pt" Feb 18 11:47:09 crc kubenswrapper[4922]: I0218 11:47:09.406082 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6h6r4\" (UniqueName: \"kubernetes.io/projected/04a66d89-6415-45c5-b87b-b3730678eac4-kube-api-access-6h6r4\") pod \"cert-manager-webhook-687f57d79b-vvgzd\" (UID: \"04a66d89-6415-45c5-b87b-b3730678eac4\") " pod="cert-manager/cert-manager-webhook-687f57d79b-vvgzd" Feb 18 11:47:09 crc kubenswrapper[4922]: I0218 11:47:09.406127 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzzkj\" (UniqueName: \"kubernetes.io/projected/c066ada8-ed3e-4ca8-ba77-2c8c9014fdb1-kube-api-access-gzzkj\") pod \"cert-manager-cainjector-cf98fcc89-wlvsw\" (UID: \"c066ada8-ed3e-4ca8-ba77-2c8c9014fdb1\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-wlvsw" Feb 18 11:47:09 crc kubenswrapper[4922]: I0218 11:47:09.424999 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzzkj\" (UniqueName: \"kubernetes.io/projected/c066ada8-ed3e-4ca8-ba77-2c8c9014fdb1-kube-api-access-gzzkj\") pod \"cert-manager-cainjector-cf98fcc89-wlvsw\" (UID: \"c066ada8-ed3e-4ca8-ba77-2c8c9014fdb1\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-wlvsw" Feb 18 11:47:09 crc kubenswrapper[4922]: I0218 11:47:09.425340 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttg7v\" (UniqueName: \"kubernetes.io/projected/906da7e7-ffe0-496f-bfb4-a76c2c14589e-kube-api-access-ttg7v\") pod \"cert-manager-858654f9db-tq4pt\" (UID: \"906da7e7-ffe0-496f-bfb4-a76c2c14589e\") " pod="cert-manager/cert-manager-858654f9db-tq4pt" Feb 18 11:47:09 crc kubenswrapper[4922]: I0218 11:47:09.507553 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6h6r4\" (UniqueName: \"kubernetes.io/projected/04a66d89-6415-45c5-b87b-b3730678eac4-kube-api-access-6h6r4\") pod \"cert-manager-webhook-687f57d79b-vvgzd\" (UID: \"04a66d89-6415-45c5-b87b-b3730678eac4\") " pod="cert-manager/cert-manager-webhook-687f57d79b-vvgzd" Feb 18 11:47:09 crc kubenswrapper[4922]: I0218 11:47:09.524081 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6h6r4\" (UniqueName: \"kubernetes.io/projected/04a66d89-6415-45c5-b87b-b3730678eac4-kube-api-access-6h6r4\") pod \"cert-manager-webhook-687f57d79b-vvgzd\" (UID: \"04a66d89-6415-45c5-b87b-b3730678eac4\") " pod="cert-manager/cert-manager-webhook-687f57d79b-vvgzd" Feb 18 11:47:09 crc kubenswrapper[4922]: I0218 11:47:09.589256 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-wlvsw" Feb 18 11:47:09 crc kubenswrapper[4922]: I0218 11:47:09.608522 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-tq4pt" Feb 18 11:47:09 crc kubenswrapper[4922]: I0218 11:47:09.637455 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-vvgzd" Feb 18 11:47:09 crc kubenswrapper[4922]: I0218 11:47:09.807227 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 11:47:09 crc kubenswrapper[4922]: I0218 11:47:09.807555 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 11:47:09 crc kubenswrapper[4922]: I0218 11:47:09.807603 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-znglx" Feb 18 11:47:09 crc kubenswrapper[4922]: I0218 11:47:09.808166 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"02db91f6fc3d787614fdd835995da44a7dcbace8afdf71101bf73a1aefb8d53b"} pod="openshift-machine-config-operator/machine-config-daemon-znglx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 11:47:09 crc kubenswrapper[4922]: I0218 11:47:09.808229 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" containerID="cri-o://02db91f6fc3d787614fdd835995da44a7dcbace8afdf71101bf73a1aefb8d53b" gracePeriod=600 Feb 18 11:47:09 crc kubenswrapper[4922]: I0218 11:47:09.823642 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-wlvsw"] Feb 18 11:47:09 crc kubenswrapper[4922]: I0218 11:47:09.851354 4922 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 11:47:09 crc kubenswrapper[4922]: I0218 11:47:09.919807 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-wlvsw" event={"ID":"c066ada8-ed3e-4ca8-ba77-2c8c9014fdb1","Type":"ContainerStarted","Data":"09a208642ab1c9ffa1bb4ad1898807211597d5a3837c02d54769debc7d48b28b"} Feb 18 11:47:10 crc kubenswrapper[4922]: I0218 11:47:10.054792 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-tq4pt"] Feb 18 11:47:10 crc kubenswrapper[4922]: W0218 11:47:10.061684 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod906da7e7_ffe0_496f_bfb4_a76c2c14589e.slice/crio-1a4c95f1fb5ea5125698f3168e43d4b01d01d12d637c5d933a214c1de67fcc3c WatchSource:0}: Error finding container 1a4c95f1fb5ea5125698f3168e43d4b01d01d12d637c5d933a214c1de67fcc3c: Status 404 returned error can't find the container with id 1a4c95f1fb5ea5125698f3168e43d4b01d01d12d637c5d933a214c1de67fcc3c Feb 18 11:47:10 crc kubenswrapper[4922]: I0218 11:47:10.071528 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-vvgzd"] Feb 18 11:47:10 crc kubenswrapper[4922]: W0218 11:47:10.077013 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04a66d89_6415_45c5_b87b_b3730678eac4.slice/crio-fc2dab564f4f064235f842294b88f1bf319022b8f4f328947753588d3d3d7e2b WatchSource:0}: Error finding container fc2dab564f4f064235f842294b88f1bf319022b8f4f328947753588d3d3d7e2b: Status 404 returned error can't find the container with id fc2dab564f4f064235f842294b88f1bf319022b8f4f328947753588d3d3d7e2b Feb 18 11:47:10 crc kubenswrapper[4922]: I0218 11:47:10.930475 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-tq4pt" event={"ID":"906da7e7-ffe0-496f-bfb4-a76c2c14589e","Type":"ContainerStarted","Data":"1a4c95f1fb5ea5125698f3168e43d4b01d01d12d637c5d933a214c1de67fcc3c"} Feb 18 11:47:10 crc kubenswrapper[4922]: I0218 11:47:10.932365 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-vvgzd" event={"ID":"04a66d89-6415-45c5-b87b-b3730678eac4","Type":"ContainerStarted","Data":"fc2dab564f4f064235f842294b88f1bf319022b8f4f328947753588d3d3d7e2b"} Feb 18 11:47:10 crc kubenswrapper[4922]: I0218 11:47:10.934823 4922 generic.go:334] "Generic (PLEG): container finished" podID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerID="02db91f6fc3d787614fdd835995da44a7dcbace8afdf71101bf73a1aefb8d53b" exitCode=0 Feb 18 11:47:10 crc kubenswrapper[4922]: I0218 11:47:10.934853 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" event={"ID":"fdb7cedc-b2e3-48f0-80e0-e17073b43228","Type":"ContainerDied","Data":"02db91f6fc3d787614fdd835995da44a7dcbace8afdf71101bf73a1aefb8d53b"} Feb 18 11:47:10 crc kubenswrapper[4922]: I0218 11:47:10.934870 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" event={"ID":"fdb7cedc-b2e3-48f0-80e0-e17073b43228","Type":"ContainerStarted","Data":"0ea4c69ba94e3a69c2a9d6932ead886d1aa8f5af4ec72d79e294ae3c3d8f54dd"} Feb 18 11:47:10 crc kubenswrapper[4922]: I0218 11:47:10.934888 4922 scope.go:117] "RemoveContainer" containerID="f1a0cd1a4059d80457c2ea470bb511d4a674a1cbfe4b38356d0636a94629e11f" Feb 18 11:47:13 crc kubenswrapper[4922]: I0218 11:47:13.953647 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-tq4pt" event={"ID":"906da7e7-ffe0-496f-bfb4-a76c2c14589e","Type":"ContainerStarted","Data":"ae48bee2f8aea520c7c10505f58d8792b19e7e79ab4e2cd633bd5f8662fe7286"} Feb 18 11:47:13 crc kubenswrapper[4922]: I0218 11:47:13.955338 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-vvgzd" event={"ID":"04a66d89-6415-45c5-b87b-b3730678eac4","Type":"ContainerStarted","Data":"083bfbb79085eaf04357a5635ba7f41cd159a66de64b81163224bf6a927bd2d5"} Feb 18 11:47:13 crc kubenswrapper[4922]: I0218 11:47:13.955434 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-vvgzd" Feb 18 11:47:13 crc kubenswrapper[4922]: I0218 11:47:13.956835 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-wlvsw" event={"ID":"c066ada8-ed3e-4ca8-ba77-2c8c9014fdb1","Type":"ContainerStarted","Data":"58199038b740d8e74d83d11707f647becfb04c24df12d6df6c4138b5211ba652"} Feb 18 11:47:13 crc kubenswrapper[4922]: I0218 11:47:13.975324 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-tq4pt" podStartSLOduration=1.9179215410000001 podStartE2EDuration="4.975291513s" podCreationTimestamp="2026-02-18 11:47:09 +0000 UTC" firstStartedPulling="2026-02-18 11:47:10.062864617 +0000 UTC m=+631.790568697" lastFinishedPulling="2026-02-18 11:47:13.120234579 +0000 UTC m=+634.847938669" observedRunningTime="2026-02-18 11:47:13.970854073 +0000 UTC m=+635.698558163" watchObservedRunningTime="2026-02-18 11:47:13.975291513 +0000 UTC m=+635.702995613" Feb 18 11:47:14 crc kubenswrapper[4922]: I0218 11:47:14.001333 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-wlvsw" podStartSLOduration=3.0104365299999998 podStartE2EDuration="5.001311667s" podCreationTimestamp="2026-02-18 11:47:09 +0000 UTC" firstStartedPulling="2026-02-18 11:47:09.851092405 +0000 UTC m=+631.578796485" lastFinishedPulling="2026-02-18 11:47:11.841967552 +0000 UTC m=+633.569671622" observedRunningTime="2026-02-18 11:47:14.000464906 +0000 UTC m=+635.728169016" watchObservedRunningTime="2026-02-18 11:47:14.001311667 +0000 UTC m=+635.729015777" Feb 18 11:47:14 crc kubenswrapper[4922]: I0218 11:47:14.019314 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-vvgzd" podStartSLOduration=2.039468129 podStartE2EDuration="5.019296052s" podCreationTimestamp="2026-02-18 11:47:09 +0000 UTC" firstStartedPulling="2026-02-18 11:47:10.080410791 +0000 UTC m=+631.808114881" lastFinishedPulling="2026-02-18 11:47:13.060238724 +0000 UTC m=+634.787942804" observedRunningTime="2026-02-18 11:47:14.016183405 +0000 UTC m=+635.743887495" watchObservedRunningTime="2026-02-18 11:47:14.019296052 +0000 UTC m=+635.747000142" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.364629 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-wg4r5"] Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.366134 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerName="ovn-controller" containerID="cri-o://9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e" gracePeriod=30 Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.366343 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerName="northd" containerID="cri-o://90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16" gracePeriod=30 Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.366510 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b" gracePeriod=30 Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.366582 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerName="kube-rbac-proxy-node" containerID="cri-o://3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844" gracePeriod=30 Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.366640 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerName="ovn-acl-logging" containerID="cri-o://fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7" gracePeriod=30 Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.366756 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerName="sbdb" containerID="cri-o://e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c" gracePeriod=30 Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.366195 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerName="nbdb" containerID="cri-o://11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6" gracePeriod=30 Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.413853 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerName="ovnkube-controller" containerID="cri-o://d40ac6f2baab7bfed2de70b92b02e9fd3903b69815200ff68c4655a0de9763f3" gracePeriod=30 Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.640871 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-vvgzd" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.707775 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wg4r5_653a41bb-bb1d-421c-a92b-7f2811d95edf/ovnkube-controller/3.log" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.709769 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wg4r5_653a41bb-bb1d-421c-a92b-7f2811d95edf/ovn-acl-logging/0.log" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.710320 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wg4r5_653a41bb-bb1d-421c-a92b-7f2811d95edf/ovn-controller/0.log" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.711402 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.766351 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-t47sv"] Feb 18 11:47:19 crc kubenswrapper[4922]: E0218 11:47:19.766567 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerName="ovnkube-controller" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.766581 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerName="ovnkube-controller" Feb 18 11:47:19 crc kubenswrapper[4922]: E0218 11:47:19.766591 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerName="sbdb" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.766597 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerName="sbdb" Feb 18 11:47:19 crc kubenswrapper[4922]: E0218 11:47:19.766607 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerName="kubecfg-setup" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.766615 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerName="kubecfg-setup" Feb 18 11:47:19 crc kubenswrapper[4922]: E0218 11:47:19.766623 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerName="ovnkube-controller" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.766629 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerName="ovnkube-controller" Feb 18 11:47:19 crc kubenswrapper[4922]: E0218 11:47:19.766639 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerName="ovn-acl-logging" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.766645 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerName="ovn-acl-logging" Feb 18 11:47:19 crc kubenswrapper[4922]: E0218 11:47:19.766658 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerName="ovn-controller" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.766664 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerName="ovn-controller" Feb 18 11:47:19 crc kubenswrapper[4922]: E0218 11:47:19.766671 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerName="kube-rbac-proxy-ovn-metrics" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.766677 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerName="kube-rbac-proxy-ovn-metrics" Feb 18 11:47:19 crc kubenswrapper[4922]: E0218 11:47:19.766684 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerName="kube-rbac-proxy-node" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.766690 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerName="kube-rbac-proxy-node" Feb 18 11:47:19 crc kubenswrapper[4922]: E0218 11:47:19.766696 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerName="nbdb" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.766702 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerName="nbdb" Feb 18 11:47:19 crc kubenswrapper[4922]: E0218 11:47:19.766709 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerName="ovnkube-controller" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.766715 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerName="ovnkube-controller" Feb 18 11:47:19 crc kubenswrapper[4922]: E0218 11:47:19.766723 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerName="northd" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.766729 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerName="northd" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.766834 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerName="ovnkube-controller" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.766843 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerName="ovnkube-controller" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.766850 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerName="northd" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.766856 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerName="ovnkube-controller" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.766863 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerName="kube-rbac-proxy-ovn-metrics" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.766871 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerName="kube-rbac-proxy-node" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.766879 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerName="sbdb" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.766884 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerName="ovnkube-controller" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.766891 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerName="nbdb" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.766899 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerName="ovnkube-controller" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.766909 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerName="ovn-controller" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.766918 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerName="ovn-acl-logging" Feb 18 11:47:19 crc kubenswrapper[4922]: E0218 11:47:19.767004 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerName="ovnkube-controller" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.767012 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerName="ovnkube-controller" Feb 18 11:47:19 crc kubenswrapper[4922]: E0218 11:47:19.767023 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerName="ovnkube-controller" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.767031 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerName="ovnkube-controller" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.768749 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.769625 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7b826f3a-fb9a-4cf2-a4de-c6a394001583-log-socket\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.769901 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgxc8\" (UniqueName: \"kubernetes.io/projected/7b826f3a-fb9a-4cf2-a4de-c6a394001583-kube-api-access-wgxc8\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.770016 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7b826f3a-fb9a-4cf2-a4de-c6a394001583-host-cni-bin\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.770125 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7b826f3a-fb9a-4cf2-a4de-c6a394001583-host-slash\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.770215 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b826f3a-fb9a-4cf2-a4de-c6a394001583-var-lib-openvswitch\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.770293 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b826f3a-fb9a-4cf2-a4de-c6a394001583-host-run-ovn-kubernetes\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.770394 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7b826f3a-fb9a-4cf2-a4de-c6a394001583-host-kubelet\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.770479 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7b826f3a-fb9a-4cf2-a4de-c6a394001583-run-ovn\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.770570 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7b826f3a-fb9a-4cf2-a4de-c6a394001583-run-systemd\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.770656 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7b826f3a-fb9a-4cf2-a4de-c6a394001583-ovnkube-config\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.770742 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7b826f3a-fb9a-4cf2-a4de-c6a394001583-env-overrides\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.770854 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7b826f3a-fb9a-4cf2-a4de-c6a394001583-systemd-units\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.770942 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b826f3a-fb9a-4cf2-a4de-c6a394001583-etc-openvswitch\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.771040 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7b826f3a-fb9a-4cf2-a4de-c6a394001583-node-log\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.771130 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b826f3a-fb9a-4cf2-a4de-c6a394001583-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.771259 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7b826f3a-fb9a-4cf2-a4de-c6a394001583-host-run-netns\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.772090 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7b826f3a-fb9a-4cf2-a4de-c6a394001583-ovn-node-metrics-cert\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.772240 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b826f3a-fb9a-4cf2-a4de-c6a394001583-run-openvswitch\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.772668 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7b826f3a-fb9a-4cf2-a4de-c6a394001583-ovnkube-script-lib\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.772804 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7b826f3a-fb9a-4cf2-a4de-c6a394001583-host-cni-netd\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.873589 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-node-log\") pod \"653a41bb-bb1d-421c-a92b-7f2811d95edf\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.873847 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-host-cni-netd\") pod \"653a41bb-bb1d-421c-a92b-7f2811d95edf\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.873923 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-host-cni-bin\") pod \"653a41bb-bb1d-421c-a92b-7f2811d95edf\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.873986 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"653a41bb-bb1d-421c-a92b-7f2811d95edf\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.874052 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-etc-openvswitch\") pod \"653a41bb-bb1d-421c-a92b-7f2811d95edf\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.874115 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-host-run-netns\") pod \"653a41bb-bb1d-421c-a92b-7f2811d95edf\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.874176 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-systemd-units\") pod \"653a41bb-bb1d-421c-a92b-7f2811d95edf\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.874239 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-var-lib-openvswitch\") pod \"653a41bb-bb1d-421c-a92b-7f2811d95edf\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.874304 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/653a41bb-bb1d-421c-a92b-7f2811d95edf-ovnkube-script-lib\") pod \"653a41bb-bb1d-421c-a92b-7f2811d95edf\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.874392 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-run-openvswitch\") pod \"653a41bb-bb1d-421c-a92b-7f2811d95edf\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.874468 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-host-run-ovn-kubernetes\") pod \"653a41bb-bb1d-421c-a92b-7f2811d95edf\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.874532 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/653a41bb-bb1d-421c-a92b-7f2811d95edf-ovnkube-config\") pod \"653a41bb-bb1d-421c-a92b-7f2811d95edf\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.874596 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-run-ovn\") pod \"653a41bb-bb1d-421c-a92b-7f2811d95edf\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.874664 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26p2s\" (UniqueName: \"kubernetes.io/projected/653a41bb-bb1d-421c-a92b-7f2811d95edf-kube-api-access-26p2s\") pod \"653a41bb-bb1d-421c-a92b-7f2811d95edf\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.874733 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/653a41bb-bb1d-421c-a92b-7f2811d95edf-env-overrides\") pod \"653a41bb-bb1d-421c-a92b-7f2811d95edf\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.874793 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-log-socket\") pod \"653a41bb-bb1d-421c-a92b-7f2811d95edf\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.874854 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-host-slash\") pod \"653a41bb-bb1d-421c-a92b-7f2811d95edf\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.874922 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-host-kubelet\") pod \"653a41bb-bb1d-421c-a92b-7f2811d95edf\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.874984 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-run-systemd\") pod \"653a41bb-bb1d-421c-a92b-7f2811d95edf\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.875049 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/653a41bb-bb1d-421c-a92b-7f2811d95edf-ovn-node-metrics-cert\") pod \"653a41bb-bb1d-421c-a92b-7f2811d95edf\" (UID: \"653a41bb-bb1d-421c-a92b-7f2811d95edf\") " Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.876042 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7b826f3a-fb9a-4cf2-a4de-c6a394001583-node-log\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.876227 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "653a41bb-bb1d-421c-a92b-7f2811d95edf" (UID: "653a41bb-bb1d-421c-a92b-7f2811d95edf"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.876265 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-node-log" (OuterVolumeSpecName: "node-log") pod "653a41bb-bb1d-421c-a92b-7f2811d95edf" (UID: "653a41bb-bb1d-421c-a92b-7f2811d95edf"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.876279 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "653a41bb-bb1d-421c-a92b-7f2811d95edf" (UID: "653a41bb-bb1d-421c-a92b-7f2811d95edf"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.876271 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-host-slash" (OuterVolumeSpecName: "host-slash") pod "653a41bb-bb1d-421c-a92b-7f2811d95edf" (UID: "653a41bb-bb1d-421c-a92b-7f2811d95edf"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.876293 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "653a41bb-bb1d-421c-a92b-7f2811d95edf" (UID: "653a41bb-bb1d-421c-a92b-7f2811d95edf"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.876298 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "653a41bb-bb1d-421c-a92b-7f2811d95edf" (UID: "653a41bb-bb1d-421c-a92b-7f2811d95edf"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.876306 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "653a41bb-bb1d-421c-a92b-7f2811d95edf" (UID: "653a41bb-bb1d-421c-a92b-7f2811d95edf"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.876315 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "653a41bb-bb1d-421c-a92b-7f2811d95edf" (UID: "653a41bb-bb1d-421c-a92b-7f2811d95edf"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.876322 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "653a41bb-bb1d-421c-a92b-7f2811d95edf" (UID: "653a41bb-bb1d-421c-a92b-7f2811d95edf"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.876329 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "653a41bb-bb1d-421c-a92b-7f2811d95edf" (UID: "653a41bb-bb1d-421c-a92b-7f2811d95edf"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.876349 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "653a41bb-bb1d-421c-a92b-7f2811d95edf" (UID: "653a41bb-bb1d-421c-a92b-7f2811d95edf"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.876381 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-log-socket" (OuterVolumeSpecName: "log-socket") pod "653a41bb-bb1d-421c-a92b-7f2811d95edf" (UID: "653a41bb-bb1d-421c-a92b-7f2811d95edf"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.876433 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "653a41bb-bb1d-421c-a92b-7f2811d95edf" (UID: "653a41bb-bb1d-421c-a92b-7f2811d95edf"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.876718 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/653a41bb-bb1d-421c-a92b-7f2811d95edf-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "653a41bb-bb1d-421c-a92b-7f2811d95edf" (UID: "653a41bb-bb1d-421c-a92b-7f2811d95edf"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.876729 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "653a41bb-bb1d-421c-a92b-7f2811d95edf" (UID: "653a41bb-bb1d-421c-a92b-7f2811d95edf"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.876758 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/653a41bb-bb1d-421c-a92b-7f2811d95edf-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "653a41bb-bb1d-421c-a92b-7f2811d95edf" (UID: "653a41bb-bb1d-421c-a92b-7f2811d95edf"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.876776 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/653a41bb-bb1d-421c-a92b-7f2811d95edf-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "653a41bb-bb1d-421c-a92b-7f2811d95edf" (UID: "653a41bb-bb1d-421c-a92b-7f2811d95edf"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.876748 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7b826f3a-fb9a-4cf2-a4de-c6a394001583-node-log\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.877380 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b826f3a-fb9a-4cf2-a4de-c6a394001583-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.877712 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7b826f3a-fb9a-4cf2-a4de-c6a394001583-host-run-netns\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.877744 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b826f3a-fb9a-4cf2-a4de-c6a394001583-run-openvswitch\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.877765 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7b826f3a-fb9a-4cf2-a4de-c6a394001583-ovn-node-metrics-cert\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.877790 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7b826f3a-fb9a-4cf2-a4de-c6a394001583-ovnkube-script-lib\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.877822 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7b826f3a-fb9a-4cf2-a4de-c6a394001583-host-cni-netd\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.877875 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7b826f3a-fb9a-4cf2-a4de-c6a394001583-log-socket\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.877921 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgxc8\" (UniqueName: \"kubernetes.io/projected/7b826f3a-fb9a-4cf2-a4de-c6a394001583-kube-api-access-wgxc8\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.877945 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7b826f3a-fb9a-4cf2-a4de-c6a394001583-host-cni-bin\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.878028 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7b826f3a-fb9a-4cf2-a4de-c6a394001583-host-slash\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.878054 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b826f3a-fb9a-4cf2-a4de-c6a394001583-var-lib-openvswitch\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.878080 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b826f3a-fb9a-4cf2-a4de-c6a394001583-host-run-ovn-kubernetes\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.878110 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7b826f3a-fb9a-4cf2-a4de-c6a394001583-host-kubelet\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.878129 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7b826f3a-fb9a-4cf2-a4de-c6a394001583-run-ovn\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.878166 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7b826f3a-fb9a-4cf2-a4de-c6a394001583-run-systemd\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.878205 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7b826f3a-fb9a-4cf2-a4de-c6a394001583-ovnkube-config\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.878230 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7b826f3a-fb9a-4cf2-a4de-c6a394001583-env-overrides\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.878252 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7b826f3a-fb9a-4cf2-a4de-c6a394001583-systemd-units\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.878280 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b826f3a-fb9a-4cf2-a4de-c6a394001583-etc-openvswitch\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.878352 4922 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-node-log\") on node \"crc\" DevicePath \"\"" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.878385 4922 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.878399 4922 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.878413 4922 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.878426 4922 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.878440 4922 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.878453 4922 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.878464 4922 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.878475 4922 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/653a41bb-bb1d-421c-a92b-7f2811d95edf-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.878486 4922 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.878520 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7b826f3a-fb9a-4cf2-a4de-c6a394001583-host-slash\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.878545 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b826f3a-fb9a-4cf2-a4de-c6a394001583-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.878585 4922 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.878588 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7b826f3a-fb9a-4cf2-a4de-c6a394001583-host-kubelet\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.878602 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7b826f3a-fb9a-4cf2-a4de-c6a394001583-run-systemd\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.878585 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b826f3a-fb9a-4cf2-a4de-c6a394001583-etc-openvswitch\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.878614 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b826f3a-fb9a-4cf2-a4de-c6a394001583-var-lib-openvswitch\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.878625 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b826f3a-fb9a-4cf2-a4de-c6a394001583-host-run-ovn-kubernetes\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.878597 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7b826f3a-fb9a-4cf2-a4de-c6a394001583-log-socket\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.878585 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7b826f3a-fb9a-4cf2-a4de-c6a394001583-run-ovn\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.878591 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7b826f3a-fb9a-4cf2-a4de-c6a394001583-systemd-units\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.878619 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7b826f3a-fb9a-4cf2-a4de-c6a394001583-host-run-netns\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.878636 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7b826f3a-fb9a-4cf2-a4de-c6a394001583-host-cni-bin\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.878938 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b826f3a-fb9a-4cf2-a4de-c6a394001583-run-openvswitch\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.878994 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7b826f3a-fb9a-4cf2-a4de-c6a394001583-host-cni-netd\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.879020 4922 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/653a41bb-bb1d-421c-a92b-7f2811d95edf-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.879036 4922 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.879047 4922 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/653a41bb-bb1d-421c-a92b-7f2811d95edf-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.879057 4922 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-log-socket\") on node \"crc\" DevicePath \"\"" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.879067 4922 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-host-slash\") on node \"crc\" DevicePath \"\"" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.879077 4922 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.879243 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7b826f3a-fb9a-4cf2-a4de-c6a394001583-env-overrides\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.879305 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7b826f3a-fb9a-4cf2-a4de-c6a394001583-ovnkube-script-lib\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.879573 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7b826f3a-fb9a-4cf2-a4de-c6a394001583-ovnkube-config\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.881265 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/653a41bb-bb1d-421c-a92b-7f2811d95edf-kube-api-access-26p2s" (OuterVolumeSpecName: "kube-api-access-26p2s") pod "653a41bb-bb1d-421c-a92b-7f2811d95edf" (UID: "653a41bb-bb1d-421c-a92b-7f2811d95edf"). InnerVolumeSpecName "kube-api-access-26p2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.881388 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/653a41bb-bb1d-421c-a92b-7f2811d95edf-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "653a41bb-bb1d-421c-a92b-7f2811d95edf" (UID: "653a41bb-bb1d-421c-a92b-7f2811d95edf"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.882055 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7b826f3a-fb9a-4cf2-a4de-c6a394001583-ovn-node-metrics-cert\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.888496 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "653a41bb-bb1d-421c-a92b-7f2811d95edf" (UID: "653a41bb-bb1d-421c-a92b-7f2811d95edf"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.893321 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgxc8\" (UniqueName: \"kubernetes.io/projected/7b826f3a-fb9a-4cf2-a4de-c6a394001583-kube-api-access-wgxc8\") pod \"ovnkube-node-t47sv\" (UID: \"7b826f3a-fb9a-4cf2-a4de-c6a394001583\") " pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.980022 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26p2s\" (UniqueName: \"kubernetes.io/projected/653a41bb-bb1d-421c-a92b-7f2811d95edf-kube-api-access-26p2s\") on node \"crc\" DevicePath \"\"" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.980071 4922 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/653a41bb-bb1d-421c-a92b-7f2811d95edf-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.980091 4922 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/653a41bb-bb1d-421c-a92b-7f2811d95edf-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.994229 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wg4r5_653a41bb-bb1d-421c-a92b-7f2811d95edf/ovnkube-controller/3.log" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.996711 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wg4r5_653a41bb-bb1d-421c-a92b-7f2811d95edf/ovn-acl-logging/0.log" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.997461 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wg4r5_653a41bb-bb1d-421c-a92b-7f2811d95edf/ovn-controller/0.log" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.997952 4922 generic.go:334] "Generic (PLEG): container finished" podID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerID="d40ac6f2baab7bfed2de70b92b02e9fd3903b69815200ff68c4655a0de9763f3" exitCode=0 Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.997983 4922 generic.go:334] "Generic (PLEG): container finished" podID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerID="e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c" exitCode=0 Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.997994 4922 generic.go:334] "Generic (PLEG): container finished" podID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerID="11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6" exitCode=0 Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.998004 4922 generic.go:334] "Generic (PLEG): container finished" podID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerID="90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16" exitCode=0 Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.998013 4922 generic.go:334] "Generic (PLEG): container finished" podID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerID="eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b" exitCode=0 Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.998022 4922 generic.go:334] "Generic (PLEG): container finished" podID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerID="3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844" exitCode=0 Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.998030 4922 generic.go:334] "Generic (PLEG): container finished" podID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerID="fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7" exitCode=143 Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.998042 4922 generic.go:334] "Generic (PLEG): container finished" podID="653a41bb-bb1d-421c-a92b-7f2811d95edf" containerID="9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e" exitCode=143 Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.998049 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.998057 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" event={"ID":"653a41bb-bb1d-421c-a92b-7f2811d95edf","Type":"ContainerDied","Data":"d40ac6f2baab7bfed2de70b92b02e9fd3903b69815200ff68c4655a0de9763f3"} Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.998119 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" event={"ID":"653a41bb-bb1d-421c-a92b-7f2811d95edf","Type":"ContainerDied","Data":"e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c"} Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.998141 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" event={"ID":"653a41bb-bb1d-421c-a92b-7f2811d95edf","Type":"ContainerDied","Data":"11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6"} Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.998164 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" event={"ID":"653a41bb-bb1d-421c-a92b-7f2811d95edf","Type":"ContainerDied","Data":"90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16"} Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.998183 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" event={"ID":"653a41bb-bb1d-421c-a92b-7f2811d95edf","Type":"ContainerDied","Data":"eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b"} Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.998204 4922 scope.go:117] "RemoveContainer" containerID="d40ac6f2baab7bfed2de70b92b02e9fd3903b69815200ff68c4655a0de9763f3" Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.998237 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" event={"ID":"653a41bb-bb1d-421c-a92b-7f2811d95edf","Type":"ContainerDied","Data":"3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844"} Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.998262 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1d1aca901c6bb735f04a76a87f7a05d6c549b8f936a07fb1917ece2b9fdd0081"} Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.998283 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c"} Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.998297 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6"} Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.998381 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16"} Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.998397 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b"} Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.998408 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844"} Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.998419 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7"} Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.998461 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e"} Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.998474 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794"} Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.998491 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" event={"ID":"653a41bb-bb1d-421c-a92b-7f2811d95edf","Type":"ContainerDied","Data":"fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7"} Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.998509 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d40ac6f2baab7bfed2de70b92b02e9fd3903b69815200ff68c4655a0de9763f3"} Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.998929 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1d1aca901c6bb735f04a76a87f7a05d6c549b8f936a07fb1917ece2b9fdd0081"} Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.998943 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c"} Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.998954 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6"} Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.998996 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16"} Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.999008 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b"} Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.999019 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844"} Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.999030 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7"} Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.999040 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e"} Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.999051 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794"} Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.999262 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" event={"ID":"653a41bb-bb1d-421c-a92b-7f2811d95edf","Type":"ContainerDied","Data":"9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e"} Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.999285 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d40ac6f2baab7bfed2de70b92b02e9fd3903b69815200ff68c4655a0de9763f3"} Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.999298 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1d1aca901c6bb735f04a76a87f7a05d6c549b8f936a07fb1917ece2b9fdd0081"} Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.999309 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c"} Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.999321 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6"} Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.999332 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16"} Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.999344 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b"} Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.999355 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844"} Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.999397 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7"} Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.999408 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e"} Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.999418 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794"} Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.999436 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wg4r5" event={"ID":"653a41bb-bb1d-421c-a92b-7f2811d95edf","Type":"ContainerDied","Data":"925e7e67a4fbe78eeed080fb6248ee3ea896e9f6ac32d11416ce15ab5e43d0fb"} Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.999452 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d40ac6f2baab7bfed2de70b92b02e9fd3903b69815200ff68c4655a0de9763f3"} Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.999465 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1d1aca901c6bb735f04a76a87f7a05d6c549b8f936a07fb1917ece2b9fdd0081"} Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.999476 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c"} Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.999487 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6"} Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.999498 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16"} Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.999509 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b"} Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.999524 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844"} Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.999535 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7"} Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.999546 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e"} Feb 18 11:47:19 crc kubenswrapper[4922]: I0218 11:47:19.999556 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794"} Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.001332 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-c9xzd_9b4595ac-c521-4ada-950d-e1b01cdff99b/kube-multus/2.log" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.002089 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-c9xzd_9b4595ac-c521-4ada-950d-e1b01cdff99b/kube-multus/1.log" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.002178 4922 generic.go:334] "Generic (PLEG): container finished" podID="9b4595ac-c521-4ada-950d-e1b01cdff99b" containerID="7270cd2a97c0ebd334ca9515a9ea50d1714e52da4915c1ade982166b94d940c9" exitCode=2 Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.002250 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-c9xzd" event={"ID":"9b4595ac-c521-4ada-950d-e1b01cdff99b","Type":"ContainerDied","Data":"7270cd2a97c0ebd334ca9515a9ea50d1714e52da4915c1ade982166b94d940c9"} Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.002307 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"71d0689c83ee6c0c1704fac1f69846ec7d6ef1b16479ee4eae33acefd6b84765"} Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.003131 4922 scope.go:117] "RemoveContainer" containerID="7270cd2a97c0ebd334ca9515a9ea50d1714e52da4915c1ade982166b94d940c9" Feb 18 11:47:20 crc kubenswrapper[4922]: E0218 11:47:20.003519 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-c9xzd_openshift-multus(9b4595ac-c521-4ada-950d-e1b01cdff99b)\"" pod="openshift-multus/multus-c9xzd" podUID="9b4595ac-c521-4ada-950d-e1b01cdff99b" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.016433 4922 scope.go:117] "RemoveContainer" containerID="1d1aca901c6bb735f04a76a87f7a05d6c549b8f936a07fb1917ece2b9fdd0081" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.037644 4922 scope.go:117] "RemoveContainer" containerID="e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.059154 4922 scope.go:117] "RemoveContainer" containerID="11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.061484 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-wg4r5"] Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.068601 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-wg4r5"] Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.080605 4922 scope.go:117] "RemoveContainer" containerID="90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.087882 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.094499 4922 scope.go:117] "RemoveContainer" containerID="eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.121538 4922 scope.go:117] "RemoveContainer" containerID="3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.141313 4922 scope.go:117] "RemoveContainer" containerID="fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.161102 4922 scope.go:117] "RemoveContainer" containerID="9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.180917 4922 scope.go:117] "RemoveContainer" containerID="64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.196820 4922 scope.go:117] "RemoveContainer" containerID="d40ac6f2baab7bfed2de70b92b02e9fd3903b69815200ff68c4655a0de9763f3" Feb 18 11:47:20 crc kubenswrapper[4922]: E0218 11:47:20.197271 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d40ac6f2baab7bfed2de70b92b02e9fd3903b69815200ff68c4655a0de9763f3\": container with ID starting with d40ac6f2baab7bfed2de70b92b02e9fd3903b69815200ff68c4655a0de9763f3 not found: ID does not exist" containerID="d40ac6f2baab7bfed2de70b92b02e9fd3903b69815200ff68c4655a0de9763f3" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.197347 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d40ac6f2baab7bfed2de70b92b02e9fd3903b69815200ff68c4655a0de9763f3"} err="failed to get container status \"d40ac6f2baab7bfed2de70b92b02e9fd3903b69815200ff68c4655a0de9763f3\": rpc error: code = NotFound desc = could not find container \"d40ac6f2baab7bfed2de70b92b02e9fd3903b69815200ff68c4655a0de9763f3\": container with ID starting with d40ac6f2baab7bfed2de70b92b02e9fd3903b69815200ff68c4655a0de9763f3 not found: ID does not exist" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.197415 4922 scope.go:117] "RemoveContainer" containerID="1d1aca901c6bb735f04a76a87f7a05d6c549b8f936a07fb1917ece2b9fdd0081" Feb 18 11:47:20 crc kubenswrapper[4922]: E0218 11:47:20.197833 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d1aca901c6bb735f04a76a87f7a05d6c549b8f936a07fb1917ece2b9fdd0081\": container with ID starting with 1d1aca901c6bb735f04a76a87f7a05d6c549b8f936a07fb1917ece2b9fdd0081 not found: ID does not exist" containerID="1d1aca901c6bb735f04a76a87f7a05d6c549b8f936a07fb1917ece2b9fdd0081" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.197887 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d1aca901c6bb735f04a76a87f7a05d6c549b8f936a07fb1917ece2b9fdd0081"} err="failed to get container status \"1d1aca901c6bb735f04a76a87f7a05d6c549b8f936a07fb1917ece2b9fdd0081\": rpc error: code = NotFound desc = could not find container \"1d1aca901c6bb735f04a76a87f7a05d6c549b8f936a07fb1917ece2b9fdd0081\": container with ID starting with 1d1aca901c6bb735f04a76a87f7a05d6c549b8f936a07fb1917ece2b9fdd0081 not found: ID does not exist" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.197925 4922 scope.go:117] "RemoveContainer" containerID="e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c" Feb 18 11:47:20 crc kubenswrapper[4922]: E0218 11:47:20.198180 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c\": container with ID starting with e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c not found: ID does not exist" containerID="e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.198209 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c"} err="failed to get container status \"e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c\": rpc error: code = NotFound desc = could not find container \"e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c\": container with ID starting with e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c not found: ID does not exist" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.198229 4922 scope.go:117] "RemoveContainer" containerID="11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6" Feb 18 11:47:20 crc kubenswrapper[4922]: E0218 11:47:20.198642 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6\": container with ID starting with 11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6 not found: ID does not exist" containerID="11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.198675 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6"} err="failed to get container status \"11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6\": rpc error: code = NotFound desc = could not find container \"11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6\": container with ID starting with 11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6 not found: ID does not exist" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.198694 4922 scope.go:117] "RemoveContainer" containerID="90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16" Feb 18 11:47:20 crc kubenswrapper[4922]: E0218 11:47:20.198932 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16\": container with ID starting with 90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16 not found: ID does not exist" containerID="90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.198964 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16"} err="failed to get container status \"90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16\": rpc error: code = NotFound desc = could not find container \"90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16\": container with ID starting with 90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16 not found: ID does not exist" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.198985 4922 scope.go:117] "RemoveContainer" containerID="eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b" Feb 18 11:47:20 crc kubenswrapper[4922]: E0218 11:47:20.199380 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b\": container with ID starting with eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b not found: ID does not exist" containerID="eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.199412 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b"} err="failed to get container status \"eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b\": rpc error: code = NotFound desc = could not find container \"eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b\": container with ID starting with eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b not found: ID does not exist" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.199432 4922 scope.go:117] "RemoveContainer" containerID="3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844" Feb 18 11:47:20 crc kubenswrapper[4922]: E0218 11:47:20.199858 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844\": container with ID starting with 3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844 not found: ID does not exist" containerID="3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.199892 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844"} err="failed to get container status \"3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844\": rpc error: code = NotFound desc = could not find container \"3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844\": container with ID starting with 3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844 not found: ID does not exist" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.199914 4922 scope.go:117] "RemoveContainer" containerID="fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7" Feb 18 11:47:20 crc kubenswrapper[4922]: E0218 11:47:20.200126 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7\": container with ID starting with fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7 not found: ID does not exist" containerID="fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.200156 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7"} err="failed to get container status \"fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7\": rpc error: code = NotFound desc = could not find container \"fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7\": container with ID starting with fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7 not found: ID does not exist" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.200176 4922 scope.go:117] "RemoveContainer" containerID="9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e" Feb 18 11:47:20 crc kubenswrapper[4922]: E0218 11:47:20.200418 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e\": container with ID starting with 9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e not found: ID does not exist" containerID="9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.200454 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e"} err="failed to get container status \"9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e\": rpc error: code = NotFound desc = could not find container \"9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e\": container with ID starting with 9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e not found: ID does not exist" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.200508 4922 scope.go:117] "RemoveContainer" containerID="64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794" Feb 18 11:47:20 crc kubenswrapper[4922]: E0218 11:47:20.200776 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\": container with ID starting with 64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794 not found: ID does not exist" containerID="64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.200801 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794"} err="failed to get container status \"64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\": rpc error: code = NotFound desc = could not find container \"64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\": container with ID starting with 64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794 not found: ID does not exist" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.200817 4922 scope.go:117] "RemoveContainer" containerID="d40ac6f2baab7bfed2de70b92b02e9fd3903b69815200ff68c4655a0de9763f3" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.201082 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d40ac6f2baab7bfed2de70b92b02e9fd3903b69815200ff68c4655a0de9763f3"} err="failed to get container status \"d40ac6f2baab7bfed2de70b92b02e9fd3903b69815200ff68c4655a0de9763f3\": rpc error: code = NotFound desc = could not find container \"d40ac6f2baab7bfed2de70b92b02e9fd3903b69815200ff68c4655a0de9763f3\": container with ID starting with d40ac6f2baab7bfed2de70b92b02e9fd3903b69815200ff68c4655a0de9763f3 not found: ID does not exist" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.201104 4922 scope.go:117] "RemoveContainer" containerID="1d1aca901c6bb735f04a76a87f7a05d6c549b8f936a07fb1917ece2b9fdd0081" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.201323 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d1aca901c6bb735f04a76a87f7a05d6c549b8f936a07fb1917ece2b9fdd0081"} err="failed to get container status \"1d1aca901c6bb735f04a76a87f7a05d6c549b8f936a07fb1917ece2b9fdd0081\": rpc error: code = NotFound desc = could not find container \"1d1aca901c6bb735f04a76a87f7a05d6c549b8f936a07fb1917ece2b9fdd0081\": container with ID starting with 1d1aca901c6bb735f04a76a87f7a05d6c549b8f936a07fb1917ece2b9fdd0081 not found: ID does not exist" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.201408 4922 scope.go:117] "RemoveContainer" containerID="e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.201642 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c"} err="failed to get container status \"e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c\": rpc error: code = NotFound desc = could not find container \"e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c\": container with ID starting with e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c not found: ID does not exist" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.201672 4922 scope.go:117] "RemoveContainer" containerID="11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.201883 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6"} err="failed to get container status \"11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6\": rpc error: code = NotFound desc = could not find container \"11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6\": container with ID starting with 11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6 not found: ID does not exist" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.201907 4922 scope.go:117] "RemoveContainer" containerID="90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.202282 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16"} err="failed to get container status \"90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16\": rpc error: code = NotFound desc = could not find container \"90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16\": container with ID starting with 90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16 not found: ID does not exist" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.202306 4922 scope.go:117] "RemoveContainer" containerID="eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.202594 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b"} err="failed to get container status \"eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b\": rpc error: code = NotFound desc = could not find container \"eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b\": container with ID starting with eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b not found: ID does not exist" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.202621 4922 scope.go:117] "RemoveContainer" containerID="3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.202821 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844"} err="failed to get container status \"3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844\": rpc error: code = NotFound desc = could not find container \"3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844\": container with ID starting with 3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844 not found: ID does not exist" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.202844 4922 scope.go:117] "RemoveContainer" containerID="fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.203060 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7"} err="failed to get container status \"fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7\": rpc error: code = NotFound desc = could not find container \"fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7\": container with ID starting with fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7 not found: ID does not exist" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.203080 4922 scope.go:117] "RemoveContainer" containerID="9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.203311 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e"} err="failed to get container status \"9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e\": rpc error: code = NotFound desc = could not find container \"9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e\": container with ID starting with 9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e not found: ID does not exist" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.203336 4922 scope.go:117] "RemoveContainer" containerID="64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.203591 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794"} err="failed to get container status \"64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\": rpc error: code = NotFound desc = could not find container \"64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\": container with ID starting with 64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794 not found: ID does not exist" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.203612 4922 scope.go:117] "RemoveContainer" containerID="d40ac6f2baab7bfed2de70b92b02e9fd3903b69815200ff68c4655a0de9763f3" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.203843 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d40ac6f2baab7bfed2de70b92b02e9fd3903b69815200ff68c4655a0de9763f3"} err="failed to get container status \"d40ac6f2baab7bfed2de70b92b02e9fd3903b69815200ff68c4655a0de9763f3\": rpc error: code = NotFound desc = could not find container \"d40ac6f2baab7bfed2de70b92b02e9fd3903b69815200ff68c4655a0de9763f3\": container with ID starting with d40ac6f2baab7bfed2de70b92b02e9fd3903b69815200ff68c4655a0de9763f3 not found: ID does not exist" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.203869 4922 scope.go:117] "RemoveContainer" containerID="1d1aca901c6bb735f04a76a87f7a05d6c549b8f936a07fb1917ece2b9fdd0081" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.204171 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d1aca901c6bb735f04a76a87f7a05d6c549b8f936a07fb1917ece2b9fdd0081"} err="failed to get container status \"1d1aca901c6bb735f04a76a87f7a05d6c549b8f936a07fb1917ece2b9fdd0081\": rpc error: code = NotFound desc = could not find container \"1d1aca901c6bb735f04a76a87f7a05d6c549b8f936a07fb1917ece2b9fdd0081\": container with ID starting with 1d1aca901c6bb735f04a76a87f7a05d6c549b8f936a07fb1917ece2b9fdd0081 not found: ID does not exist" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.204197 4922 scope.go:117] "RemoveContainer" containerID="e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.204447 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c"} err="failed to get container status \"e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c\": rpc error: code = NotFound desc = could not find container \"e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c\": container with ID starting with e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c not found: ID does not exist" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.204472 4922 scope.go:117] "RemoveContainer" containerID="11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.204838 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6"} err="failed to get container status \"11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6\": rpc error: code = NotFound desc = could not find container \"11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6\": container with ID starting with 11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6 not found: ID does not exist" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.204861 4922 scope.go:117] "RemoveContainer" containerID="90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.205084 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16"} err="failed to get container status \"90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16\": rpc error: code = NotFound desc = could not find container \"90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16\": container with ID starting with 90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16 not found: ID does not exist" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.205110 4922 scope.go:117] "RemoveContainer" containerID="eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.205385 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b"} err="failed to get container status \"eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b\": rpc error: code = NotFound desc = could not find container \"eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b\": container with ID starting with eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b not found: ID does not exist" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.205417 4922 scope.go:117] "RemoveContainer" containerID="3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.205652 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844"} err="failed to get container status \"3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844\": rpc error: code = NotFound desc = could not find container \"3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844\": container with ID starting with 3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844 not found: ID does not exist" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.205683 4922 scope.go:117] "RemoveContainer" containerID="fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.205918 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7"} err="failed to get container status \"fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7\": rpc error: code = NotFound desc = could not find container \"fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7\": container with ID starting with fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7 not found: ID does not exist" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.205941 4922 scope.go:117] "RemoveContainer" containerID="9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.206172 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e"} err="failed to get container status \"9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e\": rpc error: code = NotFound desc = could not find container \"9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e\": container with ID starting with 9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e not found: ID does not exist" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.206193 4922 scope.go:117] "RemoveContainer" containerID="64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.206539 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794"} err="failed to get container status \"64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\": rpc error: code = NotFound desc = could not find container \"64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\": container with ID starting with 64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794 not found: ID does not exist" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.206560 4922 scope.go:117] "RemoveContainer" containerID="d40ac6f2baab7bfed2de70b92b02e9fd3903b69815200ff68c4655a0de9763f3" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.206785 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d40ac6f2baab7bfed2de70b92b02e9fd3903b69815200ff68c4655a0de9763f3"} err="failed to get container status \"d40ac6f2baab7bfed2de70b92b02e9fd3903b69815200ff68c4655a0de9763f3\": rpc error: code = NotFound desc = could not find container \"d40ac6f2baab7bfed2de70b92b02e9fd3903b69815200ff68c4655a0de9763f3\": container with ID starting with d40ac6f2baab7bfed2de70b92b02e9fd3903b69815200ff68c4655a0de9763f3 not found: ID does not exist" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.206805 4922 scope.go:117] "RemoveContainer" containerID="1d1aca901c6bb735f04a76a87f7a05d6c549b8f936a07fb1917ece2b9fdd0081" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.207073 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d1aca901c6bb735f04a76a87f7a05d6c549b8f936a07fb1917ece2b9fdd0081"} err="failed to get container status \"1d1aca901c6bb735f04a76a87f7a05d6c549b8f936a07fb1917ece2b9fdd0081\": rpc error: code = NotFound desc = could not find container \"1d1aca901c6bb735f04a76a87f7a05d6c549b8f936a07fb1917ece2b9fdd0081\": container with ID starting with 1d1aca901c6bb735f04a76a87f7a05d6c549b8f936a07fb1917ece2b9fdd0081 not found: ID does not exist" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.207093 4922 scope.go:117] "RemoveContainer" containerID="e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.207327 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c"} err="failed to get container status \"e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c\": rpc error: code = NotFound desc = could not find container \"e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c\": container with ID starting with e99aaa236f5140d0dc7d57feac05d3ed348ca3e3c21662747fc2d9fb6f6d524c not found: ID does not exist" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.207349 4922 scope.go:117] "RemoveContainer" containerID="11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.207631 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6"} err="failed to get container status \"11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6\": rpc error: code = NotFound desc = could not find container \"11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6\": container with ID starting with 11cc7edb64eebf72aeab109cbba4dfecbb601f3e9f585237b8e57b66896ec0f6 not found: ID does not exist" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.207654 4922 scope.go:117] "RemoveContainer" containerID="90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.207915 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16"} err="failed to get container status \"90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16\": rpc error: code = NotFound desc = could not find container \"90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16\": container with ID starting with 90386bc683dad72409b376571ec9798e81dde30ffde519bc4a97952520449c16 not found: ID does not exist" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.207936 4922 scope.go:117] "RemoveContainer" containerID="eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.208200 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b"} err="failed to get container status \"eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b\": rpc error: code = NotFound desc = could not find container \"eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b\": container with ID starting with eec020def8c057658795c1b83804239fb6285e4bbcd3ba3a1fe05368365fca3b not found: ID does not exist" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.208221 4922 scope.go:117] "RemoveContainer" containerID="3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.208522 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844"} err="failed to get container status \"3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844\": rpc error: code = NotFound desc = could not find container \"3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844\": container with ID starting with 3fe29d47bb107a4073f06d833a5f436aa6746bf2c47dd7c820f9ac2e4308a844 not found: ID does not exist" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.208543 4922 scope.go:117] "RemoveContainer" containerID="fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.208759 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7"} err="failed to get container status \"fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7\": rpc error: code = NotFound desc = could not find container \"fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7\": container with ID starting with fcda158beee470e32bf198a44c8cc4e4433345e4000dcefdd16252c0aa3529e7 not found: ID does not exist" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.208782 4922 scope.go:117] "RemoveContainer" containerID="9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.208985 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e"} err="failed to get container status \"9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e\": rpc error: code = NotFound desc = could not find container \"9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e\": container with ID starting with 9cbef294a9d462e0843258405c7892152327a707bc2577f06eedfb7ecf13e20e not found: ID does not exist" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.209005 4922 scope.go:117] "RemoveContainer" containerID="64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.209234 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794"} err="failed to get container status \"64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\": rpc error: code = NotFound desc = could not find container \"64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794\": container with ID starting with 64c67cddfeb39e5545606a34a3bd8080de4a210b966a185824b2d374b880c794 not found: ID does not exist" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.209260 4922 scope.go:117] "RemoveContainer" containerID="d40ac6f2baab7bfed2de70b92b02e9fd3903b69815200ff68c4655a0de9763f3" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.209495 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d40ac6f2baab7bfed2de70b92b02e9fd3903b69815200ff68c4655a0de9763f3"} err="failed to get container status \"d40ac6f2baab7bfed2de70b92b02e9fd3903b69815200ff68c4655a0de9763f3\": rpc error: code = NotFound desc = could not find container \"d40ac6f2baab7bfed2de70b92b02e9fd3903b69815200ff68c4655a0de9763f3\": container with ID starting with d40ac6f2baab7bfed2de70b92b02e9fd3903b69815200ff68c4655a0de9763f3 not found: ID does not exist" Feb 18 11:47:20 crc kubenswrapper[4922]: I0218 11:47:20.983452 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="653a41bb-bb1d-421c-a92b-7f2811d95edf" path="/var/lib/kubelet/pods/653a41bb-bb1d-421c-a92b-7f2811d95edf/volumes" Feb 18 11:47:21 crc kubenswrapper[4922]: I0218 11:47:21.012264 4922 generic.go:334] "Generic (PLEG): container finished" podID="7b826f3a-fb9a-4cf2-a4de-c6a394001583" containerID="a87f4af54fb6c7a7e63fe8e4acaf8b08557715517bd1538e2e62c06cd4396cfd" exitCode=0 Feb 18 11:47:21 crc kubenswrapper[4922]: I0218 11:47:21.012412 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" event={"ID":"7b826f3a-fb9a-4cf2-a4de-c6a394001583","Type":"ContainerDied","Data":"a87f4af54fb6c7a7e63fe8e4acaf8b08557715517bd1538e2e62c06cd4396cfd"} Feb 18 11:47:21 crc kubenswrapper[4922]: I0218 11:47:21.012488 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" event={"ID":"7b826f3a-fb9a-4cf2-a4de-c6a394001583","Type":"ContainerStarted","Data":"6e8d88074466e16ebb906dc20fa1fa6238859abee7bbaaee0417428d015ebce7"} Feb 18 11:47:22 crc kubenswrapper[4922]: I0218 11:47:22.033427 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" event={"ID":"7b826f3a-fb9a-4cf2-a4de-c6a394001583","Type":"ContainerStarted","Data":"03636054b3e91332aab32a780f232662c130a45f2305252b2a7452d92b262e58"} Feb 18 11:47:22 crc kubenswrapper[4922]: I0218 11:47:22.033816 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" event={"ID":"7b826f3a-fb9a-4cf2-a4de-c6a394001583","Type":"ContainerStarted","Data":"b2296cbb18d92c80b6c70ee9bdd4f34f0480a8c243f77be6f8dba16b4259d9ad"} Feb 18 11:47:22 crc kubenswrapper[4922]: I0218 11:47:22.033836 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" event={"ID":"7b826f3a-fb9a-4cf2-a4de-c6a394001583","Type":"ContainerStarted","Data":"23785c92de84daadcefbcd93677f741a2911c038275116c42a870b94d15b1bb8"} Feb 18 11:47:22 crc kubenswrapper[4922]: I0218 11:47:22.033854 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" event={"ID":"7b826f3a-fb9a-4cf2-a4de-c6a394001583","Type":"ContainerStarted","Data":"e4604063944727e2aedb2e56cde174699f16b0cd110353c8b5e9a2ffb95dd3f5"} Feb 18 11:47:22 crc kubenswrapper[4922]: I0218 11:47:22.033871 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" event={"ID":"7b826f3a-fb9a-4cf2-a4de-c6a394001583","Type":"ContainerStarted","Data":"ecf90ff0857fef7355f76a5335271a20ba8266534519b68a2c92c1f6c2e1f8ef"} Feb 18 11:47:22 crc kubenswrapper[4922]: I0218 11:47:22.033886 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" event={"ID":"7b826f3a-fb9a-4cf2-a4de-c6a394001583","Type":"ContainerStarted","Data":"c9ca32e51a89560fb8e225d5164d676909819dfa533aae4ff676ce9c45ce279a"} Feb 18 11:47:24 crc kubenswrapper[4922]: I0218 11:47:24.055753 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" event={"ID":"7b826f3a-fb9a-4cf2-a4de-c6a394001583","Type":"ContainerStarted","Data":"815f6e83c156cb0a0b59b42c0c854928c6b48c9fc8e462799ce3b2f5a8442bda"} Feb 18 11:47:27 crc kubenswrapper[4922]: I0218 11:47:27.081117 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" event={"ID":"7b826f3a-fb9a-4cf2-a4de-c6a394001583","Type":"ContainerStarted","Data":"b6353450a31c16913533ddf271788c3533b2b1513128991a392cbd1b434a8e8f"} Feb 18 11:47:27 crc kubenswrapper[4922]: I0218 11:47:27.081622 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:27 crc kubenswrapper[4922]: I0218 11:47:27.081648 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:27 crc kubenswrapper[4922]: I0218 11:47:27.119900 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:27 crc kubenswrapper[4922]: I0218 11:47:27.126617 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" podStartSLOduration=8.126594747 podStartE2EDuration="8.126594747s" podCreationTimestamp="2026-02-18 11:47:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:47:27.125156072 +0000 UTC m=+648.852860172" watchObservedRunningTime="2026-02-18 11:47:27.126594747 +0000 UTC m=+648.854298847" Feb 18 11:47:28 crc kubenswrapper[4922]: I0218 11:47:28.086643 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:28 crc kubenswrapper[4922]: I0218 11:47:28.111220 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:47:33 crc kubenswrapper[4922]: I0218 11:47:33.974103 4922 scope.go:117] "RemoveContainer" containerID="7270cd2a97c0ebd334ca9515a9ea50d1714e52da4915c1ade982166b94d940c9" Feb 18 11:47:33 crc kubenswrapper[4922]: E0218 11:47:33.974833 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-c9xzd_openshift-multus(9b4595ac-c521-4ada-950d-e1b01cdff99b)\"" pod="openshift-multus/multus-c9xzd" podUID="9b4595ac-c521-4ada-950d-e1b01cdff99b" Feb 18 11:47:39 crc kubenswrapper[4922]: I0218 11:47:39.304725 4922 scope.go:117] "RemoveContainer" containerID="71d0689c83ee6c0c1704fac1f69846ec7d6ef1b16479ee4eae33acefd6b84765" Feb 18 11:47:40 crc kubenswrapper[4922]: I0218 11:47:40.165837 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-c9xzd_9b4595ac-c521-4ada-950d-e1b01cdff99b/kube-multus/2.log" Feb 18 11:47:46 crc kubenswrapper[4922]: I0218 11:47:46.609094 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp"] Feb 18 11:47:46 crc kubenswrapper[4922]: I0218 11:47:46.610865 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp" Feb 18 11:47:46 crc kubenswrapper[4922]: I0218 11:47:46.613581 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 18 11:47:46 crc kubenswrapper[4922]: I0218 11:47:46.620958 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp"] Feb 18 11:47:46 crc kubenswrapper[4922]: I0218 11:47:46.730932 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/00938d04-ee62-4756-830e-f66e2fbaab9d-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp\" (UID: \"00938d04-ee62-4756-830e-f66e2fbaab9d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp" Feb 18 11:47:46 crc kubenswrapper[4922]: I0218 11:47:46.730988 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xkrb\" (UniqueName: \"kubernetes.io/projected/00938d04-ee62-4756-830e-f66e2fbaab9d-kube-api-access-8xkrb\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp\" (UID: \"00938d04-ee62-4756-830e-f66e2fbaab9d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp" Feb 18 11:47:46 crc kubenswrapper[4922]: I0218 11:47:46.731300 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/00938d04-ee62-4756-830e-f66e2fbaab9d-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp\" (UID: \"00938d04-ee62-4756-830e-f66e2fbaab9d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp" Feb 18 11:47:46 crc kubenswrapper[4922]: I0218 11:47:46.832656 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/00938d04-ee62-4756-830e-f66e2fbaab9d-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp\" (UID: \"00938d04-ee62-4756-830e-f66e2fbaab9d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp" Feb 18 11:47:46 crc kubenswrapper[4922]: I0218 11:47:46.832736 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xkrb\" (UniqueName: \"kubernetes.io/projected/00938d04-ee62-4756-830e-f66e2fbaab9d-kube-api-access-8xkrb\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp\" (UID: \"00938d04-ee62-4756-830e-f66e2fbaab9d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp" Feb 18 11:47:46 crc kubenswrapper[4922]: I0218 11:47:46.832808 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/00938d04-ee62-4756-830e-f66e2fbaab9d-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp\" (UID: \"00938d04-ee62-4756-830e-f66e2fbaab9d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp" Feb 18 11:47:46 crc kubenswrapper[4922]: I0218 11:47:46.833512 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/00938d04-ee62-4756-830e-f66e2fbaab9d-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp\" (UID: \"00938d04-ee62-4756-830e-f66e2fbaab9d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp" Feb 18 11:47:46 crc kubenswrapper[4922]: I0218 11:47:46.833521 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/00938d04-ee62-4756-830e-f66e2fbaab9d-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp\" (UID: \"00938d04-ee62-4756-830e-f66e2fbaab9d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp" Feb 18 11:47:46 crc kubenswrapper[4922]: I0218 11:47:46.854126 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xkrb\" (UniqueName: \"kubernetes.io/projected/00938d04-ee62-4756-830e-f66e2fbaab9d-kube-api-access-8xkrb\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp\" (UID: \"00938d04-ee62-4756-830e-f66e2fbaab9d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp" Feb 18 11:47:46 crc kubenswrapper[4922]: I0218 11:47:46.958101 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp" Feb 18 11:47:46 crc kubenswrapper[4922]: I0218 11:47:46.973043 4922 scope.go:117] "RemoveContainer" containerID="7270cd2a97c0ebd334ca9515a9ea50d1714e52da4915c1ade982166b94d940c9" Feb 18 11:47:47 crc kubenswrapper[4922]: E0218 11:47:47.008623 4922 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp_openshift-marketplace_00938d04-ee62-4756-830e-f66e2fbaab9d_0(9372639c5c83c78bfb3331184249525339a100732e043822202f74150e6c666c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 18 11:47:47 crc kubenswrapper[4922]: E0218 11:47:47.008736 4922 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp_openshift-marketplace_00938d04-ee62-4756-830e-f66e2fbaab9d_0(9372639c5c83c78bfb3331184249525339a100732e043822202f74150e6c666c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp" Feb 18 11:47:47 crc kubenswrapper[4922]: E0218 11:47:47.008776 4922 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp_openshift-marketplace_00938d04-ee62-4756-830e-f66e2fbaab9d_0(9372639c5c83c78bfb3331184249525339a100732e043822202f74150e6c666c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp" Feb 18 11:47:47 crc kubenswrapper[4922]: E0218 11:47:47.008863 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp_openshift-marketplace(00938d04-ee62-4756-830e-f66e2fbaab9d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp_openshift-marketplace(00938d04-ee62-4756-830e-f66e2fbaab9d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp_openshift-marketplace_00938d04-ee62-4756-830e-f66e2fbaab9d_0(9372639c5c83c78bfb3331184249525339a100732e043822202f74150e6c666c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp" podUID="00938d04-ee62-4756-830e-f66e2fbaab9d" Feb 18 11:47:47 crc kubenswrapper[4922]: I0218 11:47:47.210958 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-c9xzd_9b4595ac-c521-4ada-950d-e1b01cdff99b/kube-multus/2.log" Feb 18 11:47:47 crc kubenswrapper[4922]: I0218 11:47:47.211103 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp" Feb 18 11:47:47 crc kubenswrapper[4922]: I0218 11:47:47.211138 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-c9xzd" event={"ID":"9b4595ac-c521-4ada-950d-e1b01cdff99b","Type":"ContainerStarted","Data":"ca6c59f69a9b82d6a8d68ebcc74a0ebc9731c0ab9458f534d5fbbdb39210c498"} Feb 18 11:47:47 crc kubenswrapper[4922]: I0218 11:47:47.211726 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp" Feb 18 11:47:47 crc kubenswrapper[4922]: E0218 11:47:47.237018 4922 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp_openshift-marketplace_00938d04-ee62-4756-830e-f66e2fbaab9d_0(4b0c1b8effdec1f0795d00cd2d87715574e9304f5c1b194966456665a100ed32): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 18 11:47:47 crc kubenswrapper[4922]: E0218 11:47:47.237108 4922 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp_openshift-marketplace_00938d04-ee62-4756-830e-f66e2fbaab9d_0(4b0c1b8effdec1f0795d00cd2d87715574e9304f5c1b194966456665a100ed32): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp" Feb 18 11:47:47 crc kubenswrapper[4922]: E0218 11:47:47.237358 4922 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp_openshift-marketplace_00938d04-ee62-4756-830e-f66e2fbaab9d_0(4b0c1b8effdec1f0795d00cd2d87715574e9304f5c1b194966456665a100ed32): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp" Feb 18 11:47:47 crc kubenswrapper[4922]: E0218 11:47:47.237473 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp_openshift-marketplace(00938d04-ee62-4756-830e-f66e2fbaab9d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp_openshift-marketplace(00938d04-ee62-4756-830e-f66e2fbaab9d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp_openshift-marketplace_00938d04-ee62-4756-830e-f66e2fbaab9d_0(4b0c1b8effdec1f0795d00cd2d87715574e9304f5c1b194966456665a100ed32): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp" podUID="00938d04-ee62-4756-830e-f66e2fbaab9d" Feb 18 11:47:50 crc kubenswrapper[4922]: I0218 11:47:50.118764 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t47sv" Feb 18 11:48:00 crc kubenswrapper[4922]: I0218 11:48:00.973129 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp" Feb 18 11:48:00 crc kubenswrapper[4922]: I0218 11:48:00.974160 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp" Feb 18 11:48:01 crc kubenswrapper[4922]: I0218 11:48:01.237270 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp"] Feb 18 11:48:01 crc kubenswrapper[4922]: W0218 11:48:01.243547 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00938d04_ee62_4756_830e_f66e2fbaab9d.slice/crio-7c71d4c0c8bc51a41b0b55cdd5a469e8c57de802ea041e37234606852613315d WatchSource:0}: Error finding container 7c71d4c0c8bc51a41b0b55cdd5a469e8c57de802ea041e37234606852613315d: Status 404 returned error can't find the container with id 7c71d4c0c8bc51a41b0b55cdd5a469e8c57de802ea041e37234606852613315d Feb 18 11:48:01 crc kubenswrapper[4922]: I0218 11:48:01.300988 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp" event={"ID":"00938d04-ee62-4756-830e-f66e2fbaab9d","Type":"ContainerStarted","Data":"7c71d4c0c8bc51a41b0b55cdd5a469e8c57de802ea041e37234606852613315d"} Feb 18 11:48:02 crc kubenswrapper[4922]: I0218 11:48:02.311077 4922 generic.go:334] "Generic (PLEG): container finished" podID="00938d04-ee62-4756-830e-f66e2fbaab9d" containerID="27c4e9ecb5dc34c78af5e6369eb74c5f7a3dbdb2f850e0b5fc0d2b7be739625d" exitCode=0 Feb 18 11:48:02 crc kubenswrapper[4922]: I0218 11:48:02.311196 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp" event={"ID":"00938d04-ee62-4756-830e-f66e2fbaab9d","Type":"ContainerDied","Data":"27c4e9ecb5dc34c78af5e6369eb74c5f7a3dbdb2f850e0b5fc0d2b7be739625d"} Feb 18 11:48:05 crc kubenswrapper[4922]: I0218 11:48:05.333549 4922 generic.go:334] "Generic (PLEG): container finished" podID="00938d04-ee62-4756-830e-f66e2fbaab9d" containerID="274001f2e7a62c9010ffb094988d109a7d7641ec448b22bffc09d6c59f27fd5e" exitCode=0 Feb 18 11:48:05 crc kubenswrapper[4922]: I0218 11:48:05.333606 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp" event={"ID":"00938d04-ee62-4756-830e-f66e2fbaab9d","Type":"ContainerDied","Data":"274001f2e7a62c9010ffb094988d109a7d7641ec448b22bffc09d6c59f27fd5e"} Feb 18 11:48:06 crc kubenswrapper[4922]: I0218 11:48:06.343648 4922 generic.go:334] "Generic (PLEG): container finished" podID="00938d04-ee62-4756-830e-f66e2fbaab9d" containerID="ff16461aa96575c79fb9785e391e62d9e7f5a2382b77e8a210b2b2e197441584" exitCode=0 Feb 18 11:48:06 crc kubenswrapper[4922]: I0218 11:48:06.343769 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp" event={"ID":"00938d04-ee62-4756-830e-f66e2fbaab9d","Type":"ContainerDied","Data":"ff16461aa96575c79fb9785e391e62d9e7f5a2382b77e8a210b2b2e197441584"} Feb 18 11:48:07 crc kubenswrapper[4922]: I0218 11:48:07.647738 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp" Feb 18 11:48:07 crc kubenswrapper[4922]: I0218 11:48:07.747221 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/00938d04-ee62-4756-830e-f66e2fbaab9d-util\") pod \"00938d04-ee62-4756-830e-f66e2fbaab9d\" (UID: \"00938d04-ee62-4756-830e-f66e2fbaab9d\") " Feb 18 11:48:07 crc kubenswrapper[4922]: I0218 11:48:07.747282 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/00938d04-ee62-4756-830e-f66e2fbaab9d-bundle\") pod \"00938d04-ee62-4756-830e-f66e2fbaab9d\" (UID: \"00938d04-ee62-4756-830e-f66e2fbaab9d\") " Feb 18 11:48:07 crc kubenswrapper[4922]: I0218 11:48:07.749273 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00938d04-ee62-4756-830e-f66e2fbaab9d-bundle" (OuterVolumeSpecName: "bundle") pod "00938d04-ee62-4756-830e-f66e2fbaab9d" (UID: "00938d04-ee62-4756-830e-f66e2fbaab9d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:48:07 crc kubenswrapper[4922]: I0218 11:48:07.756826 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00938d04-ee62-4756-830e-f66e2fbaab9d-util" (OuterVolumeSpecName: "util") pod "00938d04-ee62-4756-830e-f66e2fbaab9d" (UID: "00938d04-ee62-4756-830e-f66e2fbaab9d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:48:07 crc kubenswrapper[4922]: I0218 11:48:07.848122 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xkrb\" (UniqueName: \"kubernetes.io/projected/00938d04-ee62-4756-830e-f66e2fbaab9d-kube-api-access-8xkrb\") pod \"00938d04-ee62-4756-830e-f66e2fbaab9d\" (UID: \"00938d04-ee62-4756-830e-f66e2fbaab9d\") " Feb 18 11:48:07 crc kubenswrapper[4922]: I0218 11:48:07.848756 4922 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/00938d04-ee62-4756-830e-f66e2fbaab9d-util\") on node \"crc\" DevicePath \"\"" Feb 18 11:48:07 crc kubenswrapper[4922]: I0218 11:48:07.849029 4922 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/00938d04-ee62-4756-830e-f66e2fbaab9d-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:48:07 crc kubenswrapper[4922]: I0218 11:48:07.854451 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00938d04-ee62-4756-830e-f66e2fbaab9d-kube-api-access-8xkrb" (OuterVolumeSpecName: "kube-api-access-8xkrb") pod "00938d04-ee62-4756-830e-f66e2fbaab9d" (UID: "00938d04-ee62-4756-830e-f66e2fbaab9d"). InnerVolumeSpecName "kube-api-access-8xkrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:48:07 crc kubenswrapper[4922]: I0218 11:48:07.950718 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xkrb\" (UniqueName: \"kubernetes.io/projected/00938d04-ee62-4756-830e-f66e2fbaab9d-kube-api-access-8xkrb\") on node \"crc\" DevicePath \"\"" Feb 18 11:48:08 crc kubenswrapper[4922]: I0218 11:48:08.360285 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp" event={"ID":"00938d04-ee62-4756-830e-f66e2fbaab9d","Type":"ContainerDied","Data":"7c71d4c0c8bc51a41b0b55cdd5a469e8c57de802ea041e37234606852613315d"} Feb 18 11:48:08 crc kubenswrapper[4922]: I0218 11:48:08.360710 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c71d4c0c8bc51a41b0b55cdd5a469e8c57de802ea041e37234606852613315d" Feb 18 11:48:08 crc kubenswrapper[4922]: I0218 11:48:08.360323 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp" Feb 18 11:48:19 crc kubenswrapper[4922]: I0218 11:48:19.748005 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-cq76p"] Feb 18 11:48:19 crc kubenswrapper[4922]: E0218 11:48:19.748614 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00938d04-ee62-4756-830e-f66e2fbaab9d" containerName="extract" Feb 18 11:48:19 crc kubenswrapper[4922]: I0218 11:48:19.748626 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="00938d04-ee62-4756-830e-f66e2fbaab9d" containerName="extract" Feb 18 11:48:19 crc kubenswrapper[4922]: E0218 11:48:19.748646 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00938d04-ee62-4756-830e-f66e2fbaab9d" containerName="util" Feb 18 11:48:19 crc kubenswrapper[4922]: I0218 11:48:19.748652 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="00938d04-ee62-4756-830e-f66e2fbaab9d" containerName="util" Feb 18 11:48:19 crc kubenswrapper[4922]: E0218 11:48:19.748661 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00938d04-ee62-4756-830e-f66e2fbaab9d" containerName="pull" Feb 18 11:48:19 crc kubenswrapper[4922]: I0218 11:48:19.748667 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="00938d04-ee62-4756-830e-f66e2fbaab9d" containerName="pull" Feb 18 11:48:19 crc kubenswrapper[4922]: I0218 11:48:19.748753 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="00938d04-ee62-4756-830e-f66e2fbaab9d" containerName="extract" Feb 18 11:48:19 crc kubenswrapper[4922]: I0218 11:48:19.749073 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-cq76p" Feb 18 11:48:19 crc kubenswrapper[4922]: I0218 11:48:19.750909 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Feb 18 11:48:19 crc kubenswrapper[4922]: I0218 11:48:19.752123 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-57c68" Feb 18 11:48:19 crc kubenswrapper[4922]: I0218 11:48:19.754275 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Feb 18 11:48:19 crc kubenswrapper[4922]: I0218 11:48:19.788475 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-cq76p"] Feb 18 11:48:19 crc kubenswrapper[4922]: I0218 11:48:19.895221 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-755d44f7c-x5s2p"] Feb 18 11:48:19 crc kubenswrapper[4922]: I0218 11:48:19.895842 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-755d44f7c-x5s2p" Feb 18 11:48:19 crc kubenswrapper[4922]: I0218 11:48:19.898068 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-lpql4" Feb 18 11:48:19 crc kubenswrapper[4922]: I0218 11:48:19.898294 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Feb 18 11:48:19 crc kubenswrapper[4922]: I0218 11:48:19.898291 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/333644cd-a424-47a3-b701-378149dcdc80-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-755d44f7c-x5s2p\" (UID: \"333644cd-a424-47a3-b701-378149dcdc80\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-755d44f7c-x5s2p" Feb 18 11:48:19 crc kubenswrapper[4922]: I0218 11:48:19.898436 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/333644cd-a424-47a3-b701-378149dcdc80-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-755d44f7c-x5s2p\" (UID: \"333644cd-a424-47a3-b701-378149dcdc80\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-755d44f7c-x5s2p" Feb 18 11:48:19 crc kubenswrapper[4922]: I0218 11:48:19.898478 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmph9\" (UniqueName: \"kubernetes.io/projected/1446ef26-f977-4255-a1b2-a42e8107303e-kube-api-access-pmph9\") pod \"obo-prometheus-operator-68bc856cb9-cq76p\" (UID: \"1446ef26-f977-4255-a1b2-a42e8107303e\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-cq76p" Feb 18 11:48:19 crc kubenswrapper[4922]: I0218 11:48:19.910065 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-755d44f7c-x5s2p"] Feb 18 11:48:19 crc kubenswrapper[4922]: I0218 11:48:19.911725 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-755d44f7c-bqmb5"] Feb 18 11:48:19 crc kubenswrapper[4922]: I0218 11:48:19.912344 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-755d44f7c-bqmb5" Feb 18 11:48:19 crc kubenswrapper[4922]: I0218 11:48:19.937159 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-755d44f7c-bqmb5"] Feb 18 11:48:20 crc kubenswrapper[4922]: I0218 11:48:20.000516 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/879d4ddb-47d1-4987-a980-e9f05104e5cb-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-755d44f7c-bqmb5\" (UID: \"879d4ddb-47d1-4987-a980-e9f05104e5cb\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-755d44f7c-bqmb5" Feb 18 11:48:20 crc kubenswrapper[4922]: I0218 11:48:20.000582 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/333644cd-a424-47a3-b701-378149dcdc80-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-755d44f7c-x5s2p\" (UID: \"333644cd-a424-47a3-b701-378149dcdc80\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-755d44f7c-x5s2p" Feb 18 11:48:20 crc kubenswrapper[4922]: I0218 11:48:20.000629 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/333644cd-a424-47a3-b701-378149dcdc80-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-755d44f7c-x5s2p\" (UID: \"333644cd-a424-47a3-b701-378149dcdc80\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-755d44f7c-x5s2p" Feb 18 11:48:20 crc kubenswrapper[4922]: I0218 11:48:20.000657 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmph9\" (UniqueName: \"kubernetes.io/projected/1446ef26-f977-4255-a1b2-a42e8107303e-kube-api-access-pmph9\") pod \"obo-prometheus-operator-68bc856cb9-cq76p\" (UID: \"1446ef26-f977-4255-a1b2-a42e8107303e\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-cq76p" Feb 18 11:48:20 crc kubenswrapper[4922]: I0218 11:48:20.000680 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/879d4ddb-47d1-4987-a980-e9f05104e5cb-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-755d44f7c-bqmb5\" (UID: \"879d4ddb-47d1-4987-a980-e9f05104e5cb\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-755d44f7c-bqmb5" Feb 18 11:48:20 crc kubenswrapper[4922]: I0218 11:48:20.005973 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/333644cd-a424-47a3-b701-378149dcdc80-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-755d44f7c-x5s2p\" (UID: \"333644cd-a424-47a3-b701-378149dcdc80\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-755d44f7c-x5s2p" Feb 18 11:48:20 crc kubenswrapper[4922]: I0218 11:48:20.010186 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/333644cd-a424-47a3-b701-378149dcdc80-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-755d44f7c-x5s2p\" (UID: \"333644cd-a424-47a3-b701-378149dcdc80\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-755d44f7c-x5s2p" Feb 18 11:48:20 crc kubenswrapper[4922]: I0218 11:48:20.023249 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmph9\" (UniqueName: \"kubernetes.io/projected/1446ef26-f977-4255-a1b2-a42e8107303e-kube-api-access-pmph9\") pod \"obo-prometheus-operator-68bc856cb9-cq76p\" (UID: \"1446ef26-f977-4255-a1b2-a42e8107303e\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-cq76p" Feb 18 11:48:20 crc kubenswrapper[4922]: I0218 11:48:20.064484 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-cq76p" Feb 18 11:48:20 crc kubenswrapper[4922]: I0218 11:48:20.083136 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-tkz2d"] Feb 18 11:48:20 crc kubenswrapper[4922]: I0218 11:48:20.092243 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-tkz2d" Feb 18 11:48:20 crc kubenswrapper[4922]: I0218 11:48:20.097641 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-5m464" Feb 18 11:48:20 crc kubenswrapper[4922]: I0218 11:48:20.099346 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Feb 18 11:48:20 crc kubenswrapper[4922]: I0218 11:48:20.102343 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/10c40ab6-7b55-410d-958e-3a6a37818c88-observability-operator-tls\") pod \"observability-operator-59bdc8b94-tkz2d\" (UID: \"10c40ab6-7b55-410d-958e-3a6a37818c88\") " pod="openshift-operators/observability-operator-59bdc8b94-tkz2d" Feb 18 11:48:20 crc kubenswrapper[4922]: I0218 11:48:20.102603 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/879d4ddb-47d1-4987-a980-e9f05104e5cb-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-755d44f7c-bqmb5\" (UID: \"879d4ddb-47d1-4987-a980-e9f05104e5cb\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-755d44f7c-bqmb5" Feb 18 11:48:20 crc kubenswrapper[4922]: I0218 11:48:20.102707 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4vv9\" (UniqueName: \"kubernetes.io/projected/10c40ab6-7b55-410d-958e-3a6a37818c88-kube-api-access-x4vv9\") pod \"observability-operator-59bdc8b94-tkz2d\" (UID: \"10c40ab6-7b55-410d-958e-3a6a37818c88\") " pod="openshift-operators/observability-operator-59bdc8b94-tkz2d" Feb 18 11:48:20 crc kubenswrapper[4922]: I0218 11:48:20.102845 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/879d4ddb-47d1-4987-a980-e9f05104e5cb-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-755d44f7c-bqmb5\" (UID: \"879d4ddb-47d1-4987-a980-e9f05104e5cb\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-755d44f7c-bqmb5" Feb 18 11:48:20 crc kubenswrapper[4922]: I0218 11:48:20.108258 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-tkz2d"] Feb 18 11:48:20 crc kubenswrapper[4922]: I0218 11:48:20.109901 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/879d4ddb-47d1-4987-a980-e9f05104e5cb-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-755d44f7c-bqmb5\" (UID: \"879d4ddb-47d1-4987-a980-e9f05104e5cb\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-755d44f7c-bqmb5" Feb 18 11:48:20 crc kubenswrapper[4922]: I0218 11:48:20.120326 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/879d4ddb-47d1-4987-a980-e9f05104e5cb-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-755d44f7c-bqmb5\" (UID: \"879d4ddb-47d1-4987-a980-e9f05104e5cb\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-755d44f7c-bqmb5" Feb 18 11:48:20 crc kubenswrapper[4922]: I0218 11:48:20.174099 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-mh85w"] Feb 18 11:48:20 crc kubenswrapper[4922]: I0218 11:48:20.174977 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-mh85w" Feb 18 11:48:20 crc kubenswrapper[4922]: I0218 11:48:20.180840 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-q4j4d" Feb 18 11:48:20 crc kubenswrapper[4922]: I0218 11:48:20.187111 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-mh85w"] Feb 18 11:48:20 crc kubenswrapper[4922]: I0218 11:48:20.203275 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/20e893d8-cc0c-4bdf-83d6-698e08e5d82b-openshift-service-ca\") pod \"perses-operator-5bf474d74f-mh85w\" (UID: \"20e893d8-cc0c-4bdf-83d6-698e08e5d82b\") " pod="openshift-operators/perses-operator-5bf474d74f-mh85w" Feb 18 11:48:20 crc kubenswrapper[4922]: I0218 11:48:20.203328 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/10c40ab6-7b55-410d-958e-3a6a37818c88-observability-operator-tls\") pod \"observability-operator-59bdc8b94-tkz2d\" (UID: \"10c40ab6-7b55-410d-958e-3a6a37818c88\") " pod="openshift-operators/observability-operator-59bdc8b94-tkz2d" Feb 18 11:48:20 crc kubenswrapper[4922]: I0218 11:48:20.203376 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4vv9\" (UniqueName: \"kubernetes.io/projected/10c40ab6-7b55-410d-958e-3a6a37818c88-kube-api-access-x4vv9\") pod \"observability-operator-59bdc8b94-tkz2d\" (UID: \"10c40ab6-7b55-410d-958e-3a6a37818c88\") " pod="openshift-operators/observability-operator-59bdc8b94-tkz2d" Feb 18 11:48:20 crc kubenswrapper[4922]: I0218 11:48:20.203418 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49t42\" (UniqueName: \"kubernetes.io/projected/20e893d8-cc0c-4bdf-83d6-698e08e5d82b-kube-api-access-49t42\") pod \"perses-operator-5bf474d74f-mh85w\" (UID: \"20e893d8-cc0c-4bdf-83d6-698e08e5d82b\") " pod="openshift-operators/perses-operator-5bf474d74f-mh85w" Feb 18 11:48:20 crc kubenswrapper[4922]: I0218 11:48:20.211501 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-755d44f7c-x5s2p" Feb 18 11:48:20 crc kubenswrapper[4922]: I0218 11:48:20.223104 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/10c40ab6-7b55-410d-958e-3a6a37818c88-observability-operator-tls\") pod \"observability-operator-59bdc8b94-tkz2d\" (UID: \"10c40ab6-7b55-410d-958e-3a6a37818c88\") " pod="openshift-operators/observability-operator-59bdc8b94-tkz2d" Feb 18 11:48:20 crc kubenswrapper[4922]: I0218 11:48:20.232580 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-755d44f7c-bqmb5" Feb 18 11:48:20 crc kubenswrapper[4922]: I0218 11:48:20.248454 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4vv9\" (UniqueName: \"kubernetes.io/projected/10c40ab6-7b55-410d-958e-3a6a37818c88-kube-api-access-x4vv9\") pod \"observability-operator-59bdc8b94-tkz2d\" (UID: \"10c40ab6-7b55-410d-958e-3a6a37818c88\") " pod="openshift-operators/observability-operator-59bdc8b94-tkz2d" Feb 18 11:48:20 crc kubenswrapper[4922]: I0218 11:48:20.304650 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/20e893d8-cc0c-4bdf-83d6-698e08e5d82b-openshift-service-ca\") pod \"perses-operator-5bf474d74f-mh85w\" (UID: \"20e893d8-cc0c-4bdf-83d6-698e08e5d82b\") " pod="openshift-operators/perses-operator-5bf474d74f-mh85w" Feb 18 11:48:20 crc kubenswrapper[4922]: I0218 11:48:20.304777 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49t42\" (UniqueName: \"kubernetes.io/projected/20e893d8-cc0c-4bdf-83d6-698e08e5d82b-kube-api-access-49t42\") pod \"perses-operator-5bf474d74f-mh85w\" (UID: \"20e893d8-cc0c-4bdf-83d6-698e08e5d82b\") " pod="openshift-operators/perses-operator-5bf474d74f-mh85w" Feb 18 11:48:20 crc kubenswrapper[4922]: I0218 11:48:20.306434 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/20e893d8-cc0c-4bdf-83d6-698e08e5d82b-openshift-service-ca\") pod \"perses-operator-5bf474d74f-mh85w\" (UID: \"20e893d8-cc0c-4bdf-83d6-698e08e5d82b\") " pod="openshift-operators/perses-operator-5bf474d74f-mh85w" Feb 18 11:48:20 crc kubenswrapper[4922]: I0218 11:48:20.337799 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49t42\" (UniqueName: \"kubernetes.io/projected/20e893d8-cc0c-4bdf-83d6-698e08e5d82b-kube-api-access-49t42\") pod \"perses-operator-5bf474d74f-mh85w\" (UID: \"20e893d8-cc0c-4bdf-83d6-698e08e5d82b\") " pod="openshift-operators/perses-operator-5bf474d74f-mh85w" Feb 18 11:48:20 crc kubenswrapper[4922]: I0218 11:48:20.337907 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-cq76p"] Feb 18 11:48:20 crc kubenswrapper[4922]: I0218 11:48:20.423093 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-cq76p" event={"ID":"1446ef26-f977-4255-a1b2-a42e8107303e","Type":"ContainerStarted","Data":"51ec3eea3e9dbf3f81fde0c2533d89180c44631d8326fa4d3f7424ccd6c8208b"} Feb 18 11:48:20 crc kubenswrapper[4922]: I0218 11:48:20.439703 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-tkz2d" Feb 18 11:48:20 crc kubenswrapper[4922]: I0218 11:48:20.500466 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-755d44f7c-x5s2p"] Feb 18 11:48:20 crc kubenswrapper[4922]: I0218 11:48:20.509810 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-mh85w" Feb 18 11:48:20 crc kubenswrapper[4922]: I0218 11:48:20.655244 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-755d44f7c-bqmb5"] Feb 18 11:48:20 crc kubenswrapper[4922]: W0218 11:48:20.660559 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod879d4ddb_47d1_4987_a980_e9f05104e5cb.slice/crio-d130111a8c10d96bda6aee9ebd583cdfe68bf855db61549867de268407fb946d WatchSource:0}: Error finding container d130111a8c10d96bda6aee9ebd583cdfe68bf855db61549867de268407fb946d: Status 404 returned error can't find the container with id d130111a8c10d96bda6aee9ebd583cdfe68bf855db61549867de268407fb946d Feb 18 11:48:20 crc kubenswrapper[4922]: I0218 11:48:20.746686 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-mh85w"] Feb 18 11:48:20 crc kubenswrapper[4922]: W0218 11:48:20.749568 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20e893d8_cc0c_4bdf_83d6_698e08e5d82b.slice/crio-d6ecd55219bc848d998996159031bc8f52c458e20914bd6cc68c7fd8ea617729 WatchSource:0}: Error finding container d6ecd55219bc848d998996159031bc8f52c458e20914bd6cc68c7fd8ea617729: Status 404 returned error can't find the container with id d6ecd55219bc848d998996159031bc8f52c458e20914bd6cc68c7fd8ea617729 Feb 18 11:48:20 crc kubenswrapper[4922]: I0218 11:48:20.839350 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-tkz2d"] Feb 18 11:48:21 crc kubenswrapper[4922]: I0218 11:48:21.434279 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-755d44f7c-x5s2p" event={"ID":"333644cd-a424-47a3-b701-378149dcdc80","Type":"ContainerStarted","Data":"bfb3f5af5fdd25aea6b8e43188f03e9bd226ff5f52c46d835df6d64951891b75"} Feb 18 11:48:21 crc kubenswrapper[4922]: I0218 11:48:21.435633 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-mh85w" event={"ID":"20e893d8-cc0c-4bdf-83d6-698e08e5d82b","Type":"ContainerStarted","Data":"d6ecd55219bc848d998996159031bc8f52c458e20914bd6cc68c7fd8ea617729"} Feb 18 11:48:21 crc kubenswrapper[4922]: I0218 11:48:21.437178 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-755d44f7c-bqmb5" event={"ID":"879d4ddb-47d1-4987-a980-e9f05104e5cb","Type":"ContainerStarted","Data":"d130111a8c10d96bda6aee9ebd583cdfe68bf855db61549867de268407fb946d"} Feb 18 11:48:21 crc kubenswrapper[4922]: I0218 11:48:21.442280 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-tkz2d" event={"ID":"10c40ab6-7b55-410d-958e-3a6a37818c88","Type":"ContainerStarted","Data":"5e42eb36343f27623db83714142fb72f73f55eda27d47a6db8b06cc55d0dc155"} Feb 18 11:48:30 crc kubenswrapper[4922]: I0218 11:48:30.522168 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-755d44f7c-x5s2p" event={"ID":"333644cd-a424-47a3-b701-378149dcdc80","Type":"ContainerStarted","Data":"2d68b0fa4fed319e58c3f11397dc751ebec3ef25550c7e39c68d1baf4f73f03c"} Feb 18 11:48:30 crc kubenswrapper[4922]: I0218 11:48:30.524315 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-mh85w" event={"ID":"20e893d8-cc0c-4bdf-83d6-698e08e5d82b","Type":"ContainerStarted","Data":"6fbb52c7299b7f81e9e1da049175b542d94caa65f52eb81e4012cbc3f3e8cbba"} Feb 18 11:48:30 crc kubenswrapper[4922]: I0218 11:48:30.524502 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-mh85w" Feb 18 11:48:30 crc kubenswrapper[4922]: I0218 11:48:30.526467 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-755d44f7c-bqmb5" event={"ID":"879d4ddb-47d1-4987-a980-e9f05104e5cb","Type":"ContainerStarted","Data":"ca02b621f7015a194f2ab4dd72d7e49a730566bb65fd0f968d756e46dc231160"} Feb 18 11:48:30 crc kubenswrapper[4922]: I0218 11:48:30.528078 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-tkz2d" event={"ID":"10c40ab6-7b55-410d-958e-3a6a37818c88","Type":"ContainerStarted","Data":"1f8cc572431c4a7cc6dab26aab6aa0ca6ae6f2757a46873bea2e63ad26b51048"} Feb 18 11:48:30 crc kubenswrapper[4922]: I0218 11:48:30.528318 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-tkz2d" Feb 18 11:48:30 crc kubenswrapper[4922]: I0218 11:48:30.530195 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-cq76p" event={"ID":"1446ef26-f977-4255-a1b2-a42e8107303e","Type":"ContainerStarted","Data":"7d3220e5856f5b48eb51b0bfa964b73fdd7d74ebe7a99d5d79c290c9ebf9891d"} Feb 18 11:48:30 crc kubenswrapper[4922]: I0218 11:48:30.530556 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-tkz2d" Feb 18 11:48:30 crc kubenswrapper[4922]: I0218 11:48:30.543883 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-755d44f7c-x5s2p" podStartSLOduration=2.863131878 podStartE2EDuration="11.543859028s" podCreationTimestamp="2026-02-18 11:48:19 +0000 UTC" firstStartedPulling="2026-02-18 11:48:20.529917167 +0000 UTC m=+702.257621247" lastFinishedPulling="2026-02-18 11:48:29.210644317 +0000 UTC m=+710.938348397" observedRunningTime="2026-02-18 11:48:30.538187867 +0000 UTC m=+712.265891947" watchObservedRunningTime="2026-02-18 11:48:30.543859028 +0000 UTC m=+712.271563108" Feb 18 11:48:30 crc kubenswrapper[4922]: I0218 11:48:30.562806 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-755d44f7c-bqmb5" podStartSLOduration=2.978936367 podStartE2EDuration="11.56278742s" podCreationTimestamp="2026-02-18 11:48:19 +0000 UTC" firstStartedPulling="2026-02-18 11:48:20.66311793 +0000 UTC m=+702.390822010" lastFinishedPulling="2026-02-18 11:48:29.246968983 +0000 UTC m=+710.974673063" observedRunningTime="2026-02-18 11:48:30.559915249 +0000 UTC m=+712.287619329" watchObservedRunningTime="2026-02-18 11:48:30.56278742 +0000 UTC m=+712.290491500" Feb 18 11:48:30 crc kubenswrapper[4922]: I0218 11:48:30.596160 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-cq76p" podStartSLOduration=2.7057864929999997 podStartE2EDuration="11.596138562s" podCreationTimestamp="2026-02-18 11:48:19 +0000 UTC" firstStartedPulling="2026-02-18 11:48:20.356960772 +0000 UTC m=+702.084664852" lastFinishedPulling="2026-02-18 11:48:29.247312841 +0000 UTC m=+710.975016921" observedRunningTime="2026-02-18 11:48:30.59405189 +0000 UTC m=+712.321755970" watchObservedRunningTime="2026-02-18 11:48:30.596138562 +0000 UTC m=+712.323842642" Feb 18 11:48:30 crc kubenswrapper[4922]: I0218 11:48:30.612953 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-tkz2d" podStartSLOduration=2.1566347710000002 podStartE2EDuration="10.612930871s" podCreationTimestamp="2026-02-18 11:48:20 +0000 UTC" firstStartedPulling="2026-02-18 11:48:20.844938757 +0000 UTC m=+702.572642837" lastFinishedPulling="2026-02-18 11:48:29.301234857 +0000 UTC m=+711.028938937" observedRunningTime="2026-02-18 11:48:30.609996108 +0000 UTC m=+712.337700198" watchObservedRunningTime="2026-02-18 11:48:30.612930871 +0000 UTC m=+712.340634971" Feb 18 11:48:30 crc kubenswrapper[4922]: I0218 11:48:30.634828 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-mh85w" podStartSLOduration=2.140907029 podStartE2EDuration="10.634812437s" podCreationTimestamp="2026-02-18 11:48:20 +0000 UTC" firstStartedPulling="2026-02-18 11:48:20.751868165 +0000 UTC m=+702.479572245" lastFinishedPulling="2026-02-18 11:48:29.245773573 +0000 UTC m=+710.973477653" observedRunningTime="2026-02-18 11:48:30.634599722 +0000 UTC m=+712.362303802" watchObservedRunningTime="2026-02-18 11:48:30.634812437 +0000 UTC m=+712.362516517" Feb 18 11:48:40 crc kubenswrapper[4922]: I0218 11:48:40.512771 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-mh85w" Feb 18 11:48:56 crc kubenswrapper[4922]: I0218 11:48:56.913680 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatx846"] Feb 18 11:48:56 crc kubenswrapper[4922]: I0218 11:48:56.915403 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatx846" Feb 18 11:48:56 crc kubenswrapper[4922]: I0218 11:48:56.917524 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 18 11:48:56 crc kubenswrapper[4922]: I0218 11:48:56.922726 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatx846"] Feb 18 11:48:57 crc kubenswrapper[4922]: I0218 11:48:57.089372 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b19cf8eb-c4e0-42a2-bc33-246e5c756bda-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatx846\" (UID: \"b19cf8eb-c4e0-42a2-bc33-246e5c756bda\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatx846" Feb 18 11:48:57 crc kubenswrapper[4922]: I0218 11:48:57.089743 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b19cf8eb-c4e0-42a2-bc33-246e5c756bda-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatx846\" (UID: \"b19cf8eb-c4e0-42a2-bc33-246e5c756bda\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatx846" Feb 18 11:48:57 crc kubenswrapper[4922]: I0218 11:48:57.089833 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxm87\" (UniqueName: \"kubernetes.io/projected/b19cf8eb-c4e0-42a2-bc33-246e5c756bda-kube-api-access-rxm87\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatx846\" (UID: \"b19cf8eb-c4e0-42a2-bc33-246e5c756bda\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatx846" Feb 18 11:48:57 crc kubenswrapper[4922]: I0218 11:48:57.191629 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b19cf8eb-c4e0-42a2-bc33-246e5c756bda-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatx846\" (UID: \"b19cf8eb-c4e0-42a2-bc33-246e5c756bda\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatx846" Feb 18 11:48:57 crc kubenswrapper[4922]: I0218 11:48:57.191710 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxm87\" (UniqueName: \"kubernetes.io/projected/b19cf8eb-c4e0-42a2-bc33-246e5c756bda-kube-api-access-rxm87\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatx846\" (UID: \"b19cf8eb-c4e0-42a2-bc33-246e5c756bda\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatx846" Feb 18 11:48:57 crc kubenswrapper[4922]: I0218 11:48:57.191762 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b19cf8eb-c4e0-42a2-bc33-246e5c756bda-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatx846\" (UID: \"b19cf8eb-c4e0-42a2-bc33-246e5c756bda\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatx846" Feb 18 11:48:57 crc kubenswrapper[4922]: I0218 11:48:57.192184 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b19cf8eb-c4e0-42a2-bc33-246e5c756bda-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatx846\" (UID: \"b19cf8eb-c4e0-42a2-bc33-246e5c756bda\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatx846" Feb 18 11:48:57 crc kubenswrapper[4922]: I0218 11:48:57.192275 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b19cf8eb-c4e0-42a2-bc33-246e5c756bda-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatx846\" (UID: \"b19cf8eb-c4e0-42a2-bc33-246e5c756bda\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatx846" Feb 18 11:48:57 crc kubenswrapper[4922]: I0218 11:48:57.218935 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxm87\" (UniqueName: \"kubernetes.io/projected/b19cf8eb-c4e0-42a2-bc33-246e5c756bda-kube-api-access-rxm87\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatx846\" (UID: \"b19cf8eb-c4e0-42a2-bc33-246e5c756bda\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatx846" Feb 18 11:48:57 crc kubenswrapper[4922]: I0218 11:48:57.231153 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatx846" Feb 18 11:48:57 crc kubenswrapper[4922]: I0218 11:48:57.679342 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatx846"] Feb 18 11:48:58 crc kubenswrapper[4922]: I0218 11:48:58.690297 4922 generic.go:334] "Generic (PLEG): container finished" podID="b19cf8eb-c4e0-42a2-bc33-246e5c756bda" containerID="f317ae4fd0a8f319d576033f5e5722ebb6e82ba20cf05423570e941ca3017274" exitCode=0 Feb 18 11:48:58 crc kubenswrapper[4922]: I0218 11:48:58.690342 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatx846" event={"ID":"b19cf8eb-c4e0-42a2-bc33-246e5c756bda","Type":"ContainerDied","Data":"f317ae4fd0a8f319d576033f5e5722ebb6e82ba20cf05423570e941ca3017274"} Feb 18 11:48:58 crc kubenswrapper[4922]: I0218 11:48:58.690405 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatx846" event={"ID":"b19cf8eb-c4e0-42a2-bc33-246e5c756bda","Type":"ContainerStarted","Data":"0b2941e027dac645858be77cdc474ee5835f582540fe8a51fabd3794a74a1b37"} Feb 18 11:49:00 crc kubenswrapper[4922]: I0218 11:49:00.701943 4922 generic.go:334] "Generic (PLEG): container finished" podID="b19cf8eb-c4e0-42a2-bc33-246e5c756bda" containerID="fee21b33036842e34387a977a9de1d244aa81b9728e3adff4a2c5ef581c836c7" exitCode=0 Feb 18 11:49:00 crc kubenswrapper[4922]: I0218 11:49:00.702020 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatx846" event={"ID":"b19cf8eb-c4e0-42a2-bc33-246e5c756bda","Type":"ContainerDied","Data":"fee21b33036842e34387a977a9de1d244aa81b9728e3adff4a2c5ef581c836c7"} Feb 18 11:49:01 crc kubenswrapper[4922]: I0218 11:49:01.714596 4922 generic.go:334] "Generic (PLEG): container finished" podID="b19cf8eb-c4e0-42a2-bc33-246e5c756bda" containerID="a06c5179c5c3ffcaaf7732a92ab878d68e7d526acdd98925fa5cbaf37828b776" exitCode=0 Feb 18 11:49:01 crc kubenswrapper[4922]: I0218 11:49:01.714634 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatx846" event={"ID":"b19cf8eb-c4e0-42a2-bc33-246e5c756bda","Type":"ContainerDied","Data":"a06c5179c5c3ffcaaf7732a92ab878d68e7d526acdd98925fa5cbaf37828b776"} Feb 18 11:49:02 crc kubenswrapper[4922]: I0218 11:49:02.932957 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatx846" Feb 18 11:49:03 crc kubenswrapper[4922]: I0218 11:49:03.074521 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b19cf8eb-c4e0-42a2-bc33-246e5c756bda-util\") pod \"b19cf8eb-c4e0-42a2-bc33-246e5c756bda\" (UID: \"b19cf8eb-c4e0-42a2-bc33-246e5c756bda\") " Feb 18 11:49:03 crc kubenswrapper[4922]: I0218 11:49:03.074962 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b19cf8eb-c4e0-42a2-bc33-246e5c756bda-bundle\") pod \"b19cf8eb-c4e0-42a2-bc33-246e5c756bda\" (UID: \"b19cf8eb-c4e0-42a2-bc33-246e5c756bda\") " Feb 18 11:49:03 crc kubenswrapper[4922]: I0218 11:49:03.075052 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxm87\" (UniqueName: \"kubernetes.io/projected/b19cf8eb-c4e0-42a2-bc33-246e5c756bda-kube-api-access-rxm87\") pod \"b19cf8eb-c4e0-42a2-bc33-246e5c756bda\" (UID: \"b19cf8eb-c4e0-42a2-bc33-246e5c756bda\") " Feb 18 11:49:03 crc kubenswrapper[4922]: I0218 11:49:03.075979 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b19cf8eb-c4e0-42a2-bc33-246e5c756bda-bundle" (OuterVolumeSpecName: "bundle") pod "b19cf8eb-c4e0-42a2-bc33-246e5c756bda" (UID: "b19cf8eb-c4e0-42a2-bc33-246e5c756bda"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:49:03 crc kubenswrapper[4922]: I0218 11:49:03.076623 4922 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b19cf8eb-c4e0-42a2-bc33-246e5c756bda-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:49:03 crc kubenswrapper[4922]: I0218 11:49:03.084705 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b19cf8eb-c4e0-42a2-bc33-246e5c756bda-kube-api-access-rxm87" (OuterVolumeSpecName: "kube-api-access-rxm87") pod "b19cf8eb-c4e0-42a2-bc33-246e5c756bda" (UID: "b19cf8eb-c4e0-42a2-bc33-246e5c756bda"). InnerVolumeSpecName "kube-api-access-rxm87". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:49:03 crc kubenswrapper[4922]: I0218 11:49:03.099602 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b19cf8eb-c4e0-42a2-bc33-246e5c756bda-util" (OuterVolumeSpecName: "util") pod "b19cf8eb-c4e0-42a2-bc33-246e5c756bda" (UID: "b19cf8eb-c4e0-42a2-bc33-246e5c756bda"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:49:03 crc kubenswrapper[4922]: I0218 11:49:03.177748 4922 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b19cf8eb-c4e0-42a2-bc33-246e5c756bda-util\") on node \"crc\" DevicePath \"\"" Feb 18 11:49:03 crc kubenswrapper[4922]: I0218 11:49:03.177786 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxm87\" (UniqueName: \"kubernetes.io/projected/b19cf8eb-c4e0-42a2-bc33-246e5c756bda-kube-api-access-rxm87\") on node \"crc\" DevicePath \"\"" Feb 18 11:49:03 crc kubenswrapper[4922]: I0218 11:49:03.729631 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatx846" event={"ID":"b19cf8eb-c4e0-42a2-bc33-246e5c756bda","Type":"ContainerDied","Data":"0b2941e027dac645858be77cdc474ee5835f582540fe8a51fabd3794a74a1b37"} Feb 18 11:49:03 crc kubenswrapper[4922]: I0218 11:49:03.729689 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b2941e027dac645858be77cdc474ee5835f582540fe8a51fabd3794a74a1b37" Feb 18 11:49:03 crc kubenswrapper[4922]: I0218 11:49:03.729723 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatx846" Feb 18 11:49:05 crc kubenswrapper[4922]: I0218 11:49:05.406994 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-p7vsx"] Feb 18 11:49:05 crc kubenswrapper[4922]: E0218 11:49:05.407278 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b19cf8eb-c4e0-42a2-bc33-246e5c756bda" containerName="extract" Feb 18 11:49:05 crc kubenswrapper[4922]: I0218 11:49:05.407293 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="b19cf8eb-c4e0-42a2-bc33-246e5c756bda" containerName="extract" Feb 18 11:49:05 crc kubenswrapper[4922]: E0218 11:49:05.407306 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b19cf8eb-c4e0-42a2-bc33-246e5c756bda" containerName="pull" Feb 18 11:49:05 crc kubenswrapper[4922]: I0218 11:49:05.407313 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="b19cf8eb-c4e0-42a2-bc33-246e5c756bda" containerName="pull" Feb 18 11:49:05 crc kubenswrapper[4922]: E0218 11:49:05.407324 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b19cf8eb-c4e0-42a2-bc33-246e5c756bda" containerName="util" Feb 18 11:49:05 crc kubenswrapper[4922]: I0218 11:49:05.407332 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="b19cf8eb-c4e0-42a2-bc33-246e5c756bda" containerName="util" Feb 18 11:49:05 crc kubenswrapper[4922]: I0218 11:49:05.407476 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="b19cf8eb-c4e0-42a2-bc33-246e5c756bda" containerName="extract" Feb 18 11:49:05 crc kubenswrapper[4922]: I0218 11:49:05.408017 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-p7vsx" Feb 18 11:49:05 crc kubenswrapper[4922]: I0218 11:49:05.411492 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 18 11:49:05 crc kubenswrapper[4922]: I0218 11:49:05.411757 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-tv7x2" Feb 18 11:49:05 crc kubenswrapper[4922]: I0218 11:49:05.413593 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 18 11:49:05 crc kubenswrapper[4922]: I0218 11:49:05.415792 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-p7vsx"] Feb 18 11:49:05 crc kubenswrapper[4922]: I0218 11:49:05.606136 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct4dv\" (UniqueName: \"kubernetes.io/projected/578f51b2-8e78-4720-93f6-7cd9ce17e2ed-kube-api-access-ct4dv\") pod \"nmstate-operator-694c9596b7-p7vsx\" (UID: \"578f51b2-8e78-4720-93f6-7cd9ce17e2ed\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-p7vsx" Feb 18 11:49:05 crc kubenswrapper[4922]: I0218 11:49:05.707432 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ct4dv\" (UniqueName: \"kubernetes.io/projected/578f51b2-8e78-4720-93f6-7cd9ce17e2ed-kube-api-access-ct4dv\") pod \"nmstate-operator-694c9596b7-p7vsx\" (UID: \"578f51b2-8e78-4720-93f6-7cd9ce17e2ed\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-p7vsx" Feb 18 11:49:05 crc kubenswrapper[4922]: I0218 11:49:05.725078 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct4dv\" (UniqueName: \"kubernetes.io/projected/578f51b2-8e78-4720-93f6-7cd9ce17e2ed-kube-api-access-ct4dv\") pod \"nmstate-operator-694c9596b7-p7vsx\" (UID: \"578f51b2-8e78-4720-93f6-7cd9ce17e2ed\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-p7vsx" Feb 18 11:49:05 crc kubenswrapper[4922]: I0218 11:49:05.731663 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-p7vsx" Feb 18 11:49:05 crc kubenswrapper[4922]: I0218 11:49:05.955067 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-p7vsx"] Feb 18 11:49:06 crc kubenswrapper[4922]: I0218 11:49:06.748612 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-p7vsx" event={"ID":"578f51b2-8e78-4720-93f6-7cd9ce17e2ed","Type":"ContainerStarted","Data":"23fb181ec1e27b020e0dfd9aecccf9eb4b208f7f089ba0e0341f2e3be0aae32d"} Feb 18 11:49:08 crc kubenswrapper[4922]: I0218 11:49:08.760213 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-p7vsx" event={"ID":"578f51b2-8e78-4720-93f6-7cd9ce17e2ed","Type":"ContainerStarted","Data":"03ac8600b2ed13b54fdddd204431cde0ea9aa0540af33fc438a688c0a14c3d5e"} Feb 18 11:49:08 crc kubenswrapper[4922]: I0218 11:49:08.777923 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-694c9596b7-p7vsx" podStartSLOduration=1.555292347 podStartE2EDuration="3.777905069s" podCreationTimestamp="2026-02-18 11:49:05 +0000 UTC" firstStartedPulling="2026-02-18 11:49:05.966683509 +0000 UTC m=+747.694387579" lastFinishedPulling="2026-02-18 11:49:08.189296221 +0000 UTC m=+749.917000301" observedRunningTime="2026-02-18 11:49:08.773919281 +0000 UTC m=+750.501623381" watchObservedRunningTime="2026-02-18 11:49:08.777905069 +0000 UTC m=+750.505609149" Feb 18 11:49:09 crc kubenswrapper[4922]: I0218 11:49:09.674350 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-2dtql"] Feb 18 11:49:09 crc kubenswrapper[4922]: I0218 11:49:09.675789 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-2dtql" Feb 18 11:49:09 crc kubenswrapper[4922]: I0218 11:49:09.679257 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-5nqz7" Feb 18 11:49:09 crc kubenswrapper[4922]: I0218 11:49:09.700355 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-2dtql"] Feb 18 11:49:09 crc kubenswrapper[4922]: I0218 11:49:09.715678 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-7mvdv"] Feb 18 11:49:09 crc kubenswrapper[4922]: I0218 11:49:09.716483 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-7mvdv" Feb 18 11:49:09 crc kubenswrapper[4922]: I0218 11:49:09.718183 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 18 11:49:09 crc kubenswrapper[4922]: I0218 11:49:09.738897 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-xgmj2"] Feb 18 11:49:09 crc kubenswrapper[4922]: I0218 11:49:09.739706 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-xgmj2" Feb 18 11:49:09 crc kubenswrapper[4922]: I0218 11:49:09.745014 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-7mvdv"] Feb 18 11:49:09 crc kubenswrapper[4922]: I0218 11:49:09.826919 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-8x52x"] Feb 18 11:49:09 crc kubenswrapper[4922]: I0218 11:49:09.827798 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-8x52x" Feb 18 11:49:09 crc kubenswrapper[4922]: I0218 11:49:09.831036 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 18 11:49:09 crc kubenswrapper[4922]: I0218 11:49:09.831378 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-9rxjj" Feb 18 11:49:09 crc kubenswrapper[4922]: I0218 11:49:09.840534 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-8x52x"] Feb 18 11:49:09 crc kubenswrapper[4922]: I0218 11:49:09.840561 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 18 11:49:09 crc kubenswrapper[4922]: I0218 11:49:09.855255 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/df5bbc9b-9ba2-416b-93db-c4f6155b6906-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-7mvdv\" (UID: \"df5bbc9b-9ba2-416b-93db-c4f6155b6906\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-7mvdv" Feb 18 11:49:09 crc kubenswrapper[4922]: I0218 11:49:09.855305 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nm4nn\" (UniqueName: \"kubernetes.io/projected/8a41aeaf-5b15-4c8c-8abc-ad77b8e33896-kube-api-access-nm4nn\") pod \"nmstate-console-plugin-5c78fc5d65-8x52x\" (UID: \"8a41aeaf-5b15-4c8c-8abc-ad77b8e33896\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-8x52x" Feb 18 11:49:09 crc kubenswrapper[4922]: I0218 11:49:09.855353 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6tlg\" (UniqueName: \"kubernetes.io/projected/4e3e71a0-5178-4016-853d-0d0c31563d99-kube-api-access-n6tlg\") pod \"nmstate-metrics-58c85c668d-2dtql\" (UID: \"4e3e71a0-5178-4016-853d-0d0c31563d99\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-2dtql" Feb 18 11:49:09 crc kubenswrapper[4922]: I0218 11:49:09.855501 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ea43ed5b-6735-4fd5-8fc5-1a01dcaeea01-ovs-socket\") pod \"nmstate-handler-xgmj2\" (UID: \"ea43ed5b-6735-4fd5-8fc5-1a01dcaeea01\") " pod="openshift-nmstate/nmstate-handler-xgmj2" Feb 18 11:49:09 crc kubenswrapper[4922]: I0218 11:49:09.855587 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8a41aeaf-5b15-4c8c-8abc-ad77b8e33896-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-8x52x\" (UID: \"8a41aeaf-5b15-4c8c-8abc-ad77b8e33896\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-8x52x" Feb 18 11:49:09 crc kubenswrapper[4922]: I0218 11:49:09.855616 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs46n\" (UniqueName: \"kubernetes.io/projected/df5bbc9b-9ba2-416b-93db-c4f6155b6906-kube-api-access-hs46n\") pod \"nmstate-webhook-866bcb46dc-7mvdv\" (UID: \"df5bbc9b-9ba2-416b-93db-c4f6155b6906\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-7mvdv" Feb 18 11:49:09 crc kubenswrapper[4922]: I0218 11:49:09.855650 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8a41aeaf-5b15-4c8c-8abc-ad77b8e33896-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-8x52x\" (UID: \"8a41aeaf-5b15-4c8c-8abc-ad77b8e33896\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-8x52x" Feb 18 11:49:09 crc kubenswrapper[4922]: I0218 11:49:09.855677 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ea43ed5b-6735-4fd5-8fc5-1a01dcaeea01-nmstate-lock\") pod \"nmstate-handler-xgmj2\" (UID: \"ea43ed5b-6735-4fd5-8fc5-1a01dcaeea01\") " pod="openshift-nmstate/nmstate-handler-xgmj2" Feb 18 11:49:09 crc kubenswrapper[4922]: I0218 11:49:09.855823 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kxn9\" (UniqueName: \"kubernetes.io/projected/ea43ed5b-6735-4fd5-8fc5-1a01dcaeea01-kube-api-access-6kxn9\") pod \"nmstate-handler-xgmj2\" (UID: \"ea43ed5b-6735-4fd5-8fc5-1a01dcaeea01\") " pod="openshift-nmstate/nmstate-handler-xgmj2" Feb 18 11:49:09 crc kubenswrapper[4922]: I0218 11:49:09.855896 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ea43ed5b-6735-4fd5-8fc5-1a01dcaeea01-dbus-socket\") pod \"nmstate-handler-xgmj2\" (UID: \"ea43ed5b-6735-4fd5-8fc5-1a01dcaeea01\") " pod="openshift-nmstate/nmstate-handler-xgmj2" Feb 18 11:49:09 crc kubenswrapper[4922]: I0218 11:49:09.956407 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8a41aeaf-5b15-4c8c-8abc-ad77b8e33896-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-8x52x\" (UID: \"8a41aeaf-5b15-4c8c-8abc-ad77b8e33896\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-8x52x" Feb 18 11:49:09 crc kubenswrapper[4922]: I0218 11:49:09.956461 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hs46n\" (UniqueName: \"kubernetes.io/projected/df5bbc9b-9ba2-416b-93db-c4f6155b6906-kube-api-access-hs46n\") pod \"nmstate-webhook-866bcb46dc-7mvdv\" (UID: \"df5bbc9b-9ba2-416b-93db-c4f6155b6906\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-7mvdv" Feb 18 11:49:09 crc kubenswrapper[4922]: I0218 11:49:09.956499 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8a41aeaf-5b15-4c8c-8abc-ad77b8e33896-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-8x52x\" (UID: \"8a41aeaf-5b15-4c8c-8abc-ad77b8e33896\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-8x52x" Feb 18 11:49:09 crc kubenswrapper[4922]: I0218 11:49:09.956530 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ea43ed5b-6735-4fd5-8fc5-1a01dcaeea01-nmstate-lock\") pod \"nmstate-handler-xgmj2\" (UID: \"ea43ed5b-6735-4fd5-8fc5-1a01dcaeea01\") " pod="openshift-nmstate/nmstate-handler-xgmj2" Feb 18 11:49:09 crc kubenswrapper[4922]: I0218 11:49:09.956557 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kxn9\" (UniqueName: \"kubernetes.io/projected/ea43ed5b-6735-4fd5-8fc5-1a01dcaeea01-kube-api-access-6kxn9\") pod \"nmstate-handler-xgmj2\" (UID: \"ea43ed5b-6735-4fd5-8fc5-1a01dcaeea01\") " pod="openshift-nmstate/nmstate-handler-xgmj2" Feb 18 11:49:09 crc kubenswrapper[4922]: I0218 11:49:09.956584 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ea43ed5b-6735-4fd5-8fc5-1a01dcaeea01-dbus-socket\") pod \"nmstate-handler-xgmj2\" (UID: \"ea43ed5b-6735-4fd5-8fc5-1a01dcaeea01\") " pod="openshift-nmstate/nmstate-handler-xgmj2" Feb 18 11:49:09 crc kubenswrapper[4922]: I0218 11:49:09.956609 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/df5bbc9b-9ba2-416b-93db-c4f6155b6906-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-7mvdv\" (UID: \"df5bbc9b-9ba2-416b-93db-c4f6155b6906\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-7mvdv" Feb 18 11:49:09 crc kubenswrapper[4922]: I0218 11:49:09.956633 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nm4nn\" (UniqueName: \"kubernetes.io/projected/8a41aeaf-5b15-4c8c-8abc-ad77b8e33896-kube-api-access-nm4nn\") pod \"nmstate-console-plugin-5c78fc5d65-8x52x\" (UID: \"8a41aeaf-5b15-4c8c-8abc-ad77b8e33896\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-8x52x" Feb 18 11:49:09 crc kubenswrapper[4922]: I0218 11:49:09.956684 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ea43ed5b-6735-4fd5-8fc5-1a01dcaeea01-nmstate-lock\") pod \"nmstate-handler-xgmj2\" (UID: \"ea43ed5b-6735-4fd5-8fc5-1a01dcaeea01\") " pod="openshift-nmstate/nmstate-handler-xgmj2" Feb 18 11:49:09 crc kubenswrapper[4922]: I0218 11:49:09.956790 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6tlg\" (UniqueName: \"kubernetes.io/projected/4e3e71a0-5178-4016-853d-0d0c31563d99-kube-api-access-n6tlg\") pod \"nmstate-metrics-58c85c668d-2dtql\" (UID: \"4e3e71a0-5178-4016-853d-0d0c31563d99\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-2dtql" Feb 18 11:49:09 crc kubenswrapper[4922]: I0218 11:49:09.956853 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ea43ed5b-6735-4fd5-8fc5-1a01dcaeea01-ovs-socket\") pod \"nmstate-handler-xgmj2\" (UID: \"ea43ed5b-6735-4fd5-8fc5-1a01dcaeea01\") " pod="openshift-nmstate/nmstate-handler-xgmj2" Feb 18 11:49:09 crc kubenswrapper[4922]: I0218 11:49:09.957027 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ea43ed5b-6735-4fd5-8fc5-1a01dcaeea01-ovs-socket\") pod \"nmstate-handler-xgmj2\" (UID: \"ea43ed5b-6735-4fd5-8fc5-1a01dcaeea01\") " pod="openshift-nmstate/nmstate-handler-xgmj2" Feb 18 11:49:09 crc kubenswrapper[4922]: I0218 11:49:09.956915 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ea43ed5b-6735-4fd5-8fc5-1a01dcaeea01-dbus-socket\") pod \"nmstate-handler-xgmj2\" (UID: \"ea43ed5b-6735-4fd5-8fc5-1a01dcaeea01\") " pod="openshift-nmstate/nmstate-handler-xgmj2" Feb 18 11:49:09 crc kubenswrapper[4922]: I0218 11:49:09.958346 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8a41aeaf-5b15-4c8c-8abc-ad77b8e33896-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-8x52x\" (UID: \"8a41aeaf-5b15-4c8c-8abc-ad77b8e33896\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-8x52x" Feb 18 11:49:09 crc kubenswrapper[4922]: I0218 11:49:09.968565 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/df5bbc9b-9ba2-416b-93db-c4f6155b6906-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-7mvdv\" (UID: \"df5bbc9b-9ba2-416b-93db-c4f6155b6906\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-7mvdv" Feb 18 11:49:09 crc kubenswrapper[4922]: I0218 11:49:09.970039 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8a41aeaf-5b15-4c8c-8abc-ad77b8e33896-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-8x52x\" (UID: \"8a41aeaf-5b15-4c8c-8abc-ad77b8e33896\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-8x52x" Feb 18 11:49:09 crc kubenswrapper[4922]: I0218 11:49:09.975052 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kxn9\" (UniqueName: \"kubernetes.io/projected/ea43ed5b-6735-4fd5-8fc5-1a01dcaeea01-kube-api-access-6kxn9\") pod \"nmstate-handler-xgmj2\" (UID: \"ea43ed5b-6735-4fd5-8fc5-1a01dcaeea01\") " pod="openshift-nmstate/nmstate-handler-xgmj2" Feb 18 11:49:09 crc kubenswrapper[4922]: I0218 11:49:09.978151 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nm4nn\" (UniqueName: \"kubernetes.io/projected/8a41aeaf-5b15-4c8c-8abc-ad77b8e33896-kube-api-access-nm4nn\") pod \"nmstate-console-plugin-5c78fc5d65-8x52x\" (UID: \"8a41aeaf-5b15-4c8c-8abc-ad77b8e33896\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-8x52x" Feb 18 11:49:09 crc kubenswrapper[4922]: I0218 11:49:09.981505 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6tlg\" (UniqueName: \"kubernetes.io/projected/4e3e71a0-5178-4016-853d-0d0c31563d99-kube-api-access-n6tlg\") pod \"nmstate-metrics-58c85c668d-2dtql\" (UID: \"4e3e71a0-5178-4016-853d-0d0c31563d99\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-2dtql" Feb 18 11:49:09 crc kubenswrapper[4922]: I0218 11:49:09.991196 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-2dtql" Feb 18 11:49:09 crc kubenswrapper[4922]: I0218 11:49:09.992866 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs46n\" (UniqueName: \"kubernetes.io/projected/df5bbc9b-9ba2-416b-93db-c4f6155b6906-kube-api-access-hs46n\") pod \"nmstate-webhook-866bcb46dc-7mvdv\" (UID: \"df5bbc9b-9ba2-416b-93db-c4f6155b6906\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-7mvdv" Feb 18 11:49:10 crc kubenswrapper[4922]: I0218 11:49:10.033307 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-54f4fcfcbd-86swd"] Feb 18 11:49:10 crc kubenswrapper[4922]: I0218 11:49:10.039841 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-7mvdv" Feb 18 11:49:10 crc kubenswrapper[4922]: I0218 11:49:10.041622 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54f4fcfcbd-86swd" Feb 18 11:49:10 crc kubenswrapper[4922]: I0218 11:49:10.046940 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-54f4fcfcbd-86swd"] Feb 18 11:49:10 crc kubenswrapper[4922]: I0218 11:49:10.058723 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73c837c9-e56e-4076-a7ed-1093dc99787c-trusted-ca-bundle\") pod \"console-54f4fcfcbd-86swd\" (UID: \"73c837c9-e56e-4076-a7ed-1093dc99787c\") " pod="openshift-console/console-54f4fcfcbd-86swd" Feb 18 11:49:10 crc kubenswrapper[4922]: I0218 11:49:10.058780 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/73c837c9-e56e-4076-a7ed-1093dc99787c-console-oauth-config\") pod \"console-54f4fcfcbd-86swd\" (UID: \"73c837c9-e56e-4076-a7ed-1093dc99787c\") " pod="openshift-console/console-54f4fcfcbd-86swd" Feb 18 11:49:10 crc kubenswrapper[4922]: I0218 11:49:10.058832 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/73c837c9-e56e-4076-a7ed-1093dc99787c-console-serving-cert\") pod \"console-54f4fcfcbd-86swd\" (UID: \"73c837c9-e56e-4076-a7ed-1093dc99787c\") " pod="openshift-console/console-54f4fcfcbd-86swd" Feb 18 11:49:10 crc kubenswrapper[4922]: I0218 11:49:10.058854 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/73c837c9-e56e-4076-a7ed-1093dc99787c-console-config\") pod \"console-54f4fcfcbd-86swd\" (UID: \"73c837c9-e56e-4076-a7ed-1093dc99787c\") " pod="openshift-console/console-54f4fcfcbd-86swd" Feb 18 11:49:10 crc kubenswrapper[4922]: I0218 11:49:10.058874 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/73c837c9-e56e-4076-a7ed-1093dc99787c-service-ca\") pod \"console-54f4fcfcbd-86swd\" (UID: \"73c837c9-e56e-4076-a7ed-1093dc99787c\") " pod="openshift-console/console-54f4fcfcbd-86swd" Feb 18 11:49:10 crc kubenswrapper[4922]: I0218 11:49:10.058891 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/73c837c9-e56e-4076-a7ed-1093dc99787c-oauth-serving-cert\") pod \"console-54f4fcfcbd-86swd\" (UID: \"73c837c9-e56e-4076-a7ed-1093dc99787c\") " pod="openshift-console/console-54f4fcfcbd-86swd" Feb 18 11:49:10 crc kubenswrapper[4922]: I0218 11:49:10.058911 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h89d\" (UniqueName: \"kubernetes.io/projected/73c837c9-e56e-4076-a7ed-1093dc99787c-kube-api-access-5h89d\") pod \"console-54f4fcfcbd-86swd\" (UID: \"73c837c9-e56e-4076-a7ed-1093dc99787c\") " pod="openshift-console/console-54f4fcfcbd-86swd" Feb 18 11:49:10 crc kubenswrapper[4922]: I0218 11:49:10.060263 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-xgmj2" Feb 18 11:49:10 crc kubenswrapper[4922]: W0218 11:49:10.118977 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea43ed5b_6735_4fd5_8fc5_1a01dcaeea01.slice/crio-b4c8ed4a8968f2756d55c4a919a88ce813ef651f9dc338bdd94d980e199a8760 WatchSource:0}: Error finding container b4c8ed4a8968f2756d55c4a919a88ce813ef651f9dc338bdd94d980e199a8760: Status 404 returned error can't find the container with id b4c8ed4a8968f2756d55c4a919a88ce813ef651f9dc338bdd94d980e199a8760 Feb 18 11:49:10 crc kubenswrapper[4922]: I0218 11:49:10.143090 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-8x52x" Feb 18 11:49:10 crc kubenswrapper[4922]: I0218 11:49:10.159919 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73c837c9-e56e-4076-a7ed-1093dc99787c-trusted-ca-bundle\") pod \"console-54f4fcfcbd-86swd\" (UID: \"73c837c9-e56e-4076-a7ed-1093dc99787c\") " pod="openshift-console/console-54f4fcfcbd-86swd" Feb 18 11:49:10 crc kubenswrapper[4922]: I0218 11:49:10.159977 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/73c837c9-e56e-4076-a7ed-1093dc99787c-console-oauth-config\") pod \"console-54f4fcfcbd-86swd\" (UID: \"73c837c9-e56e-4076-a7ed-1093dc99787c\") " pod="openshift-console/console-54f4fcfcbd-86swd" Feb 18 11:49:10 crc kubenswrapper[4922]: I0218 11:49:10.160010 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/73c837c9-e56e-4076-a7ed-1093dc99787c-console-serving-cert\") pod \"console-54f4fcfcbd-86swd\" (UID: \"73c837c9-e56e-4076-a7ed-1093dc99787c\") " pod="openshift-console/console-54f4fcfcbd-86swd" Feb 18 11:49:10 crc kubenswrapper[4922]: I0218 11:49:10.160028 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/73c837c9-e56e-4076-a7ed-1093dc99787c-console-config\") pod \"console-54f4fcfcbd-86swd\" (UID: \"73c837c9-e56e-4076-a7ed-1093dc99787c\") " pod="openshift-console/console-54f4fcfcbd-86swd" Feb 18 11:49:10 crc kubenswrapper[4922]: I0218 11:49:10.160047 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/73c837c9-e56e-4076-a7ed-1093dc99787c-service-ca\") pod \"console-54f4fcfcbd-86swd\" (UID: \"73c837c9-e56e-4076-a7ed-1093dc99787c\") " pod="openshift-console/console-54f4fcfcbd-86swd" Feb 18 11:49:10 crc kubenswrapper[4922]: I0218 11:49:10.160064 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/73c837c9-e56e-4076-a7ed-1093dc99787c-oauth-serving-cert\") pod \"console-54f4fcfcbd-86swd\" (UID: \"73c837c9-e56e-4076-a7ed-1093dc99787c\") " pod="openshift-console/console-54f4fcfcbd-86swd" Feb 18 11:49:10 crc kubenswrapper[4922]: I0218 11:49:10.160086 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5h89d\" (UniqueName: \"kubernetes.io/projected/73c837c9-e56e-4076-a7ed-1093dc99787c-kube-api-access-5h89d\") pod \"console-54f4fcfcbd-86swd\" (UID: \"73c837c9-e56e-4076-a7ed-1093dc99787c\") " pod="openshift-console/console-54f4fcfcbd-86swd" Feb 18 11:49:10 crc kubenswrapper[4922]: I0218 11:49:10.161407 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/73c837c9-e56e-4076-a7ed-1093dc99787c-console-config\") pod \"console-54f4fcfcbd-86swd\" (UID: \"73c837c9-e56e-4076-a7ed-1093dc99787c\") " pod="openshift-console/console-54f4fcfcbd-86swd" Feb 18 11:49:10 crc kubenswrapper[4922]: I0218 11:49:10.161519 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/73c837c9-e56e-4076-a7ed-1093dc99787c-service-ca\") pod \"console-54f4fcfcbd-86swd\" (UID: \"73c837c9-e56e-4076-a7ed-1093dc99787c\") " pod="openshift-console/console-54f4fcfcbd-86swd" Feb 18 11:49:10 crc kubenswrapper[4922]: I0218 11:49:10.161646 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/73c837c9-e56e-4076-a7ed-1093dc99787c-oauth-serving-cert\") pod \"console-54f4fcfcbd-86swd\" (UID: \"73c837c9-e56e-4076-a7ed-1093dc99787c\") " pod="openshift-console/console-54f4fcfcbd-86swd" Feb 18 11:49:10 crc kubenswrapper[4922]: I0218 11:49:10.162457 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73c837c9-e56e-4076-a7ed-1093dc99787c-trusted-ca-bundle\") pod \"console-54f4fcfcbd-86swd\" (UID: \"73c837c9-e56e-4076-a7ed-1093dc99787c\") " pod="openshift-console/console-54f4fcfcbd-86swd" Feb 18 11:49:10 crc kubenswrapper[4922]: I0218 11:49:10.170405 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/73c837c9-e56e-4076-a7ed-1093dc99787c-console-serving-cert\") pod \"console-54f4fcfcbd-86swd\" (UID: \"73c837c9-e56e-4076-a7ed-1093dc99787c\") " pod="openshift-console/console-54f4fcfcbd-86swd" Feb 18 11:49:10 crc kubenswrapper[4922]: I0218 11:49:10.170877 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/73c837c9-e56e-4076-a7ed-1093dc99787c-console-oauth-config\") pod \"console-54f4fcfcbd-86swd\" (UID: \"73c837c9-e56e-4076-a7ed-1093dc99787c\") " pod="openshift-console/console-54f4fcfcbd-86swd" Feb 18 11:49:10 crc kubenswrapper[4922]: I0218 11:49:10.183301 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h89d\" (UniqueName: \"kubernetes.io/projected/73c837c9-e56e-4076-a7ed-1093dc99787c-kube-api-access-5h89d\") pod \"console-54f4fcfcbd-86swd\" (UID: \"73c837c9-e56e-4076-a7ed-1093dc99787c\") " pod="openshift-console/console-54f4fcfcbd-86swd" Feb 18 11:49:10 crc kubenswrapper[4922]: I0218 11:49:10.242773 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-2dtql"] Feb 18 11:49:10 crc kubenswrapper[4922]: W0218 11:49:10.261190 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e3e71a0_5178_4016_853d_0d0c31563d99.slice/crio-9a438a87856d64e8c6c0602eab77806f09019b8b4c949a60a50d07507c8afc5d WatchSource:0}: Error finding container 9a438a87856d64e8c6c0602eab77806f09019b8b4c949a60a50d07507c8afc5d: Status 404 returned error can't find the container with id 9a438a87856d64e8c6c0602eab77806f09019b8b4c949a60a50d07507c8afc5d Feb 18 11:49:10 crc kubenswrapper[4922]: I0218 11:49:10.304741 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-7mvdv"] Feb 18 11:49:10 crc kubenswrapper[4922]: W0218 11:49:10.312171 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf5bbc9b_9ba2_416b_93db_c4f6155b6906.slice/crio-6552cf11b2ffba1a085f7727712027bd5961c7c9a1779c6422085715b1b5b708 WatchSource:0}: Error finding container 6552cf11b2ffba1a085f7727712027bd5961c7c9a1779c6422085715b1b5b708: Status 404 returned error can't find the container with id 6552cf11b2ffba1a085f7727712027bd5961c7c9a1779c6422085715b1b5b708 Feb 18 11:49:10 crc kubenswrapper[4922]: I0218 11:49:10.387814 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-8x52x"] Feb 18 11:49:10 crc kubenswrapper[4922]: I0218 11:49:10.392121 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54f4fcfcbd-86swd" Feb 18 11:49:10 crc kubenswrapper[4922]: I0218 11:49:10.588765 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-54f4fcfcbd-86swd"] Feb 18 11:49:10 crc kubenswrapper[4922]: W0218 11:49:10.593028 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73c837c9_e56e_4076_a7ed_1093dc99787c.slice/crio-2b3f8430eb5e0a2b5aeb43f5591d8a9adaf0170d80f47e516348a127b8d0199e WatchSource:0}: Error finding container 2b3f8430eb5e0a2b5aeb43f5591d8a9adaf0170d80f47e516348a127b8d0199e: Status 404 returned error can't find the container with id 2b3f8430eb5e0a2b5aeb43f5591d8a9adaf0170d80f47e516348a127b8d0199e Feb 18 11:49:10 crc kubenswrapper[4922]: I0218 11:49:10.774983 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-8x52x" event={"ID":"8a41aeaf-5b15-4c8c-8abc-ad77b8e33896","Type":"ContainerStarted","Data":"c1c0aad688c46b210d97c6b4744ee5db3393cebc6a77d3cccdce985ebeb53f97"} Feb 18 11:49:10 crc kubenswrapper[4922]: I0218 11:49:10.776333 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-xgmj2" event={"ID":"ea43ed5b-6735-4fd5-8fc5-1a01dcaeea01","Type":"ContainerStarted","Data":"b4c8ed4a8968f2756d55c4a919a88ce813ef651f9dc338bdd94d980e199a8760"} Feb 18 11:49:10 crc kubenswrapper[4922]: I0218 11:49:10.777847 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54f4fcfcbd-86swd" event={"ID":"73c837c9-e56e-4076-a7ed-1093dc99787c","Type":"ContainerStarted","Data":"46fea00450caf74b5dab5b448ff200117f02a85055a02939633fd4ddd3f537ed"} Feb 18 11:49:10 crc kubenswrapper[4922]: I0218 11:49:10.777879 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54f4fcfcbd-86swd" event={"ID":"73c837c9-e56e-4076-a7ed-1093dc99787c","Type":"ContainerStarted","Data":"2b3f8430eb5e0a2b5aeb43f5591d8a9adaf0170d80f47e516348a127b8d0199e"} Feb 18 11:49:10 crc kubenswrapper[4922]: I0218 11:49:10.779219 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-7mvdv" event={"ID":"df5bbc9b-9ba2-416b-93db-c4f6155b6906","Type":"ContainerStarted","Data":"6552cf11b2ffba1a085f7727712027bd5961c7c9a1779c6422085715b1b5b708"} Feb 18 11:49:10 crc kubenswrapper[4922]: I0218 11:49:10.780240 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-2dtql" event={"ID":"4e3e71a0-5178-4016-853d-0d0c31563d99","Type":"ContainerStarted","Data":"9a438a87856d64e8c6c0602eab77806f09019b8b4c949a60a50d07507c8afc5d"} Feb 18 11:49:10 crc kubenswrapper[4922]: I0218 11:49:10.796590 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-54f4fcfcbd-86swd" podStartSLOduration=0.796569354 podStartE2EDuration="796.569354ms" podCreationTimestamp="2026-02-18 11:49:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:49:10.794667417 +0000 UTC m=+752.522371507" watchObservedRunningTime="2026-02-18 11:49:10.796569354 +0000 UTC m=+752.524273444" Feb 18 11:49:13 crc kubenswrapper[4922]: I0218 11:49:13.354195 4922 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 18 11:49:13 crc kubenswrapper[4922]: I0218 11:49:13.799850 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-7mvdv" event={"ID":"df5bbc9b-9ba2-416b-93db-c4f6155b6906","Type":"ContainerStarted","Data":"8babf6eb9f2388affc816411dfa34c11f1586951e8dfefd647f4b350b7c6b593"} Feb 18 11:49:13 crc kubenswrapper[4922]: I0218 11:49:13.800471 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-7mvdv" Feb 18 11:49:13 crc kubenswrapper[4922]: I0218 11:49:13.801433 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-2dtql" event={"ID":"4e3e71a0-5178-4016-853d-0d0c31563d99","Type":"ContainerStarted","Data":"82c4c1563907ae81ba22c301e70a0195c6a4f5cd83d16f2023b7ea1f11569fc1"} Feb 18 11:49:13 crc kubenswrapper[4922]: I0218 11:49:13.802725 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-8x52x" event={"ID":"8a41aeaf-5b15-4c8c-8abc-ad77b8e33896","Type":"ContainerStarted","Data":"a84288c98e8ee4e1135b7c61012ecc0c85db6ff330a82a1c0f7cb1e2e0dc6979"} Feb 18 11:49:13 crc kubenswrapper[4922]: I0218 11:49:13.805186 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-xgmj2" event={"ID":"ea43ed5b-6735-4fd5-8fc5-1a01dcaeea01","Type":"ContainerStarted","Data":"0e1a45db1b17d91496fe6ad090b385b45be5ad488f0b95f705b7ad4302b59e24"} Feb 18 11:49:13 crc kubenswrapper[4922]: I0218 11:49:13.805375 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-xgmj2" Feb 18 11:49:13 crc kubenswrapper[4922]: I0218 11:49:13.820116 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-7mvdv" podStartSLOduration=2.095602069 podStartE2EDuration="4.820092229s" podCreationTimestamp="2026-02-18 11:49:09 +0000 UTC" firstStartedPulling="2026-02-18 11:49:10.318942442 +0000 UTC m=+752.046646522" lastFinishedPulling="2026-02-18 11:49:13.043432572 +0000 UTC m=+754.771136682" observedRunningTime="2026-02-18 11:49:13.814866461 +0000 UTC m=+755.542570531" watchObservedRunningTime="2026-02-18 11:49:13.820092229 +0000 UTC m=+755.547796309" Feb 18 11:49:13 crc kubenswrapper[4922]: I0218 11:49:13.834778 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-xgmj2" podStartSLOduration=1.9015293610000001 podStartE2EDuration="4.834763409s" podCreationTimestamp="2026-02-18 11:49:09 +0000 UTC" firstStartedPulling="2026-02-18 11:49:10.121643095 +0000 UTC m=+751.849347175" lastFinishedPulling="2026-02-18 11:49:13.054877113 +0000 UTC m=+754.782581223" observedRunningTime="2026-02-18 11:49:13.833700453 +0000 UTC m=+755.561404533" watchObservedRunningTime="2026-02-18 11:49:13.834763409 +0000 UTC m=+755.562467489" Feb 18 11:49:13 crc kubenswrapper[4922]: I0218 11:49:13.852234 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-8x52x" podStartSLOduration=2.211911226 podStartE2EDuration="4.852207778s" podCreationTimestamp="2026-02-18 11:49:09 +0000 UTC" firstStartedPulling="2026-02-18 11:49:10.394016266 +0000 UTC m=+752.121720346" lastFinishedPulling="2026-02-18 11:49:13.034312818 +0000 UTC m=+754.762016898" observedRunningTime="2026-02-18 11:49:13.848930467 +0000 UTC m=+755.576634567" watchObservedRunningTime="2026-02-18 11:49:13.852207778 +0000 UTC m=+755.579911858" Feb 18 11:49:15 crc kubenswrapper[4922]: I0218 11:49:15.818199 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-2dtql" event={"ID":"4e3e71a0-5178-4016-853d-0d0c31563d99","Type":"ContainerStarted","Data":"fb7e680683328a66fed5134d370553e5315a803ccf7e3bdef8fd47bb90ac1508"} Feb 18 11:49:15 crc kubenswrapper[4922]: I0218 11:49:15.839909 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58c85c668d-2dtql" podStartSLOduration=1.905352466 podStartE2EDuration="6.83988528s" podCreationTimestamp="2026-02-18 11:49:09 +0000 UTC" firstStartedPulling="2026-02-18 11:49:10.262083585 +0000 UTC m=+751.989787665" lastFinishedPulling="2026-02-18 11:49:15.196616399 +0000 UTC m=+756.924320479" observedRunningTime="2026-02-18 11:49:15.83663062 +0000 UTC m=+757.564334790" watchObservedRunningTime="2026-02-18 11:49:15.83988528 +0000 UTC m=+757.567589390" Feb 18 11:49:20 crc kubenswrapper[4922]: I0218 11:49:20.098282 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-xgmj2" Feb 18 11:49:20 crc kubenswrapper[4922]: I0218 11:49:20.393180 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-54f4fcfcbd-86swd" Feb 18 11:49:20 crc kubenswrapper[4922]: I0218 11:49:20.393259 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-54f4fcfcbd-86swd" Feb 18 11:49:20 crc kubenswrapper[4922]: I0218 11:49:20.397771 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-54f4fcfcbd-86swd" Feb 18 11:49:20 crc kubenswrapper[4922]: I0218 11:49:20.855064 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-54f4fcfcbd-86swd" Feb 18 11:49:20 crc kubenswrapper[4922]: I0218 11:49:20.904834 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-nfn89"] Feb 18 11:49:30 crc kubenswrapper[4922]: I0218 11:49:30.046930 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-7mvdv" Feb 18 11:49:39 crc kubenswrapper[4922]: I0218 11:49:39.807448 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 11:49:39 crc kubenswrapper[4922]: I0218 11:49:39.808226 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 11:49:44 crc kubenswrapper[4922]: I0218 11:49:44.562598 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b59p4"] Feb 18 11:49:44 crc kubenswrapper[4922]: I0218 11:49:44.564093 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b59p4" Feb 18 11:49:44 crc kubenswrapper[4922]: I0218 11:49:44.566306 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 18 11:49:44 crc kubenswrapper[4922]: I0218 11:49:44.579602 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b59p4"] Feb 18 11:49:44 crc kubenswrapper[4922]: I0218 11:49:44.631573 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8llc\" (UniqueName: \"kubernetes.io/projected/188679bc-8b67-4136-94ce-fa515c1c950a-kube-api-access-k8llc\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b59p4\" (UID: \"188679bc-8b67-4136-94ce-fa515c1c950a\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b59p4" Feb 18 11:49:44 crc kubenswrapper[4922]: I0218 11:49:44.631644 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/188679bc-8b67-4136-94ce-fa515c1c950a-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b59p4\" (UID: \"188679bc-8b67-4136-94ce-fa515c1c950a\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b59p4" Feb 18 11:49:44 crc kubenswrapper[4922]: I0218 11:49:44.631669 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/188679bc-8b67-4136-94ce-fa515c1c950a-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b59p4\" (UID: \"188679bc-8b67-4136-94ce-fa515c1c950a\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b59p4" Feb 18 11:49:44 crc kubenswrapper[4922]: I0218 11:49:44.732606 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8llc\" (UniqueName: \"kubernetes.io/projected/188679bc-8b67-4136-94ce-fa515c1c950a-kube-api-access-k8llc\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b59p4\" (UID: \"188679bc-8b67-4136-94ce-fa515c1c950a\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b59p4" Feb 18 11:49:44 crc kubenswrapper[4922]: I0218 11:49:44.732688 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/188679bc-8b67-4136-94ce-fa515c1c950a-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b59p4\" (UID: \"188679bc-8b67-4136-94ce-fa515c1c950a\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b59p4" Feb 18 11:49:44 crc kubenswrapper[4922]: I0218 11:49:44.732727 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/188679bc-8b67-4136-94ce-fa515c1c950a-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b59p4\" (UID: \"188679bc-8b67-4136-94ce-fa515c1c950a\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b59p4" Feb 18 11:49:44 crc kubenswrapper[4922]: I0218 11:49:44.733356 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/188679bc-8b67-4136-94ce-fa515c1c950a-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b59p4\" (UID: \"188679bc-8b67-4136-94ce-fa515c1c950a\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b59p4" Feb 18 11:49:44 crc kubenswrapper[4922]: I0218 11:49:44.733743 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/188679bc-8b67-4136-94ce-fa515c1c950a-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b59p4\" (UID: \"188679bc-8b67-4136-94ce-fa515c1c950a\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b59p4" Feb 18 11:49:44 crc kubenswrapper[4922]: I0218 11:49:44.753411 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8llc\" (UniqueName: \"kubernetes.io/projected/188679bc-8b67-4136-94ce-fa515c1c950a-kube-api-access-k8llc\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b59p4\" (UID: \"188679bc-8b67-4136-94ce-fa515c1c950a\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b59p4" Feb 18 11:49:44 crc kubenswrapper[4922]: I0218 11:49:44.881977 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b59p4" Feb 18 11:49:45 crc kubenswrapper[4922]: I0218 11:49:45.330609 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b59p4"] Feb 18 11:49:45 crc kubenswrapper[4922]: I0218 11:49:45.946630 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-nfn89" podUID="14e81dbf-6c73-481c-b758-4c15cc0f3258" containerName="console" containerID="cri-o://af27bceaff8159c64269dd32349e1085ec2a3938f8c97b9f88035f96de999194" gracePeriod=15 Feb 18 11:49:46 crc kubenswrapper[4922]: I0218 11:49:46.042681 4922 generic.go:334] "Generic (PLEG): container finished" podID="188679bc-8b67-4136-94ce-fa515c1c950a" containerID="a0d393bb91b2595d47ee36d33109599b029b2e17f22ea41b4c78c4daf107351d" exitCode=0 Feb 18 11:49:46 crc kubenswrapper[4922]: I0218 11:49:46.042741 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b59p4" event={"ID":"188679bc-8b67-4136-94ce-fa515c1c950a","Type":"ContainerDied","Data":"a0d393bb91b2595d47ee36d33109599b029b2e17f22ea41b4c78c4daf107351d"} Feb 18 11:49:46 crc kubenswrapper[4922]: I0218 11:49:46.043055 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b59p4" event={"ID":"188679bc-8b67-4136-94ce-fa515c1c950a","Type":"ContainerStarted","Data":"338a8537317d0f5f601de5b9d14a10031bd912511e91d08e59143447f9685b68"} Feb 18 11:49:46 crc kubenswrapper[4922]: I0218 11:49:46.289743 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8cjdv"] Feb 18 11:49:46 crc kubenswrapper[4922]: I0218 11:49:46.291058 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8cjdv" Feb 18 11:49:46 crc kubenswrapper[4922]: I0218 11:49:46.296289 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8cjdv"] Feb 18 11:49:46 crc kubenswrapper[4922]: I0218 11:49:46.355710 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4485\" (UniqueName: \"kubernetes.io/projected/0573a6f7-8a5e-4083-8dc6-64608707229c-kube-api-access-q4485\") pod \"redhat-operators-8cjdv\" (UID: \"0573a6f7-8a5e-4083-8dc6-64608707229c\") " pod="openshift-marketplace/redhat-operators-8cjdv" Feb 18 11:49:46 crc kubenswrapper[4922]: I0218 11:49:46.355872 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0573a6f7-8a5e-4083-8dc6-64608707229c-catalog-content\") pod \"redhat-operators-8cjdv\" (UID: \"0573a6f7-8a5e-4083-8dc6-64608707229c\") " pod="openshift-marketplace/redhat-operators-8cjdv" Feb 18 11:49:46 crc kubenswrapper[4922]: I0218 11:49:46.356082 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0573a6f7-8a5e-4083-8dc6-64608707229c-utilities\") pod \"redhat-operators-8cjdv\" (UID: \"0573a6f7-8a5e-4083-8dc6-64608707229c\") " pod="openshift-marketplace/redhat-operators-8cjdv" Feb 18 11:49:46 crc kubenswrapper[4922]: I0218 11:49:46.416095 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-nfn89_14e81dbf-6c73-481c-b758-4c15cc0f3258/console/0.log" Feb 18 11:49:46 crc kubenswrapper[4922]: I0218 11:49:46.416162 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nfn89" Feb 18 11:49:46 crc kubenswrapper[4922]: I0218 11:49:46.457885 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slf52\" (UniqueName: \"kubernetes.io/projected/14e81dbf-6c73-481c-b758-4c15cc0f3258-kube-api-access-slf52\") pod \"14e81dbf-6c73-481c-b758-4c15cc0f3258\" (UID: \"14e81dbf-6c73-481c-b758-4c15cc0f3258\") " Feb 18 11:49:46 crc kubenswrapper[4922]: I0218 11:49:46.457945 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/14e81dbf-6c73-481c-b758-4c15cc0f3258-console-config\") pod \"14e81dbf-6c73-481c-b758-4c15cc0f3258\" (UID: \"14e81dbf-6c73-481c-b758-4c15cc0f3258\") " Feb 18 11:49:46 crc kubenswrapper[4922]: I0218 11:49:46.457978 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/14e81dbf-6c73-481c-b758-4c15cc0f3258-console-serving-cert\") pod \"14e81dbf-6c73-481c-b758-4c15cc0f3258\" (UID: \"14e81dbf-6c73-481c-b758-4c15cc0f3258\") " Feb 18 11:49:46 crc kubenswrapper[4922]: I0218 11:49:46.457999 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/14e81dbf-6c73-481c-b758-4c15cc0f3258-oauth-serving-cert\") pod \"14e81dbf-6c73-481c-b758-4c15cc0f3258\" (UID: \"14e81dbf-6c73-481c-b758-4c15cc0f3258\") " Feb 18 11:49:46 crc kubenswrapper[4922]: I0218 11:49:46.458040 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/14e81dbf-6c73-481c-b758-4c15cc0f3258-service-ca\") pod \"14e81dbf-6c73-481c-b758-4c15cc0f3258\" (UID: \"14e81dbf-6c73-481c-b758-4c15cc0f3258\") " Feb 18 11:49:46 crc kubenswrapper[4922]: I0218 11:49:46.458054 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/14e81dbf-6c73-481c-b758-4c15cc0f3258-console-oauth-config\") pod \"14e81dbf-6c73-481c-b758-4c15cc0f3258\" (UID: \"14e81dbf-6c73-481c-b758-4c15cc0f3258\") " Feb 18 11:49:46 crc kubenswrapper[4922]: I0218 11:49:46.458070 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14e81dbf-6c73-481c-b758-4c15cc0f3258-trusted-ca-bundle\") pod \"14e81dbf-6c73-481c-b758-4c15cc0f3258\" (UID: \"14e81dbf-6c73-481c-b758-4c15cc0f3258\") " Feb 18 11:49:46 crc kubenswrapper[4922]: I0218 11:49:46.458326 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0573a6f7-8a5e-4083-8dc6-64608707229c-utilities\") pod \"redhat-operators-8cjdv\" (UID: \"0573a6f7-8a5e-4083-8dc6-64608707229c\") " pod="openshift-marketplace/redhat-operators-8cjdv" Feb 18 11:49:46 crc kubenswrapper[4922]: I0218 11:49:46.458394 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4485\" (UniqueName: \"kubernetes.io/projected/0573a6f7-8a5e-4083-8dc6-64608707229c-kube-api-access-q4485\") pod \"redhat-operators-8cjdv\" (UID: \"0573a6f7-8a5e-4083-8dc6-64608707229c\") " pod="openshift-marketplace/redhat-operators-8cjdv" Feb 18 11:49:46 crc kubenswrapper[4922]: I0218 11:49:46.458427 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0573a6f7-8a5e-4083-8dc6-64608707229c-catalog-content\") pod \"redhat-operators-8cjdv\" (UID: \"0573a6f7-8a5e-4083-8dc6-64608707229c\") " pod="openshift-marketplace/redhat-operators-8cjdv" Feb 18 11:49:46 crc kubenswrapper[4922]: I0218 11:49:46.459136 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14e81dbf-6c73-481c-b758-4c15cc0f3258-console-config" (OuterVolumeSpecName: "console-config") pod "14e81dbf-6c73-481c-b758-4c15cc0f3258" (UID: "14e81dbf-6c73-481c-b758-4c15cc0f3258"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:49:46 crc kubenswrapper[4922]: I0218 11:49:46.459377 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14e81dbf-6c73-481c-b758-4c15cc0f3258-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "14e81dbf-6c73-481c-b758-4c15cc0f3258" (UID: "14e81dbf-6c73-481c-b758-4c15cc0f3258"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:49:46 crc kubenswrapper[4922]: I0218 11:49:46.459660 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14e81dbf-6c73-481c-b758-4c15cc0f3258-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "14e81dbf-6c73-481c-b758-4c15cc0f3258" (UID: "14e81dbf-6c73-481c-b758-4c15cc0f3258"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:49:46 crc kubenswrapper[4922]: I0218 11:49:46.459914 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0573a6f7-8a5e-4083-8dc6-64608707229c-catalog-content\") pod \"redhat-operators-8cjdv\" (UID: \"0573a6f7-8a5e-4083-8dc6-64608707229c\") " pod="openshift-marketplace/redhat-operators-8cjdv" Feb 18 11:49:46 crc kubenswrapper[4922]: I0218 11:49:46.460078 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14e81dbf-6c73-481c-b758-4c15cc0f3258-service-ca" (OuterVolumeSpecName: "service-ca") pod "14e81dbf-6c73-481c-b758-4c15cc0f3258" (UID: "14e81dbf-6c73-481c-b758-4c15cc0f3258"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:49:46 crc kubenswrapper[4922]: I0218 11:49:46.460461 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0573a6f7-8a5e-4083-8dc6-64608707229c-utilities\") pod \"redhat-operators-8cjdv\" (UID: \"0573a6f7-8a5e-4083-8dc6-64608707229c\") " pod="openshift-marketplace/redhat-operators-8cjdv" Feb 18 11:49:46 crc kubenswrapper[4922]: I0218 11:49:46.463564 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14e81dbf-6c73-481c-b758-4c15cc0f3258-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "14e81dbf-6c73-481c-b758-4c15cc0f3258" (UID: "14e81dbf-6c73-481c-b758-4c15cc0f3258"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:49:46 crc kubenswrapper[4922]: I0218 11:49:46.465119 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14e81dbf-6c73-481c-b758-4c15cc0f3258-kube-api-access-slf52" (OuterVolumeSpecName: "kube-api-access-slf52") pod "14e81dbf-6c73-481c-b758-4c15cc0f3258" (UID: "14e81dbf-6c73-481c-b758-4c15cc0f3258"). InnerVolumeSpecName "kube-api-access-slf52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:49:46 crc kubenswrapper[4922]: I0218 11:49:46.467671 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14e81dbf-6c73-481c-b758-4c15cc0f3258-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "14e81dbf-6c73-481c-b758-4c15cc0f3258" (UID: "14e81dbf-6c73-481c-b758-4c15cc0f3258"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:49:46 crc kubenswrapper[4922]: I0218 11:49:46.475427 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4485\" (UniqueName: \"kubernetes.io/projected/0573a6f7-8a5e-4083-8dc6-64608707229c-kube-api-access-q4485\") pod \"redhat-operators-8cjdv\" (UID: \"0573a6f7-8a5e-4083-8dc6-64608707229c\") " pod="openshift-marketplace/redhat-operators-8cjdv" Feb 18 11:49:46 crc kubenswrapper[4922]: I0218 11:49:46.559376 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slf52\" (UniqueName: \"kubernetes.io/projected/14e81dbf-6c73-481c-b758-4c15cc0f3258-kube-api-access-slf52\") on node \"crc\" DevicePath \"\"" Feb 18 11:49:46 crc kubenswrapper[4922]: I0218 11:49:46.559429 4922 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/14e81dbf-6c73-481c-b758-4c15cc0f3258-console-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:49:46 crc kubenswrapper[4922]: I0218 11:49:46.559439 4922 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/14e81dbf-6c73-481c-b758-4c15cc0f3258-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:49:46 crc kubenswrapper[4922]: I0218 11:49:46.559450 4922 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/14e81dbf-6c73-481c-b758-4c15cc0f3258-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:49:46 crc kubenswrapper[4922]: I0218 11:49:46.559462 4922 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/14e81dbf-6c73-481c-b758-4c15cc0f3258-service-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:49:46 crc kubenswrapper[4922]: I0218 11:49:46.559472 4922 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/14e81dbf-6c73-481c-b758-4c15cc0f3258-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:49:46 crc kubenswrapper[4922]: I0218 11:49:46.559482 4922 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14e81dbf-6c73-481c-b758-4c15cc0f3258-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:49:46 crc kubenswrapper[4922]: I0218 11:49:46.614700 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8cjdv" Feb 18 11:49:46 crc kubenswrapper[4922]: I0218 11:49:46.818043 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8cjdv"] Feb 18 11:49:46 crc kubenswrapper[4922]: W0218 11:49:46.825323 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0573a6f7_8a5e_4083_8dc6_64608707229c.slice/crio-87d76ad8e6106ce18111899a657b4a0c77b34e29c2e101027714b79ae306611e WatchSource:0}: Error finding container 87d76ad8e6106ce18111899a657b4a0c77b34e29c2e101027714b79ae306611e: Status 404 returned error can't find the container with id 87d76ad8e6106ce18111899a657b4a0c77b34e29c2e101027714b79ae306611e Feb 18 11:49:47 crc kubenswrapper[4922]: I0218 11:49:47.050190 4922 generic.go:334] "Generic (PLEG): container finished" podID="0573a6f7-8a5e-4083-8dc6-64608707229c" containerID="518cfc0e2122ba3d8575366d8ca43a4359d78ae39ac7920bc04d33ac9f1a06a7" exitCode=0 Feb 18 11:49:47 crc kubenswrapper[4922]: I0218 11:49:47.051257 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8cjdv" event={"ID":"0573a6f7-8a5e-4083-8dc6-64608707229c","Type":"ContainerDied","Data":"518cfc0e2122ba3d8575366d8ca43a4359d78ae39ac7920bc04d33ac9f1a06a7"} Feb 18 11:49:47 crc kubenswrapper[4922]: I0218 11:49:47.051406 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8cjdv" event={"ID":"0573a6f7-8a5e-4083-8dc6-64608707229c","Type":"ContainerStarted","Data":"87d76ad8e6106ce18111899a657b4a0c77b34e29c2e101027714b79ae306611e"} Feb 18 11:49:47 crc kubenswrapper[4922]: I0218 11:49:47.053813 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-nfn89_14e81dbf-6c73-481c-b758-4c15cc0f3258/console/0.log" Feb 18 11:49:47 crc kubenswrapper[4922]: I0218 11:49:47.053850 4922 generic.go:334] "Generic (PLEG): container finished" podID="14e81dbf-6c73-481c-b758-4c15cc0f3258" containerID="af27bceaff8159c64269dd32349e1085ec2a3938f8c97b9f88035f96de999194" exitCode=2 Feb 18 11:49:47 crc kubenswrapper[4922]: I0218 11:49:47.053899 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nfn89" event={"ID":"14e81dbf-6c73-481c-b758-4c15cc0f3258","Type":"ContainerDied","Data":"af27bceaff8159c64269dd32349e1085ec2a3938f8c97b9f88035f96de999194"} Feb 18 11:49:47 crc kubenswrapper[4922]: I0218 11:49:47.053928 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nfn89" event={"ID":"14e81dbf-6c73-481c-b758-4c15cc0f3258","Type":"ContainerDied","Data":"e3e0521f1d2586e618292c2cbf0b96f0a5f28185a418b592c2676c55e3372f97"} Feb 18 11:49:47 crc kubenswrapper[4922]: I0218 11:49:47.053967 4922 scope.go:117] "RemoveContainer" containerID="af27bceaff8159c64269dd32349e1085ec2a3938f8c97b9f88035f96de999194" Feb 18 11:49:47 crc kubenswrapper[4922]: I0218 11:49:47.054156 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nfn89" Feb 18 11:49:47 crc kubenswrapper[4922]: I0218 11:49:47.090244 4922 scope.go:117] "RemoveContainer" containerID="af27bceaff8159c64269dd32349e1085ec2a3938f8c97b9f88035f96de999194" Feb 18 11:49:47 crc kubenswrapper[4922]: E0218 11:49:47.094992 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af27bceaff8159c64269dd32349e1085ec2a3938f8c97b9f88035f96de999194\": container with ID starting with af27bceaff8159c64269dd32349e1085ec2a3938f8c97b9f88035f96de999194 not found: ID does not exist" containerID="af27bceaff8159c64269dd32349e1085ec2a3938f8c97b9f88035f96de999194" Feb 18 11:49:47 crc kubenswrapper[4922]: I0218 11:49:47.095041 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af27bceaff8159c64269dd32349e1085ec2a3938f8c97b9f88035f96de999194"} err="failed to get container status \"af27bceaff8159c64269dd32349e1085ec2a3938f8c97b9f88035f96de999194\": rpc error: code = NotFound desc = could not find container \"af27bceaff8159c64269dd32349e1085ec2a3938f8c97b9f88035f96de999194\": container with ID starting with af27bceaff8159c64269dd32349e1085ec2a3938f8c97b9f88035f96de999194 not found: ID does not exist" Feb 18 11:49:47 crc kubenswrapper[4922]: I0218 11:49:47.095240 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-nfn89"] Feb 18 11:49:47 crc kubenswrapper[4922]: I0218 11:49:47.101399 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-nfn89"] Feb 18 11:49:48 crc kubenswrapper[4922]: I0218 11:49:48.071448 4922 generic.go:334] "Generic (PLEG): container finished" podID="188679bc-8b67-4136-94ce-fa515c1c950a" containerID="00255652c77fcf3ae566adc81c151063bc789a070562e1934905a64c1c694b2c" exitCode=0 Feb 18 11:49:48 crc kubenswrapper[4922]: I0218 11:49:48.071543 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b59p4" event={"ID":"188679bc-8b67-4136-94ce-fa515c1c950a","Type":"ContainerDied","Data":"00255652c77fcf3ae566adc81c151063bc789a070562e1934905a64c1c694b2c"} Feb 18 11:49:48 crc kubenswrapper[4922]: I0218 11:49:48.076614 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8cjdv" event={"ID":"0573a6f7-8a5e-4083-8dc6-64608707229c","Type":"ContainerStarted","Data":"a2085f95b1cea95e061e2dae349488380da4a4af1b753b095691e935490ce027"} Feb 18 11:49:48 crc kubenswrapper[4922]: I0218 11:49:48.981134 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14e81dbf-6c73-481c-b758-4c15cc0f3258" path="/var/lib/kubelet/pods/14e81dbf-6c73-481c-b758-4c15cc0f3258/volumes" Feb 18 11:49:49 crc kubenswrapper[4922]: I0218 11:49:49.087204 4922 generic.go:334] "Generic (PLEG): container finished" podID="0573a6f7-8a5e-4083-8dc6-64608707229c" containerID="a2085f95b1cea95e061e2dae349488380da4a4af1b753b095691e935490ce027" exitCode=0 Feb 18 11:49:49 crc kubenswrapper[4922]: I0218 11:49:49.087289 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8cjdv" event={"ID":"0573a6f7-8a5e-4083-8dc6-64608707229c","Type":"ContainerDied","Data":"a2085f95b1cea95e061e2dae349488380da4a4af1b753b095691e935490ce027"} Feb 18 11:49:49 crc kubenswrapper[4922]: I0218 11:49:49.092115 4922 generic.go:334] "Generic (PLEG): container finished" podID="188679bc-8b67-4136-94ce-fa515c1c950a" containerID="e110bc3356ab463d5b9cc069bb258ef89272cacab9d97001eb0c4514a140e3c8" exitCode=0 Feb 18 11:49:49 crc kubenswrapper[4922]: I0218 11:49:49.092179 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b59p4" event={"ID":"188679bc-8b67-4136-94ce-fa515c1c950a","Type":"ContainerDied","Data":"e110bc3356ab463d5b9cc069bb258ef89272cacab9d97001eb0c4514a140e3c8"} Feb 18 11:49:50 crc kubenswrapper[4922]: I0218 11:49:50.104010 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8cjdv" event={"ID":"0573a6f7-8a5e-4083-8dc6-64608707229c","Type":"ContainerStarted","Data":"88ca51ca1d2b61ac84bdfb12a835fecc00413b6efaf7b24f5fd72c26c077f3eb"} Feb 18 11:49:50 crc kubenswrapper[4922]: I0218 11:49:50.129468 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8cjdv" podStartSLOduration=1.536647785 podStartE2EDuration="4.129448402s" podCreationTimestamp="2026-02-18 11:49:46 +0000 UTC" firstStartedPulling="2026-02-18 11:49:47.052333369 +0000 UTC m=+788.780037449" lastFinishedPulling="2026-02-18 11:49:49.645133976 +0000 UTC m=+791.372838066" observedRunningTime="2026-02-18 11:49:50.124963112 +0000 UTC m=+791.852667192" watchObservedRunningTime="2026-02-18 11:49:50.129448402 +0000 UTC m=+791.857152482" Feb 18 11:49:50 crc kubenswrapper[4922]: I0218 11:49:50.362519 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b59p4" Feb 18 11:49:50 crc kubenswrapper[4922]: I0218 11:49:50.521282 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/188679bc-8b67-4136-94ce-fa515c1c950a-bundle\") pod \"188679bc-8b67-4136-94ce-fa515c1c950a\" (UID: \"188679bc-8b67-4136-94ce-fa515c1c950a\") " Feb 18 11:49:50 crc kubenswrapper[4922]: I0218 11:49:50.521473 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8llc\" (UniqueName: \"kubernetes.io/projected/188679bc-8b67-4136-94ce-fa515c1c950a-kube-api-access-k8llc\") pod \"188679bc-8b67-4136-94ce-fa515c1c950a\" (UID: \"188679bc-8b67-4136-94ce-fa515c1c950a\") " Feb 18 11:49:50 crc kubenswrapper[4922]: I0218 11:49:50.521570 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/188679bc-8b67-4136-94ce-fa515c1c950a-util\") pod \"188679bc-8b67-4136-94ce-fa515c1c950a\" (UID: \"188679bc-8b67-4136-94ce-fa515c1c950a\") " Feb 18 11:49:50 crc kubenswrapper[4922]: I0218 11:49:50.522286 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/188679bc-8b67-4136-94ce-fa515c1c950a-bundle" (OuterVolumeSpecName: "bundle") pod "188679bc-8b67-4136-94ce-fa515c1c950a" (UID: "188679bc-8b67-4136-94ce-fa515c1c950a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:49:50 crc kubenswrapper[4922]: I0218 11:49:50.529633 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/188679bc-8b67-4136-94ce-fa515c1c950a-kube-api-access-k8llc" (OuterVolumeSpecName: "kube-api-access-k8llc") pod "188679bc-8b67-4136-94ce-fa515c1c950a" (UID: "188679bc-8b67-4136-94ce-fa515c1c950a"). InnerVolumeSpecName "kube-api-access-k8llc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:49:50 crc kubenswrapper[4922]: I0218 11:49:50.556482 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/188679bc-8b67-4136-94ce-fa515c1c950a-util" (OuterVolumeSpecName: "util") pod "188679bc-8b67-4136-94ce-fa515c1c950a" (UID: "188679bc-8b67-4136-94ce-fa515c1c950a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:49:50 crc kubenswrapper[4922]: I0218 11:49:50.622980 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8llc\" (UniqueName: \"kubernetes.io/projected/188679bc-8b67-4136-94ce-fa515c1c950a-kube-api-access-k8llc\") on node \"crc\" DevicePath \"\"" Feb 18 11:49:50 crc kubenswrapper[4922]: I0218 11:49:50.623023 4922 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/188679bc-8b67-4136-94ce-fa515c1c950a-util\") on node \"crc\" DevicePath \"\"" Feb 18 11:49:50 crc kubenswrapper[4922]: I0218 11:49:50.623037 4922 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/188679bc-8b67-4136-94ce-fa515c1c950a-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:49:51 crc kubenswrapper[4922]: I0218 11:49:51.112712 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b59p4" Feb 18 11:49:51 crc kubenswrapper[4922]: I0218 11:49:51.112724 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b59p4" event={"ID":"188679bc-8b67-4136-94ce-fa515c1c950a","Type":"ContainerDied","Data":"338a8537317d0f5f601de5b9d14a10031bd912511e91d08e59143447f9685b68"} Feb 18 11:49:51 crc kubenswrapper[4922]: I0218 11:49:51.112767 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="338a8537317d0f5f601de5b9d14a10031bd912511e91d08e59143447f9685b68" Feb 18 11:49:56 crc kubenswrapper[4922]: I0218 11:49:56.615587 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8cjdv" Feb 18 11:49:56 crc kubenswrapper[4922]: I0218 11:49:56.616677 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8cjdv" Feb 18 11:49:56 crc kubenswrapper[4922]: I0218 11:49:56.659166 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8cjdv" Feb 18 11:49:57 crc kubenswrapper[4922]: I0218 11:49:57.212998 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8cjdv" Feb 18 11:49:59 crc kubenswrapper[4922]: I0218 11:49:59.076554 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8cjdv"] Feb 18 11:49:59 crc kubenswrapper[4922]: I0218 11:49:59.163015 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8cjdv" podUID="0573a6f7-8a5e-4083-8dc6-64608707229c" containerName="registry-server" containerID="cri-o://88ca51ca1d2b61ac84bdfb12a835fecc00413b6efaf7b24f5fd72c26c077f3eb" gracePeriod=2 Feb 18 11:49:59 crc kubenswrapper[4922]: E0218 11:49:59.629731 4922 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0573a6f7_8a5e_4083_8dc6_64608707229c.slice/crio-88ca51ca1d2b61ac84bdfb12a835fecc00413b6efaf7b24f5fd72c26c077f3eb.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0573a6f7_8a5e_4083_8dc6_64608707229c.slice/crio-conmon-88ca51ca1d2b61ac84bdfb12a835fecc00413b6efaf7b24f5fd72c26c077f3eb.scope\": RecentStats: unable to find data in memory cache]" Feb 18 11:50:00 crc kubenswrapper[4922]: I0218 11:50:00.175389 4922 generic.go:334] "Generic (PLEG): container finished" podID="0573a6f7-8a5e-4083-8dc6-64608707229c" containerID="88ca51ca1d2b61ac84bdfb12a835fecc00413b6efaf7b24f5fd72c26c077f3eb" exitCode=0 Feb 18 11:50:00 crc kubenswrapper[4922]: I0218 11:50:00.175431 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8cjdv" event={"ID":"0573a6f7-8a5e-4083-8dc6-64608707229c","Type":"ContainerDied","Data":"88ca51ca1d2b61ac84bdfb12a835fecc00413b6efaf7b24f5fd72c26c077f3eb"} Feb 18 11:50:00 crc kubenswrapper[4922]: I0218 11:50:00.295552 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8cjdv" Feb 18 11:50:00 crc kubenswrapper[4922]: I0218 11:50:00.345570 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4485\" (UniqueName: \"kubernetes.io/projected/0573a6f7-8a5e-4083-8dc6-64608707229c-kube-api-access-q4485\") pod \"0573a6f7-8a5e-4083-8dc6-64608707229c\" (UID: \"0573a6f7-8a5e-4083-8dc6-64608707229c\") " Feb 18 11:50:00 crc kubenswrapper[4922]: I0218 11:50:00.345754 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0573a6f7-8a5e-4083-8dc6-64608707229c-catalog-content\") pod \"0573a6f7-8a5e-4083-8dc6-64608707229c\" (UID: \"0573a6f7-8a5e-4083-8dc6-64608707229c\") " Feb 18 11:50:00 crc kubenswrapper[4922]: I0218 11:50:00.345823 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0573a6f7-8a5e-4083-8dc6-64608707229c-utilities\") pod \"0573a6f7-8a5e-4083-8dc6-64608707229c\" (UID: \"0573a6f7-8a5e-4083-8dc6-64608707229c\") " Feb 18 11:50:00 crc kubenswrapper[4922]: I0218 11:50:00.346709 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0573a6f7-8a5e-4083-8dc6-64608707229c-utilities" (OuterVolumeSpecName: "utilities") pod "0573a6f7-8a5e-4083-8dc6-64608707229c" (UID: "0573a6f7-8a5e-4083-8dc6-64608707229c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:50:00 crc kubenswrapper[4922]: I0218 11:50:00.352031 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0573a6f7-8a5e-4083-8dc6-64608707229c-kube-api-access-q4485" (OuterVolumeSpecName: "kube-api-access-q4485") pod "0573a6f7-8a5e-4083-8dc6-64608707229c" (UID: "0573a6f7-8a5e-4083-8dc6-64608707229c"). InnerVolumeSpecName "kube-api-access-q4485". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:50:00 crc kubenswrapper[4922]: I0218 11:50:00.447409 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4485\" (UniqueName: \"kubernetes.io/projected/0573a6f7-8a5e-4083-8dc6-64608707229c-kube-api-access-q4485\") on node \"crc\" DevicePath \"\"" Feb 18 11:50:00 crc kubenswrapper[4922]: I0218 11:50:00.447447 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0573a6f7-8a5e-4083-8dc6-64608707229c-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 11:50:00 crc kubenswrapper[4922]: I0218 11:50:00.491833 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0573a6f7-8a5e-4083-8dc6-64608707229c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0573a6f7-8a5e-4083-8dc6-64608707229c" (UID: "0573a6f7-8a5e-4083-8dc6-64608707229c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:50:00 crc kubenswrapper[4922]: I0218 11:50:00.548217 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0573a6f7-8a5e-4083-8dc6-64608707229c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 11:50:01 crc kubenswrapper[4922]: I0218 11:50:01.183722 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8cjdv" event={"ID":"0573a6f7-8a5e-4083-8dc6-64608707229c","Type":"ContainerDied","Data":"87d76ad8e6106ce18111899a657b4a0c77b34e29c2e101027714b79ae306611e"} Feb 18 11:50:01 crc kubenswrapper[4922]: I0218 11:50:01.183788 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8cjdv" Feb 18 11:50:01 crc kubenswrapper[4922]: I0218 11:50:01.184017 4922 scope.go:117] "RemoveContainer" containerID="88ca51ca1d2b61ac84bdfb12a835fecc00413b6efaf7b24f5fd72c26c077f3eb" Feb 18 11:50:01 crc kubenswrapper[4922]: I0218 11:50:01.203717 4922 scope.go:117] "RemoveContainer" containerID="a2085f95b1cea95e061e2dae349488380da4a4af1b753b095691e935490ce027" Feb 18 11:50:01 crc kubenswrapper[4922]: I0218 11:50:01.208562 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8cjdv"] Feb 18 11:50:01 crc kubenswrapper[4922]: I0218 11:50:01.214057 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8cjdv"] Feb 18 11:50:01 crc kubenswrapper[4922]: I0218 11:50:01.220152 4922 scope.go:117] "RemoveContainer" containerID="518cfc0e2122ba3d8575366d8ca43a4359d78ae39ac7920bc04d33ac9f1a06a7" Feb 18 11:50:01 crc kubenswrapper[4922]: I0218 11:50:01.262955 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-576949b4c-vwcqv"] Feb 18 11:50:01 crc kubenswrapper[4922]: E0218 11:50:01.263174 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="188679bc-8b67-4136-94ce-fa515c1c950a" containerName="extract" Feb 18 11:50:01 crc kubenswrapper[4922]: I0218 11:50:01.263185 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="188679bc-8b67-4136-94ce-fa515c1c950a" containerName="extract" Feb 18 11:50:01 crc kubenswrapper[4922]: E0218 11:50:01.263201 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14e81dbf-6c73-481c-b758-4c15cc0f3258" containerName="console" Feb 18 11:50:01 crc kubenswrapper[4922]: I0218 11:50:01.263207 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="14e81dbf-6c73-481c-b758-4c15cc0f3258" containerName="console" Feb 18 11:50:01 crc kubenswrapper[4922]: E0218 11:50:01.263216 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0573a6f7-8a5e-4083-8dc6-64608707229c" containerName="registry-server" Feb 18 11:50:01 crc kubenswrapper[4922]: I0218 11:50:01.263223 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="0573a6f7-8a5e-4083-8dc6-64608707229c" containerName="registry-server" Feb 18 11:50:01 crc kubenswrapper[4922]: E0218 11:50:01.263232 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0573a6f7-8a5e-4083-8dc6-64608707229c" containerName="extract-utilities" Feb 18 11:50:01 crc kubenswrapper[4922]: I0218 11:50:01.263238 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="0573a6f7-8a5e-4083-8dc6-64608707229c" containerName="extract-utilities" Feb 18 11:50:01 crc kubenswrapper[4922]: E0218 11:50:01.263248 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="188679bc-8b67-4136-94ce-fa515c1c950a" containerName="pull" Feb 18 11:50:01 crc kubenswrapper[4922]: I0218 11:50:01.263254 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="188679bc-8b67-4136-94ce-fa515c1c950a" containerName="pull" Feb 18 11:50:01 crc kubenswrapper[4922]: E0218 11:50:01.263261 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0573a6f7-8a5e-4083-8dc6-64608707229c" containerName="extract-content" Feb 18 11:50:01 crc kubenswrapper[4922]: I0218 11:50:01.263266 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="0573a6f7-8a5e-4083-8dc6-64608707229c" containerName="extract-content" Feb 18 11:50:01 crc kubenswrapper[4922]: E0218 11:50:01.263276 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="188679bc-8b67-4136-94ce-fa515c1c950a" containerName="util" Feb 18 11:50:01 crc kubenswrapper[4922]: I0218 11:50:01.263281 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="188679bc-8b67-4136-94ce-fa515c1c950a" containerName="util" Feb 18 11:50:01 crc kubenswrapper[4922]: I0218 11:50:01.263394 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="188679bc-8b67-4136-94ce-fa515c1c950a" containerName="extract" Feb 18 11:50:01 crc kubenswrapper[4922]: I0218 11:50:01.263402 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="14e81dbf-6c73-481c-b758-4c15cc0f3258" containerName="console" Feb 18 11:50:01 crc kubenswrapper[4922]: I0218 11:50:01.263414 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="0573a6f7-8a5e-4083-8dc6-64608707229c" containerName="registry-server" Feb 18 11:50:01 crc kubenswrapper[4922]: I0218 11:50:01.263781 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-576949b4c-vwcqv" Feb 18 11:50:01 crc kubenswrapper[4922]: I0218 11:50:01.266078 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 18 11:50:01 crc kubenswrapper[4922]: I0218 11:50:01.266226 4922 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 18 11:50:01 crc kubenswrapper[4922]: I0218 11:50:01.266440 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 18 11:50:01 crc kubenswrapper[4922]: I0218 11:50:01.267513 4922 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 18 11:50:01 crc kubenswrapper[4922]: I0218 11:50:01.267725 4922 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-vdh67" Feb 18 11:50:01 crc kubenswrapper[4922]: I0218 11:50:01.289080 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-576949b4c-vwcqv"] Feb 18 11:50:01 crc kubenswrapper[4922]: I0218 11:50:01.457893 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5vs5\" (UniqueName: \"kubernetes.io/projected/9fbb7bfe-c8d9-4a50-9326-bf07e99f4336-kube-api-access-x5vs5\") pod \"metallb-operator-controller-manager-576949b4c-vwcqv\" (UID: \"9fbb7bfe-c8d9-4a50-9326-bf07e99f4336\") " pod="metallb-system/metallb-operator-controller-manager-576949b4c-vwcqv" Feb 18 11:50:01 crc kubenswrapper[4922]: I0218 11:50:01.457942 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9fbb7bfe-c8d9-4a50-9326-bf07e99f4336-apiservice-cert\") pod \"metallb-operator-controller-manager-576949b4c-vwcqv\" (UID: \"9fbb7bfe-c8d9-4a50-9326-bf07e99f4336\") " pod="metallb-system/metallb-operator-controller-manager-576949b4c-vwcqv" Feb 18 11:50:01 crc kubenswrapper[4922]: I0218 11:50:01.457967 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9fbb7bfe-c8d9-4a50-9326-bf07e99f4336-webhook-cert\") pod \"metallb-operator-controller-manager-576949b4c-vwcqv\" (UID: \"9fbb7bfe-c8d9-4a50-9326-bf07e99f4336\") " pod="metallb-system/metallb-operator-controller-manager-576949b4c-vwcqv" Feb 18 11:50:01 crc kubenswrapper[4922]: I0218 11:50:01.507209 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-6d8c5554f7-psxr7"] Feb 18 11:50:01 crc kubenswrapper[4922]: I0218 11:50:01.508444 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6d8c5554f7-psxr7" Feb 18 11:50:01 crc kubenswrapper[4922]: I0218 11:50:01.510295 4922 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-9shkx" Feb 18 11:50:01 crc kubenswrapper[4922]: I0218 11:50:01.510734 4922 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 18 11:50:01 crc kubenswrapper[4922]: I0218 11:50:01.512130 4922 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 18 11:50:01 crc kubenswrapper[4922]: I0218 11:50:01.530601 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6d8c5554f7-psxr7"] Feb 18 11:50:01 crc kubenswrapper[4922]: I0218 11:50:01.558722 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9fbb7bfe-c8d9-4a50-9326-bf07e99f4336-apiservice-cert\") pod \"metallb-operator-controller-manager-576949b4c-vwcqv\" (UID: \"9fbb7bfe-c8d9-4a50-9326-bf07e99f4336\") " pod="metallb-system/metallb-operator-controller-manager-576949b4c-vwcqv" Feb 18 11:50:01 crc kubenswrapper[4922]: I0218 11:50:01.558761 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5vs5\" (UniqueName: \"kubernetes.io/projected/9fbb7bfe-c8d9-4a50-9326-bf07e99f4336-kube-api-access-x5vs5\") pod \"metallb-operator-controller-manager-576949b4c-vwcqv\" (UID: \"9fbb7bfe-c8d9-4a50-9326-bf07e99f4336\") " pod="metallb-system/metallb-operator-controller-manager-576949b4c-vwcqv" Feb 18 11:50:01 crc kubenswrapper[4922]: I0218 11:50:01.558785 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9fbb7bfe-c8d9-4a50-9326-bf07e99f4336-webhook-cert\") pod \"metallb-operator-controller-manager-576949b4c-vwcqv\" (UID: \"9fbb7bfe-c8d9-4a50-9326-bf07e99f4336\") " pod="metallb-system/metallb-operator-controller-manager-576949b4c-vwcqv" Feb 18 11:50:01 crc kubenswrapper[4922]: I0218 11:50:01.564487 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9fbb7bfe-c8d9-4a50-9326-bf07e99f4336-apiservice-cert\") pod \"metallb-operator-controller-manager-576949b4c-vwcqv\" (UID: \"9fbb7bfe-c8d9-4a50-9326-bf07e99f4336\") " pod="metallb-system/metallb-operator-controller-manager-576949b4c-vwcqv" Feb 18 11:50:01 crc kubenswrapper[4922]: I0218 11:50:01.565881 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9fbb7bfe-c8d9-4a50-9326-bf07e99f4336-webhook-cert\") pod \"metallb-operator-controller-manager-576949b4c-vwcqv\" (UID: \"9fbb7bfe-c8d9-4a50-9326-bf07e99f4336\") " pod="metallb-system/metallb-operator-controller-manager-576949b4c-vwcqv" Feb 18 11:50:01 crc kubenswrapper[4922]: I0218 11:50:01.576315 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5vs5\" (UniqueName: \"kubernetes.io/projected/9fbb7bfe-c8d9-4a50-9326-bf07e99f4336-kube-api-access-x5vs5\") pod \"metallb-operator-controller-manager-576949b4c-vwcqv\" (UID: \"9fbb7bfe-c8d9-4a50-9326-bf07e99f4336\") " pod="metallb-system/metallb-operator-controller-manager-576949b4c-vwcqv" Feb 18 11:50:01 crc kubenswrapper[4922]: I0218 11:50:01.582303 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-576949b4c-vwcqv" Feb 18 11:50:01 crc kubenswrapper[4922]: I0218 11:50:01.660521 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxw5f\" (UniqueName: \"kubernetes.io/projected/7c9c6b01-e766-411c-a275-ae7ea3a9659e-kube-api-access-mxw5f\") pod \"metallb-operator-webhook-server-6d8c5554f7-psxr7\" (UID: \"7c9c6b01-e766-411c-a275-ae7ea3a9659e\") " pod="metallb-system/metallb-operator-webhook-server-6d8c5554f7-psxr7" Feb 18 11:50:01 crc kubenswrapper[4922]: I0218 11:50:01.660856 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7c9c6b01-e766-411c-a275-ae7ea3a9659e-apiservice-cert\") pod \"metallb-operator-webhook-server-6d8c5554f7-psxr7\" (UID: \"7c9c6b01-e766-411c-a275-ae7ea3a9659e\") " pod="metallb-system/metallb-operator-webhook-server-6d8c5554f7-psxr7" Feb 18 11:50:01 crc kubenswrapper[4922]: I0218 11:50:01.660908 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7c9c6b01-e766-411c-a275-ae7ea3a9659e-webhook-cert\") pod \"metallb-operator-webhook-server-6d8c5554f7-psxr7\" (UID: \"7c9c6b01-e766-411c-a275-ae7ea3a9659e\") " pod="metallb-system/metallb-operator-webhook-server-6d8c5554f7-psxr7" Feb 18 11:50:01 crc kubenswrapper[4922]: I0218 11:50:01.762488 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7c9c6b01-e766-411c-a275-ae7ea3a9659e-apiservice-cert\") pod \"metallb-operator-webhook-server-6d8c5554f7-psxr7\" (UID: \"7c9c6b01-e766-411c-a275-ae7ea3a9659e\") " pod="metallb-system/metallb-operator-webhook-server-6d8c5554f7-psxr7" Feb 18 11:50:01 crc kubenswrapper[4922]: I0218 11:50:01.762571 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7c9c6b01-e766-411c-a275-ae7ea3a9659e-webhook-cert\") pod \"metallb-operator-webhook-server-6d8c5554f7-psxr7\" (UID: \"7c9c6b01-e766-411c-a275-ae7ea3a9659e\") " pod="metallb-system/metallb-operator-webhook-server-6d8c5554f7-psxr7" Feb 18 11:50:01 crc kubenswrapper[4922]: I0218 11:50:01.762647 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxw5f\" (UniqueName: \"kubernetes.io/projected/7c9c6b01-e766-411c-a275-ae7ea3a9659e-kube-api-access-mxw5f\") pod \"metallb-operator-webhook-server-6d8c5554f7-psxr7\" (UID: \"7c9c6b01-e766-411c-a275-ae7ea3a9659e\") " pod="metallb-system/metallb-operator-webhook-server-6d8c5554f7-psxr7" Feb 18 11:50:01 crc kubenswrapper[4922]: I0218 11:50:01.768645 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7c9c6b01-e766-411c-a275-ae7ea3a9659e-webhook-cert\") pod \"metallb-operator-webhook-server-6d8c5554f7-psxr7\" (UID: \"7c9c6b01-e766-411c-a275-ae7ea3a9659e\") " pod="metallb-system/metallb-operator-webhook-server-6d8c5554f7-psxr7" Feb 18 11:50:01 crc kubenswrapper[4922]: I0218 11:50:01.769980 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7c9c6b01-e766-411c-a275-ae7ea3a9659e-apiservice-cert\") pod \"metallb-operator-webhook-server-6d8c5554f7-psxr7\" (UID: \"7c9c6b01-e766-411c-a275-ae7ea3a9659e\") " pod="metallb-system/metallb-operator-webhook-server-6d8c5554f7-psxr7" Feb 18 11:50:01 crc kubenswrapper[4922]: I0218 11:50:01.780256 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxw5f\" (UniqueName: \"kubernetes.io/projected/7c9c6b01-e766-411c-a275-ae7ea3a9659e-kube-api-access-mxw5f\") pod \"metallb-operator-webhook-server-6d8c5554f7-psxr7\" (UID: \"7c9c6b01-e766-411c-a275-ae7ea3a9659e\") " pod="metallb-system/metallb-operator-webhook-server-6d8c5554f7-psxr7" Feb 18 11:50:01 crc kubenswrapper[4922]: I0218 11:50:01.825430 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6d8c5554f7-psxr7" Feb 18 11:50:01 crc kubenswrapper[4922]: I0218 11:50:01.845415 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-576949b4c-vwcqv"] Feb 18 11:50:01 crc kubenswrapper[4922]: W0218 11:50:01.874595 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fbb7bfe_c8d9_4a50_9326_bf07e99f4336.slice/crio-0e1e249c09887447ab6ab288b159c1b353766d339abbb2bfc52339a64e5c8241 WatchSource:0}: Error finding container 0e1e249c09887447ab6ab288b159c1b353766d339abbb2bfc52339a64e5c8241: Status 404 returned error can't find the container with id 0e1e249c09887447ab6ab288b159c1b353766d339abbb2bfc52339a64e5c8241 Feb 18 11:50:02 crc kubenswrapper[4922]: I0218 11:50:02.190966 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-576949b4c-vwcqv" event={"ID":"9fbb7bfe-c8d9-4a50-9326-bf07e99f4336","Type":"ContainerStarted","Data":"0e1e249c09887447ab6ab288b159c1b353766d339abbb2bfc52339a64e5c8241"} Feb 18 11:50:02 crc kubenswrapper[4922]: I0218 11:50:02.371094 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6d8c5554f7-psxr7"] Feb 18 11:50:02 crc kubenswrapper[4922]: W0218 11:50:02.380377 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c9c6b01_e766_411c_a275_ae7ea3a9659e.slice/crio-c2c6602927306626fdf9e4ef0dfd7403d6411d01d167010965a74fa9221cf333 WatchSource:0}: Error finding container c2c6602927306626fdf9e4ef0dfd7403d6411d01d167010965a74fa9221cf333: Status 404 returned error can't find the container with id c2c6602927306626fdf9e4ef0dfd7403d6411d01d167010965a74fa9221cf333 Feb 18 11:50:02 crc kubenswrapper[4922]: I0218 11:50:02.988301 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0573a6f7-8a5e-4083-8dc6-64608707229c" path="/var/lib/kubelet/pods/0573a6f7-8a5e-4083-8dc6-64608707229c/volumes" Feb 18 11:50:03 crc kubenswrapper[4922]: I0218 11:50:03.199809 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6d8c5554f7-psxr7" event={"ID":"7c9c6b01-e766-411c-a275-ae7ea3a9659e","Type":"ContainerStarted","Data":"c2c6602927306626fdf9e4ef0dfd7403d6411d01d167010965a74fa9221cf333"} Feb 18 11:50:06 crc kubenswrapper[4922]: I0218 11:50:06.231241 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-576949b4c-vwcqv" event={"ID":"9fbb7bfe-c8d9-4a50-9326-bf07e99f4336","Type":"ContainerStarted","Data":"a8c1643848298fd562d8eb39564b149627303cc39e149e5915f0fc077e5615d3"} Feb 18 11:50:06 crc kubenswrapper[4922]: I0218 11:50:06.231641 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-576949b4c-vwcqv" Feb 18 11:50:06 crc kubenswrapper[4922]: I0218 11:50:06.256569 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-576949b4c-vwcqv" podStartSLOduration=1.392976516 podStartE2EDuration="5.256548466s" podCreationTimestamp="2026-02-18 11:50:01 +0000 UTC" firstStartedPulling="2026-02-18 11:50:01.883164954 +0000 UTC m=+803.610869034" lastFinishedPulling="2026-02-18 11:50:05.746736904 +0000 UTC m=+807.474440984" observedRunningTime="2026-02-18 11:50:06.253744387 +0000 UTC m=+807.981448467" watchObservedRunningTime="2026-02-18 11:50:06.256548466 +0000 UTC m=+807.984252546" Feb 18 11:50:08 crc kubenswrapper[4922]: I0218 11:50:08.252393 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6d8c5554f7-psxr7" event={"ID":"7c9c6b01-e766-411c-a275-ae7ea3a9659e","Type":"ContainerStarted","Data":"0dacb45b1950eba87778f5769b3face2a5481b97a135aef796b6629ca69b4a00"} Feb 18 11:50:08 crc kubenswrapper[4922]: I0218 11:50:08.252781 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-6d8c5554f7-psxr7" Feb 18 11:50:08 crc kubenswrapper[4922]: I0218 11:50:08.272345 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-6d8c5554f7-psxr7" podStartSLOduration=1.773910783 podStartE2EDuration="7.272319129s" podCreationTimestamp="2026-02-18 11:50:01 +0000 UTC" firstStartedPulling="2026-02-18 11:50:02.382701684 +0000 UTC m=+804.110405764" lastFinishedPulling="2026-02-18 11:50:07.88111001 +0000 UTC m=+809.608814110" observedRunningTime="2026-02-18 11:50:08.26952243 +0000 UTC m=+809.997226600" watchObservedRunningTime="2026-02-18 11:50:08.272319129 +0000 UTC m=+810.000023219" Feb 18 11:50:09 crc kubenswrapper[4922]: I0218 11:50:09.807572 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 11:50:09 crc kubenswrapper[4922]: I0218 11:50:09.807892 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 11:50:17 crc kubenswrapper[4922]: I0218 11:50:17.080151 4922 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","burstable","pod14e81dbf-6c73-481c-b758-4c15cc0f3258"] err="unable to destroy cgroup paths for cgroup [kubepods burstable pod14e81dbf-6c73-481c-b758-4c15cc0f3258] : Timed out while waiting for systemd to remove kubepods-burstable-pod14e81dbf_6c73_481c_b758_4c15cc0f3258.slice" Feb 18 11:50:21 crc kubenswrapper[4922]: I0218 11:50:21.829599 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-6d8c5554f7-psxr7" Feb 18 11:50:39 crc kubenswrapper[4922]: I0218 11:50:39.808144 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 11:50:39 crc kubenswrapper[4922]: I0218 11:50:39.808749 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 11:50:39 crc kubenswrapper[4922]: I0218 11:50:39.808801 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-znglx" Feb 18 11:50:39 crc kubenswrapper[4922]: I0218 11:50:39.809484 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0ea4c69ba94e3a69c2a9d6932ead886d1aa8f5af4ec72d79e294ae3c3d8f54dd"} pod="openshift-machine-config-operator/machine-config-daemon-znglx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 11:50:39 crc kubenswrapper[4922]: I0218 11:50:39.809554 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" containerID="cri-o://0ea4c69ba94e3a69c2a9d6932ead886d1aa8f5af4ec72d79e294ae3c3d8f54dd" gracePeriod=600 Feb 18 11:50:40 crc kubenswrapper[4922]: I0218 11:50:40.451310 4922 generic.go:334] "Generic (PLEG): container finished" podID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerID="0ea4c69ba94e3a69c2a9d6932ead886d1aa8f5af4ec72d79e294ae3c3d8f54dd" exitCode=0 Feb 18 11:50:40 crc kubenswrapper[4922]: I0218 11:50:40.451390 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" event={"ID":"fdb7cedc-b2e3-48f0-80e0-e17073b43228","Type":"ContainerDied","Data":"0ea4c69ba94e3a69c2a9d6932ead886d1aa8f5af4ec72d79e294ae3c3d8f54dd"} Feb 18 11:50:40 crc kubenswrapper[4922]: I0218 11:50:40.451882 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" event={"ID":"fdb7cedc-b2e3-48f0-80e0-e17073b43228","Type":"ContainerStarted","Data":"3ecb5c1316b12312a3aa69dd6c259c33fc3f21c1c782e1e74d55d4b725fb05a8"} Feb 18 11:50:40 crc kubenswrapper[4922]: I0218 11:50:40.451904 4922 scope.go:117] "RemoveContainer" containerID="02db91f6fc3d787614fdd835995da44a7dcbace8afdf71101bf73a1aefb8d53b" Feb 18 11:50:41 crc kubenswrapper[4922]: I0218 11:50:41.585351 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-576949b4c-vwcqv" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.314975 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-9cn2f"] Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.316578 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-9cn2f" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.320442 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-fwwpd"] Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.322642 4922 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-c2whk" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.323734 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-fwwpd" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.323733 4922 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.334094 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.334091 4922 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.361975 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-9cn2f"] Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.464691 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-7rvcx"] Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.465810 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-7rvcx" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.470894 4922 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.471006 4922 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.471216 4922 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-2dznj" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.473327 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.482021 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-69bbfbf88f-8ds4f"] Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.483139 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-8ds4f" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.485326 4922 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.505920 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqqbk\" (UniqueName: \"kubernetes.io/projected/74e84ee6-9d14-48aa-9e59-f1ee46e15fcf-kube-api-access-bqqbk\") pod \"frr-k8s-webhook-server-78b44bf5bb-9cn2f\" (UID: \"74e84ee6-9d14-48aa-9e59-f1ee46e15fcf\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-9cn2f" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.505969 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d069bacc-29a2-4aeb-9437-e654621c73c8-metrics\") pod \"frr-k8s-fwwpd\" (UID: \"d069bacc-29a2-4aeb-9437-e654621c73c8\") " pod="metallb-system/frr-k8s-fwwpd" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.506032 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d069bacc-29a2-4aeb-9437-e654621c73c8-frr-sockets\") pod \"frr-k8s-fwwpd\" (UID: \"d069bacc-29a2-4aeb-9437-e654621c73c8\") " pod="metallb-system/frr-k8s-fwwpd" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.506125 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvzvb\" (UniqueName: \"kubernetes.io/projected/d069bacc-29a2-4aeb-9437-e654621c73c8-kube-api-access-zvzvb\") pod \"frr-k8s-fwwpd\" (UID: \"d069bacc-29a2-4aeb-9437-e654621c73c8\") " pod="metallb-system/frr-k8s-fwwpd" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.506179 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d069bacc-29a2-4aeb-9437-e654621c73c8-metrics-certs\") pod \"frr-k8s-fwwpd\" (UID: \"d069bacc-29a2-4aeb-9437-e654621c73c8\") " pod="metallb-system/frr-k8s-fwwpd" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.506235 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d069bacc-29a2-4aeb-9437-e654621c73c8-frr-startup\") pod \"frr-k8s-fwwpd\" (UID: \"d069bacc-29a2-4aeb-9437-e654621c73c8\") " pod="metallb-system/frr-k8s-fwwpd" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.506268 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d069bacc-29a2-4aeb-9437-e654621c73c8-reloader\") pod \"frr-k8s-fwwpd\" (UID: \"d069bacc-29a2-4aeb-9437-e654621c73c8\") " pod="metallb-system/frr-k8s-fwwpd" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.506318 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d069bacc-29a2-4aeb-9437-e654621c73c8-frr-conf\") pod \"frr-k8s-fwwpd\" (UID: \"d069bacc-29a2-4aeb-9437-e654621c73c8\") " pod="metallb-system/frr-k8s-fwwpd" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.506337 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/74e84ee6-9d14-48aa-9e59-f1ee46e15fcf-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-9cn2f\" (UID: \"74e84ee6-9d14-48aa-9e59-f1ee46e15fcf\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-9cn2f" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.506443 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-8ds4f"] Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.607583 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d069bacc-29a2-4aeb-9437-e654621c73c8-metrics-certs\") pod \"frr-k8s-fwwpd\" (UID: \"d069bacc-29a2-4aeb-9437-e654621c73c8\") " pod="metallb-system/frr-k8s-fwwpd" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.607633 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d069bacc-29a2-4aeb-9437-e654621c73c8-frr-startup\") pod \"frr-k8s-fwwpd\" (UID: \"d069bacc-29a2-4aeb-9437-e654621c73c8\") " pod="metallb-system/frr-k8s-fwwpd" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.607668 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d069bacc-29a2-4aeb-9437-e654621c73c8-reloader\") pod \"frr-k8s-fwwpd\" (UID: \"d069bacc-29a2-4aeb-9437-e654621c73c8\") " pod="metallb-system/frr-k8s-fwwpd" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.607710 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7989\" (UniqueName: \"kubernetes.io/projected/4e80d896-3eb4-4dc8-b217-441a5a09dd05-kube-api-access-t7989\") pod \"controller-69bbfbf88f-8ds4f\" (UID: \"4e80d896-3eb4-4dc8-b217-441a5a09dd05\") " pod="metallb-system/controller-69bbfbf88f-8ds4f" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.607736 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d069bacc-29a2-4aeb-9437-e654621c73c8-frr-conf\") pod \"frr-k8s-fwwpd\" (UID: \"d069bacc-29a2-4aeb-9437-e654621c73c8\") " pod="metallb-system/frr-k8s-fwwpd" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.607762 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/74e84ee6-9d14-48aa-9e59-f1ee46e15fcf-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-9cn2f\" (UID: \"74e84ee6-9d14-48aa-9e59-f1ee46e15fcf\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-9cn2f" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.607783 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aa729491-0a34-4772-8178-d8566c355add-metrics-certs\") pod \"speaker-7rvcx\" (UID: \"aa729491-0a34-4772-8178-d8566c355add\") " pod="metallb-system/speaker-7rvcx" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.607817 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/aa729491-0a34-4772-8178-d8566c355add-memberlist\") pod \"speaker-7rvcx\" (UID: \"aa729491-0a34-4772-8178-d8566c355add\") " pod="metallb-system/speaker-7rvcx" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.607853 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqqbk\" (UniqueName: \"kubernetes.io/projected/74e84ee6-9d14-48aa-9e59-f1ee46e15fcf-kube-api-access-bqqbk\") pod \"frr-k8s-webhook-server-78b44bf5bb-9cn2f\" (UID: \"74e84ee6-9d14-48aa-9e59-f1ee46e15fcf\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-9cn2f" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.607888 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/aa729491-0a34-4772-8178-d8566c355add-metallb-excludel2\") pod \"speaker-7rvcx\" (UID: \"aa729491-0a34-4772-8178-d8566c355add\") " pod="metallb-system/speaker-7rvcx" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.607914 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d069bacc-29a2-4aeb-9437-e654621c73c8-metrics\") pod \"frr-k8s-fwwpd\" (UID: \"d069bacc-29a2-4aeb-9437-e654621c73c8\") " pod="metallb-system/frr-k8s-fwwpd" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.607958 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d069bacc-29a2-4aeb-9437-e654621c73c8-frr-sockets\") pod \"frr-k8s-fwwpd\" (UID: \"d069bacc-29a2-4aeb-9437-e654621c73c8\") " pod="metallb-system/frr-k8s-fwwpd" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.607983 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e80d896-3eb4-4dc8-b217-441a5a09dd05-cert\") pod \"controller-69bbfbf88f-8ds4f\" (UID: \"4e80d896-3eb4-4dc8-b217-441a5a09dd05\") " pod="metallb-system/controller-69bbfbf88f-8ds4f" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.608010 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxh8k\" (UniqueName: \"kubernetes.io/projected/aa729491-0a34-4772-8178-d8566c355add-kube-api-access-cxh8k\") pod \"speaker-7rvcx\" (UID: \"aa729491-0a34-4772-8178-d8566c355add\") " pod="metallb-system/speaker-7rvcx" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.608064 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4e80d896-3eb4-4dc8-b217-441a5a09dd05-metrics-certs\") pod \"controller-69bbfbf88f-8ds4f\" (UID: \"4e80d896-3eb4-4dc8-b217-441a5a09dd05\") " pod="metallb-system/controller-69bbfbf88f-8ds4f" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.608084 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvzvb\" (UniqueName: \"kubernetes.io/projected/d069bacc-29a2-4aeb-9437-e654621c73c8-kube-api-access-zvzvb\") pod \"frr-k8s-fwwpd\" (UID: \"d069bacc-29a2-4aeb-9437-e654621c73c8\") " pod="metallb-system/frr-k8s-fwwpd" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.608583 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d069bacc-29a2-4aeb-9437-e654621c73c8-reloader\") pod \"frr-k8s-fwwpd\" (UID: \"d069bacc-29a2-4aeb-9437-e654621c73c8\") " pod="metallb-system/frr-k8s-fwwpd" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.608966 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d069bacc-29a2-4aeb-9437-e654621c73c8-frr-startup\") pod \"frr-k8s-fwwpd\" (UID: \"d069bacc-29a2-4aeb-9437-e654621c73c8\") " pod="metallb-system/frr-k8s-fwwpd" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.609090 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d069bacc-29a2-4aeb-9437-e654621c73c8-frr-sockets\") pod \"frr-k8s-fwwpd\" (UID: \"d069bacc-29a2-4aeb-9437-e654621c73c8\") " pod="metallb-system/frr-k8s-fwwpd" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.609111 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d069bacc-29a2-4aeb-9437-e654621c73c8-frr-conf\") pod \"frr-k8s-fwwpd\" (UID: \"d069bacc-29a2-4aeb-9437-e654621c73c8\") " pod="metallb-system/frr-k8s-fwwpd" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.609328 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d069bacc-29a2-4aeb-9437-e654621c73c8-metrics\") pod \"frr-k8s-fwwpd\" (UID: \"d069bacc-29a2-4aeb-9437-e654621c73c8\") " pod="metallb-system/frr-k8s-fwwpd" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.613921 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d069bacc-29a2-4aeb-9437-e654621c73c8-metrics-certs\") pod \"frr-k8s-fwwpd\" (UID: \"d069bacc-29a2-4aeb-9437-e654621c73c8\") " pod="metallb-system/frr-k8s-fwwpd" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.614088 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/74e84ee6-9d14-48aa-9e59-f1ee46e15fcf-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-9cn2f\" (UID: \"74e84ee6-9d14-48aa-9e59-f1ee46e15fcf\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-9cn2f" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.626564 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvzvb\" (UniqueName: \"kubernetes.io/projected/d069bacc-29a2-4aeb-9437-e654621c73c8-kube-api-access-zvzvb\") pod \"frr-k8s-fwwpd\" (UID: \"d069bacc-29a2-4aeb-9437-e654621c73c8\") " pod="metallb-system/frr-k8s-fwwpd" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.628219 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqqbk\" (UniqueName: \"kubernetes.io/projected/74e84ee6-9d14-48aa-9e59-f1ee46e15fcf-kube-api-access-bqqbk\") pod \"frr-k8s-webhook-server-78b44bf5bb-9cn2f\" (UID: \"74e84ee6-9d14-48aa-9e59-f1ee46e15fcf\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-9cn2f" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.640273 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-9cn2f" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.652731 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-fwwpd" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.709938 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7989\" (UniqueName: \"kubernetes.io/projected/4e80d896-3eb4-4dc8-b217-441a5a09dd05-kube-api-access-t7989\") pod \"controller-69bbfbf88f-8ds4f\" (UID: \"4e80d896-3eb4-4dc8-b217-441a5a09dd05\") " pod="metallb-system/controller-69bbfbf88f-8ds4f" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.709993 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aa729491-0a34-4772-8178-d8566c355add-metrics-certs\") pod \"speaker-7rvcx\" (UID: \"aa729491-0a34-4772-8178-d8566c355add\") " pod="metallb-system/speaker-7rvcx" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.710021 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/aa729491-0a34-4772-8178-d8566c355add-memberlist\") pod \"speaker-7rvcx\" (UID: \"aa729491-0a34-4772-8178-d8566c355add\") " pod="metallb-system/speaker-7rvcx" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.710064 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/aa729491-0a34-4772-8178-d8566c355add-metallb-excludel2\") pod \"speaker-7rvcx\" (UID: \"aa729491-0a34-4772-8178-d8566c355add\") " pod="metallb-system/speaker-7rvcx" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.710108 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e80d896-3eb4-4dc8-b217-441a5a09dd05-cert\") pod \"controller-69bbfbf88f-8ds4f\" (UID: \"4e80d896-3eb4-4dc8-b217-441a5a09dd05\") " pod="metallb-system/controller-69bbfbf88f-8ds4f" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.710134 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxh8k\" (UniqueName: \"kubernetes.io/projected/aa729491-0a34-4772-8178-d8566c355add-kube-api-access-cxh8k\") pod \"speaker-7rvcx\" (UID: \"aa729491-0a34-4772-8178-d8566c355add\") " pod="metallb-system/speaker-7rvcx" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.710165 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4e80d896-3eb4-4dc8-b217-441a5a09dd05-metrics-certs\") pod \"controller-69bbfbf88f-8ds4f\" (UID: \"4e80d896-3eb4-4dc8-b217-441a5a09dd05\") " pod="metallb-system/controller-69bbfbf88f-8ds4f" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.711114 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/aa729491-0a34-4772-8178-d8566c355add-metallb-excludel2\") pod \"speaker-7rvcx\" (UID: \"aa729491-0a34-4772-8178-d8566c355add\") " pod="metallb-system/speaker-7rvcx" Feb 18 11:50:42 crc kubenswrapper[4922]: E0218 11:50:42.711234 4922 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 18 11:50:42 crc kubenswrapper[4922]: E0218 11:50:42.711280 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa729491-0a34-4772-8178-d8566c355add-memberlist podName:aa729491-0a34-4772-8178-d8566c355add nodeName:}" failed. No retries permitted until 2026-02-18 11:50:43.21126672 +0000 UTC m=+844.938970800 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/aa729491-0a34-4772-8178-d8566c355add-memberlist") pod "speaker-7rvcx" (UID: "aa729491-0a34-4772-8178-d8566c355add") : secret "metallb-memberlist" not found Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.714669 4922 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.714865 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aa729491-0a34-4772-8178-d8566c355add-metrics-certs\") pod \"speaker-7rvcx\" (UID: \"aa729491-0a34-4772-8178-d8566c355add\") " pod="metallb-system/speaker-7rvcx" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.715129 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4e80d896-3eb4-4dc8-b217-441a5a09dd05-metrics-certs\") pod \"controller-69bbfbf88f-8ds4f\" (UID: \"4e80d896-3eb4-4dc8-b217-441a5a09dd05\") " pod="metallb-system/controller-69bbfbf88f-8ds4f" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.727113 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e80d896-3eb4-4dc8-b217-441a5a09dd05-cert\") pod \"controller-69bbfbf88f-8ds4f\" (UID: \"4e80d896-3eb4-4dc8-b217-441a5a09dd05\") " pod="metallb-system/controller-69bbfbf88f-8ds4f" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.731617 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7989\" (UniqueName: \"kubernetes.io/projected/4e80d896-3eb4-4dc8-b217-441a5a09dd05-kube-api-access-t7989\") pod \"controller-69bbfbf88f-8ds4f\" (UID: \"4e80d896-3eb4-4dc8-b217-441a5a09dd05\") " pod="metallb-system/controller-69bbfbf88f-8ds4f" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.749639 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxh8k\" (UniqueName: \"kubernetes.io/projected/aa729491-0a34-4772-8178-d8566c355add-kube-api-access-cxh8k\") pod \"speaker-7rvcx\" (UID: \"aa729491-0a34-4772-8178-d8566c355add\") " pod="metallb-system/speaker-7rvcx" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.802640 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-8ds4f" Feb 18 11:50:42 crc kubenswrapper[4922]: I0218 11:50:42.876686 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-9cn2f"] Feb 18 11:50:43 crc kubenswrapper[4922]: I0218 11:50:43.002729 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-8ds4f"] Feb 18 11:50:43 crc kubenswrapper[4922]: W0218 11:50:43.007165 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e80d896_3eb4_4dc8_b217_441a5a09dd05.slice/crio-9e0ca6b826edd881d338755a10261f41cd0607a7eb79366ee36d376833725a87 WatchSource:0}: Error finding container 9e0ca6b826edd881d338755a10261f41cd0607a7eb79366ee36d376833725a87: Status 404 returned error can't find the container with id 9e0ca6b826edd881d338755a10261f41cd0607a7eb79366ee36d376833725a87 Feb 18 11:50:43 crc kubenswrapper[4922]: I0218 11:50:43.218887 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/aa729491-0a34-4772-8178-d8566c355add-memberlist\") pod \"speaker-7rvcx\" (UID: \"aa729491-0a34-4772-8178-d8566c355add\") " pod="metallb-system/speaker-7rvcx" Feb 18 11:50:43 crc kubenswrapper[4922]: I0218 11:50:43.227006 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/aa729491-0a34-4772-8178-d8566c355add-memberlist\") pod \"speaker-7rvcx\" (UID: \"aa729491-0a34-4772-8178-d8566c355add\") " pod="metallb-system/speaker-7rvcx" Feb 18 11:50:43 crc kubenswrapper[4922]: I0218 11:50:43.379421 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-7rvcx" Feb 18 11:50:43 crc kubenswrapper[4922]: W0218 11:50:43.401872 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa729491_0a34_4772_8178_d8566c355add.slice/crio-fd6b3bf8d78b636ec606e71a7241b2afc71f99d1f761934d419f6dab90b2bc54 WatchSource:0}: Error finding container fd6b3bf8d78b636ec606e71a7241b2afc71f99d1f761934d419f6dab90b2bc54: Status 404 returned error can't find the container with id fd6b3bf8d78b636ec606e71a7241b2afc71f99d1f761934d419f6dab90b2bc54 Feb 18 11:50:43 crc kubenswrapper[4922]: I0218 11:50:43.498281 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-9cn2f" event={"ID":"74e84ee6-9d14-48aa-9e59-f1ee46e15fcf","Type":"ContainerStarted","Data":"c700c4485c2f30fff8a764e4fe0d5d21ed650a7278e8cc60dfa005229bf71587"} Feb 18 11:50:43 crc kubenswrapper[4922]: I0218 11:50:43.499978 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fwwpd" event={"ID":"d069bacc-29a2-4aeb-9437-e654621c73c8","Type":"ContainerStarted","Data":"dd093157a54d99e46ba2c1e7b3479f74534b4f69d660714eb9cb7fab2bee10a3"} Feb 18 11:50:43 crc kubenswrapper[4922]: I0218 11:50:43.502258 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-7rvcx" event={"ID":"aa729491-0a34-4772-8178-d8566c355add","Type":"ContainerStarted","Data":"fd6b3bf8d78b636ec606e71a7241b2afc71f99d1f761934d419f6dab90b2bc54"} Feb 18 11:50:43 crc kubenswrapper[4922]: I0218 11:50:43.508975 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-8ds4f" event={"ID":"4e80d896-3eb4-4dc8-b217-441a5a09dd05","Type":"ContainerStarted","Data":"c28e7f3963f59d42e3f95d86350d9c59cfb841c6daabe2cf1440919548cf92fa"} Feb 18 11:50:43 crc kubenswrapper[4922]: I0218 11:50:43.509051 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-8ds4f" event={"ID":"4e80d896-3eb4-4dc8-b217-441a5a09dd05","Type":"ContainerStarted","Data":"1a095f7b3232e73c8ddc8b63f1c0c79dd1c30f459842e80e5416316587b84041"} Feb 18 11:50:43 crc kubenswrapper[4922]: I0218 11:50:43.509073 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-8ds4f" event={"ID":"4e80d896-3eb4-4dc8-b217-441a5a09dd05","Type":"ContainerStarted","Data":"9e0ca6b826edd881d338755a10261f41cd0607a7eb79366ee36d376833725a87"} Feb 18 11:50:43 crc kubenswrapper[4922]: I0218 11:50:43.509220 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-69bbfbf88f-8ds4f" Feb 18 11:50:43 crc kubenswrapper[4922]: I0218 11:50:43.532741 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-69bbfbf88f-8ds4f" podStartSLOduration=1.532703717 podStartE2EDuration="1.532703717s" podCreationTimestamp="2026-02-18 11:50:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:50:43.526821683 +0000 UTC m=+845.254525783" watchObservedRunningTime="2026-02-18 11:50:43.532703717 +0000 UTC m=+845.260407797" Feb 18 11:50:44 crc kubenswrapper[4922]: I0218 11:50:44.517306 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-7rvcx" event={"ID":"aa729491-0a34-4772-8178-d8566c355add","Type":"ContainerStarted","Data":"80cbd1fd2ea9db1f07cfe08607f0bf5fbb3d830bda781f3858b1f3689337c2b7"} Feb 18 11:50:44 crc kubenswrapper[4922]: I0218 11:50:44.518565 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-7rvcx" event={"ID":"aa729491-0a34-4772-8178-d8566c355add","Type":"ContainerStarted","Data":"f59d365f37d4f76f87e5b85c0896de123746606dd89d25dbb109c0fedcde263c"} Feb 18 11:50:44 crc kubenswrapper[4922]: I0218 11:50:44.518605 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-7rvcx" Feb 18 11:50:44 crc kubenswrapper[4922]: I0218 11:50:44.538638 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-7rvcx" podStartSLOduration=2.538608174 podStartE2EDuration="2.538608174s" podCreationTimestamp="2026-02-18 11:50:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:50:44.53763049 +0000 UTC m=+846.265334570" watchObservedRunningTime="2026-02-18 11:50:44.538608174 +0000 UTC m=+846.266312254" Feb 18 11:50:50 crc kubenswrapper[4922]: I0218 11:50:50.567042 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-9cn2f" event={"ID":"74e84ee6-9d14-48aa-9e59-f1ee46e15fcf","Type":"ContainerStarted","Data":"ba2534187ceeb73d0d90dddec4dd035aff8b92f51719eadcfd2a3263ab03a830"} Feb 18 11:50:50 crc kubenswrapper[4922]: I0218 11:50:50.567650 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-9cn2f" Feb 18 11:50:50 crc kubenswrapper[4922]: I0218 11:50:50.569108 4922 generic.go:334] "Generic (PLEG): container finished" podID="d069bacc-29a2-4aeb-9437-e654621c73c8" containerID="dd2c12a86f634e49366c0496a7fa0b1cb6cdd22fb0ca2230ef33223ed8465cd5" exitCode=0 Feb 18 11:50:50 crc kubenswrapper[4922]: I0218 11:50:50.569149 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fwwpd" event={"ID":"d069bacc-29a2-4aeb-9437-e654621c73c8","Type":"ContainerDied","Data":"dd2c12a86f634e49366c0496a7fa0b1cb6cdd22fb0ca2230ef33223ed8465cd5"} Feb 18 11:50:50 crc kubenswrapper[4922]: I0218 11:50:50.591049 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-9cn2f" podStartSLOduration=1.396541122 podStartE2EDuration="8.591031148s" podCreationTimestamp="2026-02-18 11:50:42 +0000 UTC" firstStartedPulling="2026-02-18 11:50:42.88631476 +0000 UTC m=+844.614018840" lastFinishedPulling="2026-02-18 11:50:50.080804776 +0000 UTC m=+851.808508866" observedRunningTime="2026-02-18 11:50:50.587715117 +0000 UTC m=+852.315419197" watchObservedRunningTime="2026-02-18 11:50:50.591031148 +0000 UTC m=+852.318735228" Feb 18 11:50:51 crc kubenswrapper[4922]: I0218 11:50:51.577968 4922 generic.go:334] "Generic (PLEG): container finished" podID="d069bacc-29a2-4aeb-9437-e654621c73c8" containerID="6c84959e9441ad2a088e1d84890e555a1afcc5db5a316080377dfcaef787553d" exitCode=0 Feb 18 11:50:51 crc kubenswrapper[4922]: I0218 11:50:51.578040 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fwwpd" event={"ID":"d069bacc-29a2-4aeb-9437-e654621c73c8","Type":"ContainerDied","Data":"6c84959e9441ad2a088e1d84890e555a1afcc5db5a316080377dfcaef787553d"} Feb 18 11:50:52 crc kubenswrapper[4922]: I0218 11:50:52.586430 4922 generic.go:334] "Generic (PLEG): container finished" podID="d069bacc-29a2-4aeb-9437-e654621c73c8" containerID="21966ed5eca5df8b6a2c98f7afcf86a7948ab14135f37cac606291a98f8e4595" exitCode=0 Feb 18 11:50:52 crc kubenswrapper[4922]: I0218 11:50:52.586473 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fwwpd" event={"ID":"d069bacc-29a2-4aeb-9437-e654621c73c8","Type":"ContainerDied","Data":"21966ed5eca5df8b6a2c98f7afcf86a7948ab14135f37cac606291a98f8e4595"} Feb 18 11:50:53 crc kubenswrapper[4922]: I0218 11:50:53.382146 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-7rvcx" Feb 18 11:50:53 crc kubenswrapper[4922]: I0218 11:50:53.604058 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fwwpd" event={"ID":"d069bacc-29a2-4aeb-9437-e654621c73c8","Type":"ContainerStarted","Data":"cccacbb85d963e92575c7289808168a688091dbb6f20629a1bb6201480f3feb3"} Feb 18 11:50:53 crc kubenswrapper[4922]: I0218 11:50:53.604121 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fwwpd" event={"ID":"d069bacc-29a2-4aeb-9437-e654621c73c8","Type":"ContainerStarted","Data":"473686045b41915ce5b680985e16192d4c921709f6508bdb13249c8baea3a22b"} Feb 18 11:50:53 crc kubenswrapper[4922]: I0218 11:50:53.604131 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fwwpd" event={"ID":"d069bacc-29a2-4aeb-9437-e654621c73c8","Type":"ContainerStarted","Data":"193aab8d8ca9a911883eddc72edcc8981c1085200f0a894cb79d7596a3fab13d"} Feb 18 11:50:53 crc kubenswrapper[4922]: I0218 11:50:53.604139 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fwwpd" event={"ID":"d069bacc-29a2-4aeb-9437-e654621c73c8","Type":"ContainerStarted","Data":"a58ea9fc19026d27e1bb4484655923385bd5f4ea4ac7e9897e8d634315786c2f"} Feb 18 11:50:53 crc kubenswrapper[4922]: I0218 11:50:53.604164 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fwwpd" event={"ID":"d069bacc-29a2-4aeb-9437-e654621c73c8","Type":"ContainerStarted","Data":"dcd9db767feb7caafeeb4bd85518538028ae08c124f64fd727f6812221d071e4"} Feb 18 11:50:54 crc kubenswrapper[4922]: I0218 11:50:54.616130 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fwwpd" event={"ID":"d069bacc-29a2-4aeb-9437-e654621c73c8","Type":"ContainerStarted","Data":"5fc9d22b47ab0d541e8edfc67262724185c66fdb775d5c51055e2154bb16f472"} Feb 18 11:50:54 crc kubenswrapper[4922]: I0218 11:50:54.617116 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-fwwpd" Feb 18 11:50:54 crc kubenswrapper[4922]: I0218 11:50:54.638133 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-fwwpd" podStartSLOduration=5.382892618 podStartE2EDuration="12.638115425s" podCreationTimestamp="2026-02-18 11:50:42 +0000 UTC" firstStartedPulling="2026-02-18 11:50:42.83541631 +0000 UTC m=+844.563120390" lastFinishedPulling="2026-02-18 11:50:50.090639117 +0000 UTC m=+851.818343197" observedRunningTime="2026-02-18 11:50:54.636232329 +0000 UTC m=+856.363936409" watchObservedRunningTime="2026-02-18 11:50:54.638115425 +0000 UTC m=+856.365819505" Feb 18 11:50:56 crc kubenswrapper[4922]: I0218 11:50:56.173100 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-h4fcj"] Feb 18 11:50:56 crc kubenswrapper[4922]: I0218 11:50:56.174154 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-h4fcj" Feb 18 11:50:56 crc kubenswrapper[4922]: I0218 11:50:56.179796 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-8m47b" Feb 18 11:50:56 crc kubenswrapper[4922]: I0218 11:50:56.179856 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 18 11:50:56 crc kubenswrapper[4922]: I0218 11:50:56.179805 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 18 11:50:56 crc kubenswrapper[4922]: I0218 11:50:56.186338 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-h4fcj"] Feb 18 11:50:56 crc kubenswrapper[4922]: I0218 11:50:56.317407 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmjnp\" (UniqueName: \"kubernetes.io/projected/987d782f-67f5-4ce7-bd98-cea59f177e8d-kube-api-access-mmjnp\") pod \"openstack-operator-index-h4fcj\" (UID: \"987d782f-67f5-4ce7-bd98-cea59f177e8d\") " pod="openstack-operators/openstack-operator-index-h4fcj" Feb 18 11:50:56 crc kubenswrapper[4922]: I0218 11:50:56.419223 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmjnp\" (UniqueName: \"kubernetes.io/projected/987d782f-67f5-4ce7-bd98-cea59f177e8d-kube-api-access-mmjnp\") pod \"openstack-operator-index-h4fcj\" (UID: \"987d782f-67f5-4ce7-bd98-cea59f177e8d\") " pod="openstack-operators/openstack-operator-index-h4fcj" Feb 18 11:50:56 crc kubenswrapper[4922]: I0218 11:50:56.437743 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmjnp\" (UniqueName: \"kubernetes.io/projected/987d782f-67f5-4ce7-bd98-cea59f177e8d-kube-api-access-mmjnp\") pod \"openstack-operator-index-h4fcj\" (UID: \"987d782f-67f5-4ce7-bd98-cea59f177e8d\") " pod="openstack-operators/openstack-operator-index-h4fcj" Feb 18 11:50:56 crc kubenswrapper[4922]: I0218 11:50:56.552575 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-h4fcj" Feb 18 11:50:56 crc kubenswrapper[4922]: I0218 11:50:56.955278 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-h4fcj"] Feb 18 11:50:56 crc kubenswrapper[4922]: W0218 11:50:56.967407 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod987d782f_67f5_4ce7_bd98_cea59f177e8d.slice/crio-220b34ee83ba684aad00844ee0d2258af0223cfccb2c56b89dd704093a1ea2f0 WatchSource:0}: Error finding container 220b34ee83ba684aad00844ee0d2258af0223cfccb2c56b89dd704093a1ea2f0: Status 404 returned error can't find the container with id 220b34ee83ba684aad00844ee0d2258af0223cfccb2c56b89dd704093a1ea2f0 Feb 18 11:50:57 crc kubenswrapper[4922]: I0218 11:50:57.640847 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-h4fcj" event={"ID":"987d782f-67f5-4ce7-bd98-cea59f177e8d","Type":"ContainerStarted","Data":"220b34ee83ba684aad00844ee0d2258af0223cfccb2c56b89dd704093a1ea2f0"} Feb 18 11:50:57 crc kubenswrapper[4922]: I0218 11:50:57.653288 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-fwwpd" Feb 18 11:50:57 crc kubenswrapper[4922]: I0218 11:50:57.696135 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-fwwpd" Feb 18 11:50:59 crc kubenswrapper[4922]: I0218 11:50:59.514029 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-h4fcj"] Feb 18 11:51:00 crc kubenswrapper[4922]: I0218 11:51:00.126250 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-8rrxt"] Feb 18 11:51:00 crc kubenswrapper[4922]: I0218 11:51:00.128382 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-8rrxt" Feb 18 11:51:00 crc kubenswrapper[4922]: I0218 11:51:00.134087 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-8rrxt"] Feb 18 11:51:00 crc kubenswrapper[4922]: I0218 11:51:00.273214 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnwsg\" (UniqueName: \"kubernetes.io/projected/191b8ec5-c4e8-4e8c-92c2-fa2fd655f94a-kube-api-access-rnwsg\") pod \"openstack-operator-index-8rrxt\" (UID: \"191b8ec5-c4e8-4e8c-92c2-fa2fd655f94a\") " pod="openstack-operators/openstack-operator-index-8rrxt" Feb 18 11:51:00 crc kubenswrapper[4922]: I0218 11:51:00.373994 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnwsg\" (UniqueName: \"kubernetes.io/projected/191b8ec5-c4e8-4e8c-92c2-fa2fd655f94a-kube-api-access-rnwsg\") pod \"openstack-operator-index-8rrxt\" (UID: \"191b8ec5-c4e8-4e8c-92c2-fa2fd655f94a\") " pod="openstack-operators/openstack-operator-index-8rrxt" Feb 18 11:51:00 crc kubenswrapper[4922]: I0218 11:51:00.396471 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnwsg\" (UniqueName: \"kubernetes.io/projected/191b8ec5-c4e8-4e8c-92c2-fa2fd655f94a-kube-api-access-rnwsg\") pod \"openstack-operator-index-8rrxt\" (UID: \"191b8ec5-c4e8-4e8c-92c2-fa2fd655f94a\") " pod="openstack-operators/openstack-operator-index-8rrxt" Feb 18 11:51:00 crc kubenswrapper[4922]: I0218 11:51:00.448178 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-8rrxt" Feb 18 11:51:01 crc kubenswrapper[4922]: I0218 11:51:01.167253 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-8rrxt"] Feb 18 11:51:01 crc kubenswrapper[4922]: W0218 11:51:01.174138 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod191b8ec5_c4e8_4e8c_92c2_fa2fd655f94a.slice/crio-35ff7845aa20abd551265e8f928cb4147f448a43d60941fcbec09c09708a6618 WatchSource:0}: Error finding container 35ff7845aa20abd551265e8f928cb4147f448a43d60941fcbec09c09708a6618: Status 404 returned error can't find the container with id 35ff7845aa20abd551265e8f928cb4147f448a43d60941fcbec09c09708a6618 Feb 18 11:51:01 crc kubenswrapper[4922]: I0218 11:51:01.667200 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-8rrxt" event={"ID":"191b8ec5-c4e8-4e8c-92c2-fa2fd655f94a","Type":"ContainerStarted","Data":"8e7386d473cb8963d0167a473abef9786a4348b15950a6f1efb44730d0c1bb6a"} Feb 18 11:51:01 crc kubenswrapper[4922]: I0218 11:51:01.667262 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-8rrxt" event={"ID":"191b8ec5-c4e8-4e8c-92c2-fa2fd655f94a","Type":"ContainerStarted","Data":"35ff7845aa20abd551265e8f928cb4147f448a43d60941fcbec09c09708a6618"} Feb 18 11:51:01 crc kubenswrapper[4922]: I0218 11:51:01.669851 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-h4fcj" event={"ID":"987d782f-67f5-4ce7-bd98-cea59f177e8d","Type":"ContainerStarted","Data":"1c29f0c6da2f82b64ece39e9fb4215cb1c2eb6d0c555e56f73f056d501dd0615"} Feb 18 11:51:01 crc kubenswrapper[4922]: I0218 11:51:01.669989 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-h4fcj" podUID="987d782f-67f5-4ce7-bd98-cea59f177e8d" containerName="registry-server" containerID="cri-o://1c29f0c6da2f82b64ece39e9fb4215cb1c2eb6d0c555e56f73f056d501dd0615" gracePeriod=2 Feb 18 11:51:01 crc kubenswrapper[4922]: I0218 11:51:01.688172 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-8rrxt" podStartSLOduration=1.614246297 podStartE2EDuration="1.688153943s" podCreationTimestamp="2026-02-18 11:51:00 +0000 UTC" firstStartedPulling="2026-02-18 11:51:01.180261147 +0000 UTC m=+862.907965217" lastFinishedPulling="2026-02-18 11:51:01.254168783 +0000 UTC m=+862.981872863" observedRunningTime="2026-02-18 11:51:01.687929787 +0000 UTC m=+863.415633867" watchObservedRunningTime="2026-02-18 11:51:01.688153943 +0000 UTC m=+863.415858023" Feb 18 11:51:01 crc kubenswrapper[4922]: I0218 11:51:01.705735 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-h4fcj" podStartSLOduration=1.639864966 podStartE2EDuration="5.705713634s" podCreationTimestamp="2026-02-18 11:50:56 +0000 UTC" firstStartedPulling="2026-02-18 11:50:56.970022713 +0000 UTC m=+858.697726793" lastFinishedPulling="2026-02-18 11:51:01.035871371 +0000 UTC m=+862.763575461" observedRunningTime="2026-02-18 11:51:01.701779547 +0000 UTC m=+863.429483667" watchObservedRunningTime="2026-02-18 11:51:01.705713634 +0000 UTC m=+863.433417714" Feb 18 11:51:02 crc kubenswrapper[4922]: I0218 11:51:02.033903 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-h4fcj" Feb 18 11:51:02 crc kubenswrapper[4922]: I0218 11:51:02.198160 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmjnp\" (UniqueName: \"kubernetes.io/projected/987d782f-67f5-4ce7-bd98-cea59f177e8d-kube-api-access-mmjnp\") pod \"987d782f-67f5-4ce7-bd98-cea59f177e8d\" (UID: \"987d782f-67f5-4ce7-bd98-cea59f177e8d\") " Feb 18 11:51:02 crc kubenswrapper[4922]: I0218 11:51:02.205011 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/987d782f-67f5-4ce7-bd98-cea59f177e8d-kube-api-access-mmjnp" (OuterVolumeSpecName: "kube-api-access-mmjnp") pod "987d782f-67f5-4ce7-bd98-cea59f177e8d" (UID: "987d782f-67f5-4ce7-bd98-cea59f177e8d"). InnerVolumeSpecName "kube-api-access-mmjnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:02 crc kubenswrapper[4922]: I0218 11:51:02.299975 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmjnp\" (UniqueName: \"kubernetes.io/projected/987d782f-67f5-4ce7-bd98-cea59f177e8d-kube-api-access-mmjnp\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:02 crc kubenswrapper[4922]: I0218 11:51:02.648246 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-9cn2f" Feb 18 11:51:02 crc kubenswrapper[4922]: I0218 11:51:02.656133 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-fwwpd" Feb 18 11:51:02 crc kubenswrapper[4922]: I0218 11:51:02.682401 4922 generic.go:334] "Generic (PLEG): container finished" podID="987d782f-67f5-4ce7-bd98-cea59f177e8d" containerID="1c29f0c6da2f82b64ece39e9fb4215cb1c2eb6d0c555e56f73f056d501dd0615" exitCode=0 Feb 18 11:51:02 crc kubenswrapper[4922]: I0218 11:51:02.683035 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-h4fcj" Feb 18 11:51:02 crc kubenswrapper[4922]: I0218 11:51:02.693057 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-h4fcj" event={"ID":"987d782f-67f5-4ce7-bd98-cea59f177e8d","Type":"ContainerDied","Data":"1c29f0c6da2f82b64ece39e9fb4215cb1c2eb6d0c555e56f73f056d501dd0615"} Feb 18 11:51:02 crc kubenswrapper[4922]: I0218 11:51:02.693107 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-h4fcj" event={"ID":"987d782f-67f5-4ce7-bd98-cea59f177e8d","Type":"ContainerDied","Data":"220b34ee83ba684aad00844ee0d2258af0223cfccb2c56b89dd704093a1ea2f0"} Feb 18 11:51:02 crc kubenswrapper[4922]: I0218 11:51:02.693153 4922 scope.go:117] "RemoveContainer" containerID="1c29f0c6da2f82b64ece39e9fb4215cb1c2eb6d0c555e56f73f056d501dd0615" Feb 18 11:51:02 crc kubenswrapper[4922]: I0218 11:51:02.721312 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-h4fcj"] Feb 18 11:51:02 crc kubenswrapper[4922]: I0218 11:51:02.726147 4922 scope.go:117] "RemoveContainer" containerID="1c29f0c6da2f82b64ece39e9fb4215cb1c2eb6d0c555e56f73f056d501dd0615" Feb 18 11:51:02 crc kubenswrapper[4922]: E0218 11:51:02.726586 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c29f0c6da2f82b64ece39e9fb4215cb1c2eb6d0c555e56f73f056d501dd0615\": container with ID starting with 1c29f0c6da2f82b64ece39e9fb4215cb1c2eb6d0c555e56f73f056d501dd0615 not found: ID does not exist" containerID="1c29f0c6da2f82b64ece39e9fb4215cb1c2eb6d0c555e56f73f056d501dd0615" Feb 18 11:51:02 crc kubenswrapper[4922]: I0218 11:51:02.726640 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c29f0c6da2f82b64ece39e9fb4215cb1c2eb6d0c555e56f73f056d501dd0615"} err="failed to get container status \"1c29f0c6da2f82b64ece39e9fb4215cb1c2eb6d0c555e56f73f056d501dd0615\": rpc error: code = NotFound desc = could not find container \"1c29f0c6da2f82b64ece39e9fb4215cb1c2eb6d0c555e56f73f056d501dd0615\": container with ID starting with 1c29f0c6da2f82b64ece39e9fb4215cb1c2eb6d0c555e56f73f056d501dd0615 not found: ID does not exist" Feb 18 11:51:02 crc kubenswrapper[4922]: I0218 11:51:02.728694 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-h4fcj"] Feb 18 11:51:02 crc kubenswrapper[4922]: I0218 11:51:02.808513 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-69bbfbf88f-8ds4f" Feb 18 11:51:02 crc kubenswrapper[4922]: I0218 11:51:02.983574 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="987d782f-67f5-4ce7-bd98-cea59f177e8d" path="/var/lib/kubelet/pods/987d782f-67f5-4ce7-bd98-cea59f177e8d/volumes" Feb 18 11:51:10 crc kubenswrapper[4922]: I0218 11:51:10.449756 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-8rrxt" Feb 18 11:51:10 crc kubenswrapper[4922]: I0218 11:51:10.450258 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-8rrxt" Feb 18 11:51:10 crc kubenswrapper[4922]: I0218 11:51:10.477815 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-8rrxt" Feb 18 11:51:10 crc kubenswrapper[4922]: I0218 11:51:10.776692 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-8rrxt" Feb 18 11:51:11 crc kubenswrapper[4922]: I0218 11:51:11.758035 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/14461323130afa0a42cfbeae9371df3c6df69f0e6e31cfc4d749781171nvjzv"] Feb 18 11:51:11 crc kubenswrapper[4922]: E0218 11:51:11.758270 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="987d782f-67f5-4ce7-bd98-cea59f177e8d" containerName="registry-server" Feb 18 11:51:11 crc kubenswrapper[4922]: I0218 11:51:11.758282 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="987d782f-67f5-4ce7-bd98-cea59f177e8d" containerName="registry-server" Feb 18 11:51:11 crc kubenswrapper[4922]: I0218 11:51:11.758425 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="987d782f-67f5-4ce7-bd98-cea59f177e8d" containerName="registry-server" Feb 18 11:51:11 crc kubenswrapper[4922]: I0218 11:51:11.759277 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/14461323130afa0a42cfbeae9371df3c6df69f0e6e31cfc4d749781171nvjzv" Feb 18 11:51:11 crc kubenswrapper[4922]: I0218 11:51:11.764149 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-lqlmg" Feb 18 11:51:11 crc kubenswrapper[4922]: I0218 11:51:11.768823 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/14461323130afa0a42cfbeae9371df3c6df69f0e6e31cfc4d749781171nvjzv"] Feb 18 11:51:11 crc kubenswrapper[4922]: I0218 11:51:11.843876 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/83694df8-b6fe-4913-8f73-d53972c81f36-util\") pod \"14461323130afa0a42cfbeae9371df3c6df69f0e6e31cfc4d749781171nvjzv\" (UID: \"83694df8-b6fe-4913-8f73-d53972c81f36\") " pod="openstack-operators/14461323130afa0a42cfbeae9371df3c6df69f0e6e31cfc4d749781171nvjzv" Feb 18 11:51:11 crc kubenswrapper[4922]: I0218 11:51:11.843931 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2zrj\" (UniqueName: \"kubernetes.io/projected/83694df8-b6fe-4913-8f73-d53972c81f36-kube-api-access-t2zrj\") pod \"14461323130afa0a42cfbeae9371df3c6df69f0e6e31cfc4d749781171nvjzv\" (UID: \"83694df8-b6fe-4913-8f73-d53972c81f36\") " pod="openstack-operators/14461323130afa0a42cfbeae9371df3c6df69f0e6e31cfc4d749781171nvjzv" Feb 18 11:51:11 crc kubenswrapper[4922]: I0218 11:51:11.843967 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/83694df8-b6fe-4913-8f73-d53972c81f36-bundle\") pod \"14461323130afa0a42cfbeae9371df3c6df69f0e6e31cfc4d749781171nvjzv\" (UID: \"83694df8-b6fe-4913-8f73-d53972c81f36\") " pod="openstack-operators/14461323130afa0a42cfbeae9371df3c6df69f0e6e31cfc4d749781171nvjzv" Feb 18 11:51:11 crc kubenswrapper[4922]: I0218 11:51:11.944790 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/83694df8-b6fe-4913-8f73-d53972c81f36-util\") pod \"14461323130afa0a42cfbeae9371df3c6df69f0e6e31cfc4d749781171nvjzv\" (UID: \"83694df8-b6fe-4913-8f73-d53972c81f36\") " pod="openstack-operators/14461323130afa0a42cfbeae9371df3c6df69f0e6e31cfc4d749781171nvjzv" Feb 18 11:51:11 crc kubenswrapper[4922]: I0218 11:51:11.944848 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2zrj\" (UniqueName: \"kubernetes.io/projected/83694df8-b6fe-4913-8f73-d53972c81f36-kube-api-access-t2zrj\") pod \"14461323130afa0a42cfbeae9371df3c6df69f0e6e31cfc4d749781171nvjzv\" (UID: \"83694df8-b6fe-4913-8f73-d53972c81f36\") " pod="openstack-operators/14461323130afa0a42cfbeae9371df3c6df69f0e6e31cfc4d749781171nvjzv" Feb 18 11:51:11 crc kubenswrapper[4922]: I0218 11:51:11.944882 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/83694df8-b6fe-4913-8f73-d53972c81f36-bundle\") pod \"14461323130afa0a42cfbeae9371df3c6df69f0e6e31cfc4d749781171nvjzv\" (UID: \"83694df8-b6fe-4913-8f73-d53972c81f36\") " pod="openstack-operators/14461323130afa0a42cfbeae9371df3c6df69f0e6e31cfc4d749781171nvjzv" Feb 18 11:51:11 crc kubenswrapper[4922]: I0218 11:51:11.945503 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/83694df8-b6fe-4913-8f73-d53972c81f36-bundle\") pod \"14461323130afa0a42cfbeae9371df3c6df69f0e6e31cfc4d749781171nvjzv\" (UID: \"83694df8-b6fe-4913-8f73-d53972c81f36\") " pod="openstack-operators/14461323130afa0a42cfbeae9371df3c6df69f0e6e31cfc4d749781171nvjzv" Feb 18 11:51:11 crc kubenswrapper[4922]: I0218 11:51:11.945706 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/83694df8-b6fe-4913-8f73-d53972c81f36-util\") pod \"14461323130afa0a42cfbeae9371df3c6df69f0e6e31cfc4d749781171nvjzv\" (UID: \"83694df8-b6fe-4913-8f73-d53972c81f36\") " pod="openstack-operators/14461323130afa0a42cfbeae9371df3c6df69f0e6e31cfc4d749781171nvjzv" Feb 18 11:51:11 crc kubenswrapper[4922]: I0218 11:51:11.971746 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2zrj\" (UniqueName: \"kubernetes.io/projected/83694df8-b6fe-4913-8f73-d53972c81f36-kube-api-access-t2zrj\") pod \"14461323130afa0a42cfbeae9371df3c6df69f0e6e31cfc4d749781171nvjzv\" (UID: \"83694df8-b6fe-4913-8f73-d53972c81f36\") " pod="openstack-operators/14461323130afa0a42cfbeae9371df3c6df69f0e6e31cfc4d749781171nvjzv" Feb 18 11:51:12 crc kubenswrapper[4922]: I0218 11:51:12.081801 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/14461323130afa0a42cfbeae9371df3c6df69f0e6e31cfc4d749781171nvjzv" Feb 18 11:51:12 crc kubenswrapper[4922]: I0218 11:51:12.485728 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/14461323130afa0a42cfbeae9371df3c6df69f0e6e31cfc4d749781171nvjzv"] Feb 18 11:51:12 crc kubenswrapper[4922]: W0218 11:51:12.494379 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83694df8_b6fe_4913_8f73_d53972c81f36.slice/crio-4350210d6e70cfdde3c8b3dc3c6842679a33148f8e693a5bd1aabc74d003ccc7 WatchSource:0}: Error finding container 4350210d6e70cfdde3c8b3dc3c6842679a33148f8e693a5bd1aabc74d003ccc7: Status 404 returned error can't find the container with id 4350210d6e70cfdde3c8b3dc3c6842679a33148f8e693a5bd1aabc74d003ccc7 Feb 18 11:51:12 crc kubenswrapper[4922]: I0218 11:51:12.750577 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/14461323130afa0a42cfbeae9371df3c6df69f0e6e31cfc4d749781171nvjzv" event={"ID":"83694df8-b6fe-4913-8f73-d53972c81f36","Type":"ContainerStarted","Data":"261ae208f78558fd60501df0dcebd0c60e4b4e40aab5ca673884561de1f961e5"} Feb 18 11:51:12 crc kubenswrapper[4922]: I0218 11:51:12.750867 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/14461323130afa0a42cfbeae9371df3c6df69f0e6e31cfc4d749781171nvjzv" event={"ID":"83694df8-b6fe-4913-8f73-d53972c81f36","Type":"ContainerStarted","Data":"4350210d6e70cfdde3c8b3dc3c6842679a33148f8e693a5bd1aabc74d003ccc7"} Feb 18 11:51:13 crc kubenswrapper[4922]: I0218 11:51:13.759529 4922 generic.go:334] "Generic (PLEG): container finished" podID="83694df8-b6fe-4913-8f73-d53972c81f36" containerID="261ae208f78558fd60501df0dcebd0c60e4b4e40aab5ca673884561de1f961e5" exitCode=0 Feb 18 11:51:13 crc kubenswrapper[4922]: I0218 11:51:13.759591 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/14461323130afa0a42cfbeae9371df3c6df69f0e6e31cfc4d749781171nvjzv" event={"ID":"83694df8-b6fe-4913-8f73-d53972c81f36","Type":"ContainerDied","Data":"261ae208f78558fd60501df0dcebd0c60e4b4e40aab5ca673884561de1f961e5"} Feb 18 11:51:14 crc kubenswrapper[4922]: I0218 11:51:14.768454 4922 generic.go:334] "Generic (PLEG): container finished" podID="83694df8-b6fe-4913-8f73-d53972c81f36" containerID="92fb3b0912ddc49b82bc590ae3ba094326610c0781c4874e5c5df7beeeb18ae4" exitCode=0 Feb 18 11:51:14 crc kubenswrapper[4922]: I0218 11:51:14.768733 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/14461323130afa0a42cfbeae9371df3c6df69f0e6e31cfc4d749781171nvjzv" event={"ID":"83694df8-b6fe-4913-8f73-d53972c81f36","Type":"ContainerDied","Data":"92fb3b0912ddc49b82bc590ae3ba094326610c0781c4874e5c5df7beeeb18ae4"} Feb 18 11:51:15 crc kubenswrapper[4922]: I0218 11:51:15.334174 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-j769b"] Feb 18 11:51:15 crc kubenswrapper[4922]: I0218 11:51:15.337666 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j769b" Feb 18 11:51:15 crc kubenswrapper[4922]: I0218 11:51:15.346797 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j769b"] Feb 18 11:51:15 crc kubenswrapper[4922]: I0218 11:51:15.491300 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0a7f927-eadd-4bed-85f2-4347306f598f-catalog-content\") pod \"community-operators-j769b\" (UID: \"c0a7f927-eadd-4bed-85f2-4347306f598f\") " pod="openshift-marketplace/community-operators-j769b" Feb 18 11:51:15 crc kubenswrapper[4922]: I0218 11:51:15.491406 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67mpt\" (UniqueName: \"kubernetes.io/projected/c0a7f927-eadd-4bed-85f2-4347306f598f-kube-api-access-67mpt\") pod \"community-operators-j769b\" (UID: \"c0a7f927-eadd-4bed-85f2-4347306f598f\") " pod="openshift-marketplace/community-operators-j769b" Feb 18 11:51:15 crc kubenswrapper[4922]: I0218 11:51:15.491431 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0a7f927-eadd-4bed-85f2-4347306f598f-utilities\") pod \"community-operators-j769b\" (UID: \"c0a7f927-eadd-4bed-85f2-4347306f598f\") " pod="openshift-marketplace/community-operators-j769b" Feb 18 11:51:15 crc kubenswrapper[4922]: I0218 11:51:15.592532 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0a7f927-eadd-4bed-85f2-4347306f598f-catalog-content\") pod \"community-operators-j769b\" (UID: \"c0a7f927-eadd-4bed-85f2-4347306f598f\") " pod="openshift-marketplace/community-operators-j769b" Feb 18 11:51:15 crc kubenswrapper[4922]: I0218 11:51:15.592617 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67mpt\" (UniqueName: \"kubernetes.io/projected/c0a7f927-eadd-4bed-85f2-4347306f598f-kube-api-access-67mpt\") pod \"community-operators-j769b\" (UID: \"c0a7f927-eadd-4bed-85f2-4347306f598f\") " pod="openshift-marketplace/community-operators-j769b" Feb 18 11:51:15 crc kubenswrapper[4922]: I0218 11:51:15.592648 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0a7f927-eadd-4bed-85f2-4347306f598f-utilities\") pod \"community-operators-j769b\" (UID: \"c0a7f927-eadd-4bed-85f2-4347306f598f\") " pod="openshift-marketplace/community-operators-j769b" Feb 18 11:51:15 crc kubenswrapper[4922]: I0218 11:51:15.593073 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0a7f927-eadd-4bed-85f2-4347306f598f-catalog-content\") pod \"community-operators-j769b\" (UID: \"c0a7f927-eadd-4bed-85f2-4347306f598f\") " pod="openshift-marketplace/community-operators-j769b" Feb 18 11:51:15 crc kubenswrapper[4922]: I0218 11:51:15.593104 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0a7f927-eadd-4bed-85f2-4347306f598f-utilities\") pod \"community-operators-j769b\" (UID: \"c0a7f927-eadd-4bed-85f2-4347306f598f\") " pod="openshift-marketplace/community-operators-j769b" Feb 18 11:51:15 crc kubenswrapper[4922]: I0218 11:51:15.612470 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67mpt\" (UniqueName: \"kubernetes.io/projected/c0a7f927-eadd-4bed-85f2-4347306f598f-kube-api-access-67mpt\") pod \"community-operators-j769b\" (UID: \"c0a7f927-eadd-4bed-85f2-4347306f598f\") " pod="openshift-marketplace/community-operators-j769b" Feb 18 11:51:15 crc kubenswrapper[4922]: I0218 11:51:15.657446 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j769b" Feb 18 11:51:15 crc kubenswrapper[4922]: I0218 11:51:15.826137 4922 generic.go:334] "Generic (PLEG): container finished" podID="83694df8-b6fe-4913-8f73-d53972c81f36" containerID="586d9d63f8b1efc80891299a636b0ac6ecb9eacdf61b31838f42df503a796f5b" exitCode=0 Feb 18 11:51:15 crc kubenswrapper[4922]: I0218 11:51:15.827355 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/14461323130afa0a42cfbeae9371df3c6df69f0e6e31cfc4d749781171nvjzv" event={"ID":"83694df8-b6fe-4913-8f73-d53972c81f36","Type":"ContainerDied","Data":"586d9d63f8b1efc80891299a636b0ac6ecb9eacdf61b31838f42df503a796f5b"} Feb 18 11:51:16 crc kubenswrapper[4922]: I0218 11:51:16.152848 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j769b"] Feb 18 11:51:16 crc kubenswrapper[4922]: I0218 11:51:16.833941 4922 generic.go:334] "Generic (PLEG): container finished" podID="c0a7f927-eadd-4bed-85f2-4347306f598f" containerID="d47389a21468630e8d39ae2bd35959f5913e72d9622874ba171d46cf996381ed" exitCode=0 Feb 18 11:51:16 crc kubenswrapper[4922]: I0218 11:51:16.834012 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j769b" event={"ID":"c0a7f927-eadd-4bed-85f2-4347306f598f","Type":"ContainerDied","Data":"d47389a21468630e8d39ae2bd35959f5913e72d9622874ba171d46cf996381ed"} Feb 18 11:51:16 crc kubenswrapper[4922]: I0218 11:51:16.834588 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j769b" event={"ID":"c0a7f927-eadd-4bed-85f2-4347306f598f","Type":"ContainerStarted","Data":"5831f18aa88a23531ab09da14d84f9930cfd9b5366af084b1e7687e0f60fab18"} Feb 18 11:51:17 crc kubenswrapper[4922]: I0218 11:51:17.108665 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/14461323130afa0a42cfbeae9371df3c6df69f0e6e31cfc4d749781171nvjzv" Feb 18 11:51:17 crc kubenswrapper[4922]: I0218 11:51:17.220271 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/83694df8-b6fe-4913-8f73-d53972c81f36-util\") pod \"83694df8-b6fe-4913-8f73-d53972c81f36\" (UID: \"83694df8-b6fe-4913-8f73-d53972c81f36\") " Feb 18 11:51:17 crc kubenswrapper[4922]: I0218 11:51:17.220354 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2zrj\" (UniqueName: \"kubernetes.io/projected/83694df8-b6fe-4913-8f73-d53972c81f36-kube-api-access-t2zrj\") pod \"83694df8-b6fe-4913-8f73-d53972c81f36\" (UID: \"83694df8-b6fe-4913-8f73-d53972c81f36\") " Feb 18 11:51:17 crc kubenswrapper[4922]: I0218 11:51:17.220423 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/83694df8-b6fe-4913-8f73-d53972c81f36-bundle\") pod \"83694df8-b6fe-4913-8f73-d53972c81f36\" (UID: \"83694df8-b6fe-4913-8f73-d53972c81f36\") " Feb 18 11:51:17 crc kubenswrapper[4922]: I0218 11:51:17.221412 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83694df8-b6fe-4913-8f73-d53972c81f36-bundle" (OuterVolumeSpecName: "bundle") pod "83694df8-b6fe-4913-8f73-d53972c81f36" (UID: "83694df8-b6fe-4913-8f73-d53972c81f36"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:51:17 crc kubenswrapper[4922]: I0218 11:51:17.227902 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83694df8-b6fe-4913-8f73-d53972c81f36-kube-api-access-t2zrj" (OuterVolumeSpecName: "kube-api-access-t2zrj") pod "83694df8-b6fe-4913-8f73-d53972c81f36" (UID: "83694df8-b6fe-4913-8f73-d53972c81f36"). InnerVolumeSpecName "kube-api-access-t2zrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:17 crc kubenswrapper[4922]: I0218 11:51:17.235589 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83694df8-b6fe-4913-8f73-d53972c81f36-util" (OuterVolumeSpecName: "util") pod "83694df8-b6fe-4913-8f73-d53972c81f36" (UID: "83694df8-b6fe-4913-8f73-d53972c81f36"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:51:17 crc kubenswrapper[4922]: I0218 11:51:17.321632 4922 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/83694df8-b6fe-4913-8f73-d53972c81f36-util\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:17 crc kubenswrapper[4922]: I0218 11:51:17.321671 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2zrj\" (UniqueName: \"kubernetes.io/projected/83694df8-b6fe-4913-8f73-d53972c81f36-kube-api-access-t2zrj\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:17 crc kubenswrapper[4922]: I0218 11:51:17.321682 4922 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/83694df8-b6fe-4913-8f73-d53972c81f36-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:17 crc kubenswrapper[4922]: I0218 11:51:17.846307 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/14461323130afa0a42cfbeae9371df3c6df69f0e6e31cfc4d749781171nvjzv" event={"ID":"83694df8-b6fe-4913-8f73-d53972c81f36","Type":"ContainerDied","Data":"4350210d6e70cfdde3c8b3dc3c6842679a33148f8e693a5bd1aabc74d003ccc7"} Feb 18 11:51:17 crc kubenswrapper[4922]: I0218 11:51:17.846410 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4350210d6e70cfdde3c8b3dc3c6842679a33148f8e693a5bd1aabc74d003ccc7" Feb 18 11:51:17 crc kubenswrapper[4922]: I0218 11:51:17.846563 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/14461323130afa0a42cfbeae9371df3c6df69f0e6e31cfc4d749781171nvjzv" Feb 18 11:51:18 crc kubenswrapper[4922]: I0218 11:51:18.860340 4922 generic.go:334] "Generic (PLEG): container finished" podID="c0a7f927-eadd-4bed-85f2-4347306f598f" containerID="e109ea9bbc68468ccc7af684b8cf48822d1125374586c5006fdd34b142717ecb" exitCode=0 Feb 18 11:51:18 crc kubenswrapper[4922]: I0218 11:51:18.860559 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j769b" event={"ID":"c0a7f927-eadd-4bed-85f2-4347306f598f","Type":"ContainerDied","Data":"e109ea9bbc68468ccc7af684b8cf48822d1125374586c5006fdd34b142717ecb"} Feb 18 11:51:19 crc kubenswrapper[4922]: I0218 11:51:19.870670 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j769b" event={"ID":"c0a7f927-eadd-4bed-85f2-4347306f598f","Type":"ContainerStarted","Data":"7457c3c4cfe051c442809f502b83c3ceabdeb0e47c056e1b34acbb2533dd1165"} Feb 18 11:51:19 crc kubenswrapper[4922]: I0218 11:51:19.892100 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-j769b" podStartSLOduration=2.460582113 podStartE2EDuration="4.892085732s" podCreationTimestamp="2026-02-18 11:51:15 +0000 UTC" firstStartedPulling="2026-02-18 11:51:16.836542523 +0000 UTC m=+878.564246603" lastFinishedPulling="2026-02-18 11:51:19.268046102 +0000 UTC m=+880.995750222" observedRunningTime="2026-02-18 11:51:19.887178888 +0000 UTC m=+881.614882968" watchObservedRunningTime="2026-02-18 11:51:19.892085732 +0000 UTC m=+881.619789812" Feb 18 11:51:21 crc kubenswrapper[4922]: I0218 11:51:21.672576 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-f8b4c896c-mdz6v"] Feb 18 11:51:21 crc kubenswrapper[4922]: E0218 11:51:21.673259 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83694df8-b6fe-4913-8f73-d53972c81f36" containerName="pull" Feb 18 11:51:21 crc kubenswrapper[4922]: I0218 11:51:21.673270 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="83694df8-b6fe-4913-8f73-d53972c81f36" containerName="pull" Feb 18 11:51:21 crc kubenswrapper[4922]: E0218 11:51:21.673291 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83694df8-b6fe-4913-8f73-d53972c81f36" containerName="extract" Feb 18 11:51:21 crc kubenswrapper[4922]: I0218 11:51:21.673298 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="83694df8-b6fe-4913-8f73-d53972c81f36" containerName="extract" Feb 18 11:51:21 crc kubenswrapper[4922]: E0218 11:51:21.673308 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83694df8-b6fe-4913-8f73-d53972c81f36" containerName="util" Feb 18 11:51:21 crc kubenswrapper[4922]: I0218 11:51:21.673313 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="83694df8-b6fe-4913-8f73-d53972c81f36" containerName="util" Feb 18 11:51:21 crc kubenswrapper[4922]: I0218 11:51:21.673442 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="83694df8-b6fe-4913-8f73-d53972c81f36" containerName="extract" Feb 18 11:51:21 crc kubenswrapper[4922]: I0218 11:51:21.673852 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-f8b4c896c-mdz6v" Feb 18 11:51:21 crc kubenswrapper[4922]: I0218 11:51:21.675998 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-dst44" Feb 18 11:51:21 crc kubenswrapper[4922]: I0218 11:51:21.697540 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-f8b4c896c-mdz6v"] Feb 18 11:51:21 crc kubenswrapper[4922]: I0218 11:51:21.783094 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k25d\" (UniqueName: \"kubernetes.io/projected/51a617b6-1c84-446a-a342-bd0687227c0c-kube-api-access-5k25d\") pod \"openstack-operator-controller-init-f8b4c896c-mdz6v\" (UID: \"51a617b6-1c84-446a-a342-bd0687227c0c\") " pod="openstack-operators/openstack-operator-controller-init-f8b4c896c-mdz6v" Feb 18 11:51:21 crc kubenswrapper[4922]: I0218 11:51:21.885564 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k25d\" (UniqueName: \"kubernetes.io/projected/51a617b6-1c84-446a-a342-bd0687227c0c-kube-api-access-5k25d\") pod \"openstack-operator-controller-init-f8b4c896c-mdz6v\" (UID: \"51a617b6-1c84-446a-a342-bd0687227c0c\") " pod="openstack-operators/openstack-operator-controller-init-f8b4c896c-mdz6v" Feb 18 11:51:21 crc kubenswrapper[4922]: I0218 11:51:21.911742 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k25d\" (UniqueName: \"kubernetes.io/projected/51a617b6-1c84-446a-a342-bd0687227c0c-kube-api-access-5k25d\") pod \"openstack-operator-controller-init-f8b4c896c-mdz6v\" (UID: \"51a617b6-1c84-446a-a342-bd0687227c0c\") " pod="openstack-operators/openstack-operator-controller-init-f8b4c896c-mdz6v" Feb 18 11:51:21 crc kubenswrapper[4922]: I0218 11:51:21.993281 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-f8b4c896c-mdz6v" Feb 18 11:51:22 crc kubenswrapper[4922]: I0218 11:51:22.430070 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-f8b4c896c-mdz6v"] Feb 18 11:51:22 crc kubenswrapper[4922]: W0218 11:51:22.441238 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51a617b6_1c84_446a_a342_bd0687227c0c.slice/crio-4df226afa3df4722740fefe68da25a4b751a0d880bf7841972b9a74778777a24 WatchSource:0}: Error finding container 4df226afa3df4722740fefe68da25a4b751a0d880bf7841972b9a74778777a24: Status 404 returned error can't find the container with id 4df226afa3df4722740fefe68da25a4b751a0d880bf7841972b9a74778777a24 Feb 18 11:51:22 crc kubenswrapper[4922]: I0218 11:51:22.896460 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-f8b4c896c-mdz6v" event={"ID":"51a617b6-1c84-446a-a342-bd0687227c0c","Type":"ContainerStarted","Data":"4df226afa3df4722740fefe68da25a4b751a0d880bf7841972b9a74778777a24"} Feb 18 11:51:25 crc kubenswrapper[4922]: I0218 11:51:25.658033 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-j769b" Feb 18 11:51:25 crc kubenswrapper[4922]: I0218 11:51:25.658295 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-j769b" Feb 18 11:51:25 crc kubenswrapper[4922]: I0218 11:51:25.714377 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-j769b" Feb 18 11:51:25 crc kubenswrapper[4922]: I0218 11:51:25.960687 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-j769b" Feb 18 11:51:26 crc kubenswrapper[4922]: I0218 11:51:26.924899 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-f8b4c896c-mdz6v" event={"ID":"51a617b6-1c84-446a-a342-bd0687227c0c","Type":"ContainerStarted","Data":"b0906d449d442e9dff94a969db860fb792187b709eadf0bc697c597c37a9c9c2"} Feb 18 11:51:26 crc kubenswrapper[4922]: I0218 11:51:26.925349 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-f8b4c896c-mdz6v" Feb 18 11:51:26 crc kubenswrapper[4922]: I0218 11:51:26.960567 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-f8b4c896c-mdz6v" podStartSLOduration=1.650565128 podStartE2EDuration="5.960552076s" podCreationTimestamp="2026-02-18 11:51:21 +0000 UTC" firstStartedPulling="2026-02-18 11:51:22.447113016 +0000 UTC m=+884.174817096" lastFinishedPulling="2026-02-18 11:51:26.757099964 +0000 UTC m=+888.484804044" observedRunningTime="2026-02-18 11:51:26.956178485 +0000 UTC m=+888.683882585" watchObservedRunningTime="2026-02-18 11:51:26.960552076 +0000 UTC m=+888.688256156" Feb 18 11:51:28 crc kubenswrapper[4922]: I0218 11:51:28.517556 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bk7cv"] Feb 18 11:51:28 crc kubenswrapper[4922]: I0218 11:51:28.518964 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bk7cv" Feb 18 11:51:28 crc kubenswrapper[4922]: I0218 11:51:28.527584 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bk7cv"] Feb 18 11:51:28 crc kubenswrapper[4922]: I0218 11:51:28.685384 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53648a02-b284-431d-8ad7-11d9633b0149-utilities\") pod \"certified-operators-bk7cv\" (UID: \"53648a02-b284-431d-8ad7-11d9633b0149\") " pod="openshift-marketplace/certified-operators-bk7cv" Feb 18 11:51:28 crc kubenswrapper[4922]: I0218 11:51:28.685534 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sn6t\" (UniqueName: \"kubernetes.io/projected/53648a02-b284-431d-8ad7-11d9633b0149-kube-api-access-6sn6t\") pod \"certified-operators-bk7cv\" (UID: \"53648a02-b284-431d-8ad7-11d9633b0149\") " pod="openshift-marketplace/certified-operators-bk7cv" Feb 18 11:51:28 crc kubenswrapper[4922]: I0218 11:51:28.685561 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53648a02-b284-431d-8ad7-11d9633b0149-catalog-content\") pod \"certified-operators-bk7cv\" (UID: \"53648a02-b284-431d-8ad7-11d9633b0149\") " pod="openshift-marketplace/certified-operators-bk7cv" Feb 18 11:51:28 crc kubenswrapper[4922]: I0218 11:51:28.787077 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sn6t\" (UniqueName: \"kubernetes.io/projected/53648a02-b284-431d-8ad7-11d9633b0149-kube-api-access-6sn6t\") pod \"certified-operators-bk7cv\" (UID: \"53648a02-b284-431d-8ad7-11d9633b0149\") " pod="openshift-marketplace/certified-operators-bk7cv" Feb 18 11:51:28 crc kubenswrapper[4922]: I0218 11:51:28.787135 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53648a02-b284-431d-8ad7-11d9633b0149-catalog-content\") pod \"certified-operators-bk7cv\" (UID: \"53648a02-b284-431d-8ad7-11d9633b0149\") " pod="openshift-marketplace/certified-operators-bk7cv" Feb 18 11:51:28 crc kubenswrapper[4922]: I0218 11:51:28.787197 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53648a02-b284-431d-8ad7-11d9633b0149-utilities\") pod \"certified-operators-bk7cv\" (UID: \"53648a02-b284-431d-8ad7-11d9633b0149\") " pod="openshift-marketplace/certified-operators-bk7cv" Feb 18 11:51:28 crc kubenswrapper[4922]: I0218 11:51:28.787792 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53648a02-b284-431d-8ad7-11d9633b0149-utilities\") pod \"certified-operators-bk7cv\" (UID: \"53648a02-b284-431d-8ad7-11d9633b0149\") " pod="openshift-marketplace/certified-operators-bk7cv" Feb 18 11:51:28 crc kubenswrapper[4922]: I0218 11:51:28.787868 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53648a02-b284-431d-8ad7-11d9633b0149-catalog-content\") pod \"certified-operators-bk7cv\" (UID: \"53648a02-b284-431d-8ad7-11d9633b0149\") " pod="openshift-marketplace/certified-operators-bk7cv" Feb 18 11:51:28 crc kubenswrapper[4922]: I0218 11:51:28.811598 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sn6t\" (UniqueName: \"kubernetes.io/projected/53648a02-b284-431d-8ad7-11d9633b0149-kube-api-access-6sn6t\") pod \"certified-operators-bk7cv\" (UID: \"53648a02-b284-431d-8ad7-11d9633b0149\") " pod="openshift-marketplace/certified-operators-bk7cv" Feb 18 11:51:28 crc kubenswrapper[4922]: I0218 11:51:28.836525 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bk7cv" Feb 18 11:51:29 crc kubenswrapper[4922]: I0218 11:51:29.094579 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bk7cv"] Feb 18 11:51:29 crc kubenswrapper[4922]: W0218 11:51:29.101244 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53648a02_b284_431d_8ad7_11d9633b0149.slice/crio-50d1a8b8ee20e9292f4d6b1978fd5e8273fa913669322f1ad46ae1142887a476 WatchSource:0}: Error finding container 50d1a8b8ee20e9292f4d6b1978fd5e8273fa913669322f1ad46ae1142887a476: Status 404 returned error can't find the container with id 50d1a8b8ee20e9292f4d6b1978fd5e8273fa913669322f1ad46ae1142887a476 Feb 18 11:51:29 crc kubenswrapper[4922]: I0218 11:51:29.313010 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j769b"] Feb 18 11:51:29 crc kubenswrapper[4922]: I0218 11:51:29.313284 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-j769b" podUID="c0a7f927-eadd-4bed-85f2-4347306f598f" containerName="registry-server" containerID="cri-o://7457c3c4cfe051c442809f502b83c3ceabdeb0e47c056e1b34acbb2533dd1165" gracePeriod=2 Feb 18 11:51:29 crc kubenswrapper[4922]: I0218 11:51:29.957175 4922 generic.go:334] "Generic (PLEG): container finished" podID="c0a7f927-eadd-4bed-85f2-4347306f598f" containerID="7457c3c4cfe051c442809f502b83c3ceabdeb0e47c056e1b34acbb2533dd1165" exitCode=0 Feb 18 11:51:29 crc kubenswrapper[4922]: I0218 11:51:29.957273 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j769b" event={"ID":"c0a7f927-eadd-4bed-85f2-4347306f598f","Type":"ContainerDied","Data":"7457c3c4cfe051c442809f502b83c3ceabdeb0e47c056e1b34acbb2533dd1165"} Feb 18 11:51:29 crc kubenswrapper[4922]: I0218 11:51:29.958522 4922 generic.go:334] "Generic (PLEG): container finished" podID="53648a02-b284-431d-8ad7-11d9633b0149" containerID="c290c7904b39cf38f89a34e85673fbbb9bef3d95a11b5f7e3b6bf2302fec2ca4" exitCode=0 Feb 18 11:51:29 crc kubenswrapper[4922]: I0218 11:51:29.958551 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bk7cv" event={"ID":"53648a02-b284-431d-8ad7-11d9633b0149","Type":"ContainerDied","Data":"c290c7904b39cf38f89a34e85673fbbb9bef3d95a11b5f7e3b6bf2302fec2ca4"} Feb 18 11:51:29 crc kubenswrapper[4922]: I0218 11:51:29.958576 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bk7cv" event={"ID":"53648a02-b284-431d-8ad7-11d9633b0149","Type":"ContainerStarted","Data":"50d1a8b8ee20e9292f4d6b1978fd5e8273fa913669322f1ad46ae1142887a476"} Feb 18 11:51:30 crc kubenswrapper[4922]: I0218 11:51:30.287962 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j769b" Feb 18 11:51:30 crc kubenswrapper[4922]: I0218 11:51:30.408581 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0a7f927-eadd-4bed-85f2-4347306f598f-utilities\") pod \"c0a7f927-eadd-4bed-85f2-4347306f598f\" (UID: \"c0a7f927-eadd-4bed-85f2-4347306f598f\") " Feb 18 11:51:30 crc kubenswrapper[4922]: I0218 11:51:30.408771 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67mpt\" (UniqueName: \"kubernetes.io/projected/c0a7f927-eadd-4bed-85f2-4347306f598f-kube-api-access-67mpt\") pod \"c0a7f927-eadd-4bed-85f2-4347306f598f\" (UID: \"c0a7f927-eadd-4bed-85f2-4347306f598f\") " Feb 18 11:51:30 crc kubenswrapper[4922]: I0218 11:51:30.408832 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0a7f927-eadd-4bed-85f2-4347306f598f-catalog-content\") pod \"c0a7f927-eadd-4bed-85f2-4347306f598f\" (UID: \"c0a7f927-eadd-4bed-85f2-4347306f598f\") " Feb 18 11:51:30 crc kubenswrapper[4922]: I0218 11:51:30.409443 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0a7f927-eadd-4bed-85f2-4347306f598f-utilities" (OuterVolumeSpecName: "utilities") pod "c0a7f927-eadd-4bed-85f2-4347306f598f" (UID: "c0a7f927-eadd-4bed-85f2-4347306f598f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:51:30 crc kubenswrapper[4922]: I0218 11:51:30.415202 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0a7f927-eadd-4bed-85f2-4347306f598f-kube-api-access-67mpt" (OuterVolumeSpecName: "kube-api-access-67mpt") pod "c0a7f927-eadd-4bed-85f2-4347306f598f" (UID: "c0a7f927-eadd-4bed-85f2-4347306f598f"). InnerVolumeSpecName "kube-api-access-67mpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:30 crc kubenswrapper[4922]: I0218 11:51:30.477003 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0a7f927-eadd-4bed-85f2-4347306f598f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c0a7f927-eadd-4bed-85f2-4347306f598f" (UID: "c0a7f927-eadd-4bed-85f2-4347306f598f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:51:30 crc kubenswrapper[4922]: I0218 11:51:30.511027 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67mpt\" (UniqueName: \"kubernetes.io/projected/c0a7f927-eadd-4bed-85f2-4347306f598f-kube-api-access-67mpt\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:30 crc kubenswrapper[4922]: I0218 11:51:30.511076 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0a7f927-eadd-4bed-85f2-4347306f598f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:30 crc kubenswrapper[4922]: I0218 11:51:30.511085 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0a7f927-eadd-4bed-85f2-4347306f598f-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:30 crc kubenswrapper[4922]: I0218 11:51:30.970032 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j769b" event={"ID":"c0a7f927-eadd-4bed-85f2-4347306f598f","Type":"ContainerDied","Data":"5831f18aa88a23531ab09da14d84f9930cfd9b5366af084b1e7687e0f60fab18"} Feb 18 11:51:30 crc kubenswrapper[4922]: I0218 11:51:30.970093 4922 scope.go:117] "RemoveContainer" containerID="7457c3c4cfe051c442809f502b83c3ceabdeb0e47c056e1b34acbb2533dd1165" Feb 18 11:51:30 crc kubenswrapper[4922]: I0218 11:51:30.970217 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j769b" Feb 18 11:51:30 crc kubenswrapper[4922]: I0218 11:51:30.985990 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bk7cv" event={"ID":"53648a02-b284-431d-8ad7-11d9633b0149","Type":"ContainerStarted","Data":"ec4ff970101ce6b6fb9d0d73cbeb58c38bed6a2f72f5b7ed86b2120b52d04ca9"} Feb 18 11:51:30 crc kubenswrapper[4922]: I0218 11:51:30.997320 4922 scope.go:117] "RemoveContainer" containerID="e109ea9bbc68468ccc7af684b8cf48822d1125374586c5006fdd34b142717ecb" Feb 18 11:51:31 crc kubenswrapper[4922]: I0218 11:51:31.027771 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j769b"] Feb 18 11:51:31 crc kubenswrapper[4922]: I0218 11:51:31.035130 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-j769b"] Feb 18 11:51:31 crc kubenswrapper[4922]: I0218 11:51:31.035171 4922 scope.go:117] "RemoveContainer" containerID="d47389a21468630e8d39ae2bd35959f5913e72d9622874ba171d46cf996381ed" Feb 18 11:51:31 crc kubenswrapper[4922]: I0218 11:51:31.990973 4922 generic.go:334] "Generic (PLEG): container finished" podID="53648a02-b284-431d-8ad7-11d9633b0149" containerID="ec4ff970101ce6b6fb9d0d73cbeb58c38bed6a2f72f5b7ed86b2120b52d04ca9" exitCode=0 Feb 18 11:51:31 crc kubenswrapper[4922]: I0218 11:51:31.991088 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bk7cv" event={"ID":"53648a02-b284-431d-8ad7-11d9633b0149","Type":"ContainerDied","Data":"ec4ff970101ce6b6fb9d0d73cbeb58c38bed6a2f72f5b7ed86b2120b52d04ca9"} Feb 18 11:51:31 crc kubenswrapper[4922]: I0218 11:51:31.998292 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-f8b4c896c-mdz6v" Feb 18 11:51:32 crc kubenswrapper[4922]: I0218 11:51:32.927838 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-p58lq"] Feb 18 11:51:32 crc kubenswrapper[4922]: E0218 11:51:32.928868 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0a7f927-eadd-4bed-85f2-4347306f598f" containerName="extract-utilities" Feb 18 11:51:32 crc kubenswrapper[4922]: I0218 11:51:32.928971 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0a7f927-eadd-4bed-85f2-4347306f598f" containerName="extract-utilities" Feb 18 11:51:32 crc kubenswrapper[4922]: E0218 11:51:32.929062 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0a7f927-eadd-4bed-85f2-4347306f598f" containerName="registry-server" Feb 18 11:51:32 crc kubenswrapper[4922]: I0218 11:51:32.929145 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0a7f927-eadd-4bed-85f2-4347306f598f" containerName="registry-server" Feb 18 11:51:32 crc kubenswrapper[4922]: E0218 11:51:32.929329 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0a7f927-eadd-4bed-85f2-4347306f598f" containerName="extract-content" Feb 18 11:51:32 crc kubenswrapper[4922]: I0218 11:51:32.929434 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0a7f927-eadd-4bed-85f2-4347306f598f" containerName="extract-content" Feb 18 11:51:32 crc kubenswrapper[4922]: I0218 11:51:32.929688 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0a7f927-eadd-4bed-85f2-4347306f598f" containerName="registry-server" Feb 18 11:51:32 crc kubenswrapper[4922]: I0218 11:51:32.930979 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p58lq" Feb 18 11:51:32 crc kubenswrapper[4922]: I0218 11:51:32.967318 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p58lq"] Feb 18 11:51:32 crc kubenswrapper[4922]: I0218 11:51:32.986604 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0a7f927-eadd-4bed-85f2-4347306f598f" path="/var/lib/kubelet/pods/c0a7f927-eadd-4bed-85f2-4347306f598f/volumes" Feb 18 11:51:33 crc kubenswrapper[4922]: I0218 11:51:33.052651 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db4115a8-0d17-481f-8dee-87d0cb403c71-catalog-content\") pod \"redhat-marketplace-p58lq\" (UID: \"db4115a8-0d17-481f-8dee-87d0cb403c71\") " pod="openshift-marketplace/redhat-marketplace-p58lq" Feb 18 11:51:33 crc kubenswrapper[4922]: I0218 11:51:33.052735 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cbf5\" (UniqueName: \"kubernetes.io/projected/db4115a8-0d17-481f-8dee-87d0cb403c71-kube-api-access-6cbf5\") pod \"redhat-marketplace-p58lq\" (UID: \"db4115a8-0d17-481f-8dee-87d0cb403c71\") " pod="openshift-marketplace/redhat-marketplace-p58lq" Feb 18 11:51:33 crc kubenswrapper[4922]: I0218 11:51:33.052836 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db4115a8-0d17-481f-8dee-87d0cb403c71-utilities\") pod \"redhat-marketplace-p58lq\" (UID: \"db4115a8-0d17-481f-8dee-87d0cb403c71\") " pod="openshift-marketplace/redhat-marketplace-p58lq" Feb 18 11:51:33 crc kubenswrapper[4922]: I0218 11:51:33.154659 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db4115a8-0d17-481f-8dee-87d0cb403c71-catalog-content\") pod \"redhat-marketplace-p58lq\" (UID: \"db4115a8-0d17-481f-8dee-87d0cb403c71\") " pod="openshift-marketplace/redhat-marketplace-p58lq" Feb 18 11:51:33 crc kubenswrapper[4922]: I0218 11:51:33.154792 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cbf5\" (UniqueName: \"kubernetes.io/projected/db4115a8-0d17-481f-8dee-87d0cb403c71-kube-api-access-6cbf5\") pod \"redhat-marketplace-p58lq\" (UID: \"db4115a8-0d17-481f-8dee-87d0cb403c71\") " pod="openshift-marketplace/redhat-marketplace-p58lq" Feb 18 11:51:33 crc kubenswrapper[4922]: I0218 11:51:33.155281 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db4115a8-0d17-481f-8dee-87d0cb403c71-catalog-content\") pod \"redhat-marketplace-p58lq\" (UID: \"db4115a8-0d17-481f-8dee-87d0cb403c71\") " pod="openshift-marketplace/redhat-marketplace-p58lq" Feb 18 11:51:33 crc kubenswrapper[4922]: I0218 11:51:33.155813 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db4115a8-0d17-481f-8dee-87d0cb403c71-utilities\") pod \"redhat-marketplace-p58lq\" (UID: \"db4115a8-0d17-481f-8dee-87d0cb403c71\") " pod="openshift-marketplace/redhat-marketplace-p58lq" Feb 18 11:51:33 crc kubenswrapper[4922]: I0218 11:51:33.156297 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db4115a8-0d17-481f-8dee-87d0cb403c71-utilities\") pod \"redhat-marketplace-p58lq\" (UID: \"db4115a8-0d17-481f-8dee-87d0cb403c71\") " pod="openshift-marketplace/redhat-marketplace-p58lq" Feb 18 11:51:33 crc kubenswrapper[4922]: I0218 11:51:33.185728 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cbf5\" (UniqueName: \"kubernetes.io/projected/db4115a8-0d17-481f-8dee-87d0cb403c71-kube-api-access-6cbf5\") pod \"redhat-marketplace-p58lq\" (UID: \"db4115a8-0d17-481f-8dee-87d0cb403c71\") " pod="openshift-marketplace/redhat-marketplace-p58lq" Feb 18 11:51:33 crc kubenswrapper[4922]: I0218 11:51:33.266328 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p58lq" Feb 18 11:51:33 crc kubenswrapper[4922]: I0218 11:51:33.526435 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p58lq"] Feb 18 11:51:34 crc kubenswrapper[4922]: I0218 11:51:34.007736 4922 generic.go:334] "Generic (PLEG): container finished" podID="db4115a8-0d17-481f-8dee-87d0cb403c71" containerID="d3d937b23ea0593597efdc3795dc6f7ac95dc8a38af4b1dd12165bb613b69c2a" exitCode=0 Feb 18 11:51:34 crc kubenswrapper[4922]: I0218 11:51:34.007778 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p58lq" event={"ID":"db4115a8-0d17-481f-8dee-87d0cb403c71","Type":"ContainerDied","Data":"d3d937b23ea0593597efdc3795dc6f7ac95dc8a38af4b1dd12165bb613b69c2a"} Feb 18 11:51:34 crc kubenswrapper[4922]: I0218 11:51:34.007804 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p58lq" event={"ID":"db4115a8-0d17-481f-8dee-87d0cb403c71","Type":"ContainerStarted","Data":"da79be7f7e5f68550d58344941e1e73d8c70455fdf2e899d8d747aef7496924b"} Feb 18 11:51:35 crc kubenswrapper[4922]: I0218 11:51:35.017884 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bk7cv" event={"ID":"53648a02-b284-431d-8ad7-11d9633b0149","Type":"ContainerStarted","Data":"878c0580bbc8862d88a8e46bf5f5871de16dcdeb37ec26cd61e5cfea1dbd4ed9"} Feb 18 11:51:35 crc kubenswrapper[4922]: I0218 11:51:35.022059 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p58lq" event={"ID":"db4115a8-0d17-481f-8dee-87d0cb403c71","Type":"ContainerStarted","Data":"862386f6582f32e25d28a0f4f7fa1fed459b4fd5dd51c2ac625bc35c25f9ad0c"} Feb 18 11:51:35 crc kubenswrapper[4922]: I0218 11:51:35.053260 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bk7cv" podStartSLOduration=3.095865188 podStartE2EDuration="7.053245112s" podCreationTimestamp="2026-02-18 11:51:28 +0000 UTC" firstStartedPulling="2026-02-18 11:51:29.960084341 +0000 UTC m=+891.687788421" lastFinishedPulling="2026-02-18 11:51:33.917464265 +0000 UTC m=+895.645168345" observedRunningTime="2026-02-18 11:51:35.048341208 +0000 UTC m=+896.776045298" watchObservedRunningTime="2026-02-18 11:51:35.053245112 +0000 UTC m=+896.780949192" Feb 18 11:51:36 crc kubenswrapper[4922]: I0218 11:51:36.034149 4922 generic.go:334] "Generic (PLEG): container finished" podID="db4115a8-0d17-481f-8dee-87d0cb403c71" containerID="862386f6582f32e25d28a0f4f7fa1fed459b4fd5dd51c2ac625bc35c25f9ad0c" exitCode=0 Feb 18 11:51:36 crc kubenswrapper[4922]: I0218 11:51:36.034270 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p58lq" event={"ID":"db4115a8-0d17-481f-8dee-87d0cb403c71","Type":"ContainerDied","Data":"862386f6582f32e25d28a0f4f7fa1fed459b4fd5dd51c2ac625bc35c25f9ad0c"} Feb 18 11:51:38 crc kubenswrapper[4922]: I0218 11:51:38.050127 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p58lq" event={"ID":"db4115a8-0d17-481f-8dee-87d0cb403c71","Type":"ContainerStarted","Data":"6062a677bdd3c0fe63d63f3e2d2ffd7e256189a1991d655f3cbb250dc14f3a2a"} Feb 18 11:51:38 crc kubenswrapper[4922]: I0218 11:51:38.073277 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-p58lq" podStartSLOduration=3.309403044 podStartE2EDuration="6.073255944s" podCreationTimestamp="2026-02-18 11:51:32 +0000 UTC" firstStartedPulling="2026-02-18 11:51:34.009537438 +0000 UTC m=+895.737241518" lastFinishedPulling="2026-02-18 11:51:36.773390328 +0000 UTC m=+898.501094418" observedRunningTime="2026-02-18 11:51:38.07032234 +0000 UTC m=+899.798026420" watchObservedRunningTime="2026-02-18 11:51:38.073255944 +0000 UTC m=+899.800960024" Feb 18 11:51:38 crc kubenswrapper[4922]: I0218 11:51:38.837352 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bk7cv" Feb 18 11:51:38 crc kubenswrapper[4922]: I0218 11:51:38.837440 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bk7cv" Feb 18 11:51:38 crc kubenswrapper[4922]: I0218 11:51:38.926033 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bk7cv" Feb 18 11:51:39 crc kubenswrapper[4922]: I0218 11:51:39.100641 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bk7cv" Feb 18 11:51:39 crc kubenswrapper[4922]: I0218 11:51:39.911541 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bk7cv"] Feb 18 11:51:41 crc kubenswrapper[4922]: I0218 11:51:41.069990 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bk7cv" podUID="53648a02-b284-431d-8ad7-11d9633b0149" containerName="registry-server" containerID="cri-o://878c0580bbc8862d88a8e46bf5f5871de16dcdeb37ec26cd61e5cfea1dbd4ed9" gracePeriod=2 Feb 18 11:51:41 crc kubenswrapper[4922]: I0218 11:51:41.954955 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bk7cv" Feb 18 11:51:42 crc kubenswrapper[4922]: I0218 11:51:42.077756 4922 generic.go:334] "Generic (PLEG): container finished" podID="53648a02-b284-431d-8ad7-11d9633b0149" containerID="878c0580bbc8862d88a8e46bf5f5871de16dcdeb37ec26cd61e5cfea1dbd4ed9" exitCode=0 Feb 18 11:51:42 crc kubenswrapper[4922]: I0218 11:51:42.077807 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bk7cv" event={"ID":"53648a02-b284-431d-8ad7-11d9633b0149","Type":"ContainerDied","Data":"878c0580bbc8862d88a8e46bf5f5871de16dcdeb37ec26cd61e5cfea1dbd4ed9"} Feb 18 11:51:42 crc kubenswrapper[4922]: I0218 11:51:42.077834 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bk7cv" Feb 18 11:51:42 crc kubenswrapper[4922]: I0218 11:51:42.077853 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bk7cv" event={"ID":"53648a02-b284-431d-8ad7-11d9633b0149","Type":"ContainerDied","Data":"50d1a8b8ee20e9292f4d6b1978fd5e8273fa913669322f1ad46ae1142887a476"} Feb 18 11:51:42 crc kubenswrapper[4922]: I0218 11:51:42.077871 4922 scope.go:117] "RemoveContainer" containerID="878c0580bbc8862d88a8e46bf5f5871de16dcdeb37ec26cd61e5cfea1dbd4ed9" Feb 18 11:51:42 crc kubenswrapper[4922]: I0218 11:51:42.093281 4922 scope.go:117] "RemoveContainer" containerID="ec4ff970101ce6b6fb9d0d73cbeb58c38bed6a2f72f5b7ed86b2120b52d04ca9" Feb 18 11:51:42 crc kubenswrapper[4922]: I0218 11:51:42.096037 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53648a02-b284-431d-8ad7-11d9633b0149-utilities\") pod \"53648a02-b284-431d-8ad7-11d9633b0149\" (UID: \"53648a02-b284-431d-8ad7-11d9633b0149\") " Feb 18 11:51:42 crc kubenswrapper[4922]: I0218 11:51:42.096088 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6sn6t\" (UniqueName: \"kubernetes.io/projected/53648a02-b284-431d-8ad7-11d9633b0149-kube-api-access-6sn6t\") pod \"53648a02-b284-431d-8ad7-11d9633b0149\" (UID: \"53648a02-b284-431d-8ad7-11d9633b0149\") " Feb 18 11:51:42 crc kubenswrapper[4922]: I0218 11:51:42.096178 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53648a02-b284-431d-8ad7-11d9633b0149-catalog-content\") pod \"53648a02-b284-431d-8ad7-11d9633b0149\" (UID: \"53648a02-b284-431d-8ad7-11d9633b0149\") " Feb 18 11:51:42 crc kubenswrapper[4922]: I0218 11:51:42.096966 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53648a02-b284-431d-8ad7-11d9633b0149-utilities" (OuterVolumeSpecName: "utilities") pod "53648a02-b284-431d-8ad7-11d9633b0149" (UID: "53648a02-b284-431d-8ad7-11d9633b0149"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:51:42 crc kubenswrapper[4922]: I0218 11:51:42.104900 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53648a02-b284-431d-8ad7-11d9633b0149-kube-api-access-6sn6t" (OuterVolumeSpecName: "kube-api-access-6sn6t") pod "53648a02-b284-431d-8ad7-11d9633b0149" (UID: "53648a02-b284-431d-8ad7-11d9633b0149"). InnerVolumeSpecName "kube-api-access-6sn6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:42 crc kubenswrapper[4922]: I0218 11:51:42.124899 4922 scope.go:117] "RemoveContainer" containerID="c290c7904b39cf38f89a34e85673fbbb9bef3d95a11b5f7e3b6bf2302fec2ca4" Feb 18 11:51:42 crc kubenswrapper[4922]: I0218 11:51:42.158741 4922 scope.go:117] "RemoveContainer" containerID="878c0580bbc8862d88a8e46bf5f5871de16dcdeb37ec26cd61e5cfea1dbd4ed9" Feb 18 11:51:42 crc kubenswrapper[4922]: E0218 11:51:42.159294 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"878c0580bbc8862d88a8e46bf5f5871de16dcdeb37ec26cd61e5cfea1dbd4ed9\": container with ID starting with 878c0580bbc8862d88a8e46bf5f5871de16dcdeb37ec26cd61e5cfea1dbd4ed9 not found: ID does not exist" containerID="878c0580bbc8862d88a8e46bf5f5871de16dcdeb37ec26cd61e5cfea1dbd4ed9" Feb 18 11:51:42 crc kubenswrapper[4922]: I0218 11:51:42.159338 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"878c0580bbc8862d88a8e46bf5f5871de16dcdeb37ec26cd61e5cfea1dbd4ed9"} err="failed to get container status \"878c0580bbc8862d88a8e46bf5f5871de16dcdeb37ec26cd61e5cfea1dbd4ed9\": rpc error: code = NotFound desc = could not find container \"878c0580bbc8862d88a8e46bf5f5871de16dcdeb37ec26cd61e5cfea1dbd4ed9\": container with ID starting with 878c0580bbc8862d88a8e46bf5f5871de16dcdeb37ec26cd61e5cfea1dbd4ed9 not found: ID does not exist" Feb 18 11:51:42 crc kubenswrapper[4922]: I0218 11:51:42.159392 4922 scope.go:117] "RemoveContainer" containerID="ec4ff970101ce6b6fb9d0d73cbeb58c38bed6a2f72f5b7ed86b2120b52d04ca9" Feb 18 11:51:42 crc kubenswrapper[4922]: E0218 11:51:42.159929 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec4ff970101ce6b6fb9d0d73cbeb58c38bed6a2f72f5b7ed86b2120b52d04ca9\": container with ID starting with ec4ff970101ce6b6fb9d0d73cbeb58c38bed6a2f72f5b7ed86b2120b52d04ca9 not found: ID does not exist" containerID="ec4ff970101ce6b6fb9d0d73cbeb58c38bed6a2f72f5b7ed86b2120b52d04ca9" Feb 18 11:51:42 crc kubenswrapper[4922]: I0218 11:51:42.159959 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec4ff970101ce6b6fb9d0d73cbeb58c38bed6a2f72f5b7ed86b2120b52d04ca9"} err="failed to get container status \"ec4ff970101ce6b6fb9d0d73cbeb58c38bed6a2f72f5b7ed86b2120b52d04ca9\": rpc error: code = NotFound desc = could not find container \"ec4ff970101ce6b6fb9d0d73cbeb58c38bed6a2f72f5b7ed86b2120b52d04ca9\": container with ID starting with ec4ff970101ce6b6fb9d0d73cbeb58c38bed6a2f72f5b7ed86b2120b52d04ca9 not found: ID does not exist" Feb 18 11:51:42 crc kubenswrapper[4922]: I0218 11:51:42.159976 4922 scope.go:117] "RemoveContainer" containerID="c290c7904b39cf38f89a34e85673fbbb9bef3d95a11b5f7e3b6bf2302fec2ca4" Feb 18 11:51:42 crc kubenswrapper[4922]: E0218 11:51:42.160428 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c290c7904b39cf38f89a34e85673fbbb9bef3d95a11b5f7e3b6bf2302fec2ca4\": container with ID starting with c290c7904b39cf38f89a34e85673fbbb9bef3d95a11b5f7e3b6bf2302fec2ca4 not found: ID does not exist" containerID="c290c7904b39cf38f89a34e85673fbbb9bef3d95a11b5f7e3b6bf2302fec2ca4" Feb 18 11:51:42 crc kubenswrapper[4922]: I0218 11:51:42.160489 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c290c7904b39cf38f89a34e85673fbbb9bef3d95a11b5f7e3b6bf2302fec2ca4"} err="failed to get container status \"c290c7904b39cf38f89a34e85673fbbb9bef3d95a11b5f7e3b6bf2302fec2ca4\": rpc error: code = NotFound desc = could not find container \"c290c7904b39cf38f89a34e85673fbbb9bef3d95a11b5f7e3b6bf2302fec2ca4\": container with ID starting with c290c7904b39cf38f89a34e85673fbbb9bef3d95a11b5f7e3b6bf2302fec2ca4 not found: ID does not exist" Feb 18 11:51:42 crc kubenswrapper[4922]: I0218 11:51:42.163752 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53648a02-b284-431d-8ad7-11d9633b0149-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "53648a02-b284-431d-8ad7-11d9633b0149" (UID: "53648a02-b284-431d-8ad7-11d9633b0149"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:51:42 crc kubenswrapper[4922]: I0218 11:51:42.198176 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53648a02-b284-431d-8ad7-11d9633b0149-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:42 crc kubenswrapper[4922]: I0218 11:51:42.198213 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6sn6t\" (UniqueName: \"kubernetes.io/projected/53648a02-b284-431d-8ad7-11d9633b0149-kube-api-access-6sn6t\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:42 crc kubenswrapper[4922]: I0218 11:51:42.198224 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53648a02-b284-431d-8ad7-11d9633b0149-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:42 crc kubenswrapper[4922]: I0218 11:51:42.405745 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bk7cv"] Feb 18 11:51:42 crc kubenswrapper[4922]: I0218 11:51:42.410687 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bk7cv"] Feb 18 11:51:42 crc kubenswrapper[4922]: I0218 11:51:42.981799 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53648a02-b284-431d-8ad7-11d9633b0149" path="/var/lib/kubelet/pods/53648a02-b284-431d-8ad7-11d9633b0149/volumes" Feb 18 11:51:43 crc kubenswrapper[4922]: I0218 11:51:43.267251 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-p58lq" Feb 18 11:51:43 crc kubenswrapper[4922]: I0218 11:51:43.267302 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-p58lq" Feb 18 11:51:43 crc kubenswrapper[4922]: I0218 11:51:43.318303 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-p58lq" Feb 18 11:51:44 crc kubenswrapper[4922]: I0218 11:51:44.148694 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-p58lq" Feb 18 11:51:45 crc kubenswrapper[4922]: I0218 11:51:45.317520 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p58lq"] Feb 18 11:51:46 crc kubenswrapper[4922]: I0218 11:51:46.105609 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-p58lq" podUID="db4115a8-0d17-481f-8dee-87d0cb403c71" containerName="registry-server" containerID="cri-o://6062a677bdd3c0fe63d63f3e2d2ffd7e256189a1991d655f3cbb250dc14f3a2a" gracePeriod=2 Feb 18 11:51:46 crc kubenswrapper[4922]: I0218 11:51:46.546764 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p58lq" Feb 18 11:51:46 crc kubenswrapper[4922]: I0218 11:51:46.657057 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db4115a8-0d17-481f-8dee-87d0cb403c71-catalog-content\") pod \"db4115a8-0d17-481f-8dee-87d0cb403c71\" (UID: \"db4115a8-0d17-481f-8dee-87d0cb403c71\") " Feb 18 11:51:46 crc kubenswrapper[4922]: I0218 11:51:46.657117 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cbf5\" (UniqueName: \"kubernetes.io/projected/db4115a8-0d17-481f-8dee-87d0cb403c71-kube-api-access-6cbf5\") pod \"db4115a8-0d17-481f-8dee-87d0cb403c71\" (UID: \"db4115a8-0d17-481f-8dee-87d0cb403c71\") " Feb 18 11:51:46 crc kubenswrapper[4922]: I0218 11:51:46.657169 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db4115a8-0d17-481f-8dee-87d0cb403c71-utilities\") pod \"db4115a8-0d17-481f-8dee-87d0cb403c71\" (UID: \"db4115a8-0d17-481f-8dee-87d0cb403c71\") " Feb 18 11:51:46 crc kubenswrapper[4922]: I0218 11:51:46.658189 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db4115a8-0d17-481f-8dee-87d0cb403c71-utilities" (OuterVolumeSpecName: "utilities") pod "db4115a8-0d17-481f-8dee-87d0cb403c71" (UID: "db4115a8-0d17-481f-8dee-87d0cb403c71"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:51:46 crc kubenswrapper[4922]: I0218 11:51:46.664597 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db4115a8-0d17-481f-8dee-87d0cb403c71-kube-api-access-6cbf5" (OuterVolumeSpecName: "kube-api-access-6cbf5") pod "db4115a8-0d17-481f-8dee-87d0cb403c71" (UID: "db4115a8-0d17-481f-8dee-87d0cb403c71"). InnerVolumeSpecName "kube-api-access-6cbf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:46 crc kubenswrapper[4922]: I0218 11:51:46.698977 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db4115a8-0d17-481f-8dee-87d0cb403c71-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "db4115a8-0d17-481f-8dee-87d0cb403c71" (UID: "db4115a8-0d17-481f-8dee-87d0cb403c71"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:51:46 crc kubenswrapper[4922]: I0218 11:51:46.758845 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db4115a8-0d17-481f-8dee-87d0cb403c71-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:46 crc kubenswrapper[4922]: I0218 11:51:46.759199 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6cbf5\" (UniqueName: \"kubernetes.io/projected/db4115a8-0d17-481f-8dee-87d0cb403c71-kube-api-access-6cbf5\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:46 crc kubenswrapper[4922]: I0218 11:51:46.759283 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db4115a8-0d17-481f-8dee-87d0cb403c71-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:47 crc kubenswrapper[4922]: I0218 11:51:47.114324 4922 generic.go:334] "Generic (PLEG): container finished" podID="db4115a8-0d17-481f-8dee-87d0cb403c71" containerID="6062a677bdd3c0fe63d63f3e2d2ffd7e256189a1991d655f3cbb250dc14f3a2a" exitCode=0 Feb 18 11:51:47 crc kubenswrapper[4922]: I0218 11:51:47.114386 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p58lq" event={"ID":"db4115a8-0d17-481f-8dee-87d0cb403c71","Type":"ContainerDied","Data":"6062a677bdd3c0fe63d63f3e2d2ffd7e256189a1991d655f3cbb250dc14f3a2a"} Feb 18 11:51:47 crc kubenswrapper[4922]: I0218 11:51:47.114413 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p58lq" event={"ID":"db4115a8-0d17-481f-8dee-87d0cb403c71","Type":"ContainerDied","Data":"da79be7f7e5f68550d58344941e1e73d8c70455fdf2e899d8d747aef7496924b"} Feb 18 11:51:47 crc kubenswrapper[4922]: I0218 11:51:47.114429 4922 scope.go:117] "RemoveContainer" containerID="6062a677bdd3c0fe63d63f3e2d2ffd7e256189a1991d655f3cbb250dc14f3a2a" Feb 18 11:51:47 crc kubenswrapper[4922]: I0218 11:51:47.114775 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p58lq" Feb 18 11:51:47 crc kubenswrapper[4922]: I0218 11:51:47.134754 4922 scope.go:117] "RemoveContainer" containerID="862386f6582f32e25d28a0f4f7fa1fed459b4fd5dd51c2ac625bc35c25f9ad0c" Feb 18 11:51:47 crc kubenswrapper[4922]: I0218 11:51:47.142686 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p58lq"] Feb 18 11:51:47 crc kubenswrapper[4922]: I0218 11:51:47.150190 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-p58lq"] Feb 18 11:51:47 crc kubenswrapper[4922]: I0218 11:51:47.165701 4922 scope.go:117] "RemoveContainer" containerID="d3d937b23ea0593597efdc3795dc6f7ac95dc8a38af4b1dd12165bb613b69c2a" Feb 18 11:51:47 crc kubenswrapper[4922]: I0218 11:51:47.184700 4922 scope.go:117] "RemoveContainer" containerID="6062a677bdd3c0fe63d63f3e2d2ffd7e256189a1991d655f3cbb250dc14f3a2a" Feb 18 11:51:47 crc kubenswrapper[4922]: E0218 11:51:47.185238 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6062a677bdd3c0fe63d63f3e2d2ffd7e256189a1991d655f3cbb250dc14f3a2a\": container with ID starting with 6062a677bdd3c0fe63d63f3e2d2ffd7e256189a1991d655f3cbb250dc14f3a2a not found: ID does not exist" containerID="6062a677bdd3c0fe63d63f3e2d2ffd7e256189a1991d655f3cbb250dc14f3a2a" Feb 18 11:51:47 crc kubenswrapper[4922]: I0218 11:51:47.185290 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6062a677bdd3c0fe63d63f3e2d2ffd7e256189a1991d655f3cbb250dc14f3a2a"} err="failed to get container status \"6062a677bdd3c0fe63d63f3e2d2ffd7e256189a1991d655f3cbb250dc14f3a2a\": rpc error: code = NotFound desc = could not find container \"6062a677bdd3c0fe63d63f3e2d2ffd7e256189a1991d655f3cbb250dc14f3a2a\": container with ID starting with 6062a677bdd3c0fe63d63f3e2d2ffd7e256189a1991d655f3cbb250dc14f3a2a not found: ID does not exist" Feb 18 11:51:47 crc kubenswrapper[4922]: I0218 11:51:47.185325 4922 scope.go:117] "RemoveContainer" containerID="862386f6582f32e25d28a0f4f7fa1fed459b4fd5dd51c2ac625bc35c25f9ad0c" Feb 18 11:51:47 crc kubenswrapper[4922]: E0218 11:51:47.185713 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"862386f6582f32e25d28a0f4f7fa1fed459b4fd5dd51c2ac625bc35c25f9ad0c\": container with ID starting with 862386f6582f32e25d28a0f4f7fa1fed459b4fd5dd51c2ac625bc35c25f9ad0c not found: ID does not exist" containerID="862386f6582f32e25d28a0f4f7fa1fed459b4fd5dd51c2ac625bc35c25f9ad0c" Feb 18 11:51:47 crc kubenswrapper[4922]: I0218 11:51:47.185751 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"862386f6582f32e25d28a0f4f7fa1fed459b4fd5dd51c2ac625bc35c25f9ad0c"} err="failed to get container status \"862386f6582f32e25d28a0f4f7fa1fed459b4fd5dd51c2ac625bc35c25f9ad0c\": rpc error: code = NotFound desc = could not find container \"862386f6582f32e25d28a0f4f7fa1fed459b4fd5dd51c2ac625bc35c25f9ad0c\": container with ID starting with 862386f6582f32e25d28a0f4f7fa1fed459b4fd5dd51c2ac625bc35c25f9ad0c not found: ID does not exist" Feb 18 11:51:47 crc kubenswrapper[4922]: I0218 11:51:47.185780 4922 scope.go:117] "RemoveContainer" containerID="d3d937b23ea0593597efdc3795dc6f7ac95dc8a38af4b1dd12165bb613b69c2a" Feb 18 11:51:47 crc kubenswrapper[4922]: E0218 11:51:47.186037 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3d937b23ea0593597efdc3795dc6f7ac95dc8a38af4b1dd12165bb613b69c2a\": container with ID starting with d3d937b23ea0593597efdc3795dc6f7ac95dc8a38af4b1dd12165bb613b69c2a not found: ID does not exist" containerID="d3d937b23ea0593597efdc3795dc6f7ac95dc8a38af4b1dd12165bb613b69c2a" Feb 18 11:51:47 crc kubenswrapper[4922]: I0218 11:51:47.186073 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3d937b23ea0593597efdc3795dc6f7ac95dc8a38af4b1dd12165bb613b69c2a"} err="failed to get container status \"d3d937b23ea0593597efdc3795dc6f7ac95dc8a38af4b1dd12165bb613b69c2a\": rpc error: code = NotFound desc = could not find container \"d3d937b23ea0593597efdc3795dc6f7ac95dc8a38af4b1dd12165bb613b69c2a\": container with ID starting with d3d937b23ea0593597efdc3795dc6f7ac95dc8a38af4b1dd12165bb613b69c2a not found: ID does not exist" Feb 18 11:51:48 crc kubenswrapper[4922]: I0218 11:51:48.981126 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db4115a8-0d17-481f-8dee-87d0cb403c71" path="/var/lib/kubelet/pods/db4115a8-0d17-481f-8dee-87d0cb403c71/volumes" Feb 18 11:51:51 crc kubenswrapper[4922]: I0218 11:51:51.764663 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-f8lbk"] Feb 18 11:51:51 crc kubenswrapper[4922]: E0218 11:51:51.765592 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db4115a8-0d17-481f-8dee-87d0cb403c71" containerName="registry-server" Feb 18 11:51:51 crc kubenswrapper[4922]: I0218 11:51:51.765607 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="db4115a8-0d17-481f-8dee-87d0cb403c71" containerName="registry-server" Feb 18 11:51:51 crc kubenswrapper[4922]: E0218 11:51:51.765633 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53648a02-b284-431d-8ad7-11d9633b0149" containerName="extract-content" Feb 18 11:51:51 crc kubenswrapper[4922]: I0218 11:51:51.765641 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="53648a02-b284-431d-8ad7-11d9633b0149" containerName="extract-content" Feb 18 11:51:51 crc kubenswrapper[4922]: E0218 11:51:51.765653 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53648a02-b284-431d-8ad7-11d9633b0149" containerName="extract-utilities" Feb 18 11:51:51 crc kubenswrapper[4922]: I0218 11:51:51.765661 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="53648a02-b284-431d-8ad7-11d9633b0149" containerName="extract-utilities" Feb 18 11:51:51 crc kubenswrapper[4922]: E0218 11:51:51.765672 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db4115a8-0d17-481f-8dee-87d0cb403c71" containerName="extract-content" Feb 18 11:51:51 crc kubenswrapper[4922]: I0218 11:51:51.765679 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="db4115a8-0d17-481f-8dee-87d0cb403c71" containerName="extract-content" Feb 18 11:51:51 crc kubenswrapper[4922]: E0218 11:51:51.765689 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db4115a8-0d17-481f-8dee-87d0cb403c71" containerName="extract-utilities" Feb 18 11:51:51 crc kubenswrapper[4922]: I0218 11:51:51.765696 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="db4115a8-0d17-481f-8dee-87d0cb403c71" containerName="extract-utilities" Feb 18 11:51:51 crc kubenswrapper[4922]: E0218 11:51:51.765705 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53648a02-b284-431d-8ad7-11d9633b0149" containerName="registry-server" Feb 18 11:51:51 crc kubenswrapper[4922]: I0218 11:51:51.765711 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="53648a02-b284-431d-8ad7-11d9633b0149" containerName="registry-server" Feb 18 11:51:51 crc kubenswrapper[4922]: I0218 11:51:51.765851 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="db4115a8-0d17-481f-8dee-87d0cb403c71" containerName="registry-server" Feb 18 11:51:51 crc kubenswrapper[4922]: I0218 11:51:51.765875 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="53648a02-b284-431d-8ad7-11d9633b0149" containerName="registry-server" Feb 18 11:51:51 crc kubenswrapper[4922]: I0218 11:51:51.766387 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-f8lbk" Feb 18 11:51:51 crc kubenswrapper[4922]: I0218 11:51:51.770118 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-zqd4d" Feb 18 11:51:51 crc kubenswrapper[4922]: I0218 11:51:51.780949 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-6z2cq"] Feb 18 11:51:51 crc kubenswrapper[4922]: I0218 11:51:51.781934 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-6z2cq" Feb 18 11:51:51 crc kubenswrapper[4922]: I0218 11:51:51.785687 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-nhx4z" Feb 18 11:51:51 crc kubenswrapper[4922]: I0218 11:51:51.805666 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-2ncv8"] Feb 18 11:51:51 crc kubenswrapper[4922]: I0218 11:51:51.807244 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-2ncv8" Feb 18 11:51:51 crc kubenswrapper[4922]: I0218 11:51:51.825467 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-ktdtd" Feb 18 11:51:51 crc kubenswrapper[4922]: I0218 11:51:51.851301 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-f8lbk"] Feb 18 11:51:51 crc kubenswrapper[4922]: I0218 11:51:51.891012 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-6z2cq"] Feb 18 11:51:51 crc kubenswrapper[4922]: I0218 11:51:51.908235 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-2ncv8"] Feb 18 11:51:51 crc kubenswrapper[4922]: I0218 11:51:51.914538 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-bnvrn"] Feb 18 11:51:51 crc kubenswrapper[4922]: I0218 11:51:51.915419 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-bnvrn" Feb 18 11:51:51 crc kubenswrapper[4922]: I0218 11:51:51.923076 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-qm24h"] Feb 18 11:51:51 crc kubenswrapper[4922]: I0218 11:51:51.923142 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-wscdl" Feb 18 11:51:51 crc kubenswrapper[4922]: I0218 11:51:51.924112 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-qm24h" Feb 18 11:51:51 crc kubenswrapper[4922]: I0218 11:51:51.932875 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-94cg6" Feb 18 11:51:51 crc kubenswrapper[4922]: I0218 11:51:51.940104 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5fnf\" (UniqueName: \"kubernetes.io/projected/01766bee-50bd-4dcb-9b3d-831486ddeaf4-kube-api-access-z5fnf\") pod \"designate-operator-controller-manager-6d8bf5c495-2ncv8\" (UID: \"01766bee-50bd-4dcb-9b3d-831486ddeaf4\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-2ncv8" Feb 18 11:51:51 crc kubenswrapper[4922]: I0218 11:51:51.940183 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6pt7\" (UniqueName: \"kubernetes.io/projected/ae81863a-2778-4505-9106-c850f873a75d-kube-api-access-f6pt7\") pod \"barbican-operator-controller-manager-868647ff47-f8lbk\" (UID: \"ae81863a-2778-4505-9106-c850f873a75d\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-f8lbk" Feb 18 11:51:51 crc kubenswrapper[4922]: I0218 11:51:51.940220 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzpq2\" (UniqueName: \"kubernetes.io/projected/61f73f1d-e472-411e-adc0-6755c47aa72b-kube-api-access-zzpq2\") pod \"cinder-operator-controller-manager-5d946d989d-6z2cq\" (UID: \"61f73f1d-e472-411e-adc0-6755c47aa72b\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-6z2cq" Feb 18 11:51:51 crc kubenswrapper[4922]: I0218 11:51:51.940288 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-bnvrn"] Feb 18 11:51:51 crc kubenswrapper[4922]: I0218 11:51:51.960192 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-qm24h"] Feb 18 11:51:51 crc kubenswrapper[4922]: I0218 11:51:51.971090 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-82hvr"] Feb 18 11:51:51 crc kubenswrapper[4922]: I0218 11:51:51.972152 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-82hvr" Feb 18 11:51:51 crc kubenswrapper[4922]: I0218 11:51:51.979771 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-p5tfh" Feb 18 11:51:51 crc kubenswrapper[4922]: I0218 11:51:51.986777 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-82hvr"] Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.000993 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-krt25"] Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.002212 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-krt25" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.005188 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-9prpc" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.005446 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.042081 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-r4v59"] Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.043136 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-r4v59" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.045336 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8w9l\" (UniqueName: \"kubernetes.io/projected/51cd14ee-9b8a-421f-80bb-d208b752079d-kube-api-access-z8w9l\") pod \"heat-operator-controller-manager-69f49c598c-qm24h\" (UID: \"51cd14ee-9b8a-421f-80bb-d208b752079d\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-qm24h" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.045418 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6pt7\" (UniqueName: \"kubernetes.io/projected/ae81863a-2778-4505-9106-c850f873a75d-kube-api-access-f6pt7\") pod \"barbican-operator-controller-manager-868647ff47-f8lbk\" (UID: \"ae81863a-2778-4505-9106-c850f873a75d\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-f8lbk" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.045467 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzpq2\" (UniqueName: \"kubernetes.io/projected/61f73f1d-e472-411e-adc0-6755c47aa72b-kube-api-access-zzpq2\") pod \"cinder-operator-controller-manager-5d946d989d-6z2cq\" (UID: \"61f73f1d-e472-411e-adc0-6755c47aa72b\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-6z2cq" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.045523 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8k6q\" (UniqueName: \"kubernetes.io/projected/4c9af0bf-50d7-42ef-a8df-241b5ec63f5a-kube-api-access-f8k6q\") pod \"glance-operator-controller-manager-77987464f4-bnvrn\" (UID: \"4c9af0bf-50d7-42ef-a8df-241b5ec63f5a\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-bnvrn" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.045558 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5fnf\" (UniqueName: \"kubernetes.io/projected/01766bee-50bd-4dcb-9b3d-831486ddeaf4-kube-api-access-z5fnf\") pod \"designate-operator-controller-manager-6d8bf5c495-2ncv8\" (UID: \"01766bee-50bd-4dcb-9b3d-831486ddeaf4\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-2ncv8" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.050460 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-5h586" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.055461 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-jtfzr"] Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.056270 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-jtfzr" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.072315 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-krt25"] Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.073208 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5fnf\" (UniqueName: \"kubernetes.io/projected/01766bee-50bd-4dcb-9b3d-831486ddeaf4-kube-api-access-z5fnf\") pod \"designate-operator-controller-manager-6d8bf5c495-2ncv8\" (UID: \"01766bee-50bd-4dcb-9b3d-831486ddeaf4\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-2ncv8" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.073616 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-r7jsl" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.079583 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-jtfzr"] Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.081994 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6pt7\" (UniqueName: \"kubernetes.io/projected/ae81863a-2778-4505-9106-c850f873a75d-kube-api-access-f6pt7\") pod \"barbican-operator-controller-manager-868647ff47-f8lbk\" (UID: \"ae81863a-2778-4505-9106-c850f873a75d\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-f8lbk" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.088413 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-tn47v"] Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.089211 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-tn47v" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.089749 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-f8lbk" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.093628 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-c597h"] Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.094730 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-c597h" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.102029 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-bmlmc" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.102260 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-b5qxz" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.102285 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-r4v59"] Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.104485 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzpq2\" (UniqueName: \"kubernetes.io/projected/61f73f1d-e472-411e-adc0-6755c47aa72b-kube-api-access-zzpq2\") pod \"cinder-operator-controller-manager-5d946d989d-6z2cq\" (UID: \"61f73f1d-e472-411e-adc0-6755c47aa72b\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-6z2cq" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.126789 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-tn47v"] Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.139887 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-6z2cq" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.147639 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c16d873-1097-4f56-913f-cc366ed34c23-cert\") pod \"infra-operator-controller-manager-79d975b745-krt25\" (UID: \"3c16d873-1097-4f56-913f-cc366ed34c23\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-krt25" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.147746 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8k6q\" (UniqueName: \"kubernetes.io/projected/4c9af0bf-50d7-42ef-a8df-241b5ec63f5a-kube-api-access-f8k6q\") pod \"glance-operator-controller-manager-77987464f4-bnvrn\" (UID: \"4c9af0bf-50d7-42ef-a8df-241b5ec63f5a\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-bnvrn" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.147809 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55cl4\" (UniqueName: \"kubernetes.io/projected/3c16d873-1097-4f56-913f-cc366ed34c23-kube-api-access-55cl4\") pod \"infra-operator-controller-manager-79d975b745-krt25\" (UID: \"3c16d873-1097-4f56-913f-cc366ed34c23\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-krt25" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.147849 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlvn5\" (UniqueName: \"kubernetes.io/projected/0032092e-84ca-426d-8f15-5141f4a8da20-kube-api-access-rlvn5\") pod \"horizon-operator-controller-manager-5b9b8895d5-82hvr\" (UID: \"0032092e-84ca-426d-8f15-5141f4a8da20\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-82hvr" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.147910 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfvwl\" (UniqueName: \"kubernetes.io/projected/324031ff-ceae-4065-9955-fd5745647cb0-kube-api-access-kfvwl\") pod \"keystone-operator-controller-manager-b4d948c87-jtfzr\" (UID: \"324031ff-ceae-4065-9955-fd5745647cb0\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-jtfzr" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.147937 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpjgd\" (UniqueName: \"kubernetes.io/projected/7753280d-fc59-4887-9d87-a2cfd83e7ba9-kube-api-access-xpjgd\") pod \"ironic-operator-controller-manager-554564d7fc-r4v59\" (UID: \"7753280d-fc59-4887-9d87-a2cfd83e7ba9\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-r4v59" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.147971 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8w9l\" (UniqueName: \"kubernetes.io/projected/51cd14ee-9b8a-421f-80bb-d208b752079d-kube-api-access-z8w9l\") pod \"heat-operator-controller-manager-69f49c598c-qm24h\" (UID: \"51cd14ee-9b8a-421f-80bb-d208b752079d\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-qm24h" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.157097 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-2ncv8" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.175437 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-c597h"] Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.192989 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-wrd8w"] Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.194119 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-wrd8w" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.202868 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-z5wq8" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.205219 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8w9l\" (UniqueName: \"kubernetes.io/projected/51cd14ee-9b8a-421f-80bb-d208b752079d-kube-api-access-z8w9l\") pod \"heat-operator-controller-manager-69f49c598c-qm24h\" (UID: \"51cd14ee-9b8a-421f-80bb-d208b752079d\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-qm24h" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.205285 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-gwbk7"] Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.205563 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8k6q\" (UniqueName: \"kubernetes.io/projected/4c9af0bf-50d7-42ef-a8df-241b5ec63f5a-kube-api-access-f8k6q\") pod \"glance-operator-controller-manager-77987464f4-bnvrn\" (UID: \"4c9af0bf-50d7-42ef-a8df-241b5ec63f5a\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-bnvrn" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.206086 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-gwbk7" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.213154 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-xwfl6" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.236974 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-4fm4m"] Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.241752 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-4fm4m" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.243611 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-bnvrn" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.246637 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-tbnxp" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.249235 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-wrd8w"] Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.255089 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55cl4\" (UniqueName: \"kubernetes.io/projected/3c16d873-1097-4f56-913f-cc366ed34c23-kube-api-access-55cl4\") pod \"infra-operator-controller-manager-79d975b745-krt25\" (UID: \"3c16d873-1097-4f56-913f-cc366ed34c23\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-krt25" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.255132 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlvn5\" (UniqueName: \"kubernetes.io/projected/0032092e-84ca-426d-8f15-5141f4a8da20-kube-api-access-rlvn5\") pod \"horizon-operator-controller-manager-5b9b8895d5-82hvr\" (UID: \"0032092e-84ca-426d-8f15-5141f4a8da20\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-82hvr" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.255194 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghd7h\" (UniqueName: \"kubernetes.io/projected/2936db6d-8a5b-4da8-9e52-e508a6e757fe-kube-api-access-ghd7h\") pod \"manila-operator-controller-manager-54f6768c69-c597h\" (UID: \"2936db6d-8a5b-4da8-9e52-e508a6e757fe\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-c597h" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.255228 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfvwl\" (UniqueName: \"kubernetes.io/projected/324031ff-ceae-4065-9955-fd5745647cb0-kube-api-access-kfvwl\") pod \"keystone-operator-controller-manager-b4d948c87-jtfzr\" (UID: \"324031ff-ceae-4065-9955-fd5745647cb0\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-jtfzr" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.255262 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpjgd\" (UniqueName: \"kubernetes.io/projected/7753280d-fc59-4887-9d87-a2cfd83e7ba9-kube-api-access-xpjgd\") pod \"ironic-operator-controller-manager-554564d7fc-r4v59\" (UID: \"7753280d-fc59-4887-9d87-a2cfd83e7ba9\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-r4v59" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.255310 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzfkh\" (UniqueName: \"kubernetes.io/projected/0a8811b6-4023-427d-a893-628e0dd338e8-kube-api-access-bzfkh\") pod \"mariadb-operator-controller-manager-6994f66f48-tn47v\" (UID: \"0a8811b6-4023-427d-a893-628e0dd338e8\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-tn47v" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.255415 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c16d873-1097-4f56-913f-cc366ed34c23-cert\") pod \"infra-operator-controller-manager-79d975b745-krt25\" (UID: \"3c16d873-1097-4f56-913f-cc366ed34c23\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-krt25" Feb 18 11:51:52 crc kubenswrapper[4922]: E0218 11:51:52.255570 4922 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 18 11:51:52 crc kubenswrapper[4922]: E0218 11:51:52.255633 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c16d873-1097-4f56-913f-cc366ed34c23-cert podName:3c16d873-1097-4f56-913f-cc366ed34c23 nodeName:}" failed. No retries permitted until 2026-02-18 11:51:52.755610735 +0000 UTC m=+914.483314815 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3c16d873-1097-4f56-913f-cc366ed34c23-cert") pod "infra-operator-controller-manager-79d975b745-krt25" (UID: "3c16d873-1097-4f56-913f-cc366ed34c23") : secret "infra-operator-webhook-server-cert" not found Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.257923 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-qm24h" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.267544 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-4fm4m"] Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.281380 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-gwbk7"] Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.283633 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfvwl\" (UniqueName: \"kubernetes.io/projected/324031ff-ceae-4065-9955-fd5745647cb0-kube-api-access-kfvwl\") pod \"keystone-operator-controller-manager-b4d948c87-jtfzr\" (UID: \"324031ff-ceae-4065-9955-fd5745647cb0\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-jtfzr" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.286218 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpjgd\" (UniqueName: \"kubernetes.io/projected/7753280d-fc59-4887-9d87-a2cfd83e7ba9-kube-api-access-xpjgd\") pod \"ironic-operator-controller-manager-554564d7fc-r4v59\" (UID: \"7753280d-fc59-4887-9d87-a2cfd83e7ba9\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-r4v59" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.290401 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlvn5\" (UniqueName: \"kubernetes.io/projected/0032092e-84ca-426d-8f15-5141f4a8da20-kube-api-access-rlvn5\") pod \"horizon-operator-controller-manager-5b9b8895d5-82hvr\" (UID: \"0032092e-84ca-426d-8f15-5141f4a8da20\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-82hvr" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.301723 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-z7pdl"] Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.302928 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-z7pdl" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.303497 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-82hvr" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.304812 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-hddmr"] Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.305055 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-dbfjs" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.305621 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-hddmr" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.306765 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55cl4\" (UniqueName: \"kubernetes.io/projected/3c16d873-1097-4f56-913f-cc366ed34c23-kube-api-access-55cl4\") pod \"infra-operator-controller-manager-79d975b745-krt25\" (UID: \"3c16d873-1097-4f56-913f-cc366ed34c23\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-krt25" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.307937 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-bttjx" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.377194 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghd7h\" (UniqueName: \"kubernetes.io/projected/2936db6d-8a5b-4da8-9e52-e508a6e757fe-kube-api-access-ghd7h\") pod \"manila-operator-controller-manager-54f6768c69-c597h\" (UID: \"2936db6d-8a5b-4da8-9e52-e508a6e757fe\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-c597h" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.377338 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrbmr\" (UniqueName: \"kubernetes.io/projected/a7487625-0c9e-4396-8eb8-5840ce4344c8-kube-api-access-lrbmr\") pod \"neutron-operator-controller-manager-64ddbf8bb-gwbk7\" (UID: \"a7487625-0c9e-4396-8eb8-5840ce4344c8\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-gwbk7" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.379528 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzfkh\" (UniqueName: \"kubernetes.io/projected/0a8811b6-4023-427d-a893-628e0dd338e8-kube-api-access-bzfkh\") pod \"mariadb-operator-controller-manager-6994f66f48-tn47v\" (UID: \"0a8811b6-4023-427d-a893-628e0dd338e8\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-tn47v" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.380535 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngz5n\" (UniqueName: \"kubernetes.io/projected/8eae5053-64f3-401a-a151-dbf22f30a845-kube-api-access-ngz5n\") pod \"octavia-operator-controller-manager-69f8888797-4fm4m\" (UID: \"8eae5053-64f3-401a-a151-dbf22f30a845\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-4fm4m" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.380592 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l57qx\" (UniqueName: \"kubernetes.io/projected/90b4a58a-81d7-4129-8f45-5429e963676e-kube-api-access-l57qx\") pod \"nova-operator-controller-manager-567668f5cf-wrd8w\" (UID: \"90b4a58a-81d7-4129-8f45-5429e963676e\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-wrd8w" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.391945 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cxfsr7"] Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.394707 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cxfsr7" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.409732 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.414185 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-xgr9v" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.421339 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzfkh\" (UniqueName: \"kubernetes.io/projected/0a8811b6-4023-427d-a893-628e0dd338e8-kube-api-access-bzfkh\") pod \"mariadb-operator-controller-manager-6994f66f48-tn47v\" (UID: \"0a8811b6-4023-427d-a893-628e0dd338e8\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-tn47v" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.446203 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghd7h\" (UniqueName: \"kubernetes.io/projected/2936db6d-8a5b-4da8-9e52-e508a6e757fe-kube-api-access-ghd7h\") pod \"manila-operator-controller-manager-54f6768c69-c597h\" (UID: \"2936db6d-8a5b-4da8-9e52-e508a6e757fe\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-c597h" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.448262 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-z7pdl"] Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.459595 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-r4v59" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.460141 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-2bk9r"] Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.461297 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-2bk9r" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.467857 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-twl8v" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.472318 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-2bk9r"] Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.481407 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-hddmr"] Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.486176 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hczzm\" (UniqueName: \"kubernetes.io/projected/66351682-3cdf-41cc-80d9-0bbb020144d2-kube-api-access-hczzm\") pod \"placement-operator-controller-manager-8497b45c89-hddmr\" (UID: \"66351682-3cdf-41cc-80d9-0bbb020144d2\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-hddmr" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.486271 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngz5n\" (UniqueName: \"kubernetes.io/projected/8eae5053-64f3-401a-a151-dbf22f30a845-kube-api-access-ngz5n\") pod \"octavia-operator-controller-manager-69f8888797-4fm4m\" (UID: \"8eae5053-64f3-401a-a151-dbf22f30a845\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-4fm4m" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.486303 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l57qx\" (UniqueName: \"kubernetes.io/projected/90b4a58a-81d7-4129-8f45-5429e963676e-kube-api-access-l57qx\") pod \"nova-operator-controller-manager-567668f5cf-wrd8w\" (UID: \"90b4a58a-81d7-4129-8f45-5429e963676e\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-wrd8w" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.486517 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrbmr\" (UniqueName: \"kubernetes.io/projected/a7487625-0c9e-4396-8eb8-5840ce4344c8-kube-api-access-lrbmr\") pod \"neutron-operator-controller-manager-64ddbf8bb-gwbk7\" (UID: \"a7487625-0c9e-4396-8eb8-5840ce4344c8\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-gwbk7" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.486551 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmqv5\" (UniqueName: \"kubernetes.io/projected/42271b89-6aba-4e15-a2a1-856b656a1b6e-kube-api-access-qmqv5\") pod \"ovn-operator-controller-manager-d44cf6b75-z7pdl\" (UID: \"42271b89-6aba-4e15-a2a1-856b656a1b6e\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-z7pdl" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.508701 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cxfsr7"] Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.514465 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-btlqf"] Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.515588 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-btlqf" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.517158 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l57qx\" (UniqueName: \"kubernetes.io/projected/90b4a58a-81d7-4129-8f45-5429e963676e-kube-api-access-l57qx\") pod \"nova-operator-controller-manager-567668f5cf-wrd8w\" (UID: \"90b4a58a-81d7-4129-8f45-5429e963676e\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-wrd8w" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.521292 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngz5n\" (UniqueName: \"kubernetes.io/projected/8eae5053-64f3-401a-a151-dbf22f30a845-kube-api-access-ngz5n\") pod \"octavia-operator-controller-manager-69f8888797-4fm4m\" (UID: \"8eae5053-64f3-401a-a151-dbf22f30a845\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-4fm4m" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.522084 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-9jtzj" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.527251 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-btlqf"] Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.530660 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrbmr\" (UniqueName: \"kubernetes.io/projected/a7487625-0c9e-4396-8eb8-5840ce4344c8-kube-api-access-lrbmr\") pod \"neutron-operator-controller-manager-64ddbf8bb-gwbk7\" (UID: \"a7487625-0c9e-4396-8eb8-5840ce4344c8\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-gwbk7" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.553353 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-jtfzr" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.572496 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-tn47v" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.583956 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-c597h" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.588583 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmqv5\" (UniqueName: \"kubernetes.io/projected/42271b89-6aba-4e15-a2a1-856b656a1b6e-kube-api-access-qmqv5\") pod \"ovn-operator-controller-manager-d44cf6b75-z7pdl\" (UID: \"42271b89-6aba-4e15-a2a1-856b656a1b6e\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-z7pdl" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.588633 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hczzm\" (UniqueName: \"kubernetes.io/projected/66351682-3cdf-41cc-80d9-0bbb020144d2-kube-api-access-hczzm\") pod \"placement-operator-controller-manager-8497b45c89-hddmr\" (UID: \"66351682-3cdf-41cc-80d9-0bbb020144d2\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-hddmr" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.588663 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc6xp\" (UniqueName: \"kubernetes.io/projected/081d9ec7-e338-437a-b3bc-af9b788db66a-kube-api-access-tc6xp\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cxfsr7\" (UID: \"081d9ec7-e338-437a-b3bc-af9b788db66a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cxfsr7" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.588717 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/081d9ec7-e338-437a-b3bc-af9b788db66a-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cxfsr7\" (UID: \"081d9ec7-e338-437a-b3bc-af9b788db66a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cxfsr7" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.588774 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zn6pl\" (UniqueName: \"kubernetes.io/projected/387afbf1-afa5-414c-a22a-83a6a8197ff7-kube-api-access-zn6pl\") pod \"telemetry-operator-controller-manager-7f45b4ff68-btlqf\" (UID: \"387afbf1-afa5-414c-a22a-83a6a8197ff7\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-btlqf" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.588798 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nntkx\" (UniqueName: \"kubernetes.io/projected/183b09db-ca5a-4aa1-b87b-908de4dc44ff-kube-api-access-nntkx\") pod \"swift-operator-controller-manager-68f46476f-2bk9r\" (UID: \"183b09db-ca5a-4aa1-b87b-908de4dc44ff\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-2bk9r" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.597489 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-wrd8w" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.609673 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmqv5\" (UniqueName: \"kubernetes.io/projected/42271b89-6aba-4e15-a2a1-856b656a1b6e-kube-api-access-qmqv5\") pod \"ovn-operator-controller-manager-d44cf6b75-z7pdl\" (UID: \"42271b89-6aba-4e15-a2a1-856b656a1b6e\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-z7pdl" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.609766 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hczzm\" (UniqueName: \"kubernetes.io/projected/66351682-3cdf-41cc-80d9-0bbb020144d2-kube-api-access-hczzm\") pod \"placement-operator-controller-manager-8497b45c89-hddmr\" (UID: \"66351682-3cdf-41cc-80d9-0bbb020144d2\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-hddmr" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.620585 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-xdrrr"] Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.621939 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-xdrrr" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.624244 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-xhz5c" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.627640 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-xdrrr"] Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.650979 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5689f5d7c4-95x8t"] Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.652397 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5689f5d7c4-95x8t" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.658801 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5689f5d7c4-95x8t"] Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.661045 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-gwbk7" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.661055 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-qhg5z" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.689418 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/081d9ec7-e338-437a-b3bc-af9b788db66a-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cxfsr7\" (UID: \"081d9ec7-e338-437a-b3bc-af9b788db66a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cxfsr7" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.689474 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zn6pl\" (UniqueName: \"kubernetes.io/projected/387afbf1-afa5-414c-a22a-83a6a8197ff7-kube-api-access-zn6pl\") pod \"telemetry-operator-controller-manager-7f45b4ff68-btlqf\" (UID: \"387afbf1-afa5-414c-a22a-83a6a8197ff7\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-btlqf" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.689499 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nntkx\" (UniqueName: \"kubernetes.io/projected/183b09db-ca5a-4aa1-b87b-908de4dc44ff-kube-api-access-nntkx\") pod \"swift-operator-controller-manager-68f46476f-2bk9r\" (UID: \"183b09db-ca5a-4aa1-b87b-908de4dc44ff\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-2bk9r" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.689543 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nc7k5\" (UniqueName: \"kubernetes.io/projected/4c487619-568f-44a0-9d23-037794ada114-kube-api-access-nc7k5\") pod \"watcher-operator-controller-manager-5689f5d7c4-95x8t\" (UID: \"4c487619-568f-44a0-9d23-037794ada114\") " pod="openstack-operators/watcher-operator-controller-manager-5689f5d7c4-95x8t" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.689578 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tc6xp\" (UniqueName: \"kubernetes.io/projected/081d9ec7-e338-437a-b3bc-af9b788db66a-kube-api-access-tc6xp\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cxfsr7\" (UID: \"081d9ec7-e338-437a-b3bc-af9b788db66a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cxfsr7" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.689623 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gzz8\" (UniqueName: \"kubernetes.io/projected/52123256-1372-49b6-80ed-c3112d14a8fa-kube-api-access-4gzz8\") pod \"test-operator-controller-manager-7866795846-xdrrr\" (UID: \"52123256-1372-49b6-80ed-c3112d14a8fa\") " pod="openstack-operators/test-operator-controller-manager-7866795846-xdrrr" Feb 18 11:51:52 crc kubenswrapper[4922]: E0218 11:51:52.689736 4922 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 11:51:52 crc kubenswrapper[4922]: E0218 11:51:52.689776 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/081d9ec7-e338-437a-b3bc-af9b788db66a-cert podName:081d9ec7-e338-437a-b3bc-af9b788db66a nodeName:}" failed. No retries permitted until 2026-02-18 11:51:53.189762666 +0000 UTC m=+914.917466746 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/081d9ec7-e338-437a-b3bc-af9b788db66a-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cxfsr7" (UID: "081d9ec7-e338-437a-b3bc-af9b788db66a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.695343 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-4fm4m" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.695964 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-z7pdl" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.712793 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zn6pl\" (UniqueName: \"kubernetes.io/projected/387afbf1-afa5-414c-a22a-83a6a8197ff7-kube-api-access-zn6pl\") pod \"telemetry-operator-controller-manager-7f45b4ff68-btlqf\" (UID: \"387afbf1-afa5-414c-a22a-83a6a8197ff7\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-btlqf" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.713277 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc6xp\" (UniqueName: \"kubernetes.io/projected/081d9ec7-e338-437a-b3bc-af9b788db66a-kube-api-access-tc6xp\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cxfsr7\" (UID: \"081d9ec7-e338-437a-b3bc-af9b788db66a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cxfsr7" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.714217 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nntkx\" (UniqueName: \"kubernetes.io/projected/183b09db-ca5a-4aa1-b87b-908de4dc44ff-kube-api-access-nntkx\") pod \"swift-operator-controller-manager-68f46476f-2bk9r\" (UID: \"183b09db-ca5a-4aa1-b87b-908de4dc44ff\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-2bk9r" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.717759 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-hddmr" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.741923 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-8685d86d55-pbbl7"] Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.743658 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-8685d86d55-pbbl7" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.747701 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.747752 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-9rlc5" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.747972 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.763269 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-8685d86d55-pbbl7"] Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.788300 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-98zrv"] Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.790824 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-98zrv" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.790859 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nc7k5\" (UniqueName: \"kubernetes.io/projected/4c487619-568f-44a0-9d23-037794ada114-kube-api-access-nc7k5\") pod \"watcher-operator-controller-manager-5689f5d7c4-95x8t\" (UID: \"4c487619-568f-44a0-9d23-037794ada114\") " pod="openstack-operators/watcher-operator-controller-manager-5689f5d7c4-95x8t" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.790936 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c16d873-1097-4f56-913f-cc366ed34c23-cert\") pod \"infra-operator-controller-manager-79d975b745-krt25\" (UID: \"3c16d873-1097-4f56-913f-cc366ed34c23\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-krt25" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.790970 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gzz8\" (UniqueName: \"kubernetes.io/projected/52123256-1372-49b6-80ed-c3112d14a8fa-kube-api-access-4gzz8\") pod \"test-operator-controller-manager-7866795846-xdrrr\" (UID: \"52123256-1372-49b6-80ed-c3112d14a8fa\") " pod="openstack-operators/test-operator-controller-manager-7866795846-xdrrr" Feb 18 11:51:52 crc kubenswrapper[4922]: E0218 11:51:52.791684 4922 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 18 11:51:52 crc kubenswrapper[4922]: E0218 11:51:52.791909 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c16d873-1097-4f56-913f-cc366ed34c23-cert podName:3c16d873-1097-4f56-913f-cc366ed34c23 nodeName:}" failed. No retries permitted until 2026-02-18 11:51:53.791893122 +0000 UTC m=+915.519597202 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3c16d873-1097-4f56-913f-cc366ed34c23-cert") pod "infra-operator-controller-manager-79d975b745-krt25" (UID: "3c16d873-1097-4f56-913f-cc366ed34c23") : secret "infra-operator-webhook-server-cert" not found Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.794564 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-b29wn" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.795238 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-2bk9r" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.801606 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-98zrv"] Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.816746 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nc7k5\" (UniqueName: \"kubernetes.io/projected/4c487619-568f-44a0-9d23-037794ada114-kube-api-access-nc7k5\") pod \"watcher-operator-controller-manager-5689f5d7c4-95x8t\" (UID: \"4c487619-568f-44a0-9d23-037794ada114\") " pod="openstack-operators/watcher-operator-controller-manager-5689f5d7c4-95x8t" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.839427 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-f8lbk"] Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.843007 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gzz8\" (UniqueName: \"kubernetes.io/projected/52123256-1372-49b6-80ed-c3112d14a8fa-kube-api-access-4gzz8\") pod \"test-operator-controller-manager-7866795846-xdrrr\" (UID: \"52123256-1372-49b6-80ed-c3112d14a8fa\") " pod="openstack-operators/test-operator-controller-manager-7866795846-xdrrr" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.872764 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-btlqf" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.894977 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57mh6\" (UniqueName: \"kubernetes.io/projected/d81b14bf-a056-4780-af1a-bf38babee5b3-kube-api-access-57mh6\") pod \"openstack-operator-controller-manager-8685d86d55-pbbl7\" (UID: \"d81b14bf-a056-4780-af1a-bf38babee5b3\") " pod="openstack-operators/openstack-operator-controller-manager-8685d86d55-pbbl7" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.895067 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d81b14bf-a056-4780-af1a-bf38babee5b3-metrics-certs\") pod \"openstack-operator-controller-manager-8685d86d55-pbbl7\" (UID: \"d81b14bf-a056-4780-af1a-bf38babee5b3\") " pod="openstack-operators/openstack-operator-controller-manager-8685d86d55-pbbl7" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.895093 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvrwt\" (UniqueName: \"kubernetes.io/projected/69ef021e-1b46-4aeb-8023-93f6fb366396-kube-api-access-hvrwt\") pod \"rabbitmq-cluster-operator-manager-668c99d594-98zrv\" (UID: \"69ef021e-1b46-4aeb-8023-93f6fb366396\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-98zrv" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.895150 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d81b14bf-a056-4780-af1a-bf38babee5b3-webhook-certs\") pod \"openstack-operator-controller-manager-8685d86d55-pbbl7\" (UID: \"d81b14bf-a056-4780-af1a-bf38babee5b3\") " pod="openstack-operators/openstack-operator-controller-manager-8685d86d55-pbbl7" Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.909161 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-2ncv8"] Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.933744 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-6z2cq"] Feb 18 11:51:52 crc kubenswrapper[4922]: W0218 11:51:52.936468 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01766bee_50bd_4dcb_9b3d_831486ddeaf4.slice/crio-19acb113026a915dd4ce9f4fe1ac57917d836682a2553268a443680fae51107f WatchSource:0}: Error finding container 19acb113026a915dd4ce9f4fe1ac57917d836682a2553268a443680fae51107f: Status 404 returned error can't find the container with id 19acb113026a915dd4ce9f4fe1ac57917d836682a2553268a443680fae51107f Feb 18 11:51:52 crc kubenswrapper[4922]: I0218 11:51:52.945271 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-xdrrr" Feb 18 11:51:52 crc kubenswrapper[4922]: W0218 11:51:52.986558 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61f73f1d_e472_411e_adc0_6755c47aa72b.slice/crio-52ff0b8371c8a99b343de4e6df91d29b03a6e78611a45c73c04ac853e4527c63 WatchSource:0}: Error finding container 52ff0b8371c8a99b343de4e6df91d29b03a6e78611a45c73c04ac853e4527c63: Status 404 returned error can't find the container with id 52ff0b8371c8a99b343de4e6df91d29b03a6e78611a45c73c04ac853e4527c63 Feb 18 11:51:53 crc kubenswrapper[4922]: I0218 11:51:53.000077 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5689f5d7c4-95x8t" Feb 18 11:51:53 crc kubenswrapper[4922]: I0218 11:51:53.001348 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d81b14bf-a056-4780-af1a-bf38babee5b3-metrics-certs\") pod \"openstack-operator-controller-manager-8685d86d55-pbbl7\" (UID: \"d81b14bf-a056-4780-af1a-bf38babee5b3\") " pod="openstack-operators/openstack-operator-controller-manager-8685d86d55-pbbl7" Feb 18 11:51:53 crc kubenswrapper[4922]: I0218 11:51:53.001412 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvrwt\" (UniqueName: \"kubernetes.io/projected/69ef021e-1b46-4aeb-8023-93f6fb366396-kube-api-access-hvrwt\") pod \"rabbitmq-cluster-operator-manager-668c99d594-98zrv\" (UID: \"69ef021e-1b46-4aeb-8023-93f6fb366396\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-98zrv" Feb 18 11:51:53 crc kubenswrapper[4922]: I0218 11:51:53.001457 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d81b14bf-a056-4780-af1a-bf38babee5b3-webhook-certs\") pod \"openstack-operator-controller-manager-8685d86d55-pbbl7\" (UID: \"d81b14bf-a056-4780-af1a-bf38babee5b3\") " pod="openstack-operators/openstack-operator-controller-manager-8685d86d55-pbbl7" Feb 18 11:51:53 crc kubenswrapper[4922]: I0218 11:51:53.001515 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57mh6\" (UniqueName: \"kubernetes.io/projected/d81b14bf-a056-4780-af1a-bf38babee5b3-kube-api-access-57mh6\") pod \"openstack-operator-controller-manager-8685d86d55-pbbl7\" (UID: \"d81b14bf-a056-4780-af1a-bf38babee5b3\") " pod="openstack-operators/openstack-operator-controller-manager-8685d86d55-pbbl7" Feb 18 11:51:53 crc kubenswrapper[4922]: E0218 11:51:53.001895 4922 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 18 11:51:53 crc kubenswrapper[4922]: E0218 11:51:53.001972 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d81b14bf-a056-4780-af1a-bf38babee5b3-metrics-certs podName:d81b14bf-a056-4780-af1a-bf38babee5b3 nodeName:}" failed. No retries permitted until 2026-02-18 11:51:53.50195399 +0000 UTC m=+915.229658070 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d81b14bf-a056-4780-af1a-bf38babee5b3-metrics-certs") pod "openstack-operator-controller-manager-8685d86d55-pbbl7" (UID: "d81b14bf-a056-4780-af1a-bf38babee5b3") : secret "metrics-server-cert" not found Feb 18 11:51:53 crc kubenswrapper[4922]: E0218 11:51:53.002240 4922 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 18 11:51:53 crc kubenswrapper[4922]: E0218 11:51:53.002273 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d81b14bf-a056-4780-af1a-bf38babee5b3-webhook-certs podName:d81b14bf-a056-4780-af1a-bf38babee5b3 nodeName:}" failed. No retries permitted until 2026-02-18 11:51:53.502259458 +0000 UTC m=+915.229963538 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d81b14bf-a056-4780-af1a-bf38babee5b3-webhook-certs") pod "openstack-operator-controller-manager-8685d86d55-pbbl7" (UID: "d81b14bf-a056-4780-af1a-bf38babee5b3") : secret "webhook-server-cert" not found Feb 18 11:51:53 crc kubenswrapper[4922]: I0218 11:51:53.038068 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvrwt\" (UniqueName: \"kubernetes.io/projected/69ef021e-1b46-4aeb-8023-93f6fb366396-kube-api-access-hvrwt\") pod \"rabbitmq-cluster-operator-manager-668c99d594-98zrv\" (UID: \"69ef021e-1b46-4aeb-8023-93f6fb366396\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-98zrv" Feb 18 11:51:53 crc kubenswrapper[4922]: I0218 11:51:53.045753 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57mh6\" (UniqueName: \"kubernetes.io/projected/d81b14bf-a056-4780-af1a-bf38babee5b3-kube-api-access-57mh6\") pod \"openstack-operator-controller-manager-8685d86d55-pbbl7\" (UID: \"d81b14bf-a056-4780-af1a-bf38babee5b3\") " pod="openstack-operators/openstack-operator-controller-manager-8685d86d55-pbbl7" Feb 18 11:51:53 crc kubenswrapper[4922]: I0218 11:51:53.078872 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-bnvrn"] Feb 18 11:51:53 crc kubenswrapper[4922]: I0218 11:51:53.107786 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-qm24h"] Feb 18 11:51:53 crc kubenswrapper[4922]: I0218 11:51:53.136708 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-82hvr"] Feb 18 11:51:53 crc kubenswrapper[4922]: I0218 11:51:53.172458 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-98zrv" Feb 18 11:51:53 crc kubenswrapper[4922]: I0218 11:51:53.205742 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/081d9ec7-e338-437a-b3bc-af9b788db66a-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cxfsr7\" (UID: \"081d9ec7-e338-437a-b3bc-af9b788db66a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cxfsr7" Feb 18 11:51:53 crc kubenswrapper[4922]: E0218 11:51:53.205984 4922 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 11:51:53 crc kubenswrapper[4922]: E0218 11:51:53.206047 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/081d9ec7-e338-437a-b3bc-af9b788db66a-cert podName:081d9ec7-e338-437a-b3bc-af9b788db66a nodeName:}" failed. No retries permitted until 2026-02-18 11:51:54.206029297 +0000 UTC m=+915.933733377 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/081d9ec7-e338-437a-b3bc-af9b788db66a-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cxfsr7" (UID: "081d9ec7-e338-437a-b3bc-af9b788db66a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 11:51:53 crc kubenswrapper[4922]: I0218 11:51:53.209526 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-6z2cq" event={"ID":"61f73f1d-e472-411e-adc0-6755c47aa72b","Type":"ContainerStarted","Data":"52ff0b8371c8a99b343de4e6df91d29b03a6e78611a45c73c04ac853e4527c63"} Feb 18 11:51:53 crc kubenswrapper[4922]: I0218 11:51:53.211190 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-f8lbk" event={"ID":"ae81863a-2778-4505-9106-c850f873a75d","Type":"ContainerStarted","Data":"d316eb12f438139d42c0ec170cdc8521aebcb89b1d6e0bad06606ee4c60afabe"} Feb 18 11:51:53 crc kubenswrapper[4922]: I0218 11:51:53.212247 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-bnvrn" event={"ID":"4c9af0bf-50d7-42ef-a8df-241b5ec63f5a","Type":"ContainerStarted","Data":"346670c4a29e5cbeb5110a5ec9ae34defeb274394842d7356c5ddb63453b5f3f"} Feb 18 11:51:53 crc kubenswrapper[4922]: I0218 11:51:53.213165 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-2ncv8" event={"ID":"01766bee-50bd-4dcb-9b3d-831486ddeaf4","Type":"ContainerStarted","Data":"19acb113026a915dd4ce9f4fe1ac57917d836682a2553268a443680fae51107f"} Feb 18 11:51:53 crc kubenswrapper[4922]: I0218 11:51:53.234573 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-qm24h" event={"ID":"51cd14ee-9b8a-421f-80bb-d208b752079d","Type":"ContainerStarted","Data":"b921c092fab28063a92a42657c3522fca4ca2ef3974b76ef9ab0416c1fb67ee8"} Feb 18 11:51:53 crc kubenswrapper[4922]: I0218 11:51:53.252737 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-82hvr" event={"ID":"0032092e-84ca-426d-8f15-5141f4a8da20","Type":"ContainerStarted","Data":"8d7cceffd7f6603febb5d39ddb325bd8f746cfd19eb6f4b0ff54cf01e0776946"} Feb 18 11:51:53 crc kubenswrapper[4922]: I0218 11:51:53.324882 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-tn47v"] Feb 18 11:51:53 crc kubenswrapper[4922]: I0218 11:51:53.334767 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-r4v59"] Feb 18 11:51:53 crc kubenswrapper[4922]: W0218 11:51:53.345295 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a8811b6_4023_427d_a893_628e0dd338e8.slice/crio-8558ba6625dea2210b7b67db1e11d44c62b1c4bce4a06664d839c01b9382c622 WatchSource:0}: Error finding container 8558ba6625dea2210b7b67db1e11d44c62b1c4bce4a06664d839c01b9382c622: Status 404 returned error can't find the container with id 8558ba6625dea2210b7b67db1e11d44c62b1c4bce4a06664d839c01b9382c622 Feb 18 11:51:53 crc kubenswrapper[4922]: W0218 11:51:53.347858 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7753280d_fc59_4887_9d87_a2cfd83e7ba9.slice/crio-0b141124e1520dbab2b2ea5158452a8e8b854d2cff468477b9a91b87418c6be6 WatchSource:0}: Error finding container 0b141124e1520dbab2b2ea5158452a8e8b854d2cff468477b9a91b87418c6be6: Status 404 returned error can't find the container with id 0b141124e1520dbab2b2ea5158452a8e8b854d2cff468477b9a91b87418c6be6 Feb 18 11:51:53 crc kubenswrapper[4922]: I0218 11:51:53.509439 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d81b14bf-a056-4780-af1a-bf38babee5b3-webhook-certs\") pod \"openstack-operator-controller-manager-8685d86d55-pbbl7\" (UID: \"d81b14bf-a056-4780-af1a-bf38babee5b3\") " pod="openstack-operators/openstack-operator-controller-manager-8685d86d55-pbbl7" Feb 18 11:51:53 crc kubenswrapper[4922]: I0218 11:51:53.509574 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d81b14bf-a056-4780-af1a-bf38babee5b3-metrics-certs\") pod \"openstack-operator-controller-manager-8685d86d55-pbbl7\" (UID: \"d81b14bf-a056-4780-af1a-bf38babee5b3\") " pod="openstack-operators/openstack-operator-controller-manager-8685d86d55-pbbl7" Feb 18 11:51:53 crc kubenswrapper[4922]: E0218 11:51:53.509574 4922 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 18 11:51:53 crc kubenswrapper[4922]: E0218 11:51:53.509669 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d81b14bf-a056-4780-af1a-bf38babee5b3-webhook-certs podName:d81b14bf-a056-4780-af1a-bf38babee5b3 nodeName:}" failed. No retries permitted until 2026-02-18 11:51:54.509646225 +0000 UTC m=+916.237350335 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d81b14bf-a056-4780-af1a-bf38babee5b3-webhook-certs") pod "openstack-operator-controller-manager-8685d86d55-pbbl7" (UID: "d81b14bf-a056-4780-af1a-bf38babee5b3") : secret "webhook-server-cert" not found Feb 18 11:51:53 crc kubenswrapper[4922]: E0218 11:51:53.509746 4922 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 18 11:51:53 crc kubenswrapper[4922]: E0218 11:51:53.509837 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d81b14bf-a056-4780-af1a-bf38babee5b3-metrics-certs podName:d81b14bf-a056-4780-af1a-bf38babee5b3 nodeName:}" failed. No retries permitted until 2026-02-18 11:51:54.50982045 +0000 UTC m=+916.237524530 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d81b14bf-a056-4780-af1a-bf38babee5b3-metrics-certs") pod "openstack-operator-controller-manager-8685d86d55-pbbl7" (UID: "d81b14bf-a056-4780-af1a-bf38babee5b3") : secret "metrics-server-cert" not found Feb 18 11:51:53 crc kubenswrapper[4922]: I0218 11:51:53.561717 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-jtfzr"] Feb 18 11:51:53 crc kubenswrapper[4922]: W0218 11:51:53.577000 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod324031ff_ceae_4065_9955_fd5745647cb0.slice/crio-791eb39defb8137ec3bf31cfb665d8d3beb61451f0c1d645e315b166828a58a3 WatchSource:0}: Error finding container 791eb39defb8137ec3bf31cfb665d8d3beb61451f0c1d645e315b166828a58a3: Status 404 returned error can't find the container with id 791eb39defb8137ec3bf31cfb665d8d3beb61451f0c1d645e315b166828a58a3 Feb 18 11:51:53 crc kubenswrapper[4922]: I0218 11:51:53.583449 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-c597h"] Feb 18 11:51:53 crc kubenswrapper[4922]: I0218 11:51:53.606864 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-wrd8w"] Feb 18 11:51:53 crc kubenswrapper[4922]: W0218 11:51:53.610399 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90b4a58a_81d7_4129_8f45_5429e963676e.slice/crio-5cc223c3dbe87c0c0b49f820260318eabbe9d2c5885bfa1b38b08950af80fa5e WatchSource:0}: Error finding container 5cc223c3dbe87c0c0b49f820260318eabbe9d2c5885bfa1b38b08950af80fa5e: Status 404 returned error can't find the container with id 5cc223c3dbe87c0c0b49f820260318eabbe9d2c5885bfa1b38b08950af80fa5e Feb 18 11:51:53 crc kubenswrapper[4922]: W0218 11:51:53.611706 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2936db6d_8a5b_4da8_9e52_e508a6e757fe.slice/crio-1fe78d73db97d66f5edadb46991fad71ebe88ff4c15640b936826b5262f74e49 WatchSource:0}: Error finding container 1fe78d73db97d66f5edadb46991fad71ebe88ff4c15640b936826b5262f74e49: Status 404 returned error can't find the container with id 1fe78d73db97d66f5edadb46991fad71ebe88ff4c15640b936826b5262f74e49 Feb 18 11:51:53 crc kubenswrapper[4922]: I0218 11:51:53.734904 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-4fm4m"] Feb 18 11:51:53 crc kubenswrapper[4922]: W0218 11:51:53.741293 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod183b09db_ca5a_4aa1_b87b_908de4dc44ff.slice/crio-f9b9316101309ac1a5566770b337b359f68832e29afb6db778bc8e5893f76954 WatchSource:0}: Error finding container f9b9316101309ac1a5566770b337b359f68832e29afb6db778bc8e5893f76954: Status 404 returned error can't find the container with id f9b9316101309ac1a5566770b337b359f68832e29afb6db778bc8e5893f76954 Feb 18 11:51:53 crc kubenswrapper[4922]: I0218 11:51:53.742652 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-2bk9r"] Feb 18 11:51:53 crc kubenswrapper[4922]: I0218 11:51:53.766827 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-z7pdl"] Feb 18 11:51:53 crc kubenswrapper[4922]: I0218 11:51:53.813492 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c16d873-1097-4f56-913f-cc366ed34c23-cert\") pod \"infra-operator-controller-manager-79d975b745-krt25\" (UID: \"3c16d873-1097-4f56-913f-cc366ed34c23\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-krt25" Feb 18 11:51:53 crc kubenswrapper[4922]: E0218 11:51:53.813671 4922 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 18 11:51:53 crc kubenswrapper[4922]: E0218 11:51:53.819222 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c16d873-1097-4f56-913f-cc366ed34c23-cert podName:3c16d873-1097-4f56-913f-cc366ed34c23 nodeName:}" failed. No retries permitted until 2026-02-18 11:51:55.819183873 +0000 UTC m=+917.546887963 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3c16d873-1097-4f56-913f-cc366ed34c23-cert") pod "infra-operator-controller-manager-79d975b745-krt25" (UID: "3c16d873-1097-4f56-913f-cc366ed34c23") : secret "infra-operator-webhook-server-cert" not found Feb 18 11:51:53 crc kubenswrapper[4922]: I0218 11:51:53.859759 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-gwbk7"] Feb 18 11:51:53 crc kubenswrapper[4922]: I0218 11:51:53.879329 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5689f5d7c4-95x8t"] Feb 18 11:51:53 crc kubenswrapper[4922]: W0218 11:51:53.883608 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7487625_0c9e_4396_8eb8_5840ce4344c8.slice/crio-22611ac8f1f288ae6fc73e5b9d603d332e759835a75a6f142ebd627e2e63073d WatchSource:0}: Error finding container 22611ac8f1f288ae6fc73e5b9d603d332e759835a75a6f142ebd627e2e63073d: Status 404 returned error can't find the container with id 22611ac8f1f288ae6fc73e5b9d603d332e759835a75a6f142ebd627e2e63073d Feb 18 11:51:53 crc kubenswrapper[4922]: E0218 11:51:53.897121 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lrbmr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-64ddbf8bb-gwbk7_openstack-operators(a7487625-0c9e-4396-8eb8-5840ce4344c8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 18 11:51:53 crc kubenswrapper[4922]: E0218 11:51:53.899029 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-gwbk7" podUID="a7487625-0c9e-4396-8eb8-5840ce4344c8" Feb 18 11:51:53 crc kubenswrapper[4922]: W0218 11:51:53.900157 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c487619_568f_44a0_9d23_037794ada114.slice/crio-b5d20a2cd29055e06106754975fbfd6dee0dbeeb700bde338d90c8c037f97863 WatchSource:0}: Error finding container b5d20a2cd29055e06106754975fbfd6dee0dbeeb700bde338d90c8c037f97863: Status 404 returned error can't find the container with id b5d20a2cd29055e06106754975fbfd6dee0dbeeb700bde338d90c8c037f97863 Feb 18 11:51:53 crc kubenswrapper[4922]: I0218 11:51:53.905467 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-xdrrr"] Feb 18 11:51:53 crc kubenswrapper[4922]: E0218 11:51:53.908900 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.158:5001/openstack-k8s-operators/watcher-operator:21dfa39c3cddfadb564fb9c5cae6c76789d51664,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nc7k5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-5689f5d7c4-95x8t_openstack-operators(4c487619-568f-44a0-9d23-037794ada114): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 18 11:51:53 crc kubenswrapper[4922]: W0218 11:51:53.909062 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66351682_3cdf_41cc_80d9_0bbb020144d2.slice/crio-6527504348a11d8b2de92ff5b0dd9f81a7f98185edf3b4e73fb6440ce36147f7 WatchSource:0}: Error finding container 6527504348a11d8b2de92ff5b0dd9f81a7f98185edf3b4e73fb6440ce36147f7: Status 404 returned error can't find the container with id 6527504348a11d8b2de92ff5b0dd9f81a7f98185edf3b4e73fb6440ce36147f7 Feb 18 11:51:53 crc kubenswrapper[4922]: E0218 11:51:53.910034 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-5689f5d7c4-95x8t" podUID="4c487619-568f-44a0-9d23-037794ada114" Feb 18 11:51:53 crc kubenswrapper[4922]: W0218 11:51:53.910562 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod387afbf1_afa5_414c_a22a_83a6a8197ff7.slice/crio-32ddb711968631bbecf3506594ca3fb6079bd04b1c2852f65477b75e233773fe WatchSource:0}: Error finding container 32ddb711968631bbecf3506594ca3fb6079bd04b1c2852f65477b75e233773fe: Status 404 returned error can't find the container with id 32ddb711968631bbecf3506594ca3fb6079bd04b1c2852f65477b75e233773fe Feb 18 11:51:53 crc kubenswrapper[4922]: I0218 11:51:53.916636 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-hddmr"] Feb 18 11:51:53 crc kubenswrapper[4922]: E0218 11:51:53.918330 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zn6pl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-7f45b4ff68-btlqf_openstack-operators(387afbf1-afa5-414c-a22a-83a6a8197ff7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 18 11:51:53 crc kubenswrapper[4922]: E0218 11:51:53.919580 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-btlqf" podUID="387afbf1-afa5-414c-a22a-83a6a8197ff7" Feb 18 11:51:53 crc kubenswrapper[4922]: E0218 11:51:53.918395 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hczzm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-8497b45c89-hddmr_openstack-operators(66351682-3cdf-41cc-80d9-0bbb020144d2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 18 11:51:53 crc kubenswrapper[4922]: E0218 11:51:53.921474 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-hddmr" podUID="66351682-3cdf-41cc-80d9-0bbb020144d2" Feb 18 11:51:53 crc kubenswrapper[4922]: I0218 11:51:53.929073 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-btlqf"] Feb 18 11:51:54 crc kubenswrapper[4922]: I0218 11:51:54.007685 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-98zrv"] Feb 18 11:51:54 crc kubenswrapper[4922]: W0218 11:51:54.015903 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69ef021e_1b46_4aeb_8023_93f6fb366396.slice/crio-1ba3a63b189a222a4db8bd883f9aea17f842e39f62bb27b037ac0bada74fffba WatchSource:0}: Error finding container 1ba3a63b189a222a4db8bd883f9aea17f842e39f62bb27b037ac0bada74fffba: Status 404 returned error can't find the container with id 1ba3a63b189a222a4db8bd883f9aea17f842e39f62bb27b037ac0bada74fffba Feb 18 11:51:54 crc kubenswrapper[4922]: E0218 11:51:54.020086 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hvrwt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-98zrv_openstack-operators(69ef021e-1b46-4aeb-8023-93f6fb366396): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 18 11:51:54 crc kubenswrapper[4922]: E0218 11:51:54.021226 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-98zrv" podUID="69ef021e-1b46-4aeb-8023-93f6fb366396" Feb 18 11:51:54 crc kubenswrapper[4922]: I0218 11:51:54.224474 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/081d9ec7-e338-437a-b3bc-af9b788db66a-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cxfsr7\" (UID: \"081d9ec7-e338-437a-b3bc-af9b788db66a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cxfsr7" Feb 18 11:51:54 crc kubenswrapper[4922]: E0218 11:51:54.224608 4922 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 11:51:54 crc kubenswrapper[4922]: E0218 11:51:54.225050 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/081d9ec7-e338-437a-b3bc-af9b788db66a-cert podName:081d9ec7-e338-437a-b3bc-af9b788db66a nodeName:}" failed. No retries permitted until 2026-02-18 11:51:56.225025769 +0000 UTC m=+917.952729849 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/081d9ec7-e338-437a-b3bc-af9b788db66a-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cxfsr7" (UID: "081d9ec7-e338-437a-b3bc-af9b788db66a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 11:51:54 crc kubenswrapper[4922]: I0218 11:51:54.262959 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-jtfzr" event={"ID":"324031ff-ceae-4065-9955-fd5745647cb0","Type":"ContainerStarted","Data":"791eb39defb8137ec3bf31cfb665d8d3beb61451f0c1d645e315b166828a58a3"} Feb 18 11:51:54 crc kubenswrapper[4922]: I0218 11:51:54.264989 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-hddmr" event={"ID":"66351682-3cdf-41cc-80d9-0bbb020144d2","Type":"ContainerStarted","Data":"6527504348a11d8b2de92ff5b0dd9f81a7f98185edf3b4e73fb6440ce36147f7"} Feb 18 11:51:54 crc kubenswrapper[4922]: E0218 11:51:54.266181 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd\\\"\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-hddmr" podUID="66351682-3cdf-41cc-80d9-0bbb020144d2" Feb 18 11:51:54 crc kubenswrapper[4922]: I0218 11:51:54.274210 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-tn47v" event={"ID":"0a8811b6-4023-427d-a893-628e0dd338e8","Type":"ContainerStarted","Data":"8558ba6625dea2210b7b67db1e11d44c62b1c4bce4a06664d839c01b9382c622"} Feb 18 11:51:54 crc kubenswrapper[4922]: I0218 11:51:54.275213 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5689f5d7c4-95x8t" event={"ID":"4c487619-568f-44a0-9d23-037794ada114","Type":"ContainerStarted","Data":"b5d20a2cd29055e06106754975fbfd6dee0dbeeb700bde338d90c8c037f97863"} Feb 18 11:51:54 crc kubenswrapper[4922]: E0218 11:51:54.277404 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.158:5001/openstack-k8s-operators/watcher-operator:21dfa39c3cddfadb564fb9c5cae6c76789d51664\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5689f5d7c4-95x8t" podUID="4c487619-568f-44a0-9d23-037794ada114" Feb 18 11:51:54 crc kubenswrapper[4922]: I0218 11:51:54.277782 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-z7pdl" event={"ID":"42271b89-6aba-4e15-a2a1-856b656a1b6e","Type":"ContainerStarted","Data":"bca8f660631f35a5242fb876dfc13783218eb26554e5bcd938da8dc505261074"} Feb 18 11:51:54 crc kubenswrapper[4922]: I0218 11:51:54.289904 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-2bk9r" event={"ID":"183b09db-ca5a-4aa1-b87b-908de4dc44ff","Type":"ContainerStarted","Data":"f9b9316101309ac1a5566770b337b359f68832e29afb6db778bc8e5893f76954"} Feb 18 11:51:54 crc kubenswrapper[4922]: I0218 11:51:54.297949 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-4fm4m" event={"ID":"8eae5053-64f3-401a-a151-dbf22f30a845","Type":"ContainerStarted","Data":"fbc581451c7775e4ed002eaee46cb078dba2b7394443f839c50f8f9d53eca2b0"} Feb 18 11:51:54 crc kubenswrapper[4922]: I0218 11:51:54.303432 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-wrd8w" event={"ID":"90b4a58a-81d7-4129-8f45-5429e963676e","Type":"ContainerStarted","Data":"5cc223c3dbe87c0c0b49f820260318eabbe9d2c5885bfa1b38b08950af80fa5e"} Feb 18 11:51:54 crc kubenswrapper[4922]: I0218 11:51:54.307563 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-98zrv" event={"ID":"69ef021e-1b46-4aeb-8023-93f6fb366396","Type":"ContainerStarted","Data":"1ba3a63b189a222a4db8bd883f9aea17f842e39f62bb27b037ac0bada74fffba"} Feb 18 11:51:54 crc kubenswrapper[4922]: I0218 11:51:54.310497 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-btlqf" event={"ID":"387afbf1-afa5-414c-a22a-83a6a8197ff7","Type":"ContainerStarted","Data":"32ddb711968631bbecf3506594ca3fb6079bd04b1c2852f65477b75e233773fe"} Feb 18 11:51:54 crc kubenswrapper[4922]: E0218 11:51:54.310987 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-98zrv" podUID="69ef021e-1b46-4aeb-8023-93f6fb366396" Feb 18 11:51:54 crc kubenswrapper[4922]: E0218 11:51:54.312437 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-btlqf" podUID="387afbf1-afa5-414c-a22a-83a6a8197ff7" Feb 18 11:51:54 crc kubenswrapper[4922]: I0218 11:51:54.313708 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-r4v59" event={"ID":"7753280d-fc59-4887-9d87-a2cfd83e7ba9","Type":"ContainerStarted","Data":"0b141124e1520dbab2b2ea5158452a8e8b854d2cff468477b9a91b87418c6be6"} Feb 18 11:51:54 crc kubenswrapper[4922]: I0218 11:51:54.317927 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-xdrrr" event={"ID":"52123256-1372-49b6-80ed-c3112d14a8fa","Type":"ContainerStarted","Data":"8bbea18a5b40a76c11c1625c7047a94a50daccbec8f3489452a3f764e74051a8"} Feb 18 11:51:54 crc kubenswrapper[4922]: I0218 11:51:54.321346 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-c597h" event={"ID":"2936db6d-8a5b-4da8-9e52-e508a6e757fe","Type":"ContainerStarted","Data":"1fe78d73db97d66f5edadb46991fad71ebe88ff4c15640b936826b5262f74e49"} Feb 18 11:51:54 crc kubenswrapper[4922]: I0218 11:51:54.331504 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-gwbk7" event={"ID":"a7487625-0c9e-4396-8eb8-5840ce4344c8","Type":"ContainerStarted","Data":"22611ac8f1f288ae6fc73e5b9d603d332e759835a75a6f142ebd627e2e63073d"} Feb 18 11:51:54 crc kubenswrapper[4922]: E0218 11:51:54.333282 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-gwbk7" podUID="a7487625-0c9e-4396-8eb8-5840ce4344c8" Feb 18 11:51:54 crc kubenswrapper[4922]: I0218 11:51:54.531073 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d81b14bf-a056-4780-af1a-bf38babee5b3-metrics-certs\") pod \"openstack-operator-controller-manager-8685d86d55-pbbl7\" (UID: \"d81b14bf-a056-4780-af1a-bf38babee5b3\") " pod="openstack-operators/openstack-operator-controller-manager-8685d86d55-pbbl7" Feb 18 11:51:54 crc kubenswrapper[4922]: I0218 11:51:54.531149 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d81b14bf-a056-4780-af1a-bf38babee5b3-webhook-certs\") pod \"openstack-operator-controller-manager-8685d86d55-pbbl7\" (UID: \"d81b14bf-a056-4780-af1a-bf38babee5b3\") " pod="openstack-operators/openstack-operator-controller-manager-8685d86d55-pbbl7" Feb 18 11:51:54 crc kubenswrapper[4922]: E0218 11:51:54.531231 4922 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 18 11:51:54 crc kubenswrapper[4922]: E0218 11:51:54.531296 4922 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 18 11:51:54 crc kubenswrapper[4922]: E0218 11:51:54.531309 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d81b14bf-a056-4780-af1a-bf38babee5b3-metrics-certs podName:d81b14bf-a056-4780-af1a-bf38babee5b3 nodeName:}" failed. No retries permitted until 2026-02-18 11:51:56.531285024 +0000 UTC m=+918.258989094 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d81b14bf-a056-4780-af1a-bf38babee5b3-metrics-certs") pod "openstack-operator-controller-manager-8685d86d55-pbbl7" (UID: "d81b14bf-a056-4780-af1a-bf38babee5b3") : secret "metrics-server-cert" not found Feb 18 11:51:54 crc kubenswrapper[4922]: E0218 11:51:54.531739 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d81b14bf-a056-4780-af1a-bf38babee5b3-webhook-certs podName:d81b14bf-a056-4780-af1a-bf38babee5b3 nodeName:}" failed. No retries permitted until 2026-02-18 11:51:56.531684194 +0000 UTC m=+918.259388334 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d81b14bf-a056-4780-af1a-bf38babee5b3-webhook-certs") pod "openstack-operator-controller-manager-8685d86d55-pbbl7" (UID: "d81b14bf-a056-4780-af1a-bf38babee5b3") : secret "webhook-server-cert" not found Feb 18 11:51:55 crc kubenswrapper[4922]: E0218 11:51:55.355552 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-gwbk7" podUID="a7487625-0c9e-4396-8eb8-5840ce4344c8" Feb 18 11:51:55 crc kubenswrapper[4922]: E0218 11:51:55.357469 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-98zrv" podUID="69ef021e-1b46-4aeb-8023-93f6fb366396" Feb 18 11:51:55 crc kubenswrapper[4922]: E0218 11:51:55.359918 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-btlqf" podUID="387afbf1-afa5-414c-a22a-83a6a8197ff7" Feb 18 11:51:55 crc kubenswrapper[4922]: E0218 11:51:55.360036 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.158:5001/openstack-k8s-operators/watcher-operator:21dfa39c3cddfadb564fb9c5cae6c76789d51664\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5689f5d7c4-95x8t" podUID="4c487619-568f-44a0-9d23-037794ada114" Feb 18 11:51:55 crc kubenswrapper[4922]: E0218 11:51:55.363754 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd\\\"\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-hddmr" podUID="66351682-3cdf-41cc-80d9-0bbb020144d2" Feb 18 11:51:55 crc kubenswrapper[4922]: I0218 11:51:55.851761 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c16d873-1097-4f56-913f-cc366ed34c23-cert\") pod \"infra-operator-controller-manager-79d975b745-krt25\" (UID: \"3c16d873-1097-4f56-913f-cc366ed34c23\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-krt25" Feb 18 11:51:55 crc kubenswrapper[4922]: E0218 11:51:55.851916 4922 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 18 11:51:55 crc kubenswrapper[4922]: E0218 11:51:55.851982 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c16d873-1097-4f56-913f-cc366ed34c23-cert podName:3c16d873-1097-4f56-913f-cc366ed34c23 nodeName:}" failed. No retries permitted until 2026-02-18 11:51:59.851964394 +0000 UTC m=+921.579668474 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3c16d873-1097-4f56-913f-cc366ed34c23-cert") pod "infra-operator-controller-manager-79d975b745-krt25" (UID: "3c16d873-1097-4f56-913f-cc366ed34c23") : secret "infra-operator-webhook-server-cert" not found Feb 18 11:51:56 crc kubenswrapper[4922]: I0218 11:51:56.265563 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/081d9ec7-e338-437a-b3bc-af9b788db66a-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cxfsr7\" (UID: \"081d9ec7-e338-437a-b3bc-af9b788db66a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cxfsr7" Feb 18 11:51:56 crc kubenswrapper[4922]: E0218 11:51:56.266029 4922 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 11:51:56 crc kubenswrapper[4922]: E0218 11:51:56.266075 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/081d9ec7-e338-437a-b3bc-af9b788db66a-cert podName:081d9ec7-e338-437a-b3bc-af9b788db66a nodeName:}" failed. No retries permitted until 2026-02-18 11:52:00.266061928 +0000 UTC m=+921.993766008 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/081d9ec7-e338-437a-b3bc-af9b788db66a-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cxfsr7" (UID: "081d9ec7-e338-437a-b3bc-af9b788db66a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 11:51:56 crc kubenswrapper[4922]: I0218 11:51:56.569847 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d81b14bf-a056-4780-af1a-bf38babee5b3-metrics-certs\") pod \"openstack-operator-controller-manager-8685d86d55-pbbl7\" (UID: \"d81b14bf-a056-4780-af1a-bf38babee5b3\") " pod="openstack-operators/openstack-operator-controller-manager-8685d86d55-pbbl7" Feb 18 11:51:56 crc kubenswrapper[4922]: I0218 11:51:56.569944 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d81b14bf-a056-4780-af1a-bf38babee5b3-webhook-certs\") pod \"openstack-operator-controller-manager-8685d86d55-pbbl7\" (UID: \"d81b14bf-a056-4780-af1a-bf38babee5b3\") " pod="openstack-operators/openstack-operator-controller-manager-8685d86d55-pbbl7" Feb 18 11:51:56 crc kubenswrapper[4922]: E0218 11:51:56.570000 4922 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 18 11:51:56 crc kubenswrapper[4922]: E0218 11:51:56.570092 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d81b14bf-a056-4780-af1a-bf38babee5b3-metrics-certs podName:d81b14bf-a056-4780-af1a-bf38babee5b3 nodeName:}" failed. No retries permitted until 2026-02-18 11:52:00.570068806 +0000 UTC m=+922.297772936 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d81b14bf-a056-4780-af1a-bf38babee5b3-metrics-certs") pod "openstack-operator-controller-manager-8685d86d55-pbbl7" (UID: "d81b14bf-a056-4780-af1a-bf38babee5b3") : secret "metrics-server-cert" not found Feb 18 11:51:56 crc kubenswrapper[4922]: E0218 11:51:56.570150 4922 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 18 11:51:56 crc kubenswrapper[4922]: E0218 11:51:56.570515 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d81b14bf-a056-4780-af1a-bf38babee5b3-webhook-certs podName:d81b14bf-a056-4780-af1a-bf38babee5b3 nodeName:}" failed. No retries permitted until 2026-02-18 11:52:00.570474046 +0000 UTC m=+922.298178126 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d81b14bf-a056-4780-af1a-bf38babee5b3-webhook-certs") pod "openstack-operator-controller-manager-8685d86d55-pbbl7" (UID: "d81b14bf-a056-4780-af1a-bf38babee5b3") : secret "webhook-server-cert" not found Feb 18 11:51:59 crc kubenswrapper[4922]: I0218 11:51:59.919570 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c16d873-1097-4f56-913f-cc366ed34c23-cert\") pod \"infra-operator-controller-manager-79d975b745-krt25\" (UID: \"3c16d873-1097-4f56-913f-cc366ed34c23\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-krt25" Feb 18 11:51:59 crc kubenswrapper[4922]: E0218 11:51:59.919833 4922 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 18 11:51:59 crc kubenswrapper[4922]: E0218 11:51:59.920066 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c16d873-1097-4f56-913f-cc366ed34c23-cert podName:3c16d873-1097-4f56-913f-cc366ed34c23 nodeName:}" failed. No retries permitted until 2026-02-18 11:52:07.920049761 +0000 UTC m=+929.647753831 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3c16d873-1097-4f56-913f-cc366ed34c23-cert") pod "infra-operator-controller-manager-79d975b745-krt25" (UID: "3c16d873-1097-4f56-913f-cc366ed34c23") : secret "infra-operator-webhook-server-cert" not found Feb 18 11:52:00 crc kubenswrapper[4922]: I0218 11:52:00.326285 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/081d9ec7-e338-437a-b3bc-af9b788db66a-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cxfsr7\" (UID: \"081d9ec7-e338-437a-b3bc-af9b788db66a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cxfsr7" Feb 18 11:52:00 crc kubenswrapper[4922]: E0218 11:52:00.326518 4922 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 11:52:00 crc kubenswrapper[4922]: E0218 11:52:00.326747 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/081d9ec7-e338-437a-b3bc-af9b788db66a-cert podName:081d9ec7-e338-437a-b3bc-af9b788db66a nodeName:}" failed. No retries permitted until 2026-02-18 11:52:08.326726088 +0000 UTC m=+930.054430168 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/081d9ec7-e338-437a-b3bc-af9b788db66a-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cxfsr7" (UID: "081d9ec7-e338-437a-b3bc-af9b788db66a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 11:52:00 crc kubenswrapper[4922]: I0218 11:52:00.629982 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d81b14bf-a056-4780-af1a-bf38babee5b3-metrics-certs\") pod \"openstack-operator-controller-manager-8685d86d55-pbbl7\" (UID: \"d81b14bf-a056-4780-af1a-bf38babee5b3\") " pod="openstack-operators/openstack-operator-controller-manager-8685d86d55-pbbl7" Feb 18 11:52:00 crc kubenswrapper[4922]: I0218 11:52:00.630073 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d81b14bf-a056-4780-af1a-bf38babee5b3-webhook-certs\") pod \"openstack-operator-controller-manager-8685d86d55-pbbl7\" (UID: \"d81b14bf-a056-4780-af1a-bf38babee5b3\") " pod="openstack-operators/openstack-operator-controller-manager-8685d86d55-pbbl7" Feb 18 11:52:00 crc kubenswrapper[4922]: E0218 11:52:00.630219 4922 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 18 11:52:00 crc kubenswrapper[4922]: E0218 11:52:00.630284 4922 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 18 11:52:00 crc kubenswrapper[4922]: E0218 11:52:00.630346 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d81b14bf-a056-4780-af1a-bf38babee5b3-metrics-certs podName:d81b14bf-a056-4780-af1a-bf38babee5b3 nodeName:}" failed. No retries permitted until 2026-02-18 11:52:08.630318766 +0000 UTC m=+930.358022846 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d81b14bf-a056-4780-af1a-bf38babee5b3-metrics-certs") pod "openstack-operator-controller-manager-8685d86d55-pbbl7" (UID: "d81b14bf-a056-4780-af1a-bf38babee5b3") : secret "metrics-server-cert" not found Feb 18 11:52:00 crc kubenswrapper[4922]: E0218 11:52:00.630411 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d81b14bf-a056-4780-af1a-bf38babee5b3-webhook-certs podName:d81b14bf-a056-4780-af1a-bf38babee5b3 nodeName:}" failed. No retries permitted until 2026-02-18 11:52:08.630401138 +0000 UTC m=+930.358105218 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d81b14bf-a056-4780-af1a-bf38babee5b3-webhook-certs") pod "openstack-operator-controller-manager-8685d86d55-pbbl7" (UID: "d81b14bf-a056-4780-af1a-bf38babee5b3") : secret "webhook-server-cert" not found Feb 18 11:52:05 crc kubenswrapper[4922]: E0218 11:52:05.823013 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:c1e33e962043cd6e3d09ebd225cb72781451dba7af2d57522e5c6eedbdc91642" Feb 18 11:52:05 crc kubenswrapper[4922]: E0218 11:52:05.823717 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:c1e33e962043cd6e3d09ebd225cb72781451dba7af2d57522e5c6eedbdc91642,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-z5fnf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-6d8bf5c495-2ncv8_openstack-operators(01766bee-50bd-4dcb-9b3d-831486ddeaf4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 11:52:05 crc kubenswrapper[4922]: E0218 11:52:05.825036 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-2ncv8" podUID="01766bee-50bd-4dcb-9b3d-831486ddeaf4" Feb 18 11:52:06 crc kubenswrapper[4922]: E0218 11:52:06.392200 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:2b8ab3063af4aaeed0198197aae6f391c6647ac686c94c85668537f1d5933979" Feb 18 11:52:06 crc kubenswrapper[4922]: E0218 11:52:06.392714 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:2b8ab3063af4aaeed0198197aae6f391c6647ac686c94c85668537f1d5933979,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zzpq2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-5d946d989d-6z2cq_openstack-operators(61f73f1d-e472-411e-adc0-6755c47aa72b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 11:52:06 crc kubenswrapper[4922]: E0218 11:52:06.394310 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-6z2cq" podUID="61f73f1d-e472-411e-adc0-6755c47aa72b" Feb 18 11:52:06 crc kubenswrapper[4922]: E0218 11:52:06.431831 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:c1e33e962043cd6e3d09ebd225cb72781451dba7af2d57522e5c6eedbdc91642\\\"\"" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-2ncv8" podUID="01766bee-50bd-4dcb-9b3d-831486ddeaf4" Feb 18 11:52:06 crc kubenswrapper[4922]: E0218 11:52:06.432065 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:2b8ab3063af4aaeed0198197aae6f391c6647ac686c94c85668537f1d5933979\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-6z2cq" podUID="61f73f1d-e472-411e-adc0-6755c47aa72b" Feb 18 11:52:06 crc kubenswrapper[4922]: E0218 11:52:06.950883 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:8fb0a33b8d93cf9f84f079af5f2ceb680afada4e44542514959146779f57f64c" Feb 18 11:52:06 crc kubenswrapper[4922]: E0218 11:52:06.951081 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:8fb0a33b8d93cf9f84f079af5f2ceb680afada4e44542514959146779f57f64c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ghd7h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-54f6768c69-c597h_openstack-operators(2936db6d-8a5b-4da8-9e52-e508a6e757fe): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 11:52:06 crc kubenswrapper[4922]: E0218 11:52:06.952330 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-c597h" podUID="2936db6d-8a5b-4da8-9e52-e508a6e757fe" Feb 18 11:52:07 crc kubenswrapper[4922]: E0218 11:52:07.441908 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:8fb0a33b8d93cf9f84f079af5f2ceb680afada4e44542514959146779f57f64c\\\"\"" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-c597h" podUID="2936db6d-8a5b-4da8-9e52-e508a6e757fe" Feb 18 11:52:07 crc kubenswrapper[4922]: E0218 11:52:07.530571 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759" Feb 18 11:52:07 crc kubenswrapper[4922]: E0218 11:52:07.530885 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qmqv5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-d44cf6b75-z7pdl_openstack-operators(42271b89-6aba-4e15-a2a1-856b656a1b6e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 11:52:07 crc kubenswrapper[4922]: E0218 11:52:07.532098 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-z7pdl" podUID="42271b89-6aba-4e15-a2a1-856b656a1b6e" Feb 18 11:52:07 crc kubenswrapper[4922]: I0218 11:52:07.943435 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c16d873-1097-4f56-913f-cc366ed34c23-cert\") pod \"infra-operator-controller-manager-79d975b745-krt25\" (UID: \"3c16d873-1097-4f56-913f-cc366ed34c23\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-krt25" Feb 18 11:52:07 crc kubenswrapper[4922]: E0218 11:52:07.943603 4922 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 18 11:52:07 crc kubenswrapper[4922]: E0218 11:52:07.943650 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c16d873-1097-4f56-913f-cc366ed34c23-cert podName:3c16d873-1097-4f56-913f-cc366ed34c23 nodeName:}" failed. No retries permitted until 2026-02-18 11:52:23.943633375 +0000 UTC m=+945.671337455 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3c16d873-1097-4f56-913f-cc366ed34c23-cert") pod "infra-operator-controller-manager-79d975b745-krt25" (UID: "3c16d873-1097-4f56-913f-cc366ed34c23") : secret "infra-operator-webhook-server-cert" not found Feb 18 11:52:08 crc kubenswrapper[4922]: I0218 11:52:08.348700 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/081d9ec7-e338-437a-b3bc-af9b788db66a-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cxfsr7\" (UID: \"081d9ec7-e338-437a-b3bc-af9b788db66a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cxfsr7" Feb 18 11:52:08 crc kubenswrapper[4922]: E0218 11:52:08.348908 4922 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 11:52:08 crc kubenswrapper[4922]: E0218 11:52:08.349000 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/081d9ec7-e338-437a-b3bc-af9b788db66a-cert podName:081d9ec7-e338-437a-b3bc-af9b788db66a nodeName:}" failed. No retries permitted until 2026-02-18 11:52:24.348975439 +0000 UTC m=+946.076679539 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/081d9ec7-e338-437a-b3bc-af9b788db66a-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cxfsr7" (UID: "081d9ec7-e338-437a-b3bc-af9b788db66a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 11:52:08 crc kubenswrapper[4922]: E0218 11:52:08.447521 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-z7pdl" podUID="42271b89-6aba-4e15-a2a1-856b656a1b6e" Feb 18 11:52:08 crc kubenswrapper[4922]: I0218 11:52:08.652783 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d81b14bf-a056-4780-af1a-bf38babee5b3-metrics-certs\") pod \"openstack-operator-controller-manager-8685d86d55-pbbl7\" (UID: \"d81b14bf-a056-4780-af1a-bf38babee5b3\") " pod="openstack-operators/openstack-operator-controller-manager-8685d86d55-pbbl7" Feb 18 11:52:08 crc kubenswrapper[4922]: I0218 11:52:08.652937 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d81b14bf-a056-4780-af1a-bf38babee5b3-webhook-certs\") pod \"openstack-operator-controller-manager-8685d86d55-pbbl7\" (UID: \"d81b14bf-a056-4780-af1a-bf38babee5b3\") " pod="openstack-operators/openstack-operator-controller-manager-8685d86d55-pbbl7" Feb 18 11:52:08 crc kubenswrapper[4922]: E0218 11:52:08.653139 4922 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 18 11:52:08 crc kubenswrapper[4922]: E0218 11:52:08.653226 4922 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 18 11:52:08 crc kubenswrapper[4922]: E0218 11:52:08.653270 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d81b14bf-a056-4780-af1a-bf38babee5b3-metrics-certs podName:d81b14bf-a056-4780-af1a-bf38babee5b3 nodeName:}" failed. No retries permitted until 2026-02-18 11:52:24.653244433 +0000 UTC m=+946.380948513 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d81b14bf-a056-4780-af1a-bf38babee5b3-metrics-certs") pod "openstack-operator-controller-manager-8685d86d55-pbbl7" (UID: "d81b14bf-a056-4780-af1a-bf38babee5b3") : secret "metrics-server-cert" not found Feb 18 11:52:08 crc kubenswrapper[4922]: E0218 11:52:08.653324 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d81b14bf-a056-4780-af1a-bf38babee5b3-webhook-certs podName:d81b14bf-a056-4780-af1a-bf38babee5b3 nodeName:}" failed. No retries permitted until 2026-02-18 11:52:24.653296964 +0000 UTC m=+946.381001244 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d81b14bf-a056-4780-af1a-bf38babee5b3-webhook-certs") pod "openstack-operator-controller-manager-8685d86d55-pbbl7" (UID: "d81b14bf-a056-4780-af1a-bf38babee5b3") : secret "webhook-server-cert" not found Feb 18 11:52:09 crc kubenswrapper[4922]: E0218 11:52:09.117808 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:229fc8c8d94dd4102d2151cd4ec1eaaa09d897c2b396d06e903f61ea29c1fa34" Feb 18 11:52:09 crc kubenswrapper[4922]: E0218 11:52:09.118292 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:229fc8c8d94dd4102d2151cd4ec1eaaa09d897c2b396d06e903f61ea29c1fa34,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ngz5n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-69f8888797-4fm4m_openstack-operators(8eae5053-64f3-401a-a151-dbf22f30a845): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 11:52:09 crc kubenswrapper[4922]: E0218 11:52:09.119909 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-4fm4m" podUID="8eae5053-64f3-401a-a151-dbf22f30a845" Feb 18 11:52:09 crc kubenswrapper[4922]: E0218 11:52:09.455213 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:229fc8c8d94dd4102d2151cd4ec1eaaa09d897c2b396d06e903f61ea29c1fa34\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-4fm4m" podUID="8eae5053-64f3-401a-a151-dbf22f30a845" Feb 18 11:52:09 crc kubenswrapper[4922]: E0218 11:52:09.696902 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838" Feb 18 11:52:09 crc kubenswrapper[4922]: E0218 11:52:09.697068 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-l57qx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-567668f5cf-wrd8w_openstack-operators(90b4a58a-81d7-4129-8f45-5429e963676e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 11:52:09 crc kubenswrapper[4922]: E0218 11:52:09.698388 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-wrd8w" podUID="90b4a58a-81d7-4129-8f45-5429e963676e" Feb 18 11:52:09 crc kubenswrapper[4922]: I0218 11:52:09.974777 4922 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 11:52:10 crc kubenswrapper[4922]: E0218 11:52:10.233033 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1" Feb 18 11:52:10 crc kubenswrapper[4922]: E0218 11:52:10.233215 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kfvwl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-b4d948c87-jtfzr_openstack-operators(324031ff-ceae-4065-9955-fd5745647cb0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 11:52:10 crc kubenswrapper[4922]: E0218 11:52:10.234529 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-jtfzr" podUID="324031ff-ceae-4065-9955-fd5745647cb0" Feb 18 11:52:10 crc kubenswrapper[4922]: E0218 11:52:10.462601 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-jtfzr" podUID="324031ff-ceae-4065-9955-fd5745647cb0" Feb 18 11:52:10 crc kubenswrapper[4922]: E0218 11:52:10.462741 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838\\\"\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-wrd8w" podUID="90b4a58a-81d7-4129-8f45-5429e963676e" Feb 18 11:52:14 crc kubenswrapper[4922]: I0218 11:52:14.509092 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-f8lbk" event={"ID":"ae81863a-2778-4505-9106-c850f873a75d","Type":"ContainerStarted","Data":"82ad5d281eb09ba4cafc040c8576b6d48cf0446118f10f3120bba11bc65711c8"} Feb 18 11:52:14 crc kubenswrapper[4922]: I0218 11:52:14.509421 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-f8lbk" Feb 18 11:52:14 crc kubenswrapper[4922]: I0218 11:52:14.511402 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-hddmr" event={"ID":"66351682-3cdf-41cc-80d9-0bbb020144d2","Type":"ContainerStarted","Data":"51e441e8bb0f01482649fae743fac2790e5190f0aa5133eeee078827ccb30a64"} Feb 18 11:52:14 crc kubenswrapper[4922]: I0218 11:52:14.511596 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-hddmr" Feb 18 11:52:14 crc kubenswrapper[4922]: I0218 11:52:14.512815 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-tn47v" event={"ID":"0a8811b6-4023-427d-a893-628e0dd338e8","Type":"ContainerStarted","Data":"3820e59525b5adc62b77ed733a0295e398e1ed515fb19969b5a88c399217a3ce"} Feb 18 11:52:14 crc kubenswrapper[4922]: I0218 11:52:14.513812 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-tn47v" Feb 18 11:52:14 crc kubenswrapper[4922]: I0218 11:52:14.520125 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-82hvr" event={"ID":"0032092e-84ca-426d-8f15-5141f4a8da20","Type":"ContainerStarted","Data":"94ea8c9fdab71f7b4b88df97f12706a635037fa70f83abf40da86ab871608a4f"} Feb 18 11:52:14 crc kubenswrapper[4922]: I0218 11:52:14.520806 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-82hvr" Feb 18 11:52:14 crc kubenswrapper[4922]: I0218 11:52:14.529282 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-xdrrr" event={"ID":"52123256-1372-49b6-80ed-c3112d14a8fa","Type":"ContainerStarted","Data":"047496db1edfbd8b01295f5532ec0ff045c27cd97d716a46e88784309b6676cf"} Feb 18 11:52:14 crc kubenswrapper[4922]: I0218 11:52:14.529714 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-7866795846-xdrrr" Feb 18 11:52:14 crc kubenswrapper[4922]: I0218 11:52:14.541831 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-f8lbk" podStartSLOduration=4.577771667 podStartE2EDuration="23.541809546s" podCreationTimestamp="2026-02-18 11:51:51 +0000 UTC" firstStartedPulling="2026-02-18 11:51:52.866095803 +0000 UTC m=+914.593799883" lastFinishedPulling="2026-02-18 11:52:11.830133682 +0000 UTC m=+933.557837762" observedRunningTime="2026-02-18 11:52:14.532920532 +0000 UTC m=+936.260624622" watchObservedRunningTime="2026-02-18 11:52:14.541809546 +0000 UTC m=+936.269513636" Feb 18 11:52:14 crc kubenswrapper[4922]: I0218 11:52:14.570158 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-7866795846-xdrrr" podStartSLOduration=4.625805312 podStartE2EDuration="22.570138021s" podCreationTimestamp="2026-02-18 11:51:52 +0000 UTC" firstStartedPulling="2026-02-18 11:51:53.885536896 +0000 UTC m=+915.613240986" lastFinishedPulling="2026-02-18 11:52:11.829869615 +0000 UTC m=+933.557573695" observedRunningTime="2026-02-18 11:52:14.560646781 +0000 UTC m=+936.288350861" watchObservedRunningTime="2026-02-18 11:52:14.570138021 +0000 UTC m=+936.297842101" Feb 18 11:52:14 crc kubenswrapper[4922]: I0218 11:52:14.584579 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-82hvr" podStartSLOduration=4.378875351 podStartE2EDuration="23.584563285s" podCreationTimestamp="2026-02-18 11:51:51 +0000 UTC" firstStartedPulling="2026-02-18 11:51:53.164554681 +0000 UTC m=+914.892258761" lastFinishedPulling="2026-02-18 11:52:12.370242615 +0000 UTC m=+934.097946695" observedRunningTime="2026-02-18 11:52:14.582094452 +0000 UTC m=+936.309798542" watchObservedRunningTime="2026-02-18 11:52:14.584563285 +0000 UTC m=+936.312267355" Feb 18 11:52:14 crc kubenswrapper[4922]: I0218 11:52:14.613736 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-tn47v" podStartSLOduration=5.739083378 podStartE2EDuration="23.613698039s" podCreationTimestamp="2026-02-18 11:51:51 +0000 UTC" firstStartedPulling="2026-02-18 11:51:53.363983671 +0000 UTC m=+915.091687751" lastFinishedPulling="2026-02-18 11:52:11.238598332 +0000 UTC m=+932.966302412" observedRunningTime="2026-02-18 11:52:14.610456438 +0000 UTC m=+936.338160518" watchObservedRunningTime="2026-02-18 11:52:14.613698039 +0000 UTC m=+936.341402119" Feb 18 11:52:14 crc kubenswrapper[4922]: I0218 11:52:14.635582 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-hddmr" podStartSLOduration=2.486291949 podStartE2EDuration="22.635562291s" podCreationTimestamp="2026-02-18 11:51:52 +0000 UTC" firstStartedPulling="2026-02-18 11:51:53.917974984 +0000 UTC m=+915.645679064" lastFinishedPulling="2026-02-18 11:52:14.067245326 +0000 UTC m=+935.794949406" observedRunningTime="2026-02-18 11:52:14.628494503 +0000 UTC m=+936.356198583" watchObservedRunningTime="2026-02-18 11:52:14.635562291 +0000 UTC m=+936.363266371" Feb 18 11:52:15 crc kubenswrapper[4922]: I0218 11:52:15.553644 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-98zrv" event={"ID":"69ef021e-1b46-4aeb-8023-93f6fb366396","Type":"ContainerStarted","Data":"7d70293cb5ba2e1388847139393d5104dcfe893ef9454d4c2fdc3b93b52db0b1"} Feb 18 11:52:15 crc kubenswrapper[4922]: I0218 11:52:15.555868 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-qm24h" event={"ID":"51cd14ee-9b8a-421f-80bb-d208b752079d","Type":"ContainerStarted","Data":"45a6fcea78733aa6e7b3b497f092daf8022f0dfd64f7f32bcee6bfe8fe12a7be"} Feb 18 11:52:15 crc kubenswrapper[4922]: I0218 11:52:15.556023 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-qm24h" Feb 18 11:52:15 crc kubenswrapper[4922]: I0218 11:52:15.557509 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-btlqf" event={"ID":"387afbf1-afa5-414c-a22a-83a6a8197ff7","Type":"ContainerStarted","Data":"180fdbde122f8b351f0c2d8ece628061b11a164646fd707a997de9002f4deb40"} Feb 18 11:52:15 crc kubenswrapper[4922]: I0218 11:52:15.557701 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-btlqf" Feb 18 11:52:15 crc kubenswrapper[4922]: I0218 11:52:15.559417 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-r4v59" event={"ID":"7753280d-fc59-4887-9d87-a2cfd83e7ba9","Type":"ContainerStarted","Data":"15296e2d46dd291afb2887d2b6f31dd860a198fa3e67d29211f44e3a7fd4d95b"} Feb 18 11:52:15 crc kubenswrapper[4922]: I0218 11:52:15.559583 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-r4v59" Feb 18 11:52:15 crc kubenswrapper[4922]: I0218 11:52:15.564931 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-bnvrn" event={"ID":"4c9af0bf-50d7-42ef-a8df-241b5ec63f5a","Type":"ContainerStarted","Data":"1cf1b27cd102d4af62bc817ebf9c9dc41c91fbf75803b518d18a080ee20b50cc"} Feb 18 11:52:15 crc kubenswrapper[4922]: I0218 11:52:15.565138 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987464f4-bnvrn" Feb 18 11:52:15 crc kubenswrapper[4922]: I0218 11:52:15.567667 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-gwbk7" event={"ID":"a7487625-0c9e-4396-8eb8-5840ce4344c8","Type":"ContainerStarted","Data":"37c4e74c297481c523e5c0083a06a074082fbbd2d201360965ad39fe666f1fee"} Feb 18 11:52:15 crc kubenswrapper[4922]: I0218 11:52:15.567948 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-gwbk7" Feb 18 11:52:15 crc kubenswrapper[4922]: I0218 11:52:15.569764 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5689f5d7c4-95x8t" event={"ID":"4c487619-568f-44a0-9d23-037794ada114","Type":"ContainerStarted","Data":"2e53e09cc17728c9f81e08eeb6af31e0317a0fb97b3090a1e4af610afef4d55f"} Feb 18 11:52:15 crc kubenswrapper[4922]: I0218 11:52:15.569987 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-5689f5d7c4-95x8t" Feb 18 11:52:15 crc kubenswrapper[4922]: I0218 11:52:15.571778 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-2bk9r" event={"ID":"183b09db-ca5a-4aa1-b87b-908de4dc44ff","Type":"ContainerStarted","Data":"d6ae254be8e60d3b9f6e2d4f4171811a94d90e720d17a673689ec004a0e4548b"} Feb 18 11:52:15 crc kubenswrapper[4922]: I0218 11:52:15.580418 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-98zrv" podStartSLOduration=3.533474971 podStartE2EDuration="23.580384952s" podCreationTimestamp="2026-02-18 11:51:52 +0000 UTC" firstStartedPulling="2026-02-18 11:51:54.019912085 +0000 UTC m=+915.747616165" lastFinishedPulling="2026-02-18 11:52:14.066822066 +0000 UTC m=+935.794526146" observedRunningTime="2026-02-18 11:52:15.576677518 +0000 UTC m=+937.304381598" watchObservedRunningTime="2026-02-18 11:52:15.580384952 +0000 UTC m=+937.308089032" Feb 18 11:52:15 crc kubenswrapper[4922]: I0218 11:52:15.608933 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987464f4-bnvrn" podStartSLOduration=5.942251843 podStartE2EDuration="24.608913581s" podCreationTimestamp="2026-02-18 11:51:51 +0000 UTC" firstStartedPulling="2026-02-18 11:51:53.165175127 +0000 UTC m=+914.892879207" lastFinishedPulling="2026-02-18 11:52:11.831836865 +0000 UTC m=+933.559540945" observedRunningTime="2026-02-18 11:52:15.606780287 +0000 UTC m=+937.334484377" watchObservedRunningTime="2026-02-18 11:52:15.608913581 +0000 UTC m=+937.336617681" Feb 18 11:52:15 crc kubenswrapper[4922]: I0218 11:52:15.623972 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-gwbk7" podStartSLOduration=4.323505826 podStartE2EDuration="24.623955581s" podCreationTimestamp="2026-02-18 11:51:51 +0000 UTC" firstStartedPulling="2026-02-18 11:51:53.896934064 +0000 UTC m=+915.624638144" lastFinishedPulling="2026-02-18 11:52:14.197383819 +0000 UTC m=+935.925087899" observedRunningTime="2026-02-18 11:52:15.623572291 +0000 UTC m=+937.351276371" watchObservedRunningTime="2026-02-18 11:52:15.623955581 +0000 UTC m=+937.351659661" Feb 18 11:52:15 crc kubenswrapper[4922]: I0218 11:52:15.642461 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-btlqf" podStartSLOduration=3.372358058 podStartE2EDuration="23.642442987s" podCreationTimestamp="2026-02-18 11:51:52 +0000 UTC" firstStartedPulling="2026-02-18 11:51:53.917970684 +0000 UTC m=+915.645674764" lastFinishedPulling="2026-02-18 11:52:14.188055613 +0000 UTC m=+935.915759693" observedRunningTime="2026-02-18 11:52:15.638454326 +0000 UTC m=+937.366158406" watchObservedRunningTime="2026-02-18 11:52:15.642442987 +0000 UTC m=+937.370147067" Feb 18 11:52:15 crc kubenswrapper[4922]: I0218 11:52:15.675702 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-5689f5d7c4-95x8t" podStartSLOduration=3.4060110359999998 podStartE2EDuration="23.675683495s" podCreationTimestamp="2026-02-18 11:51:52 +0000 UTC" firstStartedPulling="2026-02-18 11:51:53.908740741 +0000 UTC m=+915.636444821" lastFinishedPulling="2026-02-18 11:52:14.17841319 +0000 UTC m=+935.906117280" observedRunningTime="2026-02-18 11:52:15.674513616 +0000 UTC m=+937.402217716" watchObservedRunningTime="2026-02-18 11:52:15.675683495 +0000 UTC m=+937.403387575" Feb 18 11:52:15 crc kubenswrapper[4922]: I0218 11:52:15.678421 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-qm24h" podStartSLOduration=6.605500822 podStartE2EDuration="24.678411024s" podCreationTimestamp="2026-02-18 11:51:51 +0000 UTC" firstStartedPulling="2026-02-18 11:51:53.165176827 +0000 UTC m=+914.892880907" lastFinishedPulling="2026-02-18 11:52:11.238087029 +0000 UTC m=+932.965791109" observedRunningTime="2026-02-18 11:52:15.654526322 +0000 UTC m=+937.382230412" watchObservedRunningTime="2026-02-18 11:52:15.678411024 +0000 UTC m=+937.406115104" Feb 18 11:52:15 crc kubenswrapper[4922]: I0218 11:52:15.697701 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-r4v59" podStartSLOduration=6.81592842 podStartE2EDuration="24.69767176s" podCreationTimestamp="2026-02-18 11:51:51 +0000 UTC" firstStartedPulling="2026-02-18 11:51:53.357263422 +0000 UTC m=+915.084967502" lastFinishedPulling="2026-02-18 11:52:11.239006772 +0000 UTC m=+932.966710842" observedRunningTime="2026-02-18 11:52:15.694074479 +0000 UTC m=+937.421778559" watchObservedRunningTime="2026-02-18 11:52:15.69767176 +0000 UTC m=+937.425375840" Feb 18 11:52:15 crc kubenswrapper[4922]: I0218 11:52:15.722751 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68f46476f-2bk9r" podStartSLOduration=6.229774498 podStartE2EDuration="23.722704731s" podCreationTimestamp="2026-02-18 11:51:52 +0000 UTC" firstStartedPulling="2026-02-18 11:51:53.745169346 +0000 UTC m=+915.472873426" lastFinishedPulling="2026-02-18 11:52:11.238099579 +0000 UTC m=+932.965803659" observedRunningTime="2026-02-18 11:52:15.718018403 +0000 UTC m=+937.445722503" watchObservedRunningTime="2026-02-18 11:52:15.722704731 +0000 UTC m=+937.450408811" Feb 18 11:52:16 crc kubenswrapper[4922]: I0218 11:52:16.581394 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68f46476f-2bk9r" Feb 18 11:52:17 crc kubenswrapper[4922]: I0218 11:52:17.588545 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-2ncv8" event={"ID":"01766bee-50bd-4dcb-9b3d-831486ddeaf4","Type":"ContainerStarted","Data":"127bc88f95726b17d67e1ed81455b0bf6fec58eade654362da10a6251df24455"} Feb 18 11:52:17 crc kubenswrapper[4922]: I0218 11:52:17.589079 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-2ncv8" Feb 18 11:52:17 crc kubenswrapper[4922]: I0218 11:52:17.613565 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-2ncv8" podStartSLOduration=2.251615018 podStartE2EDuration="26.613279366s" podCreationTimestamp="2026-02-18 11:51:51 +0000 UTC" firstStartedPulling="2026-02-18 11:51:53.037201439 +0000 UTC m=+914.764905519" lastFinishedPulling="2026-02-18 11:52:17.398865777 +0000 UTC m=+939.126569867" observedRunningTime="2026-02-18 11:52:17.604922035 +0000 UTC m=+939.332626115" watchObservedRunningTime="2026-02-18 11:52:17.613279366 +0000 UTC m=+939.340983446" Feb 18 11:52:18 crc kubenswrapper[4922]: I0218 11:52:18.598240 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-6z2cq" event={"ID":"61f73f1d-e472-411e-adc0-6755c47aa72b","Type":"ContainerStarted","Data":"b1c8a7f969057e190de16d7fd5bf9a3f548ada5fb9a4c7786e1771f7aa1d433f"} Feb 18 11:52:18 crc kubenswrapper[4922]: I0218 11:52:18.598482 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-6z2cq" Feb 18 11:52:18 crc kubenswrapper[4922]: I0218 11:52:18.618597 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-6z2cq" podStartSLOduration=2.2675542 podStartE2EDuration="27.618572762s" podCreationTimestamp="2026-02-18 11:51:51 +0000 UTC" firstStartedPulling="2026-02-18 11:51:53.000388171 +0000 UTC m=+914.728092261" lastFinishedPulling="2026-02-18 11:52:18.351406743 +0000 UTC m=+940.079110823" observedRunningTime="2026-02-18 11:52:18.613902384 +0000 UTC m=+940.341606464" watchObservedRunningTime="2026-02-18 11:52:18.618572762 +0000 UTC m=+940.346276852" Feb 18 11:52:22 crc kubenswrapper[4922]: I0218 11:52:22.092768 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-f8lbk" Feb 18 11:52:22 crc kubenswrapper[4922]: I0218 11:52:22.161803 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-2ncv8" Feb 18 11:52:22 crc kubenswrapper[4922]: I0218 11:52:22.246833 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987464f4-bnvrn" Feb 18 11:52:22 crc kubenswrapper[4922]: I0218 11:52:22.260623 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-qm24h" Feb 18 11:52:22 crc kubenswrapper[4922]: I0218 11:52:22.308564 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-82hvr" Feb 18 11:52:22 crc kubenswrapper[4922]: I0218 11:52:22.463634 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-r4v59" Feb 18 11:52:22 crc kubenswrapper[4922]: I0218 11:52:22.575878 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-tn47v" Feb 18 11:52:22 crc kubenswrapper[4922]: I0218 11:52:22.628218 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-wrd8w" event={"ID":"90b4a58a-81d7-4129-8f45-5429e963676e","Type":"ContainerStarted","Data":"85cd2e4c3af3c7ceda3755929bd0aea9fabbf156d02b991371e49b388dfcb094"} Feb 18 11:52:22 crc kubenswrapper[4922]: I0218 11:52:22.628398 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-wrd8w" Feb 18 11:52:22 crc kubenswrapper[4922]: I0218 11:52:22.645613 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-wrd8w" podStartSLOduration=3.761794587 podStartE2EDuration="31.645598713s" podCreationTimestamp="2026-02-18 11:51:51 +0000 UTC" firstStartedPulling="2026-02-18 11:51:53.614479199 +0000 UTC m=+915.342183279" lastFinishedPulling="2026-02-18 11:52:21.498283305 +0000 UTC m=+943.225987405" observedRunningTime="2026-02-18 11:52:22.642811413 +0000 UTC m=+944.370515493" watchObservedRunningTime="2026-02-18 11:52:22.645598713 +0000 UTC m=+944.373302793" Feb 18 11:52:22 crc kubenswrapper[4922]: I0218 11:52:22.670898 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-gwbk7" Feb 18 11:52:22 crc kubenswrapper[4922]: I0218 11:52:22.725691 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-hddmr" Feb 18 11:52:22 crc kubenswrapper[4922]: I0218 11:52:22.799854 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68f46476f-2bk9r" Feb 18 11:52:22 crc kubenswrapper[4922]: I0218 11:52:22.875787 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-btlqf" Feb 18 11:52:22 crc kubenswrapper[4922]: I0218 11:52:22.948720 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-7866795846-xdrrr" Feb 18 11:52:23 crc kubenswrapper[4922]: I0218 11:52:23.005178 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-5689f5d7c4-95x8t" Feb 18 11:52:23 crc kubenswrapper[4922]: I0218 11:52:23.636420 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-c597h" event={"ID":"2936db6d-8a5b-4da8-9e52-e508a6e757fe","Type":"ContainerStarted","Data":"a9ad118fadf52a9e0910d8da6266e65f45d417d6f03b25cf3fed755ed810c95b"} Feb 18 11:52:23 crc kubenswrapper[4922]: I0218 11:52:23.636935 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-c597h" Feb 18 11:52:23 crc kubenswrapper[4922]: I0218 11:52:23.637973 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-z7pdl" event={"ID":"42271b89-6aba-4e15-a2a1-856b656a1b6e","Type":"ContainerStarted","Data":"d33e5739ebe03a6b08a8dc4ff9a6c1e0363333bee406aaa0c17e49624cf90bcb"} Feb 18 11:52:23 crc kubenswrapper[4922]: I0218 11:52:23.638131 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-z7pdl" Feb 18 11:52:23 crc kubenswrapper[4922]: I0218 11:52:23.639570 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-4fm4m" event={"ID":"8eae5053-64f3-401a-a151-dbf22f30a845","Type":"ContainerStarted","Data":"d98a9cb5bd91905d3ad32584c3e5e725d67b75989ab4ec89b830bf3ff9718b92"} Feb 18 11:52:23 crc kubenswrapper[4922]: I0218 11:52:23.639798 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-4fm4m" Feb 18 11:52:23 crc kubenswrapper[4922]: I0218 11:52:23.660677 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-c597h" podStartSLOduration=3.6670884 podStartE2EDuration="32.660658816s" podCreationTimestamp="2026-02-18 11:51:51 +0000 UTC" firstStartedPulling="2026-02-18 11:51:53.614462099 +0000 UTC m=+915.342166169" lastFinishedPulling="2026-02-18 11:52:22.608032505 +0000 UTC m=+944.335736585" observedRunningTime="2026-02-18 11:52:23.653344541 +0000 UTC m=+945.381048621" watchObservedRunningTime="2026-02-18 11:52:23.660658816 +0000 UTC m=+945.388362896" Feb 18 11:52:23 crc kubenswrapper[4922]: I0218 11:52:23.668247 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-4fm4m" podStartSLOduration=3.826105469 podStartE2EDuration="32.668224666s" podCreationTimestamp="2026-02-18 11:51:51 +0000 UTC" firstStartedPulling="2026-02-18 11:51:53.733298996 +0000 UTC m=+915.461003076" lastFinishedPulling="2026-02-18 11:52:22.575418193 +0000 UTC m=+944.303122273" observedRunningTime="2026-02-18 11:52:23.667247572 +0000 UTC m=+945.394951662" watchObservedRunningTime="2026-02-18 11:52:23.668224666 +0000 UTC m=+945.395928756" Feb 18 11:52:23 crc kubenswrapper[4922]: I0218 11:52:23.693140 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-z7pdl" podStartSLOduration=2.889316914 podStartE2EDuration="31.693122604s" podCreationTimestamp="2026-02-18 11:51:52 +0000 UTC" firstStartedPulling="2026-02-18 11:51:53.773463559 +0000 UTC m=+915.501167649" lastFinishedPulling="2026-02-18 11:52:22.577269259 +0000 UTC m=+944.304973339" observedRunningTime="2026-02-18 11:52:23.686265431 +0000 UTC m=+945.413969501" watchObservedRunningTime="2026-02-18 11:52:23.693122604 +0000 UTC m=+945.420826694" Feb 18 11:52:23 crc kubenswrapper[4922]: I0218 11:52:23.996595 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c16d873-1097-4f56-913f-cc366ed34c23-cert\") pod \"infra-operator-controller-manager-79d975b745-krt25\" (UID: \"3c16d873-1097-4f56-913f-cc366ed34c23\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-krt25" Feb 18 11:52:24 crc kubenswrapper[4922]: I0218 11:52:24.003140 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c16d873-1097-4f56-913f-cc366ed34c23-cert\") pod \"infra-operator-controller-manager-79d975b745-krt25\" (UID: \"3c16d873-1097-4f56-913f-cc366ed34c23\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-krt25" Feb 18 11:52:24 crc kubenswrapper[4922]: I0218 11:52:24.135438 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-krt25" Feb 18 11:52:24 crc kubenswrapper[4922]: I0218 11:52:24.402107 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/081d9ec7-e338-437a-b3bc-af9b788db66a-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cxfsr7\" (UID: \"081d9ec7-e338-437a-b3bc-af9b788db66a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cxfsr7" Feb 18 11:52:24 crc kubenswrapper[4922]: I0218 11:52:24.408832 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/081d9ec7-e338-437a-b3bc-af9b788db66a-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cxfsr7\" (UID: \"081d9ec7-e338-437a-b3bc-af9b788db66a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cxfsr7" Feb 18 11:52:24 crc kubenswrapper[4922]: I0218 11:52:24.532089 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cxfsr7" Feb 18 11:52:24 crc kubenswrapper[4922]: I0218 11:52:24.665373 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-krt25"] Feb 18 11:52:24 crc kubenswrapper[4922]: I0218 11:52:24.707644 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d81b14bf-a056-4780-af1a-bf38babee5b3-webhook-certs\") pod \"openstack-operator-controller-manager-8685d86d55-pbbl7\" (UID: \"d81b14bf-a056-4780-af1a-bf38babee5b3\") " pod="openstack-operators/openstack-operator-controller-manager-8685d86d55-pbbl7" Feb 18 11:52:24 crc kubenswrapper[4922]: I0218 11:52:24.707837 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d81b14bf-a056-4780-af1a-bf38babee5b3-metrics-certs\") pod \"openstack-operator-controller-manager-8685d86d55-pbbl7\" (UID: \"d81b14bf-a056-4780-af1a-bf38babee5b3\") " pod="openstack-operators/openstack-operator-controller-manager-8685d86d55-pbbl7" Feb 18 11:52:24 crc kubenswrapper[4922]: I0218 11:52:24.713863 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d81b14bf-a056-4780-af1a-bf38babee5b3-metrics-certs\") pod \"openstack-operator-controller-manager-8685d86d55-pbbl7\" (UID: \"d81b14bf-a056-4780-af1a-bf38babee5b3\") " pod="openstack-operators/openstack-operator-controller-manager-8685d86d55-pbbl7" Feb 18 11:52:24 crc kubenswrapper[4922]: I0218 11:52:24.714337 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d81b14bf-a056-4780-af1a-bf38babee5b3-webhook-certs\") pod \"openstack-operator-controller-manager-8685d86d55-pbbl7\" (UID: \"d81b14bf-a056-4780-af1a-bf38babee5b3\") " pod="openstack-operators/openstack-operator-controller-manager-8685d86d55-pbbl7" Feb 18 11:52:24 crc kubenswrapper[4922]: I0218 11:52:24.761605 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cxfsr7"] Feb 18 11:52:24 crc kubenswrapper[4922]: W0218 11:52:24.764277 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod081d9ec7_e338_437a_b3bc_af9b788db66a.slice/crio-0513720a13f6309d20acc61d826343eb53fc8d747c3f4a08928e47331fe3a2e2 WatchSource:0}: Error finding container 0513720a13f6309d20acc61d826343eb53fc8d747c3f4a08928e47331fe3a2e2: Status 404 returned error can't find the container with id 0513720a13f6309d20acc61d826343eb53fc8d747c3f4a08928e47331fe3a2e2 Feb 18 11:52:24 crc kubenswrapper[4922]: I0218 11:52:24.875129 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-8685d86d55-pbbl7" Feb 18 11:52:25 crc kubenswrapper[4922]: I0218 11:52:25.107784 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-8685d86d55-pbbl7"] Feb 18 11:52:25 crc kubenswrapper[4922]: W0218 11:52:25.116232 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd81b14bf_a056_4780_af1a_bf38babee5b3.slice/crio-a7fdb6783296e00b83986aaae921a03105789cc0a2d757b7d0ac926a70148f5e WatchSource:0}: Error finding container a7fdb6783296e00b83986aaae921a03105789cc0a2d757b7d0ac926a70148f5e: Status 404 returned error can't find the container with id a7fdb6783296e00b83986aaae921a03105789cc0a2d757b7d0ac926a70148f5e Feb 18 11:52:25 crc kubenswrapper[4922]: I0218 11:52:25.661690 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cxfsr7" event={"ID":"081d9ec7-e338-437a-b3bc-af9b788db66a","Type":"ContainerStarted","Data":"0513720a13f6309d20acc61d826343eb53fc8d747c3f4a08928e47331fe3a2e2"} Feb 18 11:52:25 crc kubenswrapper[4922]: I0218 11:52:25.664563 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-krt25" event={"ID":"3c16d873-1097-4f56-913f-cc366ed34c23","Type":"ContainerStarted","Data":"229de988738c71267208a4416ebaff65f4884c952d0e200b0493557c02a22b27"} Feb 18 11:52:25 crc kubenswrapper[4922]: I0218 11:52:25.667222 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-jtfzr" event={"ID":"324031ff-ceae-4065-9955-fd5745647cb0","Type":"ContainerStarted","Data":"c245b240c2f559aa64973e5251f50f071a0fb9f06bd3e9e15d5557b81424e4aa"} Feb 18 11:52:25 crc kubenswrapper[4922]: I0218 11:52:25.667438 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-jtfzr" Feb 18 11:52:25 crc kubenswrapper[4922]: I0218 11:52:25.669247 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-8685d86d55-pbbl7" event={"ID":"d81b14bf-a056-4780-af1a-bf38babee5b3","Type":"ContainerStarted","Data":"d147eb3f6c88484ca7ad3e2c09d3f57880738a52e8fa5290ce12b88df3cab757"} Feb 18 11:52:25 crc kubenswrapper[4922]: I0218 11:52:25.669275 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-8685d86d55-pbbl7" event={"ID":"d81b14bf-a056-4780-af1a-bf38babee5b3","Type":"ContainerStarted","Data":"a7fdb6783296e00b83986aaae921a03105789cc0a2d757b7d0ac926a70148f5e"} Feb 18 11:52:25 crc kubenswrapper[4922]: I0218 11:52:25.669930 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-8685d86d55-pbbl7" Feb 18 11:52:25 crc kubenswrapper[4922]: I0218 11:52:25.698227 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-jtfzr" podStartSLOduration=2.906799023 podStartE2EDuration="34.698202837s" podCreationTimestamp="2026-02-18 11:51:51 +0000 UTC" firstStartedPulling="2026-02-18 11:51:53.580451761 +0000 UTC m=+915.308155841" lastFinishedPulling="2026-02-18 11:52:25.371855575 +0000 UTC m=+947.099559655" observedRunningTime="2026-02-18 11:52:25.694085653 +0000 UTC m=+947.421789733" watchObservedRunningTime="2026-02-18 11:52:25.698202837 +0000 UTC m=+947.425906917" Feb 18 11:52:25 crc kubenswrapper[4922]: I0218 11:52:25.730844 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-8685d86d55-pbbl7" podStartSLOduration=33.730817849 podStartE2EDuration="33.730817849s" podCreationTimestamp="2026-02-18 11:51:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:52:25.725187627 +0000 UTC m=+947.452891707" watchObservedRunningTime="2026-02-18 11:52:25.730817849 +0000 UTC m=+947.458521939" Feb 18 11:52:27 crc kubenswrapper[4922]: I0218 11:52:27.691588 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cxfsr7" event={"ID":"081d9ec7-e338-437a-b3bc-af9b788db66a","Type":"ContainerStarted","Data":"0cc47605d08777efd6a5a6e67438d19cb2b880d65062ef7cd760b4327724ae0e"} Feb 18 11:52:27 crc kubenswrapper[4922]: I0218 11:52:27.692058 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cxfsr7" Feb 18 11:52:27 crc kubenswrapper[4922]: I0218 11:52:27.693395 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-krt25" event={"ID":"3c16d873-1097-4f56-913f-cc366ed34c23","Type":"ContainerStarted","Data":"cd55a5bdfcfaf39d2c010237362cd86c422ac15a198c01ef3e476a658a677415"} Feb 18 11:52:27 crc kubenswrapper[4922]: I0218 11:52:27.693498 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79d975b745-krt25" Feb 18 11:52:27 crc kubenswrapper[4922]: I0218 11:52:27.721145 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cxfsr7" podStartSLOduration=33.172203579 podStartE2EDuration="35.72112692s" podCreationTimestamp="2026-02-18 11:51:52 +0000 UTC" firstStartedPulling="2026-02-18 11:52:24.765732037 +0000 UTC m=+946.493436107" lastFinishedPulling="2026-02-18 11:52:27.314655358 +0000 UTC m=+949.042359448" observedRunningTime="2026-02-18 11:52:27.712831071 +0000 UTC m=+949.440535151" watchObservedRunningTime="2026-02-18 11:52:27.72112692 +0000 UTC m=+949.448831000" Feb 18 11:52:27 crc kubenswrapper[4922]: I0218 11:52:27.737013 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79d975b745-krt25" podStartSLOduration=34.121607013 podStartE2EDuration="36.736999491s" podCreationTimestamp="2026-02-18 11:51:51 +0000 UTC" firstStartedPulling="2026-02-18 11:52:24.68259616 +0000 UTC m=+946.410300240" lastFinishedPulling="2026-02-18 11:52:27.297988638 +0000 UTC m=+949.025692718" observedRunningTime="2026-02-18 11:52:27.733439201 +0000 UTC m=+949.461143281" watchObservedRunningTime="2026-02-18 11:52:27.736999491 +0000 UTC m=+949.464703571" Feb 18 11:52:32 crc kubenswrapper[4922]: I0218 11:52:32.144235 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-6z2cq" Feb 18 11:52:32 crc kubenswrapper[4922]: I0218 11:52:32.556202 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-jtfzr" Feb 18 11:52:32 crc kubenswrapper[4922]: I0218 11:52:32.587151 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-c597h" Feb 18 11:52:32 crc kubenswrapper[4922]: I0218 11:52:32.600389 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-wrd8w" Feb 18 11:52:32 crc kubenswrapper[4922]: I0218 11:52:32.698467 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-4fm4m" Feb 18 11:52:32 crc kubenswrapper[4922]: I0218 11:52:32.698769 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-z7pdl" Feb 18 11:52:34 crc kubenswrapper[4922]: I0218 11:52:34.141806 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79d975b745-krt25" Feb 18 11:52:34 crc kubenswrapper[4922]: I0218 11:52:34.538638 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cxfsr7" Feb 18 11:52:34 crc kubenswrapper[4922]: I0218 11:52:34.881233 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-8685d86d55-pbbl7" Feb 18 11:52:54 crc kubenswrapper[4922]: I0218 11:52:54.182287 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-d5kxc"] Feb 18 11:52:54 crc kubenswrapper[4922]: I0218 11:52:54.184019 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-d5kxc" Feb 18 11:52:54 crc kubenswrapper[4922]: I0218 11:52:54.185878 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-rnrrw" Feb 18 11:52:54 crc kubenswrapper[4922]: I0218 11:52:54.186173 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 18 11:52:54 crc kubenswrapper[4922]: I0218 11:52:54.186972 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 18 11:52:54 crc kubenswrapper[4922]: I0218 11:52:54.187202 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 18 11:52:54 crc kubenswrapper[4922]: I0218 11:52:54.205457 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-d5kxc"] Feb 18 11:52:54 crc kubenswrapper[4922]: I0218 11:52:54.235304 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-r4dpg"] Feb 18 11:52:54 crc kubenswrapper[4922]: I0218 11:52:54.238904 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-r4dpg" Feb 18 11:52:54 crc kubenswrapper[4922]: I0218 11:52:54.241003 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 18 11:52:54 crc kubenswrapper[4922]: I0218 11:52:54.246545 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-r4dpg"] Feb 18 11:52:54 crc kubenswrapper[4922]: I0218 11:52:54.252530 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9skmt\" (UniqueName: \"kubernetes.io/projected/277ea358-73ba-466c-bad6-c788f49749a2-kube-api-access-9skmt\") pod \"dnsmasq-dns-675f4bcbfc-d5kxc\" (UID: \"277ea358-73ba-466c-bad6-c788f49749a2\") " pod="openstack/dnsmasq-dns-675f4bcbfc-d5kxc" Feb 18 11:52:54 crc kubenswrapper[4922]: I0218 11:52:54.252582 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/277ea358-73ba-466c-bad6-c788f49749a2-config\") pod \"dnsmasq-dns-675f4bcbfc-d5kxc\" (UID: \"277ea358-73ba-466c-bad6-c788f49749a2\") " pod="openstack/dnsmasq-dns-675f4bcbfc-d5kxc" Feb 18 11:52:54 crc kubenswrapper[4922]: I0218 11:52:54.353458 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4flx4\" (UniqueName: \"kubernetes.io/projected/12309c77-cb46-401f-a335-08ca9d74e019-kube-api-access-4flx4\") pod \"dnsmasq-dns-78dd6ddcc-r4dpg\" (UID: \"12309c77-cb46-401f-a335-08ca9d74e019\") " pod="openstack/dnsmasq-dns-78dd6ddcc-r4dpg" Feb 18 11:52:54 crc kubenswrapper[4922]: I0218 11:52:54.353534 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12309c77-cb46-401f-a335-08ca9d74e019-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-r4dpg\" (UID: \"12309c77-cb46-401f-a335-08ca9d74e019\") " pod="openstack/dnsmasq-dns-78dd6ddcc-r4dpg" Feb 18 11:52:54 crc kubenswrapper[4922]: I0218 11:52:54.353574 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9skmt\" (UniqueName: \"kubernetes.io/projected/277ea358-73ba-466c-bad6-c788f49749a2-kube-api-access-9skmt\") pod \"dnsmasq-dns-675f4bcbfc-d5kxc\" (UID: \"277ea358-73ba-466c-bad6-c788f49749a2\") " pod="openstack/dnsmasq-dns-675f4bcbfc-d5kxc" Feb 18 11:52:54 crc kubenswrapper[4922]: I0218 11:52:54.353607 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12309c77-cb46-401f-a335-08ca9d74e019-config\") pod \"dnsmasq-dns-78dd6ddcc-r4dpg\" (UID: \"12309c77-cb46-401f-a335-08ca9d74e019\") " pod="openstack/dnsmasq-dns-78dd6ddcc-r4dpg" Feb 18 11:52:54 crc kubenswrapper[4922]: I0218 11:52:54.353637 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/277ea358-73ba-466c-bad6-c788f49749a2-config\") pod \"dnsmasq-dns-675f4bcbfc-d5kxc\" (UID: \"277ea358-73ba-466c-bad6-c788f49749a2\") " pod="openstack/dnsmasq-dns-675f4bcbfc-d5kxc" Feb 18 11:52:54 crc kubenswrapper[4922]: I0218 11:52:54.354461 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/277ea358-73ba-466c-bad6-c788f49749a2-config\") pod \"dnsmasq-dns-675f4bcbfc-d5kxc\" (UID: \"277ea358-73ba-466c-bad6-c788f49749a2\") " pod="openstack/dnsmasq-dns-675f4bcbfc-d5kxc" Feb 18 11:52:54 crc kubenswrapper[4922]: I0218 11:52:54.380493 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9skmt\" (UniqueName: \"kubernetes.io/projected/277ea358-73ba-466c-bad6-c788f49749a2-kube-api-access-9skmt\") pod \"dnsmasq-dns-675f4bcbfc-d5kxc\" (UID: \"277ea358-73ba-466c-bad6-c788f49749a2\") " pod="openstack/dnsmasq-dns-675f4bcbfc-d5kxc" Feb 18 11:52:54 crc kubenswrapper[4922]: I0218 11:52:54.454613 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4flx4\" (UniqueName: \"kubernetes.io/projected/12309c77-cb46-401f-a335-08ca9d74e019-kube-api-access-4flx4\") pod \"dnsmasq-dns-78dd6ddcc-r4dpg\" (UID: \"12309c77-cb46-401f-a335-08ca9d74e019\") " pod="openstack/dnsmasq-dns-78dd6ddcc-r4dpg" Feb 18 11:52:54 crc kubenswrapper[4922]: I0218 11:52:54.454698 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12309c77-cb46-401f-a335-08ca9d74e019-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-r4dpg\" (UID: \"12309c77-cb46-401f-a335-08ca9d74e019\") " pod="openstack/dnsmasq-dns-78dd6ddcc-r4dpg" Feb 18 11:52:54 crc kubenswrapper[4922]: I0218 11:52:54.454735 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12309c77-cb46-401f-a335-08ca9d74e019-config\") pod \"dnsmasq-dns-78dd6ddcc-r4dpg\" (UID: \"12309c77-cb46-401f-a335-08ca9d74e019\") " pod="openstack/dnsmasq-dns-78dd6ddcc-r4dpg" Feb 18 11:52:54 crc kubenswrapper[4922]: I0218 11:52:54.455832 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12309c77-cb46-401f-a335-08ca9d74e019-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-r4dpg\" (UID: \"12309c77-cb46-401f-a335-08ca9d74e019\") " pod="openstack/dnsmasq-dns-78dd6ddcc-r4dpg" Feb 18 11:52:54 crc kubenswrapper[4922]: I0218 11:52:54.455849 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12309c77-cb46-401f-a335-08ca9d74e019-config\") pod \"dnsmasq-dns-78dd6ddcc-r4dpg\" (UID: \"12309c77-cb46-401f-a335-08ca9d74e019\") " pod="openstack/dnsmasq-dns-78dd6ddcc-r4dpg" Feb 18 11:52:54 crc kubenswrapper[4922]: I0218 11:52:54.483419 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4flx4\" (UniqueName: \"kubernetes.io/projected/12309c77-cb46-401f-a335-08ca9d74e019-kube-api-access-4flx4\") pod \"dnsmasq-dns-78dd6ddcc-r4dpg\" (UID: \"12309c77-cb46-401f-a335-08ca9d74e019\") " pod="openstack/dnsmasq-dns-78dd6ddcc-r4dpg" Feb 18 11:52:54 crc kubenswrapper[4922]: I0218 11:52:54.506972 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-d5kxc" Feb 18 11:52:54 crc kubenswrapper[4922]: I0218 11:52:54.562664 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-r4dpg" Feb 18 11:52:54 crc kubenswrapper[4922]: I0218 11:52:54.807716 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-r4dpg"] Feb 18 11:52:54 crc kubenswrapper[4922]: I0218 11:52:54.894647 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-r4dpg" event={"ID":"12309c77-cb46-401f-a335-08ca9d74e019","Type":"ContainerStarted","Data":"b35ffd4661bbe49f0defaf1ce8038237c6db0e474221315740b32c40e8be885c"} Feb 18 11:52:54 crc kubenswrapper[4922]: I0218 11:52:54.930189 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-d5kxc"] Feb 18 11:52:54 crc kubenswrapper[4922]: W0218 11:52:54.933575 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod277ea358_73ba_466c_bad6_c788f49749a2.slice/crio-be1e6e7fb4233f0222ce6fa46078e0d2c0447caff506e59746208cd3058261a5 WatchSource:0}: Error finding container be1e6e7fb4233f0222ce6fa46078e0d2c0447caff506e59746208cd3058261a5: Status 404 returned error can't find the container with id be1e6e7fb4233f0222ce6fa46078e0d2c0447caff506e59746208cd3058261a5 Feb 18 11:52:55 crc kubenswrapper[4922]: I0218 11:52:55.903871 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-d5kxc" event={"ID":"277ea358-73ba-466c-bad6-c788f49749a2","Type":"ContainerStarted","Data":"be1e6e7fb4233f0222ce6fa46078e0d2c0447caff506e59746208cd3058261a5"} Feb 18 11:52:57 crc kubenswrapper[4922]: I0218 11:52:57.023481 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-d5kxc"] Feb 18 11:52:57 crc kubenswrapper[4922]: I0218 11:52:57.058327 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-rkhdz"] Feb 18 11:52:57 crc kubenswrapper[4922]: I0218 11:52:57.059942 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-rkhdz" Feb 18 11:52:57 crc kubenswrapper[4922]: I0218 11:52:57.066163 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-rkhdz"] Feb 18 11:52:57 crc kubenswrapper[4922]: I0218 11:52:57.101656 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm4rb\" (UniqueName: \"kubernetes.io/projected/5728000b-35c8-4748-a8bd-722a9d4da288-kube-api-access-zm4rb\") pod \"dnsmasq-dns-666b6646f7-rkhdz\" (UID: \"5728000b-35c8-4748-a8bd-722a9d4da288\") " pod="openstack/dnsmasq-dns-666b6646f7-rkhdz" Feb 18 11:52:57 crc kubenswrapper[4922]: I0218 11:52:57.101775 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5728000b-35c8-4748-a8bd-722a9d4da288-config\") pod \"dnsmasq-dns-666b6646f7-rkhdz\" (UID: \"5728000b-35c8-4748-a8bd-722a9d4da288\") " pod="openstack/dnsmasq-dns-666b6646f7-rkhdz" Feb 18 11:52:57 crc kubenswrapper[4922]: I0218 11:52:57.102102 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5728000b-35c8-4748-a8bd-722a9d4da288-dns-svc\") pod \"dnsmasq-dns-666b6646f7-rkhdz\" (UID: \"5728000b-35c8-4748-a8bd-722a9d4da288\") " pod="openstack/dnsmasq-dns-666b6646f7-rkhdz" Feb 18 11:52:57 crc kubenswrapper[4922]: I0218 11:52:57.203904 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5728000b-35c8-4748-a8bd-722a9d4da288-dns-svc\") pod \"dnsmasq-dns-666b6646f7-rkhdz\" (UID: \"5728000b-35c8-4748-a8bd-722a9d4da288\") " pod="openstack/dnsmasq-dns-666b6646f7-rkhdz" Feb 18 11:52:57 crc kubenswrapper[4922]: I0218 11:52:57.203972 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zm4rb\" (UniqueName: \"kubernetes.io/projected/5728000b-35c8-4748-a8bd-722a9d4da288-kube-api-access-zm4rb\") pod \"dnsmasq-dns-666b6646f7-rkhdz\" (UID: \"5728000b-35c8-4748-a8bd-722a9d4da288\") " pod="openstack/dnsmasq-dns-666b6646f7-rkhdz" Feb 18 11:52:57 crc kubenswrapper[4922]: I0218 11:52:57.203995 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5728000b-35c8-4748-a8bd-722a9d4da288-config\") pod \"dnsmasq-dns-666b6646f7-rkhdz\" (UID: \"5728000b-35c8-4748-a8bd-722a9d4da288\") " pod="openstack/dnsmasq-dns-666b6646f7-rkhdz" Feb 18 11:52:57 crc kubenswrapper[4922]: I0218 11:52:57.204977 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5728000b-35c8-4748-a8bd-722a9d4da288-config\") pod \"dnsmasq-dns-666b6646f7-rkhdz\" (UID: \"5728000b-35c8-4748-a8bd-722a9d4da288\") " pod="openstack/dnsmasq-dns-666b6646f7-rkhdz" Feb 18 11:52:57 crc kubenswrapper[4922]: I0218 11:52:57.205074 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5728000b-35c8-4748-a8bd-722a9d4da288-dns-svc\") pod \"dnsmasq-dns-666b6646f7-rkhdz\" (UID: \"5728000b-35c8-4748-a8bd-722a9d4da288\") " pod="openstack/dnsmasq-dns-666b6646f7-rkhdz" Feb 18 11:52:57 crc kubenswrapper[4922]: I0218 11:52:57.232483 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm4rb\" (UniqueName: \"kubernetes.io/projected/5728000b-35c8-4748-a8bd-722a9d4da288-kube-api-access-zm4rb\") pod \"dnsmasq-dns-666b6646f7-rkhdz\" (UID: \"5728000b-35c8-4748-a8bd-722a9d4da288\") " pod="openstack/dnsmasq-dns-666b6646f7-rkhdz" Feb 18 11:52:57 crc kubenswrapper[4922]: I0218 11:52:57.314639 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-r4dpg"] Feb 18 11:52:57 crc kubenswrapper[4922]: I0218 11:52:57.349799 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gcsm9"] Feb 18 11:52:57 crc kubenswrapper[4922]: I0218 11:52:57.352018 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-gcsm9" Feb 18 11:52:57 crc kubenswrapper[4922]: I0218 11:52:57.363877 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gcsm9"] Feb 18 11:52:57 crc kubenswrapper[4922]: I0218 11:52:57.389684 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-rkhdz" Feb 18 11:52:57 crc kubenswrapper[4922]: I0218 11:52:57.422114 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9adaf2c-91b1-42f9-8d96-307a08030cce-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-gcsm9\" (UID: \"a9adaf2c-91b1-42f9-8d96-307a08030cce\") " pod="openstack/dnsmasq-dns-57d769cc4f-gcsm9" Feb 18 11:52:57 crc kubenswrapper[4922]: I0218 11:52:57.422180 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9adaf2c-91b1-42f9-8d96-307a08030cce-config\") pod \"dnsmasq-dns-57d769cc4f-gcsm9\" (UID: \"a9adaf2c-91b1-42f9-8d96-307a08030cce\") " pod="openstack/dnsmasq-dns-57d769cc4f-gcsm9" Feb 18 11:52:57 crc kubenswrapper[4922]: I0218 11:52:57.422222 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7kzn\" (UniqueName: \"kubernetes.io/projected/a9adaf2c-91b1-42f9-8d96-307a08030cce-kube-api-access-n7kzn\") pod \"dnsmasq-dns-57d769cc4f-gcsm9\" (UID: \"a9adaf2c-91b1-42f9-8d96-307a08030cce\") " pod="openstack/dnsmasq-dns-57d769cc4f-gcsm9" Feb 18 11:52:57 crc kubenswrapper[4922]: I0218 11:52:57.523841 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9adaf2c-91b1-42f9-8d96-307a08030cce-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-gcsm9\" (UID: \"a9adaf2c-91b1-42f9-8d96-307a08030cce\") " pod="openstack/dnsmasq-dns-57d769cc4f-gcsm9" Feb 18 11:52:57 crc kubenswrapper[4922]: I0218 11:52:57.523926 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9adaf2c-91b1-42f9-8d96-307a08030cce-config\") pod \"dnsmasq-dns-57d769cc4f-gcsm9\" (UID: \"a9adaf2c-91b1-42f9-8d96-307a08030cce\") " pod="openstack/dnsmasq-dns-57d769cc4f-gcsm9" Feb 18 11:52:57 crc kubenswrapper[4922]: I0218 11:52:57.523987 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7kzn\" (UniqueName: \"kubernetes.io/projected/a9adaf2c-91b1-42f9-8d96-307a08030cce-kube-api-access-n7kzn\") pod \"dnsmasq-dns-57d769cc4f-gcsm9\" (UID: \"a9adaf2c-91b1-42f9-8d96-307a08030cce\") " pod="openstack/dnsmasq-dns-57d769cc4f-gcsm9" Feb 18 11:52:57 crc kubenswrapper[4922]: I0218 11:52:57.524926 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9adaf2c-91b1-42f9-8d96-307a08030cce-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-gcsm9\" (UID: \"a9adaf2c-91b1-42f9-8d96-307a08030cce\") " pod="openstack/dnsmasq-dns-57d769cc4f-gcsm9" Feb 18 11:52:57 crc kubenswrapper[4922]: I0218 11:52:57.524939 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9adaf2c-91b1-42f9-8d96-307a08030cce-config\") pod \"dnsmasq-dns-57d769cc4f-gcsm9\" (UID: \"a9adaf2c-91b1-42f9-8d96-307a08030cce\") " pod="openstack/dnsmasq-dns-57d769cc4f-gcsm9" Feb 18 11:52:57 crc kubenswrapper[4922]: I0218 11:52:57.543202 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7kzn\" (UniqueName: \"kubernetes.io/projected/a9adaf2c-91b1-42f9-8d96-307a08030cce-kube-api-access-n7kzn\") pod \"dnsmasq-dns-57d769cc4f-gcsm9\" (UID: \"a9adaf2c-91b1-42f9-8d96-307a08030cce\") " pod="openstack/dnsmasq-dns-57d769cc4f-gcsm9" Feb 18 11:52:57 crc kubenswrapper[4922]: I0218 11:52:57.684734 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-gcsm9" Feb 18 11:52:57 crc kubenswrapper[4922]: I0218 11:52:57.897145 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-rkhdz"] Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.185834 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.187291 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.203019 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.203377 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-ctw5n" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.204215 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.204352 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.204569 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.204701 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.204829 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.217072 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.335729 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cef557d2-b935-4cf6-98f1-d3c2251c0e38-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\") " pod="openstack/rabbitmq-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.335836 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gvvw\" (UniqueName: \"kubernetes.io/projected/cef557d2-b935-4cf6-98f1-d3c2251c0e38-kube-api-access-9gvvw\") pod \"rabbitmq-server-0\" (UID: \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\") " pod="openstack/rabbitmq-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.335874 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cef557d2-b935-4cf6-98f1-d3c2251c0e38-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\") " pod="openstack/rabbitmq-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.335889 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cef557d2-b935-4cf6-98f1-d3c2251c0e38-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\") " pod="openstack/rabbitmq-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.335928 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cef557d2-b935-4cf6-98f1-d3c2251c0e38-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\") " pod="openstack/rabbitmq-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.335953 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cef557d2-b935-4cf6-98f1-d3c2251c0e38-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\") " pod="openstack/rabbitmq-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.335975 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cef557d2-b935-4cf6-98f1-d3c2251c0e38-config-data\") pod \"rabbitmq-server-0\" (UID: \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\") " pod="openstack/rabbitmq-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.335990 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cef557d2-b935-4cf6-98f1-d3c2251c0e38-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\") " pod="openstack/rabbitmq-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.336016 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\") " pod="openstack/rabbitmq-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.336032 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cef557d2-b935-4cf6-98f1-d3c2251c0e38-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\") " pod="openstack/rabbitmq-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.336058 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cef557d2-b935-4cf6-98f1-d3c2251c0e38-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\") " pod="openstack/rabbitmq-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.440245 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cef557d2-b935-4cf6-98f1-d3c2251c0e38-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\") " pod="openstack/rabbitmq-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.440304 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gvvw\" (UniqueName: \"kubernetes.io/projected/cef557d2-b935-4cf6-98f1-d3c2251c0e38-kube-api-access-9gvvw\") pod \"rabbitmq-server-0\" (UID: \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\") " pod="openstack/rabbitmq-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.440336 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cef557d2-b935-4cf6-98f1-d3c2251c0e38-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\") " pod="openstack/rabbitmq-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.440353 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cef557d2-b935-4cf6-98f1-d3c2251c0e38-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\") " pod="openstack/rabbitmq-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.440400 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cef557d2-b935-4cf6-98f1-d3c2251c0e38-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\") " pod="openstack/rabbitmq-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.440428 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cef557d2-b935-4cf6-98f1-d3c2251c0e38-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\") " pod="openstack/rabbitmq-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.440445 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cef557d2-b935-4cf6-98f1-d3c2251c0e38-config-data\") pod \"rabbitmq-server-0\" (UID: \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\") " pod="openstack/rabbitmq-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.440462 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cef557d2-b935-4cf6-98f1-d3c2251c0e38-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\") " pod="openstack/rabbitmq-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.440483 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cef557d2-b935-4cf6-98f1-d3c2251c0e38-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\") " pod="openstack/rabbitmq-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.440501 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\") " pod="openstack/rabbitmq-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.440529 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cef557d2-b935-4cf6-98f1-d3c2251c0e38-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\") " pod="openstack/rabbitmq-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.445168 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cef557d2-b935-4cf6-98f1-d3c2251c0e38-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\") " pod="openstack/rabbitmq-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.445551 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.445603 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cef557d2-b935-4cf6-98f1-d3c2251c0e38-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\") " pod="openstack/rabbitmq-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.445975 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cef557d2-b935-4cf6-98f1-d3c2251c0e38-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\") " pod="openstack/rabbitmq-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.449388 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cef557d2-b935-4cf6-98f1-d3c2251c0e38-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\") " pod="openstack/rabbitmq-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.451437 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cef557d2-b935-4cf6-98f1-d3c2251c0e38-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\") " pod="openstack/rabbitmq-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.461040 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cef557d2-b935-4cf6-98f1-d3c2251c0e38-config-data\") pod \"rabbitmq-server-0\" (UID: \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\") " pod="openstack/rabbitmq-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.463419 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cef557d2-b935-4cf6-98f1-d3c2251c0e38-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\") " pod="openstack/rabbitmq-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.463737 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cef557d2-b935-4cf6-98f1-d3c2251c0e38-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\") " pod="openstack/rabbitmq-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.468600 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gvvw\" (UniqueName: \"kubernetes.io/projected/cef557d2-b935-4cf6-98f1-d3c2251c0e38-kube-api-access-9gvvw\") pod \"rabbitmq-server-0\" (UID: \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\") " pod="openstack/rabbitmq-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.471925 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cef557d2-b935-4cf6-98f1-d3c2251c0e38-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\") " pod="openstack/rabbitmq-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.492700 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.493676 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\") " pod="openstack/rabbitmq-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.493827 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.495895 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.497294 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.497755 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.498026 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.498215 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.501072 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.501342 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-8fwmc" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.506612 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.524074 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.543791 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/12b84523-522e-4e8c-b78e-0094262fb1f8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"12b84523-522e-4e8c-b78e-0094262fb1f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.543880 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/12b84523-522e-4e8c-b78e-0094262fb1f8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"12b84523-522e-4e8c-b78e-0094262fb1f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.543938 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/12b84523-522e-4e8c-b78e-0094262fb1f8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"12b84523-522e-4e8c-b78e-0094262fb1f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.543967 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/12b84523-522e-4e8c-b78e-0094262fb1f8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"12b84523-522e-4e8c-b78e-0094262fb1f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.544028 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/12b84523-522e-4e8c-b78e-0094262fb1f8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"12b84523-522e-4e8c-b78e-0094262fb1f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.544054 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/12b84523-522e-4e8c-b78e-0094262fb1f8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"12b84523-522e-4e8c-b78e-0094262fb1f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.544093 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9xcj\" (UniqueName: \"kubernetes.io/projected/12b84523-522e-4e8c-b78e-0094262fb1f8-kube-api-access-d9xcj\") pod \"rabbitmq-cell1-server-0\" (UID: \"12b84523-522e-4e8c-b78e-0094262fb1f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.544127 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/12b84523-522e-4e8c-b78e-0094262fb1f8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"12b84523-522e-4e8c-b78e-0094262fb1f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.544157 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/12b84523-522e-4e8c-b78e-0094262fb1f8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"12b84523-522e-4e8c-b78e-0094262fb1f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.544207 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"12b84523-522e-4e8c-b78e-0094262fb1f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.544228 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/12b84523-522e-4e8c-b78e-0094262fb1f8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"12b84523-522e-4e8c-b78e-0094262fb1f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.645458 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/12b84523-522e-4e8c-b78e-0094262fb1f8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"12b84523-522e-4e8c-b78e-0094262fb1f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.645786 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/12b84523-522e-4e8c-b78e-0094262fb1f8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"12b84523-522e-4e8c-b78e-0094262fb1f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.645829 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"12b84523-522e-4e8c-b78e-0094262fb1f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.645845 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/12b84523-522e-4e8c-b78e-0094262fb1f8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"12b84523-522e-4e8c-b78e-0094262fb1f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.645880 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/12b84523-522e-4e8c-b78e-0094262fb1f8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"12b84523-522e-4e8c-b78e-0094262fb1f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.645918 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/12b84523-522e-4e8c-b78e-0094262fb1f8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"12b84523-522e-4e8c-b78e-0094262fb1f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.645956 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/12b84523-522e-4e8c-b78e-0094262fb1f8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"12b84523-522e-4e8c-b78e-0094262fb1f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.645982 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/12b84523-522e-4e8c-b78e-0094262fb1f8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"12b84523-522e-4e8c-b78e-0094262fb1f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.646022 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/12b84523-522e-4e8c-b78e-0094262fb1f8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"12b84523-522e-4e8c-b78e-0094262fb1f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.646033 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"12b84523-522e-4e8c-b78e-0094262fb1f8\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-cell1-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.646049 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/12b84523-522e-4e8c-b78e-0094262fb1f8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"12b84523-522e-4e8c-b78e-0094262fb1f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.646084 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9xcj\" (UniqueName: \"kubernetes.io/projected/12b84523-522e-4e8c-b78e-0094262fb1f8-kube-api-access-d9xcj\") pod \"rabbitmq-cell1-server-0\" (UID: \"12b84523-522e-4e8c-b78e-0094262fb1f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.646333 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/12b84523-522e-4e8c-b78e-0094262fb1f8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"12b84523-522e-4e8c-b78e-0094262fb1f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.646995 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/12b84523-522e-4e8c-b78e-0094262fb1f8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"12b84523-522e-4e8c-b78e-0094262fb1f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.647023 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/12b84523-522e-4e8c-b78e-0094262fb1f8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"12b84523-522e-4e8c-b78e-0094262fb1f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.647131 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/12b84523-522e-4e8c-b78e-0094262fb1f8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"12b84523-522e-4e8c-b78e-0094262fb1f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.647550 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/12b84523-522e-4e8c-b78e-0094262fb1f8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"12b84523-522e-4e8c-b78e-0094262fb1f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.651009 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/12b84523-522e-4e8c-b78e-0094262fb1f8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"12b84523-522e-4e8c-b78e-0094262fb1f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.651062 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/12b84523-522e-4e8c-b78e-0094262fb1f8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"12b84523-522e-4e8c-b78e-0094262fb1f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.656268 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/12b84523-522e-4e8c-b78e-0094262fb1f8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"12b84523-522e-4e8c-b78e-0094262fb1f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.662109 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9xcj\" (UniqueName: \"kubernetes.io/projected/12b84523-522e-4e8c-b78e-0094262fb1f8-kube-api-access-d9xcj\") pod \"rabbitmq-cell1-server-0\" (UID: \"12b84523-522e-4e8c-b78e-0094262fb1f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.664473 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"12b84523-522e-4e8c-b78e-0094262fb1f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.672874 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/12b84523-522e-4e8c-b78e-0094262fb1f8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"12b84523-522e-4e8c-b78e-0094262fb1f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 11:52:58 crc kubenswrapper[4922]: I0218 11:52:58.870300 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 18 11:52:59 crc kubenswrapper[4922]: I0218 11:52:59.695127 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 18 11:52:59 crc kubenswrapper[4922]: I0218 11:52:59.698752 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 18 11:52:59 crc kubenswrapper[4922]: I0218 11:52:59.703684 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 18 11:52:59 crc kubenswrapper[4922]: I0218 11:52:59.703774 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-2zdpl" Feb 18 11:52:59 crc kubenswrapper[4922]: I0218 11:52:59.703828 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 18 11:52:59 crc kubenswrapper[4922]: I0218 11:52:59.704247 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 18 11:52:59 crc kubenswrapper[4922]: I0218 11:52:59.712080 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 18 11:52:59 crc kubenswrapper[4922]: I0218 11:52:59.716053 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 18 11:52:59 crc kubenswrapper[4922]: I0218 11:52:59.760910 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/302e3b56-c5a4-4e80-bb7e-a9e6a61a119e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"302e3b56-c5a4-4e80-bb7e-a9e6a61a119e\") " pod="openstack/openstack-galera-0" Feb 18 11:52:59 crc kubenswrapper[4922]: I0218 11:52:59.760965 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/302e3b56-c5a4-4e80-bb7e-a9e6a61a119e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"302e3b56-c5a4-4e80-bb7e-a9e6a61a119e\") " pod="openstack/openstack-galera-0" Feb 18 11:52:59 crc kubenswrapper[4922]: I0218 11:52:59.760987 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/302e3b56-c5a4-4e80-bb7e-a9e6a61a119e-config-data-default\") pod \"openstack-galera-0\" (UID: \"302e3b56-c5a4-4e80-bb7e-a9e6a61a119e\") " pod="openstack/openstack-galera-0" Feb 18 11:52:59 crc kubenswrapper[4922]: I0218 11:52:59.761016 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"302e3b56-c5a4-4e80-bb7e-a9e6a61a119e\") " pod="openstack/openstack-galera-0" Feb 18 11:52:59 crc kubenswrapper[4922]: I0218 11:52:59.761035 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/302e3b56-c5a4-4e80-bb7e-a9e6a61a119e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"302e3b56-c5a4-4e80-bb7e-a9e6a61a119e\") " pod="openstack/openstack-galera-0" Feb 18 11:52:59 crc kubenswrapper[4922]: I0218 11:52:59.761065 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/302e3b56-c5a4-4e80-bb7e-a9e6a61a119e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"302e3b56-c5a4-4e80-bb7e-a9e6a61a119e\") " pod="openstack/openstack-galera-0" Feb 18 11:52:59 crc kubenswrapper[4922]: I0218 11:52:59.761085 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9knsj\" (UniqueName: \"kubernetes.io/projected/302e3b56-c5a4-4e80-bb7e-a9e6a61a119e-kube-api-access-9knsj\") pod \"openstack-galera-0\" (UID: \"302e3b56-c5a4-4e80-bb7e-a9e6a61a119e\") " pod="openstack/openstack-galera-0" Feb 18 11:52:59 crc kubenswrapper[4922]: I0218 11:52:59.761103 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/302e3b56-c5a4-4e80-bb7e-a9e6a61a119e-kolla-config\") pod \"openstack-galera-0\" (UID: \"302e3b56-c5a4-4e80-bb7e-a9e6a61a119e\") " pod="openstack/openstack-galera-0" Feb 18 11:52:59 crc kubenswrapper[4922]: I0218 11:52:59.867769 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/302e3b56-c5a4-4e80-bb7e-a9e6a61a119e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"302e3b56-c5a4-4e80-bb7e-a9e6a61a119e\") " pod="openstack/openstack-galera-0" Feb 18 11:52:59 crc kubenswrapper[4922]: I0218 11:52:59.867836 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/302e3b56-c5a4-4e80-bb7e-a9e6a61a119e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"302e3b56-c5a4-4e80-bb7e-a9e6a61a119e\") " pod="openstack/openstack-galera-0" Feb 18 11:52:59 crc kubenswrapper[4922]: I0218 11:52:59.867862 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/302e3b56-c5a4-4e80-bb7e-a9e6a61a119e-config-data-default\") pod \"openstack-galera-0\" (UID: \"302e3b56-c5a4-4e80-bb7e-a9e6a61a119e\") " pod="openstack/openstack-galera-0" Feb 18 11:52:59 crc kubenswrapper[4922]: I0218 11:52:59.867896 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"302e3b56-c5a4-4e80-bb7e-a9e6a61a119e\") " pod="openstack/openstack-galera-0" Feb 18 11:52:59 crc kubenswrapper[4922]: I0218 11:52:59.867920 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/302e3b56-c5a4-4e80-bb7e-a9e6a61a119e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"302e3b56-c5a4-4e80-bb7e-a9e6a61a119e\") " pod="openstack/openstack-galera-0" Feb 18 11:52:59 crc kubenswrapper[4922]: I0218 11:52:59.867976 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/302e3b56-c5a4-4e80-bb7e-a9e6a61a119e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"302e3b56-c5a4-4e80-bb7e-a9e6a61a119e\") " pod="openstack/openstack-galera-0" Feb 18 11:52:59 crc kubenswrapper[4922]: I0218 11:52:59.868004 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9knsj\" (UniqueName: \"kubernetes.io/projected/302e3b56-c5a4-4e80-bb7e-a9e6a61a119e-kube-api-access-9knsj\") pod \"openstack-galera-0\" (UID: \"302e3b56-c5a4-4e80-bb7e-a9e6a61a119e\") " pod="openstack/openstack-galera-0" Feb 18 11:52:59 crc kubenswrapper[4922]: I0218 11:52:59.868027 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/302e3b56-c5a4-4e80-bb7e-a9e6a61a119e-kolla-config\") pod \"openstack-galera-0\" (UID: \"302e3b56-c5a4-4e80-bb7e-a9e6a61a119e\") " pod="openstack/openstack-galera-0" Feb 18 11:52:59 crc kubenswrapper[4922]: I0218 11:52:59.868587 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"302e3b56-c5a4-4e80-bb7e-a9e6a61a119e\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/openstack-galera-0" Feb 18 11:52:59 crc kubenswrapper[4922]: I0218 11:52:59.869211 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/302e3b56-c5a4-4e80-bb7e-a9e6a61a119e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"302e3b56-c5a4-4e80-bb7e-a9e6a61a119e\") " pod="openstack/openstack-galera-0" Feb 18 11:52:59 crc kubenswrapper[4922]: I0218 11:52:59.876721 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/302e3b56-c5a4-4e80-bb7e-a9e6a61a119e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"302e3b56-c5a4-4e80-bb7e-a9e6a61a119e\") " pod="openstack/openstack-galera-0" Feb 18 11:52:59 crc kubenswrapper[4922]: I0218 11:52:59.877215 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/302e3b56-c5a4-4e80-bb7e-a9e6a61a119e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"302e3b56-c5a4-4e80-bb7e-a9e6a61a119e\") " pod="openstack/openstack-galera-0" Feb 18 11:52:59 crc kubenswrapper[4922]: I0218 11:52:59.889508 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"302e3b56-c5a4-4e80-bb7e-a9e6a61a119e\") " pod="openstack/openstack-galera-0" Feb 18 11:52:59 crc kubenswrapper[4922]: I0218 11:52:59.895312 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9knsj\" (UniqueName: \"kubernetes.io/projected/302e3b56-c5a4-4e80-bb7e-a9e6a61a119e-kube-api-access-9knsj\") pod \"openstack-galera-0\" (UID: \"302e3b56-c5a4-4e80-bb7e-a9e6a61a119e\") " pod="openstack/openstack-galera-0" Feb 18 11:52:59 crc kubenswrapper[4922]: I0218 11:52:59.895825 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/302e3b56-c5a4-4e80-bb7e-a9e6a61a119e-kolla-config\") pod \"openstack-galera-0\" (UID: \"302e3b56-c5a4-4e80-bb7e-a9e6a61a119e\") " pod="openstack/openstack-galera-0" Feb 18 11:52:59 crc kubenswrapper[4922]: I0218 11:52:59.896507 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/302e3b56-c5a4-4e80-bb7e-a9e6a61a119e-config-data-default\") pod \"openstack-galera-0\" (UID: \"302e3b56-c5a4-4e80-bb7e-a9e6a61a119e\") " pod="openstack/openstack-galera-0" Feb 18 11:52:59 crc kubenswrapper[4922]: I0218 11:52:59.897133 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/302e3b56-c5a4-4e80-bb7e-a9e6a61a119e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"302e3b56-c5a4-4e80-bb7e-a9e6a61a119e\") " pod="openstack/openstack-galera-0" Feb 18 11:53:00 crc kubenswrapper[4922]: I0218 11:53:00.020561 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 18 11:53:00 crc kubenswrapper[4922]: I0218 11:53:00.922121 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 18 11:53:00 crc kubenswrapper[4922]: I0218 11:53:00.924031 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 18 11:53:00 crc kubenswrapper[4922]: I0218 11:53:00.926600 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 18 11:53:00 crc kubenswrapper[4922]: I0218 11:53:00.927647 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 18 11:53:00 crc kubenswrapper[4922]: I0218 11:53:00.927905 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-bjv2t" Feb 18 11:53:00 crc kubenswrapper[4922]: I0218 11:53:00.929641 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 18 11:53:00 crc kubenswrapper[4922]: I0218 11:53:00.932558 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 18 11:53:00 crc kubenswrapper[4922]: I0218 11:53:00.982962 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tqxd\" (UniqueName: \"kubernetes.io/projected/873b23d0-3c83-4ab7-8178-1c4832c544a0-kube-api-access-9tqxd\") pod \"openstack-cell1-galera-0\" (UID: \"873b23d0-3c83-4ab7-8178-1c4832c544a0\") " pod="openstack/openstack-cell1-galera-0" Feb 18 11:53:00 crc kubenswrapper[4922]: I0218 11:53:00.983026 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/873b23d0-3c83-4ab7-8178-1c4832c544a0-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"873b23d0-3c83-4ab7-8178-1c4832c544a0\") " pod="openstack/openstack-cell1-galera-0" Feb 18 11:53:00 crc kubenswrapper[4922]: I0218 11:53:00.983073 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/873b23d0-3c83-4ab7-8178-1c4832c544a0-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"873b23d0-3c83-4ab7-8178-1c4832c544a0\") " pod="openstack/openstack-cell1-galera-0" Feb 18 11:53:00 crc kubenswrapper[4922]: I0218 11:53:00.983098 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/873b23d0-3c83-4ab7-8178-1c4832c544a0-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"873b23d0-3c83-4ab7-8178-1c4832c544a0\") " pod="openstack/openstack-cell1-galera-0" Feb 18 11:53:00 crc kubenswrapper[4922]: I0218 11:53:00.983138 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/873b23d0-3c83-4ab7-8178-1c4832c544a0-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"873b23d0-3c83-4ab7-8178-1c4832c544a0\") " pod="openstack/openstack-cell1-galera-0" Feb 18 11:53:00 crc kubenswrapper[4922]: I0218 11:53:00.983173 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/873b23d0-3c83-4ab7-8178-1c4832c544a0-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"873b23d0-3c83-4ab7-8178-1c4832c544a0\") " pod="openstack/openstack-cell1-galera-0" Feb 18 11:53:00 crc kubenswrapper[4922]: I0218 11:53:00.983204 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"873b23d0-3c83-4ab7-8178-1c4832c544a0\") " pod="openstack/openstack-cell1-galera-0" Feb 18 11:53:00 crc kubenswrapper[4922]: I0218 11:53:00.983222 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/873b23d0-3c83-4ab7-8178-1c4832c544a0-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"873b23d0-3c83-4ab7-8178-1c4832c544a0\") " pod="openstack/openstack-cell1-galera-0" Feb 18 11:53:01 crc kubenswrapper[4922]: I0218 11:53:01.084491 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/873b23d0-3c83-4ab7-8178-1c4832c544a0-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"873b23d0-3c83-4ab7-8178-1c4832c544a0\") " pod="openstack/openstack-cell1-galera-0" Feb 18 11:53:01 crc kubenswrapper[4922]: I0218 11:53:01.084557 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"873b23d0-3c83-4ab7-8178-1c4832c544a0\") " pod="openstack/openstack-cell1-galera-0" Feb 18 11:53:01 crc kubenswrapper[4922]: I0218 11:53:01.084584 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/873b23d0-3c83-4ab7-8178-1c4832c544a0-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"873b23d0-3c83-4ab7-8178-1c4832c544a0\") " pod="openstack/openstack-cell1-galera-0" Feb 18 11:53:01 crc kubenswrapper[4922]: I0218 11:53:01.084689 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/873b23d0-3c83-4ab7-8178-1c4832c544a0-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"873b23d0-3c83-4ab7-8178-1c4832c544a0\") " pod="openstack/openstack-cell1-galera-0" Feb 18 11:53:01 crc kubenswrapper[4922]: I0218 11:53:01.085337 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/873b23d0-3c83-4ab7-8178-1c4832c544a0-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"873b23d0-3c83-4ab7-8178-1c4832c544a0\") " pod="openstack/openstack-cell1-galera-0" Feb 18 11:53:01 crc kubenswrapper[4922]: I0218 11:53:01.085528 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"873b23d0-3c83-4ab7-8178-1c4832c544a0\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/openstack-cell1-galera-0" Feb 18 11:53:01 crc kubenswrapper[4922]: I0218 11:53:01.086160 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tqxd\" (UniqueName: \"kubernetes.io/projected/873b23d0-3c83-4ab7-8178-1c4832c544a0-kube-api-access-9tqxd\") pod \"openstack-cell1-galera-0\" (UID: \"873b23d0-3c83-4ab7-8178-1c4832c544a0\") " pod="openstack/openstack-cell1-galera-0" Feb 18 11:53:01 crc kubenswrapper[4922]: I0218 11:53:01.086527 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/873b23d0-3c83-4ab7-8178-1c4832c544a0-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"873b23d0-3c83-4ab7-8178-1c4832c544a0\") " pod="openstack/openstack-cell1-galera-0" Feb 18 11:53:01 crc kubenswrapper[4922]: I0218 11:53:01.086570 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/873b23d0-3c83-4ab7-8178-1c4832c544a0-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"873b23d0-3c83-4ab7-8178-1c4832c544a0\") " pod="openstack/openstack-cell1-galera-0" Feb 18 11:53:01 crc kubenswrapper[4922]: I0218 11:53:01.086624 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/873b23d0-3c83-4ab7-8178-1c4832c544a0-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"873b23d0-3c83-4ab7-8178-1c4832c544a0\") " pod="openstack/openstack-cell1-galera-0" Feb 18 11:53:01 crc kubenswrapper[4922]: I0218 11:53:01.087159 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/873b23d0-3c83-4ab7-8178-1c4832c544a0-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"873b23d0-3c83-4ab7-8178-1c4832c544a0\") " pod="openstack/openstack-cell1-galera-0" Feb 18 11:53:01 crc kubenswrapper[4922]: I0218 11:53:01.087951 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/873b23d0-3c83-4ab7-8178-1c4832c544a0-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"873b23d0-3c83-4ab7-8178-1c4832c544a0\") " pod="openstack/openstack-cell1-galera-0" Feb 18 11:53:01 crc kubenswrapper[4922]: I0218 11:53:01.088630 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/873b23d0-3c83-4ab7-8178-1c4832c544a0-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"873b23d0-3c83-4ab7-8178-1c4832c544a0\") " pod="openstack/openstack-cell1-galera-0" Feb 18 11:53:01 crc kubenswrapper[4922]: I0218 11:53:01.091971 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/873b23d0-3c83-4ab7-8178-1c4832c544a0-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"873b23d0-3c83-4ab7-8178-1c4832c544a0\") " pod="openstack/openstack-cell1-galera-0" Feb 18 11:53:01 crc kubenswrapper[4922]: I0218 11:53:01.099241 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/873b23d0-3c83-4ab7-8178-1c4832c544a0-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"873b23d0-3c83-4ab7-8178-1c4832c544a0\") " pod="openstack/openstack-cell1-galera-0" Feb 18 11:53:01 crc kubenswrapper[4922]: I0218 11:53:01.116301 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tqxd\" (UniqueName: \"kubernetes.io/projected/873b23d0-3c83-4ab7-8178-1c4832c544a0-kube-api-access-9tqxd\") pod \"openstack-cell1-galera-0\" (UID: \"873b23d0-3c83-4ab7-8178-1c4832c544a0\") " pod="openstack/openstack-cell1-galera-0" Feb 18 11:53:01 crc kubenswrapper[4922]: I0218 11:53:01.118755 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"873b23d0-3c83-4ab7-8178-1c4832c544a0\") " pod="openstack/openstack-cell1-galera-0" Feb 18 11:53:01 crc kubenswrapper[4922]: I0218 11:53:01.259173 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 18 11:53:01 crc kubenswrapper[4922]: I0218 11:53:01.309145 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 18 11:53:01 crc kubenswrapper[4922]: I0218 11:53:01.312107 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 18 11:53:01 crc kubenswrapper[4922]: I0218 11:53:01.315858 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 18 11:53:01 crc kubenswrapper[4922]: I0218 11:53:01.316116 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 18 11:53:01 crc kubenswrapper[4922]: I0218 11:53:01.318122 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-f8bcd" Feb 18 11:53:01 crc kubenswrapper[4922]: I0218 11:53:01.327284 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 18 11:53:01 crc kubenswrapper[4922]: I0218 11:53:01.392515 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ce20f52-4b9d-47a6-8da7-c64cd1d15623-config-data\") pod \"memcached-0\" (UID: \"0ce20f52-4b9d-47a6-8da7-c64cd1d15623\") " pod="openstack/memcached-0" Feb 18 11:53:01 crc kubenswrapper[4922]: I0218 11:53:01.392576 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0ce20f52-4b9d-47a6-8da7-c64cd1d15623-kolla-config\") pod \"memcached-0\" (UID: \"0ce20f52-4b9d-47a6-8da7-c64cd1d15623\") " pod="openstack/memcached-0" Feb 18 11:53:01 crc kubenswrapper[4922]: I0218 11:53:01.392598 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ce20f52-4b9d-47a6-8da7-c64cd1d15623-combined-ca-bundle\") pod \"memcached-0\" (UID: \"0ce20f52-4b9d-47a6-8da7-c64cd1d15623\") " pod="openstack/memcached-0" Feb 18 11:53:01 crc kubenswrapper[4922]: I0218 11:53:01.392615 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ce20f52-4b9d-47a6-8da7-c64cd1d15623-memcached-tls-certs\") pod \"memcached-0\" (UID: \"0ce20f52-4b9d-47a6-8da7-c64cd1d15623\") " pod="openstack/memcached-0" Feb 18 11:53:01 crc kubenswrapper[4922]: I0218 11:53:01.392634 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm44r\" (UniqueName: \"kubernetes.io/projected/0ce20f52-4b9d-47a6-8da7-c64cd1d15623-kube-api-access-zm44r\") pod \"memcached-0\" (UID: \"0ce20f52-4b9d-47a6-8da7-c64cd1d15623\") " pod="openstack/memcached-0" Feb 18 11:53:01 crc kubenswrapper[4922]: I0218 11:53:01.495312 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ce20f52-4b9d-47a6-8da7-c64cd1d15623-config-data\") pod \"memcached-0\" (UID: \"0ce20f52-4b9d-47a6-8da7-c64cd1d15623\") " pod="openstack/memcached-0" Feb 18 11:53:01 crc kubenswrapper[4922]: I0218 11:53:01.495435 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0ce20f52-4b9d-47a6-8da7-c64cd1d15623-kolla-config\") pod \"memcached-0\" (UID: \"0ce20f52-4b9d-47a6-8da7-c64cd1d15623\") " pod="openstack/memcached-0" Feb 18 11:53:01 crc kubenswrapper[4922]: I0218 11:53:01.495469 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ce20f52-4b9d-47a6-8da7-c64cd1d15623-combined-ca-bundle\") pod \"memcached-0\" (UID: \"0ce20f52-4b9d-47a6-8da7-c64cd1d15623\") " pod="openstack/memcached-0" Feb 18 11:53:01 crc kubenswrapper[4922]: I0218 11:53:01.495499 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ce20f52-4b9d-47a6-8da7-c64cd1d15623-memcached-tls-certs\") pod \"memcached-0\" (UID: \"0ce20f52-4b9d-47a6-8da7-c64cd1d15623\") " pod="openstack/memcached-0" Feb 18 11:53:01 crc kubenswrapper[4922]: I0218 11:53:01.495525 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zm44r\" (UniqueName: \"kubernetes.io/projected/0ce20f52-4b9d-47a6-8da7-c64cd1d15623-kube-api-access-zm44r\") pod \"memcached-0\" (UID: \"0ce20f52-4b9d-47a6-8da7-c64cd1d15623\") " pod="openstack/memcached-0" Feb 18 11:53:01 crc kubenswrapper[4922]: I0218 11:53:01.496344 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0ce20f52-4b9d-47a6-8da7-c64cd1d15623-kolla-config\") pod \"memcached-0\" (UID: \"0ce20f52-4b9d-47a6-8da7-c64cd1d15623\") " pod="openstack/memcached-0" Feb 18 11:53:01 crc kubenswrapper[4922]: I0218 11:53:01.496442 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ce20f52-4b9d-47a6-8da7-c64cd1d15623-config-data\") pod \"memcached-0\" (UID: \"0ce20f52-4b9d-47a6-8da7-c64cd1d15623\") " pod="openstack/memcached-0" Feb 18 11:53:01 crc kubenswrapper[4922]: I0218 11:53:01.499578 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ce20f52-4b9d-47a6-8da7-c64cd1d15623-memcached-tls-certs\") pod \"memcached-0\" (UID: \"0ce20f52-4b9d-47a6-8da7-c64cd1d15623\") " pod="openstack/memcached-0" Feb 18 11:53:01 crc kubenswrapper[4922]: I0218 11:53:01.507968 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ce20f52-4b9d-47a6-8da7-c64cd1d15623-combined-ca-bundle\") pod \"memcached-0\" (UID: \"0ce20f52-4b9d-47a6-8da7-c64cd1d15623\") " pod="openstack/memcached-0" Feb 18 11:53:01 crc kubenswrapper[4922]: I0218 11:53:01.516356 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm44r\" (UniqueName: \"kubernetes.io/projected/0ce20f52-4b9d-47a6-8da7-c64cd1d15623-kube-api-access-zm44r\") pod \"memcached-0\" (UID: \"0ce20f52-4b9d-47a6-8da7-c64cd1d15623\") " pod="openstack/memcached-0" Feb 18 11:53:01 crc kubenswrapper[4922]: I0218 11:53:01.647198 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 18 11:53:01 crc kubenswrapper[4922]: I0218 11:53:01.949766 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-rkhdz" event={"ID":"5728000b-35c8-4748-a8bd-722a9d4da288","Type":"ContainerStarted","Data":"05332c9ac22e2e275a6ff8155167585702fc418b67e75c5ce73b68418331a924"} Feb 18 11:53:03 crc kubenswrapper[4922]: I0218 11:53:03.914819 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 18 11:53:03 crc kubenswrapper[4922]: I0218 11:53:03.915998 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 18 11:53:03 crc kubenswrapper[4922]: I0218 11:53:03.920725 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-r55b4" Feb 18 11:53:03 crc kubenswrapper[4922]: I0218 11:53:03.972948 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 18 11:53:04 crc kubenswrapper[4922]: I0218 11:53:04.053443 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhpfc\" (UniqueName: \"kubernetes.io/projected/2aa305a0-c015-43c2-851c-8eff778238be-kube-api-access-hhpfc\") pod \"kube-state-metrics-0\" (UID: \"2aa305a0-c015-43c2-851c-8eff778238be\") " pod="openstack/kube-state-metrics-0" Feb 18 11:53:04 crc kubenswrapper[4922]: I0218 11:53:04.155656 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhpfc\" (UniqueName: \"kubernetes.io/projected/2aa305a0-c015-43c2-851c-8eff778238be-kube-api-access-hhpfc\") pod \"kube-state-metrics-0\" (UID: \"2aa305a0-c015-43c2-851c-8eff778238be\") " pod="openstack/kube-state-metrics-0" Feb 18 11:53:04 crc kubenswrapper[4922]: I0218 11:53:04.182104 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhpfc\" (UniqueName: \"kubernetes.io/projected/2aa305a0-c015-43c2-851c-8eff778238be-kube-api-access-hhpfc\") pod \"kube-state-metrics-0\" (UID: \"2aa305a0-c015-43c2-851c-8eff778238be\") " pod="openstack/kube-state-metrics-0" Feb 18 11:53:04 crc kubenswrapper[4922]: I0218 11:53:04.255144 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 18 11:53:05 crc kubenswrapper[4922]: I0218 11:53:05.166426 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 18 11:53:05 crc kubenswrapper[4922]: I0218 11:53:05.169475 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 18 11:53:05 crc kubenswrapper[4922]: I0218 11:53:05.174385 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 18 11:53:05 crc kubenswrapper[4922]: I0218 11:53:05.174566 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 18 11:53:05 crc kubenswrapper[4922]: I0218 11:53:05.175341 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 18 11:53:05 crc kubenswrapper[4922]: I0218 11:53:05.175490 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 18 11:53:05 crc kubenswrapper[4922]: I0218 11:53:05.175641 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 18 11:53:05 crc kubenswrapper[4922]: I0218 11:53:05.175937 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-xmthr" Feb 18 11:53:05 crc kubenswrapper[4922]: I0218 11:53:05.179589 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 18 11:53:05 crc kubenswrapper[4922]: I0218 11:53:05.182055 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 18 11:53:05 crc kubenswrapper[4922]: I0218 11:53:05.182768 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 18 11:53:05 crc kubenswrapper[4922]: I0218 11:53:05.272078 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e35b9ac7-2e11-4096-a77a-4be1a41d737f-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e35b9ac7-2e11-4096-a77a-4be1a41d737f\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:53:05 crc kubenswrapper[4922]: I0218 11:53:05.272158 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/e35b9ac7-2e11-4096-a77a-4be1a41d737f-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"e35b9ac7-2e11-4096-a77a-4be1a41d737f\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:53:05 crc kubenswrapper[4922]: I0218 11:53:05.272200 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/e35b9ac7-2e11-4096-a77a-4be1a41d737f-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"e35b9ac7-2e11-4096-a77a-4be1a41d737f\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:53:05 crc kubenswrapper[4922]: I0218 11:53:05.272225 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d6c8\" (UniqueName: \"kubernetes.io/projected/e35b9ac7-2e11-4096-a77a-4be1a41d737f-kube-api-access-2d6c8\") pod \"prometheus-metric-storage-0\" (UID: \"e35b9ac7-2e11-4096-a77a-4be1a41d737f\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:53:05 crc kubenswrapper[4922]: I0218 11:53:05.272248 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e35b9ac7-2e11-4096-a77a-4be1a41d737f-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e35b9ac7-2e11-4096-a77a-4be1a41d737f\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:53:05 crc kubenswrapper[4922]: I0218 11:53:05.272316 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-15ee050c-37dd-43df-8cbe-1200f09a5545\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15ee050c-37dd-43df-8cbe-1200f09a5545\") pod \"prometheus-metric-storage-0\" (UID: \"e35b9ac7-2e11-4096-a77a-4be1a41d737f\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:53:05 crc kubenswrapper[4922]: I0218 11:53:05.272375 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e35b9ac7-2e11-4096-a77a-4be1a41d737f-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e35b9ac7-2e11-4096-a77a-4be1a41d737f\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:53:05 crc kubenswrapper[4922]: I0218 11:53:05.272400 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e35b9ac7-2e11-4096-a77a-4be1a41d737f-config\") pod \"prometheus-metric-storage-0\" (UID: \"e35b9ac7-2e11-4096-a77a-4be1a41d737f\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:53:05 crc kubenswrapper[4922]: I0218 11:53:05.272432 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e35b9ac7-2e11-4096-a77a-4be1a41d737f-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e35b9ac7-2e11-4096-a77a-4be1a41d737f\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:53:05 crc kubenswrapper[4922]: I0218 11:53:05.272456 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e35b9ac7-2e11-4096-a77a-4be1a41d737f-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e35b9ac7-2e11-4096-a77a-4be1a41d737f\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:53:05 crc kubenswrapper[4922]: I0218 11:53:05.374327 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e35b9ac7-2e11-4096-a77a-4be1a41d737f-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e35b9ac7-2e11-4096-a77a-4be1a41d737f\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:53:05 crc kubenswrapper[4922]: I0218 11:53:05.374704 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/e35b9ac7-2e11-4096-a77a-4be1a41d737f-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"e35b9ac7-2e11-4096-a77a-4be1a41d737f\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:53:05 crc kubenswrapper[4922]: I0218 11:53:05.374738 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/e35b9ac7-2e11-4096-a77a-4be1a41d737f-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"e35b9ac7-2e11-4096-a77a-4be1a41d737f\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:53:05 crc kubenswrapper[4922]: I0218 11:53:05.374757 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d6c8\" (UniqueName: \"kubernetes.io/projected/e35b9ac7-2e11-4096-a77a-4be1a41d737f-kube-api-access-2d6c8\") pod \"prometheus-metric-storage-0\" (UID: \"e35b9ac7-2e11-4096-a77a-4be1a41d737f\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:53:05 crc kubenswrapper[4922]: I0218 11:53:05.374773 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e35b9ac7-2e11-4096-a77a-4be1a41d737f-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e35b9ac7-2e11-4096-a77a-4be1a41d737f\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:53:05 crc kubenswrapper[4922]: I0218 11:53:05.374842 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-15ee050c-37dd-43df-8cbe-1200f09a5545\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15ee050c-37dd-43df-8cbe-1200f09a5545\") pod \"prometheus-metric-storage-0\" (UID: \"e35b9ac7-2e11-4096-a77a-4be1a41d737f\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:53:05 crc kubenswrapper[4922]: I0218 11:53:05.374890 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e35b9ac7-2e11-4096-a77a-4be1a41d737f-config\") pod \"prometheus-metric-storage-0\" (UID: \"e35b9ac7-2e11-4096-a77a-4be1a41d737f\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:53:05 crc kubenswrapper[4922]: I0218 11:53:05.374914 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e35b9ac7-2e11-4096-a77a-4be1a41d737f-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e35b9ac7-2e11-4096-a77a-4be1a41d737f\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:53:05 crc kubenswrapper[4922]: I0218 11:53:05.374949 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e35b9ac7-2e11-4096-a77a-4be1a41d737f-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e35b9ac7-2e11-4096-a77a-4be1a41d737f\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:53:05 crc kubenswrapper[4922]: I0218 11:53:05.374975 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e35b9ac7-2e11-4096-a77a-4be1a41d737f-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e35b9ac7-2e11-4096-a77a-4be1a41d737f\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:53:05 crc kubenswrapper[4922]: I0218 11:53:05.375622 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/e35b9ac7-2e11-4096-a77a-4be1a41d737f-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"e35b9ac7-2e11-4096-a77a-4be1a41d737f\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:53:05 crc kubenswrapper[4922]: I0218 11:53:05.375671 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/e35b9ac7-2e11-4096-a77a-4be1a41d737f-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"e35b9ac7-2e11-4096-a77a-4be1a41d737f\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:53:05 crc kubenswrapper[4922]: I0218 11:53:05.376204 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e35b9ac7-2e11-4096-a77a-4be1a41d737f-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e35b9ac7-2e11-4096-a77a-4be1a41d737f\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:53:05 crc kubenswrapper[4922]: I0218 11:53:05.379737 4922 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 18 11:53:05 crc kubenswrapper[4922]: I0218 11:53:05.379797 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-15ee050c-37dd-43df-8cbe-1200f09a5545\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15ee050c-37dd-43df-8cbe-1200f09a5545\") pod \"prometheus-metric-storage-0\" (UID: \"e35b9ac7-2e11-4096-a77a-4be1a41d737f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d461f0c4a551673a0d7d7003637451f1312f1b9722a2159a051859daee296e97/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 18 11:53:05 crc kubenswrapper[4922]: I0218 11:53:05.380199 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e35b9ac7-2e11-4096-a77a-4be1a41d737f-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e35b9ac7-2e11-4096-a77a-4be1a41d737f\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:53:05 crc kubenswrapper[4922]: I0218 11:53:05.392706 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e35b9ac7-2e11-4096-a77a-4be1a41d737f-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e35b9ac7-2e11-4096-a77a-4be1a41d737f\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:53:05 crc kubenswrapper[4922]: I0218 11:53:05.393191 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e35b9ac7-2e11-4096-a77a-4be1a41d737f-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e35b9ac7-2e11-4096-a77a-4be1a41d737f\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:53:05 crc kubenswrapper[4922]: I0218 11:53:05.393198 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e35b9ac7-2e11-4096-a77a-4be1a41d737f-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e35b9ac7-2e11-4096-a77a-4be1a41d737f\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:53:05 crc kubenswrapper[4922]: I0218 11:53:05.393671 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e35b9ac7-2e11-4096-a77a-4be1a41d737f-config\") pod \"prometheus-metric-storage-0\" (UID: \"e35b9ac7-2e11-4096-a77a-4be1a41d737f\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:53:05 crc kubenswrapper[4922]: I0218 11:53:05.396772 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d6c8\" (UniqueName: \"kubernetes.io/projected/e35b9ac7-2e11-4096-a77a-4be1a41d737f-kube-api-access-2d6c8\") pod \"prometheus-metric-storage-0\" (UID: \"e35b9ac7-2e11-4096-a77a-4be1a41d737f\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:53:05 crc kubenswrapper[4922]: I0218 11:53:05.413093 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-15ee050c-37dd-43df-8cbe-1200f09a5545\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15ee050c-37dd-43df-8cbe-1200f09a5545\") pod \"prometheus-metric-storage-0\" (UID: \"e35b9ac7-2e11-4096-a77a-4be1a41d737f\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:53:05 crc kubenswrapper[4922]: I0218 11:53:05.528526 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 18 11:53:06 crc kubenswrapper[4922]: I0218 11:53:06.589903 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-996pg"] Feb 18 11:53:06 crc kubenswrapper[4922]: I0218 11:53:06.590901 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-996pg" Feb 18 11:53:06 crc kubenswrapper[4922]: I0218 11:53:06.595708 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 18 11:53:06 crc kubenswrapper[4922]: I0218 11:53:06.595907 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-nmmtj" Feb 18 11:53:06 crc kubenswrapper[4922]: I0218 11:53:06.596557 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 18 11:53:06 crc kubenswrapper[4922]: I0218 11:53:06.610293 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-stvc7"] Feb 18 11:53:06 crc kubenswrapper[4922]: I0218 11:53:06.611900 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-stvc7" Feb 18 11:53:06 crc kubenswrapper[4922]: I0218 11:53:06.618447 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-996pg"] Feb 18 11:53:06 crc kubenswrapper[4922]: I0218 11:53:06.630375 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-stvc7"] Feb 18 11:53:06 crc kubenswrapper[4922]: I0218 11:53:06.698481 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkc92\" (UniqueName: \"kubernetes.io/projected/cf286fe0-1b17-475a-b71b-ac4897c2f59d-kube-api-access-bkc92\") pod \"ovn-controller-ovs-stvc7\" (UID: \"cf286fe0-1b17-475a-b71b-ac4897c2f59d\") " pod="openstack/ovn-controller-ovs-stvc7" Feb 18 11:53:06 crc kubenswrapper[4922]: I0218 11:53:06.698526 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a2d0a226-07e2-402d-a868-2f8374670dac-var-run\") pod \"ovn-controller-996pg\" (UID: \"a2d0a226-07e2-402d-a868-2f8374670dac\") " pod="openstack/ovn-controller-996pg" Feb 18 11:53:06 crc kubenswrapper[4922]: I0218 11:53:06.698553 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a2d0a226-07e2-402d-a868-2f8374670dac-var-log-ovn\") pod \"ovn-controller-996pg\" (UID: \"a2d0a226-07e2-402d-a868-2f8374670dac\") " pod="openstack/ovn-controller-996pg" Feb 18 11:53:06 crc kubenswrapper[4922]: I0218 11:53:06.698581 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2d0a226-07e2-402d-a868-2f8374670dac-combined-ca-bundle\") pod \"ovn-controller-996pg\" (UID: \"a2d0a226-07e2-402d-a868-2f8374670dac\") " pod="openstack/ovn-controller-996pg" Feb 18 11:53:06 crc kubenswrapper[4922]: I0218 11:53:06.698602 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cf286fe0-1b17-475a-b71b-ac4897c2f59d-scripts\") pod \"ovn-controller-ovs-stvc7\" (UID: \"cf286fe0-1b17-475a-b71b-ac4897c2f59d\") " pod="openstack/ovn-controller-ovs-stvc7" Feb 18 11:53:06 crc kubenswrapper[4922]: I0218 11:53:06.698617 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a2d0a226-07e2-402d-a868-2f8374670dac-var-run-ovn\") pod \"ovn-controller-996pg\" (UID: \"a2d0a226-07e2-402d-a868-2f8374670dac\") " pod="openstack/ovn-controller-996pg" Feb 18 11:53:06 crc kubenswrapper[4922]: I0218 11:53:06.698632 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gngmd\" (UniqueName: \"kubernetes.io/projected/a2d0a226-07e2-402d-a868-2f8374670dac-kube-api-access-gngmd\") pod \"ovn-controller-996pg\" (UID: \"a2d0a226-07e2-402d-a868-2f8374670dac\") " pod="openstack/ovn-controller-996pg" Feb 18 11:53:06 crc kubenswrapper[4922]: I0218 11:53:06.698648 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/cf286fe0-1b17-475a-b71b-ac4897c2f59d-etc-ovs\") pod \"ovn-controller-ovs-stvc7\" (UID: \"cf286fe0-1b17-475a-b71b-ac4897c2f59d\") " pod="openstack/ovn-controller-ovs-stvc7" Feb 18 11:53:06 crc kubenswrapper[4922]: I0218 11:53:06.698675 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cf286fe0-1b17-475a-b71b-ac4897c2f59d-var-run\") pod \"ovn-controller-ovs-stvc7\" (UID: \"cf286fe0-1b17-475a-b71b-ac4897c2f59d\") " pod="openstack/ovn-controller-ovs-stvc7" Feb 18 11:53:06 crc kubenswrapper[4922]: I0218 11:53:06.698697 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/cf286fe0-1b17-475a-b71b-ac4897c2f59d-var-log\") pod \"ovn-controller-ovs-stvc7\" (UID: \"cf286fe0-1b17-475a-b71b-ac4897c2f59d\") " pod="openstack/ovn-controller-ovs-stvc7" Feb 18 11:53:06 crc kubenswrapper[4922]: I0218 11:53:06.698713 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/cf286fe0-1b17-475a-b71b-ac4897c2f59d-var-lib\") pod \"ovn-controller-ovs-stvc7\" (UID: \"cf286fe0-1b17-475a-b71b-ac4897c2f59d\") " pod="openstack/ovn-controller-ovs-stvc7" Feb 18 11:53:06 crc kubenswrapper[4922]: I0218 11:53:06.698753 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2d0a226-07e2-402d-a868-2f8374670dac-ovn-controller-tls-certs\") pod \"ovn-controller-996pg\" (UID: \"a2d0a226-07e2-402d-a868-2f8374670dac\") " pod="openstack/ovn-controller-996pg" Feb 18 11:53:06 crc kubenswrapper[4922]: I0218 11:53:06.698775 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a2d0a226-07e2-402d-a868-2f8374670dac-scripts\") pod \"ovn-controller-996pg\" (UID: \"a2d0a226-07e2-402d-a868-2f8374670dac\") " pod="openstack/ovn-controller-996pg" Feb 18 11:53:06 crc kubenswrapper[4922]: I0218 11:53:06.800266 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkc92\" (UniqueName: \"kubernetes.io/projected/cf286fe0-1b17-475a-b71b-ac4897c2f59d-kube-api-access-bkc92\") pod \"ovn-controller-ovs-stvc7\" (UID: \"cf286fe0-1b17-475a-b71b-ac4897c2f59d\") " pod="openstack/ovn-controller-ovs-stvc7" Feb 18 11:53:06 crc kubenswrapper[4922]: I0218 11:53:06.800612 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a2d0a226-07e2-402d-a868-2f8374670dac-var-run\") pod \"ovn-controller-996pg\" (UID: \"a2d0a226-07e2-402d-a868-2f8374670dac\") " pod="openstack/ovn-controller-996pg" Feb 18 11:53:06 crc kubenswrapper[4922]: I0218 11:53:06.800655 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a2d0a226-07e2-402d-a868-2f8374670dac-var-log-ovn\") pod \"ovn-controller-996pg\" (UID: \"a2d0a226-07e2-402d-a868-2f8374670dac\") " pod="openstack/ovn-controller-996pg" Feb 18 11:53:06 crc kubenswrapper[4922]: I0218 11:53:06.800691 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2d0a226-07e2-402d-a868-2f8374670dac-combined-ca-bundle\") pod \"ovn-controller-996pg\" (UID: \"a2d0a226-07e2-402d-a868-2f8374670dac\") " pod="openstack/ovn-controller-996pg" Feb 18 11:53:06 crc kubenswrapper[4922]: I0218 11:53:06.800727 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cf286fe0-1b17-475a-b71b-ac4897c2f59d-scripts\") pod \"ovn-controller-ovs-stvc7\" (UID: \"cf286fe0-1b17-475a-b71b-ac4897c2f59d\") " pod="openstack/ovn-controller-ovs-stvc7" Feb 18 11:53:06 crc kubenswrapper[4922]: I0218 11:53:06.800753 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a2d0a226-07e2-402d-a868-2f8374670dac-var-run-ovn\") pod \"ovn-controller-996pg\" (UID: \"a2d0a226-07e2-402d-a868-2f8374670dac\") " pod="openstack/ovn-controller-996pg" Feb 18 11:53:06 crc kubenswrapper[4922]: I0218 11:53:06.800775 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gngmd\" (UniqueName: \"kubernetes.io/projected/a2d0a226-07e2-402d-a868-2f8374670dac-kube-api-access-gngmd\") pod \"ovn-controller-996pg\" (UID: \"a2d0a226-07e2-402d-a868-2f8374670dac\") " pod="openstack/ovn-controller-996pg" Feb 18 11:53:06 crc kubenswrapper[4922]: I0218 11:53:06.800794 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/cf286fe0-1b17-475a-b71b-ac4897c2f59d-etc-ovs\") pod \"ovn-controller-ovs-stvc7\" (UID: \"cf286fe0-1b17-475a-b71b-ac4897c2f59d\") " pod="openstack/ovn-controller-ovs-stvc7" Feb 18 11:53:06 crc kubenswrapper[4922]: I0218 11:53:06.800818 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cf286fe0-1b17-475a-b71b-ac4897c2f59d-var-run\") pod \"ovn-controller-ovs-stvc7\" (UID: \"cf286fe0-1b17-475a-b71b-ac4897c2f59d\") " pod="openstack/ovn-controller-ovs-stvc7" Feb 18 11:53:06 crc kubenswrapper[4922]: I0218 11:53:06.800842 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/cf286fe0-1b17-475a-b71b-ac4897c2f59d-var-log\") pod \"ovn-controller-ovs-stvc7\" (UID: \"cf286fe0-1b17-475a-b71b-ac4897c2f59d\") " pod="openstack/ovn-controller-ovs-stvc7" Feb 18 11:53:06 crc kubenswrapper[4922]: I0218 11:53:06.800860 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/cf286fe0-1b17-475a-b71b-ac4897c2f59d-var-lib\") pod \"ovn-controller-ovs-stvc7\" (UID: \"cf286fe0-1b17-475a-b71b-ac4897c2f59d\") " pod="openstack/ovn-controller-ovs-stvc7" Feb 18 11:53:06 crc kubenswrapper[4922]: I0218 11:53:06.800902 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2d0a226-07e2-402d-a868-2f8374670dac-ovn-controller-tls-certs\") pod \"ovn-controller-996pg\" (UID: \"a2d0a226-07e2-402d-a868-2f8374670dac\") " pod="openstack/ovn-controller-996pg" Feb 18 11:53:06 crc kubenswrapper[4922]: I0218 11:53:06.800927 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a2d0a226-07e2-402d-a868-2f8374670dac-scripts\") pod \"ovn-controller-996pg\" (UID: \"a2d0a226-07e2-402d-a868-2f8374670dac\") " pod="openstack/ovn-controller-996pg" Feb 18 11:53:06 crc kubenswrapper[4922]: I0218 11:53:06.801689 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a2d0a226-07e2-402d-a868-2f8374670dac-var-run-ovn\") pod \"ovn-controller-996pg\" (UID: \"a2d0a226-07e2-402d-a868-2f8374670dac\") " pod="openstack/ovn-controller-996pg" Feb 18 11:53:06 crc kubenswrapper[4922]: I0218 11:53:06.801708 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/cf286fe0-1b17-475a-b71b-ac4897c2f59d-etc-ovs\") pod \"ovn-controller-ovs-stvc7\" (UID: \"cf286fe0-1b17-475a-b71b-ac4897c2f59d\") " pod="openstack/ovn-controller-ovs-stvc7" Feb 18 11:53:06 crc kubenswrapper[4922]: I0218 11:53:06.801734 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cf286fe0-1b17-475a-b71b-ac4897c2f59d-var-run\") pod \"ovn-controller-ovs-stvc7\" (UID: \"cf286fe0-1b17-475a-b71b-ac4897c2f59d\") " pod="openstack/ovn-controller-ovs-stvc7" Feb 18 11:53:06 crc kubenswrapper[4922]: I0218 11:53:06.801736 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a2d0a226-07e2-402d-a868-2f8374670dac-var-run\") pod \"ovn-controller-996pg\" (UID: \"a2d0a226-07e2-402d-a868-2f8374670dac\") " pod="openstack/ovn-controller-996pg" Feb 18 11:53:06 crc kubenswrapper[4922]: I0218 11:53:06.801755 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a2d0a226-07e2-402d-a868-2f8374670dac-var-log-ovn\") pod \"ovn-controller-996pg\" (UID: \"a2d0a226-07e2-402d-a868-2f8374670dac\") " pod="openstack/ovn-controller-996pg" Feb 18 11:53:06 crc kubenswrapper[4922]: I0218 11:53:06.801800 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/cf286fe0-1b17-475a-b71b-ac4897c2f59d-var-lib\") pod \"ovn-controller-ovs-stvc7\" (UID: \"cf286fe0-1b17-475a-b71b-ac4897c2f59d\") " pod="openstack/ovn-controller-ovs-stvc7" Feb 18 11:53:06 crc kubenswrapper[4922]: I0218 11:53:06.801823 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/cf286fe0-1b17-475a-b71b-ac4897c2f59d-var-log\") pod \"ovn-controller-ovs-stvc7\" (UID: \"cf286fe0-1b17-475a-b71b-ac4897c2f59d\") " pod="openstack/ovn-controller-ovs-stvc7" Feb 18 11:53:06 crc kubenswrapper[4922]: I0218 11:53:06.803325 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a2d0a226-07e2-402d-a868-2f8374670dac-scripts\") pod \"ovn-controller-996pg\" (UID: \"a2d0a226-07e2-402d-a868-2f8374670dac\") " pod="openstack/ovn-controller-996pg" Feb 18 11:53:06 crc kubenswrapper[4922]: I0218 11:53:06.806246 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2d0a226-07e2-402d-a868-2f8374670dac-combined-ca-bundle\") pod \"ovn-controller-996pg\" (UID: \"a2d0a226-07e2-402d-a868-2f8374670dac\") " pod="openstack/ovn-controller-996pg" Feb 18 11:53:06 crc kubenswrapper[4922]: I0218 11:53:06.817380 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gngmd\" (UniqueName: \"kubernetes.io/projected/a2d0a226-07e2-402d-a868-2f8374670dac-kube-api-access-gngmd\") pod \"ovn-controller-996pg\" (UID: \"a2d0a226-07e2-402d-a868-2f8374670dac\") " pod="openstack/ovn-controller-996pg" Feb 18 11:53:06 crc kubenswrapper[4922]: I0218 11:53:06.828478 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkc92\" (UniqueName: \"kubernetes.io/projected/cf286fe0-1b17-475a-b71b-ac4897c2f59d-kube-api-access-bkc92\") pod \"ovn-controller-ovs-stvc7\" (UID: \"cf286fe0-1b17-475a-b71b-ac4897c2f59d\") " pod="openstack/ovn-controller-ovs-stvc7" Feb 18 11:53:06 crc kubenswrapper[4922]: I0218 11:53:06.830493 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cf286fe0-1b17-475a-b71b-ac4897c2f59d-scripts\") pod \"ovn-controller-ovs-stvc7\" (UID: \"cf286fe0-1b17-475a-b71b-ac4897c2f59d\") " pod="openstack/ovn-controller-ovs-stvc7" Feb 18 11:53:06 crc kubenswrapper[4922]: I0218 11:53:06.857384 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2d0a226-07e2-402d-a868-2f8374670dac-ovn-controller-tls-certs\") pod \"ovn-controller-996pg\" (UID: \"a2d0a226-07e2-402d-a868-2f8374670dac\") " pod="openstack/ovn-controller-996pg" Feb 18 11:53:06 crc kubenswrapper[4922]: I0218 11:53:06.939282 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-stvc7" Feb 18 11:53:06 crc kubenswrapper[4922]: I0218 11:53:06.947920 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-996pg" Feb 18 11:53:07 crc kubenswrapper[4922]: I0218 11:53:07.156776 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 18 11:53:07 crc kubenswrapper[4922]: I0218 11:53:07.159443 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 18 11:53:07 crc kubenswrapper[4922]: I0218 11:53:07.162462 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 18 11:53:07 crc kubenswrapper[4922]: I0218 11:53:07.162851 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 18 11:53:07 crc kubenswrapper[4922]: I0218 11:53:07.162509 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-bn68r" Feb 18 11:53:07 crc kubenswrapper[4922]: I0218 11:53:07.162719 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 18 11:53:07 crc kubenswrapper[4922]: I0218 11:53:07.163809 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 18 11:53:07 crc kubenswrapper[4922]: I0218 11:53:07.169703 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 18 11:53:07 crc kubenswrapper[4922]: I0218 11:53:07.208962 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6f3b4f2-3f65-4278-9cd0-753adfee2ecd-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"b6f3b4f2-3f65-4278-9cd0-753adfee2ecd\") " pod="openstack/ovsdbserver-nb-0" Feb 18 11:53:07 crc kubenswrapper[4922]: I0218 11:53:07.209465 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b6f3b4f2-3f65-4278-9cd0-753adfee2ecd-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"b6f3b4f2-3f65-4278-9cd0-753adfee2ecd\") " pod="openstack/ovsdbserver-nb-0" Feb 18 11:53:07 crc kubenswrapper[4922]: I0218 11:53:07.209683 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6f3b4f2-3f65-4278-9cd0-753adfee2ecd-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"b6f3b4f2-3f65-4278-9cd0-753adfee2ecd\") " pod="openstack/ovsdbserver-nb-0" Feb 18 11:53:07 crc kubenswrapper[4922]: I0218 11:53:07.209885 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8q7m\" (UniqueName: \"kubernetes.io/projected/b6f3b4f2-3f65-4278-9cd0-753adfee2ecd-kube-api-access-q8q7m\") pod \"ovsdbserver-nb-0\" (UID: \"b6f3b4f2-3f65-4278-9cd0-753adfee2ecd\") " pod="openstack/ovsdbserver-nb-0" Feb 18 11:53:07 crc kubenswrapper[4922]: I0218 11:53:07.210014 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6f3b4f2-3f65-4278-9cd0-753adfee2ecd-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b6f3b4f2-3f65-4278-9cd0-753adfee2ecd\") " pod="openstack/ovsdbserver-nb-0" Feb 18 11:53:07 crc kubenswrapper[4922]: I0218 11:53:07.210129 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"b6f3b4f2-3f65-4278-9cd0-753adfee2ecd\") " pod="openstack/ovsdbserver-nb-0" Feb 18 11:53:07 crc kubenswrapper[4922]: I0218 11:53:07.210398 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6f3b4f2-3f65-4278-9cd0-753adfee2ecd-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b6f3b4f2-3f65-4278-9cd0-753adfee2ecd\") " pod="openstack/ovsdbserver-nb-0" Feb 18 11:53:07 crc kubenswrapper[4922]: I0218 11:53:07.210533 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6f3b4f2-3f65-4278-9cd0-753adfee2ecd-config\") pod \"ovsdbserver-nb-0\" (UID: \"b6f3b4f2-3f65-4278-9cd0-753adfee2ecd\") " pod="openstack/ovsdbserver-nb-0" Feb 18 11:53:07 crc kubenswrapper[4922]: I0218 11:53:07.312013 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6f3b4f2-3f65-4278-9cd0-753adfee2ecd-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"b6f3b4f2-3f65-4278-9cd0-753adfee2ecd\") " pod="openstack/ovsdbserver-nb-0" Feb 18 11:53:07 crc kubenswrapper[4922]: I0218 11:53:07.312078 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b6f3b4f2-3f65-4278-9cd0-753adfee2ecd-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"b6f3b4f2-3f65-4278-9cd0-753adfee2ecd\") " pod="openstack/ovsdbserver-nb-0" Feb 18 11:53:07 crc kubenswrapper[4922]: I0218 11:53:07.312111 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6f3b4f2-3f65-4278-9cd0-753adfee2ecd-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"b6f3b4f2-3f65-4278-9cd0-753adfee2ecd\") " pod="openstack/ovsdbserver-nb-0" Feb 18 11:53:07 crc kubenswrapper[4922]: I0218 11:53:07.312180 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8q7m\" (UniqueName: \"kubernetes.io/projected/b6f3b4f2-3f65-4278-9cd0-753adfee2ecd-kube-api-access-q8q7m\") pod \"ovsdbserver-nb-0\" (UID: \"b6f3b4f2-3f65-4278-9cd0-753adfee2ecd\") " pod="openstack/ovsdbserver-nb-0" Feb 18 11:53:07 crc kubenswrapper[4922]: I0218 11:53:07.312331 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6f3b4f2-3f65-4278-9cd0-753adfee2ecd-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b6f3b4f2-3f65-4278-9cd0-753adfee2ecd\") " pod="openstack/ovsdbserver-nb-0" Feb 18 11:53:07 crc kubenswrapper[4922]: I0218 11:53:07.312431 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"b6f3b4f2-3f65-4278-9cd0-753adfee2ecd\") " pod="openstack/ovsdbserver-nb-0" Feb 18 11:53:07 crc kubenswrapper[4922]: I0218 11:53:07.312764 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b6f3b4f2-3f65-4278-9cd0-753adfee2ecd-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"b6f3b4f2-3f65-4278-9cd0-753adfee2ecd\") " pod="openstack/ovsdbserver-nb-0" Feb 18 11:53:07 crc kubenswrapper[4922]: I0218 11:53:07.313439 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6f3b4f2-3f65-4278-9cd0-753adfee2ecd-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"b6f3b4f2-3f65-4278-9cd0-753adfee2ecd\") " pod="openstack/ovsdbserver-nb-0" Feb 18 11:53:07 crc kubenswrapper[4922]: I0218 11:53:07.313583 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"b6f3b4f2-3f65-4278-9cd0-753adfee2ecd\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/ovsdbserver-nb-0" Feb 18 11:53:07 crc kubenswrapper[4922]: I0218 11:53:07.313673 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6f3b4f2-3f65-4278-9cd0-753adfee2ecd-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b6f3b4f2-3f65-4278-9cd0-753adfee2ecd\") " pod="openstack/ovsdbserver-nb-0" Feb 18 11:53:07 crc kubenswrapper[4922]: I0218 11:53:07.313743 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6f3b4f2-3f65-4278-9cd0-753adfee2ecd-config\") pod \"ovsdbserver-nb-0\" (UID: \"b6f3b4f2-3f65-4278-9cd0-753adfee2ecd\") " pod="openstack/ovsdbserver-nb-0" Feb 18 11:53:07 crc kubenswrapper[4922]: I0218 11:53:07.314531 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6f3b4f2-3f65-4278-9cd0-753adfee2ecd-config\") pod \"ovsdbserver-nb-0\" (UID: \"b6f3b4f2-3f65-4278-9cd0-753adfee2ecd\") " pod="openstack/ovsdbserver-nb-0" Feb 18 11:53:07 crc kubenswrapper[4922]: I0218 11:53:07.315621 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6f3b4f2-3f65-4278-9cd0-753adfee2ecd-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"b6f3b4f2-3f65-4278-9cd0-753adfee2ecd\") " pod="openstack/ovsdbserver-nb-0" Feb 18 11:53:07 crc kubenswrapper[4922]: I0218 11:53:07.315734 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6f3b4f2-3f65-4278-9cd0-753adfee2ecd-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b6f3b4f2-3f65-4278-9cd0-753adfee2ecd\") " pod="openstack/ovsdbserver-nb-0" Feb 18 11:53:07 crc kubenswrapper[4922]: I0218 11:53:07.317279 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6f3b4f2-3f65-4278-9cd0-753adfee2ecd-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b6f3b4f2-3f65-4278-9cd0-753adfee2ecd\") " pod="openstack/ovsdbserver-nb-0" Feb 18 11:53:07 crc kubenswrapper[4922]: I0218 11:53:07.330271 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8q7m\" (UniqueName: \"kubernetes.io/projected/b6f3b4f2-3f65-4278-9cd0-753adfee2ecd-kube-api-access-q8q7m\") pod \"ovsdbserver-nb-0\" (UID: \"b6f3b4f2-3f65-4278-9cd0-753adfee2ecd\") " pod="openstack/ovsdbserver-nb-0" Feb 18 11:53:07 crc kubenswrapper[4922]: I0218 11:53:07.374565 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"b6f3b4f2-3f65-4278-9cd0-753adfee2ecd\") " pod="openstack/ovsdbserver-nb-0" Feb 18 11:53:07 crc kubenswrapper[4922]: I0218 11:53:07.477330 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 18 11:53:09 crc kubenswrapper[4922]: I0218 11:53:09.807459 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 11:53:09 crc kubenswrapper[4922]: I0218 11:53:09.807777 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 11:53:10 crc kubenswrapper[4922]: I0218 11:53:10.545649 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 18 11:53:10 crc kubenswrapper[4922]: I0218 11:53:10.547164 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 18 11:53:10 crc kubenswrapper[4922]: I0218 11:53:10.552408 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 18 11:53:10 crc kubenswrapper[4922]: I0218 11:53:10.552503 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-ht5pb" Feb 18 11:53:10 crc kubenswrapper[4922]: I0218 11:53:10.552649 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 18 11:53:10 crc kubenswrapper[4922]: I0218 11:53:10.556212 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 18 11:53:10 crc kubenswrapper[4922]: I0218 11:53:10.576547 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 18 11:53:10 crc kubenswrapper[4922]: I0218 11:53:10.580633 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/186f064b-a9e8-4637-a5eb-1646f2e1a783-config\") pod \"ovsdbserver-sb-0\" (UID: \"186f064b-a9e8-4637-a5eb-1646f2e1a783\") " pod="openstack/ovsdbserver-sb-0" Feb 18 11:53:10 crc kubenswrapper[4922]: I0218 11:53:10.580725 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"186f064b-a9e8-4637-a5eb-1646f2e1a783\") " pod="openstack/ovsdbserver-sb-0" Feb 18 11:53:10 crc kubenswrapper[4922]: I0218 11:53:10.580771 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/186f064b-a9e8-4637-a5eb-1646f2e1a783-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"186f064b-a9e8-4637-a5eb-1646f2e1a783\") " pod="openstack/ovsdbserver-sb-0" Feb 18 11:53:10 crc kubenswrapper[4922]: I0218 11:53:10.580817 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/186f064b-a9e8-4637-a5eb-1646f2e1a783-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"186f064b-a9e8-4637-a5eb-1646f2e1a783\") " pod="openstack/ovsdbserver-sb-0" Feb 18 11:53:10 crc kubenswrapper[4922]: I0218 11:53:10.580841 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/186f064b-a9e8-4637-a5eb-1646f2e1a783-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"186f064b-a9e8-4637-a5eb-1646f2e1a783\") " pod="openstack/ovsdbserver-sb-0" Feb 18 11:53:10 crc kubenswrapper[4922]: I0218 11:53:10.580869 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/186f064b-a9e8-4637-a5eb-1646f2e1a783-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"186f064b-a9e8-4637-a5eb-1646f2e1a783\") " pod="openstack/ovsdbserver-sb-0" Feb 18 11:53:10 crc kubenswrapper[4922]: I0218 11:53:10.580922 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/186f064b-a9e8-4637-a5eb-1646f2e1a783-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"186f064b-a9e8-4637-a5eb-1646f2e1a783\") " pod="openstack/ovsdbserver-sb-0" Feb 18 11:53:10 crc kubenswrapper[4922]: I0218 11:53:10.580956 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npclz\" (UniqueName: \"kubernetes.io/projected/186f064b-a9e8-4637-a5eb-1646f2e1a783-kube-api-access-npclz\") pod \"ovsdbserver-sb-0\" (UID: \"186f064b-a9e8-4637-a5eb-1646f2e1a783\") " pod="openstack/ovsdbserver-sb-0" Feb 18 11:53:10 crc kubenswrapper[4922]: I0218 11:53:10.682876 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/186f064b-a9e8-4637-a5eb-1646f2e1a783-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"186f064b-a9e8-4637-a5eb-1646f2e1a783\") " pod="openstack/ovsdbserver-sb-0" Feb 18 11:53:10 crc kubenswrapper[4922]: I0218 11:53:10.682935 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npclz\" (UniqueName: \"kubernetes.io/projected/186f064b-a9e8-4637-a5eb-1646f2e1a783-kube-api-access-npclz\") pod \"ovsdbserver-sb-0\" (UID: \"186f064b-a9e8-4637-a5eb-1646f2e1a783\") " pod="openstack/ovsdbserver-sb-0" Feb 18 11:53:10 crc kubenswrapper[4922]: I0218 11:53:10.682985 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/186f064b-a9e8-4637-a5eb-1646f2e1a783-config\") pod \"ovsdbserver-sb-0\" (UID: \"186f064b-a9e8-4637-a5eb-1646f2e1a783\") " pod="openstack/ovsdbserver-sb-0" Feb 18 11:53:10 crc kubenswrapper[4922]: I0218 11:53:10.683052 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"186f064b-a9e8-4637-a5eb-1646f2e1a783\") " pod="openstack/ovsdbserver-sb-0" Feb 18 11:53:10 crc kubenswrapper[4922]: I0218 11:53:10.683088 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/186f064b-a9e8-4637-a5eb-1646f2e1a783-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"186f064b-a9e8-4637-a5eb-1646f2e1a783\") " pod="openstack/ovsdbserver-sb-0" Feb 18 11:53:10 crc kubenswrapper[4922]: I0218 11:53:10.683127 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/186f064b-a9e8-4637-a5eb-1646f2e1a783-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"186f064b-a9e8-4637-a5eb-1646f2e1a783\") " pod="openstack/ovsdbserver-sb-0" Feb 18 11:53:10 crc kubenswrapper[4922]: I0218 11:53:10.683154 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/186f064b-a9e8-4637-a5eb-1646f2e1a783-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"186f064b-a9e8-4637-a5eb-1646f2e1a783\") " pod="openstack/ovsdbserver-sb-0" Feb 18 11:53:10 crc kubenswrapper[4922]: I0218 11:53:10.683186 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/186f064b-a9e8-4637-a5eb-1646f2e1a783-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"186f064b-a9e8-4637-a5eb-1646f2e1a783\") " pod="openstack/ovsdbserver-sb-0" Feb 18 11:53:10 crc kubenswrapper[4922]: I0218 11:53:10.683425 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/186f064b-a9e8-4637-a5eb-1646f2e1a783-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"186f064b-a9e8-4637-a5eb-1646f2e1a783\") " pod="openstack/ovsdbserver-sb-0" Feb 18 11:53:10 crc kubenswrapper[4922]: I0218 11:53:10.683449 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"186f064b-a9e8-4637-a5eb-1646f2e1a783\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/ovsdbserver-sb-0" Feb 18 11:53:10 crc kubenswrapper[4922]: I0218 11:53:10.684321 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/186f064b-a9e8-4637-a5eb-1646f2e1a783-config\") pod \"ovsdbserver-sb-0\" (UID: \"186f064b-a9e8-4637-a5eb-1646f2e1a783\") " pod="openstack/ovsdbserver-sb-0" Feb 18 11:53:10 crc kubenswrapper[4922]: I0218 11:53:10.684600 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/186f064b-a9e8-4637-a5eb-1646f2e1a783-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"186f064b-a9e8-4637-a5eb-1646f2e1a783\") " pod="openstack/ovsdbserver-sb-0" Feb 18 11:53:10 crc kubenswrapper[4922]: I0218 11:53:10.688140 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/186f064b-a9e8-4637-a5eb-1646f2e1a783-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"186f064b-a9e8-4637-a5eb-1646f2e1a783\") " pod="openstack/ovsdbserver-sb-0" Feb 18 11:53:10 crc kubenswrapper[4922]: I0218 11:53:10.688283 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/186f064b-a9e8-4637-a5eb-1646f2e1a783-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"186f064b-a9e8-4637-a5eb-1646f2e1a783\") " pod="openstack/ovsdbserver-sb-0" Feb 18 11:53:10 crc kubenswrapper[4922]: I0218 11:53:10.692085 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/186f064b-a9e8-4637-a5eb-1646f2e1a783-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"186f064b-a9e8-4637-a5eb-1646f2e1a783\") " pod="openstack/ovsdbserver-sb-0" Feb 18 11:53:10 crc kubenswrapper[4922]: I0218 11:53:10.713554 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"186f064b-a9e8-4637-a5eb-1646f2e1a783\") " pod="openstack/ovsdbserver-sb-0" Feb 18 11:53:10 crc kubenswrapper[4922]: I0218 11:53:10.740015 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npclz\" (UniqueName: \"kubernetes.io/projected/186f064b-a9e8-4637-a5eb-1646f2e1a783-kube-api-access-npclz\") pod \"ovsdbserver-sb-0\" (UID: \"186f064b-a9e8-4637-a5eb-1646f2e1a783\") " pod="openstack/ovsdbserver-sb-0" Feb 18 11:53:10 crc kubenswrapper[4922]: I0218 11:53:10.867692 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 18 11:53:11 crc kubenswrapper[4922]: I0218 11:53:11.962984 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 18 11:53:12 crc kubenswrapper[4922]: E0218 11:53:12.497483 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 18 11:53:12 crc kubenswrapper[4922]: E0218 11:53:12.497656 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9skmt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-d5kxc_openstack(277ea358-73ba-466c-bad6-c788f49749a2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 11:53:12 crc kubenswrapper[4922]: E0218 11:53:12.498838 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-d5kxc" podUID="277ea358-73ba-466c-bad6-c788f49749a2" Feb 18 11:53:12 crc kubenswrapper[4922]: E0218 11:53:12.500242 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 18 11:53:12 crc kubenswrapper[4922]: E0218 11:53:12.500396 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4flx4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-r4dpg_openstack(12309c77-cb46-401f-a335-08ca9d74e019): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 11:53:12 crc kubenswrapper[4922]: E0218 11:53:12.501922 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-r4dpg" podUID="12309c77-cb46-401f-a335-08ca9d74e019" Feb 18 11:53:13 crc kubenswrapper[4922]: I0218 11:53:13.035119 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"302e3b56-c5a4-4e80-bb7e-a9e6a61a119e","Type":"ContainerStarted","Data":"6916ab85b14e41fd88a5f1a98f0b906bdbeec1785f299e2809e2c8b53ae5dd9f"} Feb 18 11:53:13 crc kubenswrapper[4922]: I0218 11:53:13.039662 4922 generic.go:334] "Generic (PLEG): container finished" podID="5728000b-35c8-4748-a8bd-722a9d4da288" containerID="e926ccb5b4c9a13d1cee243527546b45e51365b24bcd44a54ffecfd0423a70d5" exitCode=0 Feb 18 11:53:13 crc kubenswrapper[4922]: I0218 11:53:13.040562 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-rkhdz" event={"ID":"5728000b-35c8-4748-a8bd-722a9d4da288","Type":"ContainerDied","Data":"e926ccb5b4c9a13d1cee243527546b45e51365b24bcd44a54ffecfd0423a70d5"} Feb 18 11:53:13 crc kubenswrapper[4922]: I0218 11:53:13.052137 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 18 11:53:13 crc kubenswrapper[4922]: I0218 11:53:13.245808 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 18 11:53:13 crc kubenswrapper[4922]: I0218 11:53:13.253838 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 18 11:53:13 crc kubenswrapper[4922]: I0218 11:53:13.266535 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gcsm9"] Feb 18 11:53:13 crc kubenswrapper[4922]: E0218 11:53:13.307016 4922 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Feb 18 11:53:13 crc kubenswrapper[4922]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/5728000b-35c8-4748-a8bd-722a9d4da288/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 18 11:53:13 crc kubenswrapper[4922]: > podSandboxID="05332c9ac22e2e275a6ff8155167585702fc418b67e75c5ce73b68418331a924" Feb 18 11:53:13 crc kubenswrapper[4922]: E0218 11:53:13.307601 4922 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 18 11:53:13 crc kubenswrapper[4922]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zm4rb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-rkhdz_openstack(5728000b-35c8-4748-a8bd-722a9d4da288): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/5728000b-35c8-4748-a8bd-722a9d4da288/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 18 11:53:13 crc kubenswrapper[4922]: > logger="UnhandledError" Feb 18 11:53:13 crc kubenswrapper[4922]: E0218 11:53:13.309475 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/5728000b-35c8-4748-a8bd-722a9d4da288/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-666b6646f7-rkhdz" podUID="5728000b-35c8-4748-a8bd-722a9d4da288" Feb 18 11:53:13 crc kubenswrapper[4922]: I0218 11:53:13.384351 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-stvc7"] Feb 18 11:53:13 crc kubenswrapper[4922]: I0218 11:53:13.395561 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 18 11:53:13 crc kubenswrapper[4922]: I0218 11:53:13.402134 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 18 11:53:13 crc kubenswrapper[4922]: I0218 11:53:13.595652 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-d5kxc" Feb 18 11:53:13 crc kubenswrapper[4922]: I0218 11:53:13.609624 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-r4dpg" Feb 18 11:53:13 crc kubenswrapper[4922]: I0218 11:53:13.640942 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9skmt\" (UniqueName: \"kubernetes.io/projected/277ea358-73ba-466c-bad6-c788f49749a2-kube-api-access-9skmt\") pod \"277ea358-73ba-466c-bad6-c788f49749a2\" (UID: \"277ea358-73ba-466c-bad6-c788f49749a2\") " Feb 18 11:53:13 crc kubenswrapper[4922]: I0218 11:53:13.641381 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/277ea358-73ba-466c-bad6-c788f49749a2-config\") pod \"277ea358-73ba-466c-bad6-c788f49749a2\" (UID: \"277ea358-73ba-466c-bad6-c788f49749a2\") " Feb 18 11:53:13 crc kubenswrapper[4922]: I0218 11:53:13.641494 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12309c77-cb46-401f-a335-08ca9d74e019-dns-svc\") pod \"12309c77-cb46-401f-a335-08ca9d74e019\" (UID: \"12309c77-cb46-401f-a335-08ca9d74e019\") " Feb 18 11:53:13 crc kubenswrapper[4922]: I0218 11:53:13.641717 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12309c77-cb46-401f-a335-08ca9d74e019-config\") pod \"12309c77-cb46-401f-a335-08ca9d74e019\" (UID: \"12309c77-cb46-401f-a335-08ca9d74e019\") " Feb 18 11:53:13 crc kubenswrapper[4922]: I0218 11:53:13.641746 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4flx4\" (UniqueName: \"kubernetes.io/projected/12309c77-cb46-401f-a335-08ca9d74e019-kube-api-access-4flx4\") pod \"12309c77-cb46-401f-a335-08ca9d74e019\" (UID: \"12309c77-cb46-401f-a335-08ca9d74e019\") " Feb 18 11:53:13 crc kubenswrapper[4922]: I0218 11:53:13.643051 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12309c77-cb46-401f-a335-08ca9d74e019-config" (OuterVolumeSpecName: "config") pod "12309c77-cb46-401f-a335-08ca9d74e019" (UID: "12309c77-cb46-401f-a335-08ca9d74e019"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:53:13 crc kubenswrapper[4922]: I0218 11:53:13.643552 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/277ea358-73ba-466c-bad6-c788f49749a2-config" (OuterVolumeSpecName: "config") pod "277ea358-73ba-466c-bad6-c788f49749a2" (UID: "277ea358-73ba-466c-bad6-c788f49749a2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:53:13 crc kubenswrapper[4922]: I0218 11:53:13.645272 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12309c77-cb46-401f-a335-08ca9d74e019-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "12309c77-cb46-401f-a335-08ca9d74e019" (UID: "12309c77-cb46-401f-a335-08ca9d74e019"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:53:13 crc kubenswrapper[4922]: I0218 11:53:13.647317 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/277ea358-73ba-466c-bad6-c788f49749a2-kube-api-access-9skmt" (OuterVolumeSpecName: "kube-api-access-9skmt") pod "277ea358-73ba-466c-bad6-c788f49749a2" (UID: "277ea358-73ba-466c-bad6-c788f49749a2"). InnerVolumeSpecName "kube-api-access-9skmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:53:13 crc kubenswrapper[4922]: I0218 11:53:13.650743 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12309c77-cb46-401f-a335-08ca9d74e019-kube-api-access-4flx4" (OuterVolumeSpecName: "kube-api-access-4flx4") pod "12309c77-cb46-401f-a335-08ca9d74e019" (UID: "12309c77-cb46-401f-a335-08ca9d74e019"). InnerVolumeSpecName "kube-api-access-4flx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:53:13 crc kubenswrapper[4922]: I0218 11:53:13.745173 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9skmt\" (UniqueName: \"kubernetes.io/projected/277ea358-73ba-466c-bad6-c788f49749a2-kube-api-access-9skmt\") on node \"crc\" DevicePath \"\"" Feb 18 11:53:13 crc kubenswrapper[4922]: I0218 11:53:13.745208 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/277ea358-73ba-466c-bad6-c788f49749a2-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:53:13 crc kubenswrapper[4922]: I0218 11:53:13.745217 4922 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12309c77-cb46-401f-a335-08ca9d74e019-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 11:53:13 crc kubenswrapper[4922]: I0218 11:53:13.745225 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12309c77-cb46-401f-a335-08ca9d74e019-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:53:13 crc kubenswrapper[4922]: I0218 11:53:13.745234 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4flx4\" (UniqueName: \"kubernetes.io/projected/12309c77-cb46-401f-a335-08ca9d74e019-kube-api-access-4flx4\") on node \"crc\" DevicePath \"\"" Feb 18 11:53:13 crc kubenswrapper[4922]: I0218 11:53:13.751398 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 18 11:53:13 crc kubenswrapper[4922]: I0218 11:53:13.764036 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-996pg"] Feb 18 11:53:13 crc kubenswrapper[4922]: W0218 11:53:13.780776 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2d0a226_07e2_402d_a868_2f8374670dac.slice/crio-3055b44c45c190c60818e0cf5c1c85425414a78117655f4d765cb08f58235bc9 WatchSource:0}: Error finding container 3055b44c45c190c60818e0cf5c1c85425414a78117655f4d765cb08f58235bc9: Status 404 returned error can't find the container with id 3055b44c45c190c60818e0cf5c1c85425414a78117655f4d765cb08f58235bc9 Feb 18 11:53:13 crc kubenswrapper[4922]: I0218 11:53:13.835355 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 18 11:53:14 crc kubenswrapper[4922]: I0218 11:53:14.050580 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"12b84523-522e-4e8c-b78e-0094262fb1f8","Type":"ContainerStarted","Data":"aab8e1d8bb4c1667bc6b73808bdb819ba395465155e9f67195316f9044955cf6"} Feb 18 11:53:14 crc kubenswrapper[4922]: I0218 11:53:14.052971 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2aa305a0-c015-43c2-851c-8eff778238be","Type":"ContainerStarted","Data":"3a3d098eed640f36965fdadc7b1dd0c83929950b22d8057eb96a4ca71c50bd14"} Feb 18 11:53:14 crc kubenswrapper[4922]: I0218 11:53:14.055210 4922 generic.go:334] "Generic (PLEG): container finished" podID="a9adaf2c-91b1-42f9-8d96-307a08030cce" containerID="cb1206c209e030ad065b0424209cc4f379d4283c70032fb8543af8119687b039" exitCode=0 Feb 18 11:53:14 crc kubenswrapper[4922]: I0218 11:53:14.055260 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-gcsm9" event={"ID":"a9adaf2c-91b1-42f9-8d96-307a08030cce","Type":"ContainerDied","Data":"cb1206c209e030ad065b0424209cc4f379d4283c70032fb8543af8119687b039"} Feb 18 11:53:14 crc kubenswrapper[4922]: I0218 11:53:14.055316 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-gcsm9" event={"ID":"a9adaf2c-91b1-42f9-8d96-307a08030cce","Type":"ContainerStarted","Data":"8aa2b445a2e846f21de5ba41b7250cd1d270eacaa01407089987d4676bfb4dcd"} Feb 18 11:53:14 crc kubenswrapper[4922]: I0218 11:53:14.059487 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-r4dpg" event={"ID":"12309c77-cb46-401f-a335-08ca9d74e019","Type":"ContainerDied","Data":"b35ffd4661bbe49f0defaf1ce8038237c6db0e474221315740b32c40e8be885c"} Feb 18 11:53:14 crc kubenswrapper[4922]: I0218 11:53:14.059559 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-r4dpg" Feb 18 11:53:14 crc kubenswrapper[4922]: I0218 11:53:14.061924 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"0ce20f52-4b9d-47a6-8da7-c64cd1d15623","Type":"ContainerStarted","Data":"1c52538cc464082ecb9a366072d250142efbfd791d4f3400f11f51c22f9cd1c6"} Feb 18 11:53:14 crc kubenswrapper[4922]: I0218 11:53:14.063451 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"186f064b-a9e8-4637-a5eb-1646f2e1a783","Type":"ContainerStarted","Data":"79466567273a5850a41b267901e4466ff74878fe7f3d6561388315f310a215ca"} Feb 18 11:53:14 crc kubenswrapper[4922]: I0218 11:53:14.064565 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-stvc7" event={"ID":"cf286fe0-1b17-475a-b71b-ac4897c2f59d","Type":"ContainerStarted","Data":"9f9b7fad52923615947a0d6e7151a78b4ebb8b6e007995f60cadafcce5688fe4"} Feb 18 11:53:14 crc kubenswrapper[4922]: I0218 11:53:14.067203 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e35b9ac7-2e11-4096-a77a-4be1a41d737f","Type":"ContainerStarted","Data":"bf8e622508a488fa6a5aab10a0db437c746f98562fadc68cb32fcd3c28724d08"} Feb 18 11:53:14 crc kubenswrapper[4922]: I0218 11:53:14.069855 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cef557d2-b935-4cf6-98f1-d3c2251c0e38","Type":"ContainerStarted","Data":"d9788f1e654fa9ba3ad3f0a6ae9798137af27ad57e7e68121b8391b0725d166c"} Feb 18 11:53:14 crc kubenswrapper[4922]: I0218 11:53:14.071433 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-d5kxc" event={"ID":"277ea358-73ba-466c-bad6-c788f49749a2","Type":"ContainerDied","Data":"be1e6e7fb4233f0222ce6fa46078e0d2c0447caff506e59746208cd3058261a5"} Feb 18 11:53:14 crc kubenswrapper[4922]: I0218 11:53:14.071527 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-d5kxc" Feb 18 11:53:14 crc kubenswrapper[4922]: I0218 11:53:14.073016 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"873b23d0-3c83-4ab7-8178-1c4832c544a0","Type":"ContainerStarted","Data":"f0b5c3b14ba2985c7887a419fa639114ee9a1d29ac6136b93fd46a9476daf2bd"} Feb 18 11:53:14 crc kubenswrapper[4922]: I0218 11:53:14.084172 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-996pg" event={"ID":"a2d0a226-07e2-402d-a868-2f8374670dac","Type":"ContainerStarted","Data":"3055b44c45c190c60818e0cf5c1c85425414a78117655f4d765cb08f58235bc9"} Feb 18 11:53:14 crc kubenswrapper[4922]: I0218 11:53:14.179642 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-r4dpg"] Feb 18 11:53:14 crc kubenswrapper[4922]: I0218 11:53:14.188272 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-r4dpg"] Feb 18 11:53:14 crc kubenswrapper[4922]: I0218 11:53:14.201205 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-d5kxc"] Feb 18 11:53:14 crc kubenswrapper[4922]: I0218 11:53:14.209072 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-d5kxc"] Feb 18 11:53:14 crc kubenswrapper[4922]: I0218 11:53:14.675792 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 18 11:53:14 crc kubenswrapper[4922]: W0218 11:53:14.697589 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6f3b4f2_3f65_4278_9cd0_753adfee2ecd.slice/crio-095284e789ef763dbe0314370172e682e42b58e817284911a877b0672eea210f WatchSource:0}: Error finding container 095284e789ef763dbe0314370172e682e42b58e817284911a877b0672eea210f: Status 404 returned error can't find the container with id 095284e789ef763dbe0314370172e682e42b58e817284911a877b0672eea210f Feb 18 11:53:14 crc kubenswrapper[4922]: I0218 11:53:14.984223 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12309c77-cb46-401f-a335-08ca9d74e019" path="/var/lib/kubelet/pods/12309c77-cb46-401f-a335-08ca9d74e019/volumes" Feb 18 11:53:14 crc kubenswrapper[4922]: I0218 11:53:14.984590 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="277ea358-73ba-466c-bad6-c788f49749a2" path="/var/lib/kubelet/pods/277ea358-73ba-466c-bad6-c788f49749a2/volumes" Feb 18 11:53:15 crc kubenswrapper[4922]: I0218 11:53:15.093044 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-rkhdz" event={"ID":"5728000b-35c8-4748-a8bd-722a9d4da288","Type":"ContainerStarted","Data":"7d7de311b38252b803029fed4ce2a344d9898ebcba31e61063d58705f1576aca"} Feb 18 11:53:15 crc kubenswrapper[4922]: I0218 11:53:15.093272 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-rkhdz" Feb 18 11:53:15 crc kubenswrapper[4922]: I0218 11:53:15.094780 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"b6f3b4f2-3f65-4278-9cd0-753adfee2ecd","Type":"ContainerStarted","Data":"095284e789ef763dbe0314370172e682e42b58e817284911a877b0672eea210f"} Feb 18 11:53:15 crc kubenswrapper[4922]: I0218 11:53:15.096841 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-gcsm9" event={"ID":"a9adaf2c-91b1-42f9-8d96-307a08030cce","Type":"ContainerStarted","Data":"bade2e297d9e59c84721045c2242b2738e6af09fe8594f0328f6a4f10a05370f"} Feb 18 11:53:15 crc kubenswrapper[4922]: I0218 11:53:15.097007 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-gcsm9" Feb 18 11:53:15 crc kubenswrapper[4922]: I0218 11:53:15.112076 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-rkhdz" podStartSLOduration=6.585709778 podStartE2EDuration="18.112061417s" podCreationTimestamp="2026-02-18 11:52:57 +0000 UTC" firstStartedPulling="2026-02-18 11:53:01.180649 +0000 UTC m=+982.908353080" lastFinishedPulling="2026-02-18 11:53:12.707000639 +0000 UTC m=+994.434704719" observedRunningTime="2026-02-18 11:53:15.111538284 +0000 UTC m=+996.839242374" watchObservedRunningTime="2026-02-18 11:53:15.112061417 +0000 UTC m=+996.839765507" Feb 18 11:53:19 crc kubenswrapper[4922]: I0218 11:53:19.005476 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-gcsm9" podStartSLOduration=22.005460706 podStartE2EDuration="22.005460706s" podCreationTimestamp="2026-02-18 11:52:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:15.157752188 +0000 UTC m=+996.885456278" watchObservedRunningTime="2026-02-18 11:53:19.005460706 +0000 UTC m=+1000.733164776" Feb 18 11:53:19 crc kubenswrapper[4922]: I0218 11:53:19.904407 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-wknpt"] Feb 18 11:53:19 crc kubenswrapper[4922]: I0218 11:53:19.907684 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-wknpt" Feb 18 11:53:19 crc kubenswrapper[4922]: I0218 11:53:19.913574 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 18 11:53:19 crc kubenswrapper[4922]: I0218 11:53:19.942241 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-wknpt"] Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.057853 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/c831c6ce-ca0c-4f7d-8268-b4efe13e687d-ovn-rundir\") pod \"ovn-controller-metrics-wknpt\" (UID: \"c831c6ce-ca0c-4f7d-8268-b4efe13e687d\") " pod="openstack/ovn-controller-metrics-wknpt" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.057965 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/c831c6ce-ca0c-4f7d-8268-b4efe13e687d-ovs-rundir\") pod \"ovn-controller-metrics-wknpt\" (UID: \"c831c6ce-ca0c-4f7d-8268-b4efe13e687d\") " pod="openstack/ovn-controller-metrics-wknpt" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.057988 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c831c6ce-ca0c-4f7d-8268-b4efe13e687d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-wknpt\" (UID: \"c831c6ce-ca0c-4f7d-8268-b4efe13e687d\") " pod="openstack/ovn-controller-metrics-wknpt" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.058008 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c831c6ce-ca0c-4f7d-8268-b4efe13e687d-combined-ca-bundle\") pod \"ovn-controller-metrics-wknpt\" (UID: \"c831c6ce-ca0c-4f7d-8268-b4efe13e687d\") " pod="openstack/ovn-controller-metrics-wknpt" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.058029 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt7vs\" (UniqueName: \"kubernetes.io/projected/c831c6ce-ca0c-4f7d-8268-b4efe13e687d-kube-api-access-nt7vs\") pod \"ovn-controller-metrics-wknpt\" (UID: \"c831c6ce-ca0c-4f7d-8268-b4efe13e687d\") " pod="openstack/ovn-controller-metrics-wknpt" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.058072 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c831c6ce-ca0c-4f7d-8268-b4efe13e687d-config\") pod \"ovn-controller-metrics-wknpt\" (UID: \"c831c6ce-ca0c-4f7d-8268-b4efe13e687d\") " pod="openstack/ovn-controller-metrics-wknpt" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.067468 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-rkhdz"] Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.067682 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-rkhdz" podUID="5728000b-35c8-4748-a8bd-722a9d4da288" containerName="dnsmasq-dns" containerID="cri-o://7d7de311b38252b803029fed4ce2a344d9898ebcba31e61063d58705f1576aca" gracePeriod=10 Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.069627 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666b6646f7-rkhdz" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.126712 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-nhn5n"] Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.128879 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-nhn5n" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.133899 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.156700 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-nhn5n"] Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.169249 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/c831c6ce-ca0c-4f7d-8268-b4efe13e687d-ovn-rundir\") pod \"ovn-controller-metrics-wknpt\" (UID: \"c831c6ce-ca0c-4f7d-8268-b4efe13e687d\") " pod="openstack/ovn-controller-metrics-wknpt" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.169402 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/c831c6ce-ca0c-4f7d-8268-b4efe13e687d-ovs-rundir\") pod \"ovn-controller-metrics-wknpt\" (UID: \"c831c6ce-ca0c-4f7d-8268-b4efe13e687d\") " pod="openstack/ovn-controller-metrics-wknpt" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.169435 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c831c6ce-ca0c-4f7d-8268-b4efe13e687d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-wknpt\" (UID: \"c831c6ce-ca0c-4f7d-8268-b4efe13e687d\") " pod="openstack/ovn-controller-metrics-wknpt" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.169722 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c831c6ce-ca0c-4f7d-8268-b4efe13e687d-combined-ca-bundle\") pod \"ovn-controller-metrics-wknpt\" (UID: \"c831c6ce-ca0c-4f7d-8268-b4efe13e687d\") " pod="openstack/ovn-controller-metrics-wknpt" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.169750 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nt7vs\" (UniqueName: \"kubernetes.io/projected/c831c6ce-ca0c-4f7d-8268-b4efe13e687d-kube-api-access-nt7vs\") pod \"ovn-controller-metrics-wknpt\" (UID: \"c831c6ce-ca0c-4f7d-8268-b4efe13e687d\") " pod="openstack/ovn-controller-metrics-wknpt" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.169896 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c831c6ce-ca0c-4f7d-8268-b4efe13e687d-config\") pod \"ovn-controller-metrics-wknpt\" (UID: \"c831c6ce-ca0c-4f7d-8268-b4efe13e687d\") " pod="openstack/ovn-controller-metrics-wknpt" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.171223 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/c831c6ce-ca0c-4f7d-8268-b4efe13e687d-ovn-rundir\") pod \"ovn-controller-metrics-wknpt\" (UID: \"c831c6ce-ca0c-4f7d-8268-b4efe13e687d\") " pod="openstack/ovn-controller-metrics-wknpt" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.171277 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/c831c6ce-ca0c-4f7d-8268-b4efe13e687d-ovs-rundir\") pod \"ovn-controller-metrics-wknpt\" (UID: \"c831c6ce-ca0c-4f7d-8268-b4efe13e687d\") " pod="openstack/ovn-controller-metrics-wknpt" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.171347 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c831c6ce-ca0c-4f7d-8268-b4efe13e687d-config\") pod \"ovn-controller-metrics-wknpt\" (UID: \"c831c6ce-ca0c-4f7d-8268-b4efe13e687d\") " pod="openstack/ovn-controller-metrics-wknpt" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.184421 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c831c6ce-ca0c-4f7d-8268-b4efe13e687d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-wknpt\" (UID: \"c831c6ce-ca0c-4f7d-8268-b4efe13e687d\") " pod="openstack/ovn-controller-metrics-wknpt" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.194773 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt7vs\" (UniqueName: \"kubernetes.io/projected/c831c6ce-ca0c-4f7d-8268-b4efe13e687d-kube-api-access-nt7vs\") pod \"ovn-controller-metrics-wknpt\" (UID: \"c831c6ce-ca0c-4f7d-8268-b4efe13e687d\") " pod="openstack/ovn-controller-metrics-wknpt" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.228134 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c831c6ce-ca0c-4f7d-8268-b4efe13e687d-combined-ca-bundle\") pod \"ovn-controller-metrics-wknpt\" (UID: \"c831c6ce-ca0c-4f7d-8268-b4efe13e687d\") " pod="openstack/ovn-controller-metrics-wknpt" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.233208 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-wknpt" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.237708 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gcsm9"] Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.238051 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-gcsm9" podUID="a9adaf2c-91b1-42f9-8d96-307a08030cce" containerName="dnsmasq-dns" containerID="cri-o://bade2e297d9e59c84721045c2242b2738e6af09fe8594f0328f6a4f10a05370f" gracePeriod=10 Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.249734 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-gcsm9" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.263504 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-mt2n6"] Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.265240 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-mt2n6" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.268648 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.274333 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30419d6d-3999-43ef-8cd9-07143299061a-config\") pod \"dnsmasq-dns-5bf47b49b7-nhn5n\" (UID: \"30419d6d-3999-43ef-8cd9-07143299061a\") " pod="openstack/dnsmasq-dns-5bf47b49b7-nhn5n" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.274407 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/30419d6d-3999-43ef-8cd9-07143299061a-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-nhn5n\" (UID: \"30419d6d-3999-43ef-8cd9-07143299061a\") " pod="openstack/dnsmasq-dns-5bf47b49b7-nhn5n" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.274461 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66ee71b5-db58-4478-94c7-0067be9c018e-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-mt2n6\" (UID: \"66ee71b5-db58-4478-94c7-0067be9c018e\") " pod="openstack/dnsmasq-dns-8554648995-mt2n6" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.274504 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d74b5\" (UniqueName: \"kubernetes.io/projected/30419d6d-3999-43ef-8cd9-07143299061a-kube-api-access-d74b5\") pod \"dnsmasq-dns-5bf47b49b7-nhn5n\" (UID: \"30419d6d-3999-43ef-8cd9-07143299061a\") " pod="openstack/dnsmasq-dns-5bf47b49b7-nhn5n" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.274530 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xglxs\" (UniqueName: \"kubernetes.io/projected/66ee71b5-db58-4478-94c7-0067be9c018e-kube-api-access-xglxs\") pod \"dnsmasq-dns-8554648995-mt2n6\" (UID: \"66ee71b5-db58-4478-94c7-0067be9c018e\") " pod="openstack/dnsmasq-dns-8554648995-mt2n6" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.274589 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66ee71b5-db58-4478-94c7-0067be9c018e-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-mt2n6\" (UID: \"66ee71b5-db58-4478-94c7-0067be9c018e\") " pod="openstack/dnsmasq-dns-8554648995-mt2n6" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.274641 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66ee71b5-db58-4478-94c7-0067be9c018e-config\") pod \"dnsmasq-dns-8554648995-mt2n6\" (UID: \"66ee71b5-db58-4478-94c7-0067be9c018e\") " pod="openstack/dnsmasq-dns-8554648995-mt2n6" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.274667 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/30419d6d-3999-43ef-8cd9-07143299061a-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-nhn5n\" (UID: \"30419d6d-3999-43ef-8cd9-07143299061a\") " pod="openstack/dnsmasq-dns-5bf47b49b7-nhn5n" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.274706 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66ee71b5-db58-4478-94c7-0067be9c018e-dns-svc\") pod \"dnsmasq-dns-8554648995-mt2n6\" (UID: \"66ee71b5-db58-4478-94c7-0067be9c018e\") " pod="openstack/dnsmasq-dns-8554648995-mt2n6" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.287070 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-mt2n6"] Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.375969 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30419d6d-3999-43ef-8cd9-07143299061a-config\") pod \"dnsmasq-dns-5bf47b49b7-nhn5n\" (UID: \"30419d6d-3999-43ef-8cd9-07143299061a\") " pod="openstack/dnsmasq-dns-5bf47b49b7-nhn5n" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.376035 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/30419d6d-3999-43ef-8cd9-07143299061a-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-nhn5n\" (UID: \"30419d6d-3999-43ef-8cd9-07143299061a\") " pod="openstack/dnsmasq-dns-5bf47b49b7-nhn5n" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.376085 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66ee71b5-db58-4478-94c7-0067be9c018e-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-mt2n6\" (UID: \"66ee71b5-db58-4478-94c7-0067be9c018e\") " pod="openstack/dnsmasq-dns-8554648995-mt2n6" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.376516 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d74b5\" (UniqueName: \"kubernetes.io/projected/30419d6d-3999-43ef-8cd9-07143299061a-kube-api-access-d74b5\") pod \"dnsmasq-dns-5bf47b49b7-nhn5n\" (UID: \"30419d6d-3999-43ef-8cd9-07143299061a\") " pod="openstack/dnsmasq-dns-5bf47b49b7-nhn5n" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.376588 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xglxs\" (UniqueName: \"kubernetes.io/projected/66ee71b5-db58-4478-94c7-0067be9c018e-kube-api-access-xglxs\") pod \"dnsmasq-dns-8554648995-mt2n6\" (UID: \"66ee71b5-db58-4478-94c7-0067be9c018e\") " pod="openstack/dnsmasq-dns-8554648995-mt2n6" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.376783 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66ee71b5-db58-4478-94c7-0067be9c018e-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-mt2n6\" (UID: \"66ee71b5-db58-4478-94c7-0067be9c018e\") " pod="openstack/dnsmasq-dns-8554648995-mt2n6" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.376875 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66ee71b5-db58-4478-94c7-0067be9c018e-config\") pod \"dnsmasq-dns-8554648995-mt2n6\" (UID: \"66ee71b5-db58-4478-94c7-0067be9c018e\") " pod="openstack/dnsmasq-dns-8554648995-mt2n6" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.376895 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/30419d6d-3999-43ef-8cd9-07143299061a-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-nhn5n\" (UID: \"30419d6d-3999-43ef-8cd9-07143299061a\") " pod="openstack/dnsmasq-dns-5bf47b49b7-nhn5n" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.376925 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66ee71b5-db58-4478-94c7-0067be9c018e-dns-svc\") pod \"dnsmasq-dns-8554648995-mt2n6\" (UID: \"66ee71b5-db58-4478-94c7-0067be9c018e\") " pod="openstack/dnsmasq-dns-8554648995-mt2n6" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.377418 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66ee71b5-db58-4478-94c7-0067be9c018e-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-mt2n6\" (UID: \"66ee71b5-db58-4478-94c7-0067be9c018e\") " pod="openstack/dnsmasq-dns-8554648995-mt2n6" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.377770 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66ee71b5-db58-4478-94c7-0067be9c018e-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-mt2n6\" (UID: \"66ee71b5-db58-4478-94c7-0067be9c018e\") " pod="openstack/dnsmasq-dns-8554648995-mt2n6" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.377863 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/30419d6d-3999-43ef-8cd9-07143299061a-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-nhn5n\" (UID: \"30419d6d-3999-43ef-8cd9-07143299061a\") " pod="openstack/dnsmasq-dns-5bf47b49b7-nhn5n" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.378202 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66ee71b5-db58-4478-94c7-0067be9c018e-dns-svc\") pod \"dnsmasq-dns-8554648995-mt2n6\" (UID: \"66ee71b5-db58-4478-94c7-0067be9c018e\") " pod="openstack/dnsmasq-dns-8554648995-mt2n6" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.378199 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66ee71b5-db58-4478-94c7-0067be9c018e-config\") pod \"dnsmasq-dns-8554648995-mt2n6\" (UID: \"66ee71b5-db58-4478-94c7-0067be9c018e\") " pod="openstack/dnsmasq-dns-8554648995-mt2n6" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.378438 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/30419d6d-3999-43ef-8cd9-07143299061a-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-nhn5n\" (UID: \"30419d6d-3999-43ef-8cd9-07143299061a\") " pod="openstack/dnsmasq-dns-5bf47b49b7-nhn5n" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.378614 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30419d6d-3999-43ef-8cd9-07143299061a-config\") pod \"dnsmasq-dns-5bf47b49b7-nhn5n\" (UID: \"30419d6d-3999-43ef-8cd9-07143299061a\") " pod="openstack/dnsmasq-dns-5bf47b49b7-nhn5n" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.401531 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d74b5\" (UniqueName: \"kubernetes.io/projected/30419d6d-3999-43ef-8cd9-07143299061a-kube-api-access-d74b5\") pod \"dnsmasq-dns-5bf47b49b7-nhn5n\" (UID: \"30419d6d-3999-43ef-8cd9-07143299061a\") " pod="openstack/dnsmasq-dns-5bf47b49b7-nhn5n" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.402103 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xglxs\" (UniqueName: \"kubernetes.io/projected/66ee71b5-db58-4478-94c7-0067be9c018e-kube-api-access-xglxs\") pod \"dnsmasq-dns-8554648995-mt2n6\" (UID: \"66ee71b5-db58-4478-94c7-0067be9c018e\") " pod="openstack/dnsmasq-dns-8554648995-mt2n6" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.609867 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-nhn5n" Feb 18 11:53:20 crc kubenswrapper[4922]: I0218 11:53:20.667116 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-mt2n6" Feb 18 11:53:21 crc kubenswrapper[4922]: I0218 11:53:21.151708 4922 generic.go:334] "Generic (PLEG): container finished" podID="5728000b-35c8-4748-a8bd-722a9d4da288" containerID="7d7de311b38252b803029fed4ce2a344d9898ebcba31e61063d58705f1576aca" exitCode=0 Feb 18 11:53:21 crc kubenswrapper[4922]: I0218 11:53:21.151769 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-rkhdz" event={"ID":"5728000b-35c8-4748-a8bd-722a9d4da288","Type":"ContainerDied","Data":"7d7de311b38252b803029fed4ce2a344d9898ebcba31e61063d58705f1576aca"} Feb 18 11:53:21 crc kubenswrapper[4922]: I0218 11:53:21.154532 4922 generic.go:334] "Generic (PLEG): container finished" podID="a9adaf2c-91b1-42f9-8d96-307a08030cce" containerID="bade2e297d9e59c84721045c2242b2738e6af09fe8594f0328f6a4f10a05370f" exitCode=0 Feb 18 11:53:21 crc kubenswrapper[4922]: I0218 11:53:21.154584 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-gcsm9" event={"ID":"a9adaf2c-91b1-42f9-8d96-307a08030cce","Type":"ContainerDied","Data":"bade2e297d9e59c84721045c2242b2738e6af09fe8594f0328f6a4f10a05370f"} Feb 18 11:53:22 crc kubenswrapper[4922]: I0218 11:53:22.390096 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-666b6646f7-rkhdz" podUID="5728000b-35c8-4748-a8bd-722a9d4da288" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.102:5353: connect: connection refused" Feb 18 11:53:23 crc kubenswrapper[4922]: I0218 11:53:23.800917 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-gcsm9" Feb 18 11:53:23 crc kubenswrapper[4922]: I0218 11:53:23.946910 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9adaf2c-91b1-42f9-8d96-307a08030cce-dns-svc\") pod \"a9adaf2c-91b1-42f9-8d96-307a08030cce\" (UID: \"a9adaf2c-91b1-42f9-8d96-307a08030cce\") " Feb 18 11:53:23 crc kubenswrapper[4922]: I0218 11:53:23.947280 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9adaf2c-91b1-42f9-8d96-307a08030cce-config\") pod \"a9adaf2c-91b1-42f9-8d96-307a08030cce\" (UID: \"a9adaf2c-91b1-42f9-8d96-307a08030cce\") " Feb 18 11:53:23 crc kubenswrapper[4922]: I0218 11:53:23.947334 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7kzn\" (UniqueName: \"kubernetes.io/projected/a9adaf2c-91b1-42f9-8d96-307a08030cce-kube-api-access-n7kzn\") pod \"a9adaf2c-91b1-42f9-8d96-307a08030cce\" (UID: \"a9adaf2c-91b1-42f9-8d96-307a08030cce\") " Feb 18 11:53:23 crc kubenswrapper[4922]: I0218 11:53:23.954716 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9adaf2c-91b1-42f9-8d96-307a08030cce-kube-api-access-n7kzn" (OuterVolumeSpecName: "kube-api-access-n7kzn") pod "a9adaf2c-91b1-42f9-8d96-307a08030cce" (UID: "a9adaf2c-91b1-42f9-8d96-307a08030cce"). InnerVolumeSpecName "kube-api-access-n7kzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:53:23 crc kubenswrapper[4922]: I0218 11:53:23.982495 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9adaf2c-91b1-42f9-8d96-307a08030cce-config" (OuterVolumeSpecName: "config") pod "a9adaf2c-91b1-42f9-8d96-307a08030cce" (UID: "a9adaf2c-91b1-42f9-8d96-307a08030cce"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:53:23 crc kubenswrapper[4922]: I0218 11:53:23.998016 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9adaf2c-91b1-42f9-8d96-307a08030cce-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a9adaf2c-91b1-42f9-8d96-307a08030cce" (UID: "a9adaf2c-91b1-42f9-8d96-307a08030cce"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:53:24 crc kubenswrapper[4922]: I0218 11:53:24.049154 4922 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9adaf2c-91b1-42f9-8d96-307a08030cce-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 11:53:24 crc kubenswrapper[4922]: I0218 11:53:24.049193 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9adaf2c-91b1-42f9-8d96-307a08030cce-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:53:24 crc kubenswrapper[4922]: I0218 11:53:24.049211 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7kzn\" (UniqueName: \"kubernetes.io/projected/a9adaf2c-91b1-42f9-8d96-307a08030cce-kube-api-access-n7kzn\") on node \"crc\" DevicePath \"\"" Feb 18 11:53:24 crc kubenswrapper[4922]: I0218 11:53:24.176710 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-gcsm9" event={"ID":"a9adaf2c-91b1-42f9-8d96-307a08030cce","Type":"ContainerDied","Data":"8aa2b445a2e846f21de5ba41b7250cd1d270eacaa01407089987d4676bfb4dcd"} Feb 18 11:53:24 crc kubenswrapper[4922]: I0218 11:53:24.176737 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-gcsm9" Feb 18 11:53:24 crc kubenswrapper[4922]: I0218 11:53:24.176777 4922 scope.go:117] "RemoveContainer" containerID="bade2e297d9e59c84721045c2242b2738e6af09fe8594f0328f6a4f10a05370f" Feb 18 11:53:24 crc kubenswrapper[4922]: I0218 11:53:24.211972 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gcsm9"] Feb 18 11:53:24 crc kubenswrapper[4922]: I0218 11:53:24.218038 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gcsm9"] Feb 18 11:53:24 crc kubenswrapper[4922]: I0218 11:53:24.267010 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-rkhdz" Feb 18 11:53:24 crc kubenswrapper[4922]: I0218 11:53:24.457272 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5728000b-35c8-4748-a8bd-722a9d4da288-dns-svc\") pod \"5728000b-35c8-4748-a8bd-722a9d4da288\" (UID: \"5728000b-35c8-4748-a8bd-722a9d4da288\") " Feb 18 11:53:24 crc kubenswrapper[4922]: I0218 11:53:24.457380 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zm4rb\" (UniqueName: \"kubernetes.io/projected/5728000b-35c8-4748-a8bd-722a9d4da288-kube-api-access-zm4rb\") pod \"5728000b-35c8-4748-a8bd-722a9d4da288\" (UID: \"5728000b-35c8-4748-a8bd-722a9d4da288\") " Feb 18 11:53:24 crc kubenswrapper[4922]: I0218 11:53:24.457423 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5728000b-35c8-4748-a8bd-722a9d4da288-config\") pod \"5728000b-35c8-4748-a8bd-722a9d4da288\" (UID: \"5728000b-35c8-4748-a8bd-722a9d4da288\") " Feb 18 11:53:24 crc kubenswrapper[4922]: I0218 11:53:24.463221 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5728000b-35c8-4748-a8bd-722a9d4da288-kube-api-access-zm4rb" (OuterVolumeSpecName: "kube-api-access-zm4rb") pod "5728000b-35c8-4748-a8bd-722a9d4da288" (UID: "5728000b-35c8-4748-a8bd-722a9d4da288"). InnerVolumeSpecName "kube-api-access-zm4rb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:53:24 crc kubenswrapper[4922]: I0218 11:53:24.498197 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5728000b-35c8-4748-a8bd-722a9d4da288-config" (OuterVolumeSpecName: "config") pod "5728000b-35c8-4748-a8bd-722a9d4da288" (UID: "5728000b-35c8-4748-a8bd-722a9d4da288"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:53:24 crc kubenswrapper[4922]: I0218 11:53:24.503545 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5728000b-35c8-4748-a8bd-722a9d4da288-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5728000b-35c8-4748-a8bd-722a9d4da288" (UID: "5728000b-35c8-4748-a8bd-722a9d4da288"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:53:24 crc kubenswrapper[4922]: I0218 11:53:24.559057 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5728000b-35c8-4748-a8bd-722a9d4da288-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:53:24 crc kubenswrapper[4922]: I0218 11:53:24.559100 4922 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5728000b-35c8-4748-a8bd-722a9d4da288-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 11:53:24 crc kubenswrapper[4922]: I0218 11:53:24.559111 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zm4rb\" (UniqueName: \"kubernetes.io/projected/5728000b-35c8-4748-a8bd-722a9d4da288-kube-api-access-zm4rb\") on node \"crc\" DevicePath \"\"" Feb 18 11:53:24 crc kubenswrapper[4922]: I0218 11:53:24.984038 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9adaf2c-91b1-42f9-8d96-307a08030cce" path="/var/lib/kubelet/pods/a9adaf2c-91b1-42f9-8d96-307a08030cce/volumes" Feb 18 11:53:25 crc kubenswrapper[4922]: I0218 11:53:25.185547 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-rkhdz" event={"ID":"5728000b-35c8-4748-a8bd-722a9d4da288","Type":"ContainerDied","Data":"05332c9ac22e2e275a6ff8155167585702fc418b67e75c5ce73b68418331a924"} Feb 18 11:53:25 crc kubenswrapper[4922]: I0218 11:53:25.185651 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-rkhdz" Feb 18 11:53:25 crc kubenswrapper[4922]: I0218 11:53:25.240087 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-rkhdz"] Feb 18 11:53:25 crc kubenswrapper[4922]: I0218 11:53:25.245849 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-rkhdz"] Feb 18 11:53:25 crc kubenswrapper[4922]: I0218 11:53:25.433301 4922 scope.go:117] "RemoveContainer" containerID="cb1206c209e030ad065b0424209cc4f379d4283c70032fb8543af8119687b039" Feb 18 11:53:25 crc kubenswrapper[4922]: I0218 11:53:25.824328 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-wknpt"] Feb 18 11:53:26 crc kubenswrapper[4922]: I0218 11:53:26.151424 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-mt2n6"] Feb 18 11:53:26 crc kubenswrapper[4922]: I0218 11:53:26.289109 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-nhn5n"] Feb 18 11:53:26 crc kubenswrapper[4922]: W0218 11:53:26.507482 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66ee71b5_db58_4478_94c7_0067be9c018e.slice/crio-ff48b67de3bebc231cad1e6022f943d477fb1723890c418485b7af3b8fa2de11 WatchSource:0}: Error finding container ff48b67de3bebc231cad1e6022f943d477fb1723890c418485b7af3b8fa2de11: Status 404 returned error can't find the container with id ff48b67de3bebc231cad1e6022f943d477fb1723890c418485b7af3b8fa2de11 Feb 18 11:53:26 crc kubenswrapper[4922]: W0218 11:53:26.513046 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30419d6d_3999_43ef_8cd9_07143299061a.slice/crio-d25632e30548614105c4a5a234e63a61b760dc62618152fb02637520873fb6e0 WatchSource:0}: Error finding container d25632e30548614105c4a5a234e63a61b760dc62618152fb02637520873fb6e0: Status 404 returned error can't find the container with id d25632e30548614105c4a5a234e63a61b760dc62618152fb02637520873fb6e0 Feb 18 11:53:26 crc kubenswrapper[4922]: I0218 11:53:26.520186 4922 scope.go:117] "RemoveContainer" containerID="7d7de311b38252b803029fed4ce2a344d9898ebcba31e61063d58705f1576aca" Feb 18 11:53:26 crc kubenswrapper[4922]: I0218 11:53:26.987291 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5728000b-35c8-4748-a8bd-722a9d4da288" path="/var/lib/kubelet/pods/5728000b-35c8-4748-a8bd-722a9d4da288/volumes" Feb 18 11:53:27 crc kubenswrapper[4922]: I0218 11:53:27.202645 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-wknpt" event={"ID":"c831c6ce-ca0c-4f7d-8268-b4efe13e687d","Type":"ContainerStarted","Data":"062f7f43eb7d96421bec1a95ff80452743a6e5e6148aeae796eecb137024ca1f"} Feb 18 11:53:27 crc kubenswrapper[4922]: I0218 11:53:27.203953 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-mt2n6" event={"ID":"66ee71b5-db58-4478-94c7-0067be9c018e","Type":"ContainerStarted","Data":"ff48b67de3bebc231cad1e6022f943d477fb1723890c418485b7af3b8fa2de11"} Feb 18 11:53:27 crc kubenswrapper[4922]: I0218 11:53:27.205169 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-nhn5n" event={"ID":"30419d6d-3999-43ef-8cd9-07143299061a","Type":"ContainerStarted","Data":"d25632e30548614105c4a5a234e63a61b760dc62618152fb02637520873fb6e0"} Feb 18 11:53:27 crc kubenswrapper[4922]: I0218 11:53:27.685885 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57d769cc4f-gcsm9" podUID="a9adaf2c-91b1-42f9-8d96-307a08030cce" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.103:5353: i/o timeout" Feb 18 11:53:38 crc kubenswrapper[4922]: I0218 11:53:38.618445 4922 scope.go:117] "RemoveContainer" containerID="e926ccb5b4c9a13d1cee243527546b45e51365b24bcd44a54ffecfd0423a70d5" Feb 18 11:53:39 crc kubenswrapper[4922]: E0218 11:53:39.157518 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Feb 18 11:53:39 crc kubenswrapper[4922]: E0218 11:53:39.157831 4922 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Feb 18 11:53:39 crc kubenswrapper[4922]: E0218 11:53:39.157956 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hhpfc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(2aa305a0-c015-43c2-851c-8eff778238be): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 18 11:53:39 crc kubenswrapper[4922]: E0218 11:53:39.159114 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="2aa305a0-c015-43c2-851c-8eff778238be" Feb 18 11:53:39 crc kubenswrapper[4922]: I0218 11:53:39.304327 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"873b23d0-3c83-4ab7-8178-1c4832c544a0","Type":"ContainerStarted","Data":"4aa86186b70bbaa1d88853213d84ce4bbe639bf86efad906f20d670b7d784b7c"} Feb 18 11:53:39 crc kubenswrapper[4922]: E0218 11:53:39.307801 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="openstack/kube-state-metrics-0" podUID="2aa305a0-c015-43c2-851c-8eff778238be" Feb 18 11:53:39 crc kubenswrapper[4922]: I0218 11:53:39.807923 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 11:53:39 crc kubenswrapper[4922]: I0218 11:53:39.807986 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 11:53:40 crc kubenswrapper[4922]: I0218 11:53:40.314084 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cef557d2-b935-4cf6-98f1-d3c2251c0e38","Type":"ContainerStarted","Data":"fcebb6698c6e4874e1a9ec8fbf1e0b1f0b32b5ba69a663e4d66216e0e480bd70"} Feb 18 11:53:41 crc kubenswrapper[4922]: I0218 11:53:41.345832 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-wknpt" event={"ID":"c831c6ce-ca0c-4f7d-8268-b4efe13e687d","Type":"ContainerStarted","Data":"4b2b9c1b4820b77aa29de7ecfd5734aa99c8ab5af9603f71c0064adb4ecf2c6c"} Feb 18 11:53:41 crc kubenswrapper[4922]: I0218 11:53:41.349806 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"186f064b-a9e8-4637-a5eb-1646f2e1a783","Type":"ContainerStarted","Data":"b66989a7d2943a9028c5fc56e00cc93539607f92c8a52e2b9af3989791e7064f"} Feb 18 11:53:41 crc kubenswrapper[4922]: I0218 11:53:41.351942 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"302e3b56-c5a4-4e80-bb7e-a9e6a61a119e","Type":"ContainerStarted","Data":"048e6e6205c00557e2deae0c2b161998505a6264e0d0f3a1a72bad74824fd36b"} Feb 18 11:53:41 crc kubenswrapper[4922]: I0218 11:53:41.353866 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"0ce20f52-4b9d-47a6-8da7-c64cd1d15623","Type":"ContainerStarted","Data":"a97510ff59feaa5e50c719a472be949f9a003a8a6e7db8f428877d2b7e6f59fe"} Feb 18 11:53:41 crc kubenswrapper[4922]: I0218 11:53:41.354387 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 18 11:53:41 crc kubenswrapper[4922]: I0218 11:53:41.355607 4922 generic.go:334] "Generic (PLEG): container finished" podID="66ee71b5-db58-4478-94c7-0067be9c018e" containerID="4802ba16c84d861fcc08bada9e91f7825e817d37cd0acc6e85ba3301fe424969" exitCode=0 Feb 18 11:53:41 crc kubenswrapper[4922]: I0218 11:53:41.355658 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-mt2n6" event={"ID":"66ee71b5-db58-4478-94c7-0067be9c018e","Type":"ContainerDied","Data":"4802ba16c84d861fcc08bada9e91f7825e817d37cd0acc6e85ba3301fe424969"} Feb 18 11:53:41 crc kubenswrapper[4922]: I0218 11:53:41.359517 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-996pg" event={"ID":"a2d0a226-07e2-402d-a868-2f8374670dac","Type":"ContainerStarted","Data":"9ca45559eb9e2e8e5645e715736a5505eee43b0dc10bcda0685ee036514107c4"} Feb 18 11:53:41 crc kubenswrapper[4922]: I0218 11:53:41.359634 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-996pg" Feb 18 11:53:41 crc kubenswrapper[4922]: I0218 11:53:41.361235 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-stvc7" event={"ID":"cf286fe0-1b17-475a-b71b-ac4897c2f59d","Type":"ContainerStarted","Data":"1aa1946e5f937a792f7f4beb6cc426625f028314b133073cad0066076976cbfc"} Feb 18 11:53:41 crc kubenswrapper[4922]: I0218 11:53:41.364925 4922 generic.go:334] "Generic (PLEG): container finished" podID="30419d6d-3999-43ef-8cd9-07143299061a" containerID="60eb9c5023e2d3d6c8d592fa62a787bc619c79be6125ae7f37f446b7b4620ecf" exitCode=0 Feb 18 11:53:41 crc kubenswrapper[4922]: I0218 11:53:41.365136 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-nhn5n" event={"ID":"30419d6d-3999-43ef-8cd9-07143299061a","Type":"ContainerDied","Data":"60eb9c5023e2d3d6c8d592fa62a787bc619c79be6125ae7f37f446b7b4620ecf"} Feb 18 11:53:41 crc kubenswrapper[4922]: I0218 11:53:41.366481 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-wknpt" podStartSLOduration=8.678920049 podStartE2EDuration="22.3664615s" podCreationTimestamp="2026-02-18 11:53:19 +0000 UTC" firstStartedPulling="2026-02-18 11:53:26.50318256 +0000 UTC m=+1008.230886660" lastFinishedPulling="2026-02-18 11:53:40.190724031 +0000 UTC m=+1021.918428111" observedRunningTime="2026-02-18 11:53:41.361627897 +0000 UTC m=+1023.089331977" watchObservedRunningTime="2026-02-18 11:53:41.3664615 +0000 UTC m=+1023.094165580" Feb 18 11:53:41 crc kubenswrapper[4922]: I0218 11:53:41.376341 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"12b84523-522e-4e8c-b78e-0094262fb1f8","Type":"ContainerStarted","Data":"e2215857f67c60d9f5c91d2142e0be2f147282bd8c00e873605b8bd17b7df49a"} Feb 18 11:53:41 crc kubenswrapper[4922]: I0218 11:53:41.380036 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"b6f3b4f2-3f65-4278-9cd0-753adfee2ecd","Type":"ContainerStarted","Data":"cce6c10d0b2be9b757f773def29c1072ae8928bec75e04b265c07a6436f35911"} Feb 18 11:53:41 crc kubenswrapper[4922]: I0218 11:53:41.459485 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=28.731091133 podStartE2EDuration="40.459462373s" podCreationTimestamp="2026-02-18 11:53:01 +0000 UTC" firstStartedPulling="2026-02-18 11:53:13.426737454 +0000 UTC m=+995.154441534" lastFinishedPulling="2026-02-18 11:53:25.155108694 +0000 UTC m=+1006.882812774" observedRunningTime="2026-02-18 11:53:41.428759053 +0000 UTC m=+1023.156463163" watchObservedRunningTime="2026-02-18 11:53:41.459462373 +0000 UTC m=+1023.187166453" Feb 18 11:53:41 crc kubenswrapper[4922]: I0218 11:53:41.461273 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-996pg" podStartSLOduration=23.812160344 podStartE2EDuration="35.461267069s" podCreationTimestamp="2026-02-18 11:53:06 +0000 UTC" firstStartedPulling="2026-02-18 11:53:13.784196865 +0000 UTC m=+995.511900945" lastFinishedPulling="2026-02-18 11:53:25.43330359 +0000 UTC m=+1007.161007670" observedRunningTime="2026-02-18 11:53:41.449987982 +0000 UTC m=+1023.177692062" watchObservedRunningTime="2026-02-18 11:53:41.461267069 +0000 UTC m=+1023.188971149" Feb 18 11:53:42 crc kubenswrapper[4922]: I0218 11:53:42.390448 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"b6f3b4f2-3f65-4278-9cd0-753adfee2ecd","Type":"ContainerStarted","Data":"c199f7543b182f63fa0fbb340ec1e9ffc666d55129b7e5d914721d449dd1bcab"} Feb 18 11:53:42 crc kubenswrapper[4922]: I0218 11:53:42.392678 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-mt2n6" event={"ID":"66ee71b5-db58-4478-94c7-0067be9c018e","Type":"ContainerStarted","Data":"4e3aef5e68dbf272d244ded6025cd71542de6a272f367b19ff7f37bb7a72fc94"} Feb 18 11:53:42 crc kubenswrapper[4922]: I0218 11:53:42.392836 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-mt2n6" Feb 18 11:53:42 crc kubenswrapper[4922]: I0218 11:53:42.394894 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"186f064b-a9e8-4637-a5eb-1646f2e1a783","Type":"ContainerStarted","Data":"6e8120987d4ecbdfcb1b57bf6cf20f094da9b59df908aa1d02aed03319948576"} Feb 18 11:53:42 crc kubenswrapper[4922]: I0218 11:53:42.396321 4922 generic.go:334] "Generic (PLEG): container finished" podID="cf286fe0-1b17-475a-b71b-ac4897c2f59d" containerID="1aa1946e5f937a792f7f4beb6cc426625f028314b133073cad0066076976cbfc" exitCode=0 Feb 18 11:53:42 crc kubenswrapper[4922]: I0218 11:53:42.396380 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-stvc7" event={"ID":"cf286fe0-1b17-475a-b71b-ac4897c2f59d","Type":"ContainerDied","Data":"1aa1946e5f937a792f7f4beb6cc426625f028314b133073cad0066076976cbfc"} Feb 18 11:53:42 crc kubenswrapper[4922]: I0218 11:53:42.399291 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-nhn5n" event={"ID":"30419d6d-3999-43ef-8cd9-07143299061a","Type":"ContainerStarted","Data":"e636c73dbb4ceb7b2f26422690f3ab2d91f3f8e5fae643efe99e90bb666ac64a"} Feb 18 11:53:42 crc kubenswrapper[4922]: I0218 11:53:42.419901 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=24.612548705000002 podStartE2EDuration="36.419876212s" podCreationTimestamp="2026-02-18 11:53:06 +0000 UTC" firstStartedPulling="2026-02-18 11:53:14.700089741 +0000 UTC m=+996.427793821" lastFinishedPulling="2026-02-18 11:53:26.507417238 +0000 UTC m=+1008.235121328" observedRunningTime="2026-02-18 11:53:42.410809032 +0000 UTC m=+1024.138513132" watchObservedRunningTime="2026-02-18 11:53:42.419876212 +0000 UTC m=+1024.147580292" Feb 18 11:53:42 crc kubenswrapper[4922]: I0218 11:53:42.433130 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-mt2n6" podStartSLOduration=22.433110908 podStartE2EDuration="22.433110908s" podCreationTimestamp="2026-02-18 11:53:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:42.427087285 +0000 UTC m=+1024.154791365" watchObservedRunningTime="2026-02-18 11:53:42.433110908 +0000 UTC m=+1024.160814988" Feb 18 11:53:42 crc kubenswrapper[4922]: I0218 11:53:42.477446 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 18 11:53:42 crc kubenswrapper[4922]: I0218 11:53:42.496813 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=21.905350945 podStartE2EDuration="33.496792336s" podCreationTimestamp="2026-02-18 11:53:09 +0000 UTC" firstStartedPulling="2026-02-18 11:53:13.842505746 +0000 UTC m=+995.570209826" lastFinishedPulling="2026-02-18 11:53:25.433947137 +0000 UTC m=+1007.161651217" observedRunningTime="2026-02-18 11:53:42.468667862 +0000 UTC m=+1024.196371942" watchObservedRunningTime="2026-02-18 11:53:42.496792336 +0000 UTC m=+1024.224496416" Feb 18 11:53:42 crc kubenswrapper[4922]: I0218 11:53:42.499111 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5bf47b49b7-nhn5n" podStartSLOduration=22.499088165 podStartE2EDuration="22.499088165s" podCreationTimestamp="2026-02-18 11:53:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:42.495913274 +0000 UTC m=+1024.223617354" watchObservedRunningTime="2026-02-18 11:53:42.499088165 +0000 UTC m=+1024.226792245" Feb 18 11:53:43 crc kubenswrapper[4922]: I0218 11:53:43.409026 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e35b9ac7-2e11-4096-a77a-4be1a41d737f","Type":"ContainerStarted","Data":"8a6f423971fc2df44ffe536b7fffc2792fbaae14dda7a484a39a6fbec81c8d3e"} Feb 18 11:53:43 crc kubenswrapper[4922]: I0218 11:53:43.410261 4922 generic.go:334] "Generic (PLEG): container finished" podID="873b23d0-3c83-4ab7-8178-1c4832c544a0" containerID="4aa86186b70bbaa1d88853213d84ce4bbe639bf86efad906f20d670b7d784b7c" exitCode=0 Feb 18 11:53:43 crc kubenswrapper[4922]: I0218 11:53:43.410312 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"873b23d0-3c83-4ab7-8178-1c4832c544a0","Type":"ContainerDied","Data":"4aa86186b70bbaa1d88853213d84ce4bbe639bf86efad906f20d670b7d784b7c"} Feb 18 11:53:43 crc kubenswrapper[4922]: I0218 11:53:43.417108 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-stvc7" event={"ID":"cf286fe0-1b17-475a-b71b-ac4897c2f59d","Type":"ContainerStarted","Data":"247a4a35d0562231316ad55424b35b3d988a6065476c802b024259ecd57da27b"} Feb 18 11:53:43 crc kubenswrapper[4922]: I0218 11:53:43.417146 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-stvc7" event={"ID":"cf286fe0-1b17-475a-b71b-ac4897c2f59d","Type":"ContainerStarted","Data":"560f8f6a75194e2a684dcb5eebe01dc8e4f7d90457e9ab9e0f882f2638c4c36b"} Feb 18 11:53:43 crc kubenswrapper[4922]: I0218 11:53:43.417162 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-stvc7" Feb 18 11:53:43 crc kubenswrapper[4922]: I0218 11:53:43.418437 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-stvc7" Feb 18 11:53:43 crc kubenswrapper[4922]: I0218 11:53:43.418482 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5bf47b49b7-nhn5n" Feb 18 11:53:43 crc kubenswrapper[4922]: I0218 11:53:43.471748 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-stvc7" podStartSLOduration=25.717471905 podStartE2EDuration="37.471725853s" podCreationTimestamp="2026-02-18 11:53:06 +0000 UTC" firstStartedPulling="2026-02-18 11:53:13.402413496 +0000 UTC m=+995.130117576" lastFinishedPulling="2026-02-18 11:53:25.156667444 +0000 UTC m=+1006.884371524" observedRunningTime="2026-02-18 11:53:43.471692043 +0000 UTC m=+1025.199396143" watchObservedRunningTime="2026-02-18 11:53:43.471725853 +0000 UTC m=+1025.199429943" Feb 18 11:53:43 crc kubenswrapper[4922]: I0218 11:53:43.477824 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 18 11:53:43 crc kubenswrapper[4922]: I0218 11:53:43.868443 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 18 11:53:44 crc kubenswrapper[4922]: I0218 11:53:44.424749 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"873b23d0-3c83-4ab7-8178-1c4832c544a0","Type":"ContainerStarted","Data":"0307943daf75af30bf450dbdaa6f8cf8fd9ee96038755b9083999ea56738fb10"} Feb 18 11:53:45 crc kubenswrapper[4922]: I0218 11:53:45.432201 4922 generic.go:334] "Generic (PLEG): container finished" podID="302e3b56-c5a4-4e80-bb7e-a9e6a61a119e" containerID="048e6e6205c00557e2deae0c2b161998505a6264e0d0f3a1a72bad74824fd36b" exitCode=0 Feb 18 11:53:45 crc kubenswrapper[4922]: I0218 11:53:45.432261 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"302e3b56-c5a4-4e80-bb7e-a9e6a61a119e","Type":"ContainerDied","Data":"048e6e6205c00557e2deae0c2b161998505a6264e0d0f3a1a72bad74824fd36b"} Feb 18 11:53:45 crc kubenswrapper[4922]: I0218 11:53:45.454143 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=35.001173341 podStartE2EDuration="46.454124175s" podCreationTimestamp="2026-02-18 11:52:59 +0000 UTC" firstStartedPulling="2026-02-18 11:53:13.275753018 +0000 UTC m=+995.003457098" lastFinishedPulling="2026-02-18 11:53:24.728703852 +0000 UTC m=+1006.456407932" observedRunningTime="2026-02-18 11:53:44.45171488 +0000 UTC m=+1026.179418980" watchObservedRunningTime="2026-02-18 11:53:45.454124175 +0000 UTC m=+1027.181828255" Feb 18 11:53:45 crc kubenswrapper[4922]: I0218 11:53:45.867955 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 18 11:53:46 crc kubenswrapper[4922]: I0218 11:53:46.443289 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"302e3b56-c5a4-4e80-bb7e-a9e6a61a119e","Type":"ContainerStarted","Data":"5531d27efb6d88e6ac3081f1f498415eb1081729a663f20139846ae7b13dcf3f"} Feb 18 11:53:46 crc kubenswrapper[4922]: I0218 11:53:46.523505 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 18 11:53:46 crc kubenswrapper[4922]: I0218 11:53:46.542449 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=35.691148417 podStartE2EDuration="48.542429663s" podCreationTimestamp="2026-02-18 11:52:58 +0000 UTC" firstStartedPulling="2026-02-18 11:53:12.565645938 +0000 UTC m=+994.293350018" lastFinishedPulling="2026-02-18 11:53:25.416927154 +0000 UTC m=+1007.144631264" observedRunningTime="2026-02-18 11:53:46.466875184 +0000 UTC m=+1028.194579264" watchObservedRunningTime="2026-02-18 11:53:46.542429663 +0000 UTC m=+1028.270133743" Feb 18 11:53:46 crc kubenswrapper[4922]: I0218 11:53:46.560151 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 18 11:53:46 crc kubenswrapper[4922]: I0218 11:53:46.652879 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 18 11:53:46 crc kubenswrapper[4922]: I0218 11:53:46.903437 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 18 11:53:46 crc kubenswrapper[4922]: I0218 11:53:46.947932 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 18 11:53:47 crc kubenswrapper[4922]: I0218 11:53:47.116033 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 18 11:53:47 crc kubenswrapper[4922]: E0218 11:53:47.116395 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9adaf2c-91b1-42f9-8d96-307a08030cce" containerName="dnsmasq-dns" Feb 18 11:53:47 crc kubenswrapper[4922]: I0218 11:53:47.116415 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9adaf2c-91b1-42f9-8d96-307a08030cce" containerName="dnsmasq-dns" Feb 18 11:53:47 crc kubenswrapper[4922]: E0218 11:53:47.116445 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5728000b-35c8-4748-a8bd-722a9d4da288" containerName="init" Feb 18 11:53:47 crc kubenswrapper[4922]: I0218 11:53:47.116454 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="5728000b-35c8-4748-a8bd-722a9d4da288" containerName="init" Feb 18 11:53:47 crc kubenswrapper[4922]: E0218 11:53:47.116466 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5728000b-35c8-4748-a8bd-722a9d4da288" containerName="dnsmasq-dns" Feb 18 11:53:47 crc kubenswrapper[4922]: I0218 11:53:47.116472 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="5728000b-35c8-4748-a8bd-722a9d4da288" containerName="dnsmasq-dns" Feb 18 11:53:47 crc kubenswrapper[4922]: E0218 11:53:47.116485 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9adaf2c-91b1-42f9-8d96-307a08030cce" containerName="init" Feb 18 11:53:47 crc kubenswrapper[4922]: I0218 11:53:47.116490 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9adaf2c-91b1-42f9-8d96-307a08030cce" containerName="init" Feb 18 11:53:47 crc kubenswrapper[4922]: I0218 11:53:47.116653 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="5728000b-35c8-4748-a8bd-722a9d4da288" containerName="dnsmasq-dns" Feb 18 11:53:47 crc kubenswrapper[4922]: I0218 11:53:47.116666 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9adaf2c-91b1-42f9-8d96-307a08030cce" containerName="dnsmasq-dns" Feb 18 11:53:47 crc kubenswrapper[4922]: I0218 11:53:47.117574 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 18 11:53:47 crc kubenswrapper[4922]: I0218 11:53:47.120267 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-q9qpq" Feb 18 11:53:47 crc kubenswrapper[4922]: I0218 11:53:47.120293 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 18 11:53:47 crc kubenswrapper[4922]: I0218 11:53:47.120545 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 18 11:53:47 crc kubenswrapper[4922]: I0218 11:53:47.124356 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 18 11:53:47 crc kubenswrapper[4922]: I0218 11:53:47.139073 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 18 11:53:47 crc kubenswrapper[4922]: I0218 11:53:47.161195 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09bbc755-2862-437b-9ef3-515103f77710-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"09bbc755-2862-437b-9ef3-515103f77710\") " pod="openstack/ovn-northd-0" Feb 18 11:53:47 crc kubenswrapper[4922]: I0218 11:53:47.161392 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/09bbc755-2862-437b-9ef3-515103f77710-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"09bbc755-2862-437b-9ef3-515103f77710\") " pod="openstack/ovn-northd-0" Feb 18 11:53:47 crc kubenswrapper[4922]: I0218 11:53:47.161457 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/09bbc755-2862-437b-9ef3-515103f77710-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"09bbc755-2862-437b-9ef3-515103f77710\") " pod="openstack/ovn-northd-0" Feb 18 11:53:47 crc kubenswrapper[4922]: I0218 11:53:47.161481 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/09bbc755-2862-437b-9ef3-515103f77710-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"09bbc755-2862-437b-9ef3-515103f77710\") " pod="openstack/ovn-northd-0" Feb 18 11:53:47 crc kubenswrapper[4922]: I0218 11:53:47.161524 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/09bbc755-2862-437b-9ef3-515103f77710-scripts\") pod \"ovn-northd-0\" (UID: \"09bbc755-2862-437b-9ef3-515103f77710\") " pod="openstack/ovn-northd-0" Feb 18 11:53:47 crc kubenswrapper[4922]: I0218 11:53:47.161545 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqpph\" (UniqueName: \"kubernetes.io/projected/09bbc755-2862-437b-9ef3-515103f77710-kube-api-access-xqpph\") pod \"ovn-northd-0\" (UID: \"09bbc755-2862-437b-9ef3-515103f77710\") " pod="openstack/ovn-northd-0" Feb 18 11:53:47 crc kubenswrapper[4922]: I0218 11:53:47.161590 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09bbc755-2862-437b-9ef3-515103f77710-config\") pod \"ovn-northd-0\" (UID: \"09bbc755-2862-437b-9ef3-515103f77710\") " pod="openstack/ovn-northd-0" Feb 18 11:53:47 crc kubenswrapper[4922]: I0218 11:53:47.263568 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqpph\" (UniqueName: \"kubernetes.io/projected/09bbc755-2862-437b-9ef3-515103f77710-kube-api-access-xqpph\") pod \"ovn-northd-0\" (UID: \"09bbc755-2862-437b-9ef3-515103f77710\") " pod="openstack/ovn-northd-0" Feb 18 11:53:47 crc kubenswrapper[4922]: I0218 11:53:47.263641 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09bbc755-2862-437b-9ef3-515103f77710-config\") pod \"ovn-northd-0\" (UID: \"09bbc755-2862-437b-9ef3-515103f77710\") " pod="openstack/ovn-northd-0" Feb 18 11:53:47 crc kubenswrapper[4922]: I0218 11:53:47.263682 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09bbc755-2862-437b-9ef3-515103f77710-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"09bbc755-2862-437b-9ef3-515103f77710\") " pod="openstack/ovn-northd-0" Feb 18 11:53:47 crc kubenswrapper[4922]: I0218 11:53:47.263736 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/09bbc755-2862-437b-9ef3-515103f77710-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"09bbc755-2862-437b-9ef3-515103f77710\") " pod="openstack/ovn-northd-0" Feb 18 11:53:47 crc kubenswrapper[4922]: I0218 11:53:47.263777 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/09bbc755-2862-437b-9ef3-515103f77710-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"09bbc755-2862-437b-9ef3-515103f77710\") " pod="openstack/ovn-northd-0" Feb 18 11:53:47 crc kubenswrapper[4922]: I0218 11:53:47.263791 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/09bbc755-2862-437b-9ef3-515103f77710-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"09bbc755-2862-437b-9ef3-515103f77710\") " pod="openstack/ovn-northd-0" Feb 18 11:53:47 crc kubenswrapper[4922]: I0218 11:53:47.263820 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/09bbc755-2862-437b-9ef3-515103f77710-scripts\") pod \"ovn-northd-0\" (UID: \"09bbc755-2862-437b-9ef3-515103f77710\") " pod="openstack/ovn-northd-0" Feb 18 11:53:47 crc kubenswrapper[4922]: I0218 11:53:47.264426 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/09bbc755-2862-437b-9ef3-515103f77710-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"09bbc755-2862-437b-9ef3-515103f77710\") " pod="openstack/ovn-northd-0" Feb 18 11:53:47 crc kubenswrapper[4922]: I0218 11:53:47.264725 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/09bbc755-2862-437b-9ef3-515103f77710-scripts\") pod \"ovn-northd-0\" (UID: \"09bbc755-2862-437b-9ef3-515103f77710\") " pod="openstack/ovn-northd-0" Feb 18 11:53:47 crc kubenswrapper[4922]: I0218 11:53:47.264767 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09bbc755-2862-437b-9ef3-515103f77710-config\") pod \"ovn-northd-0\" (UID: \"09bbc755-2862-437b-9ef3-515103f77710\") " pod="openstack/ovn-northd-0" Feb 18 11:53:47 crc kubenswrapper[4922]: I0218 11:53:47.269010 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09bbc755-2862-437b-9ef3-515103f77710-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"09bbc755-2862-437b-9ef3-515103f77710\") " pod="openstack/ovn-northd-0" Feb 18 11:53:47 crc kubenswrapper[4922]: I0218 11:53:47.269149 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/09bbc755-2862-437b-9ef3-515103f77710-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"09bbc755-2862-437b-9ef3-515103f77710\") " pod="openstack/ovn-northd-0" Feb 18 11:53:47 crc kubenswrapper[4922]: I0218 11:53:47.271064 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/09bbc755-2862-437b-9ef3-515103f77710-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"09bbc755-2862-437b-9ef3-515103f77710\") " pod="openstack/ovn-northd-0" Feb 18 11:53:47 crc kubenswrapper[4922]: I0218 11:53:47.284717 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqpph\" (UniqueName: \"kubernetes.io/projected/09bbc755-2862-437b-9ef3-515103f77710-kube-api-access-xqpph\") pod \"ovn-northd-0\" (UID: \"09bbc755-2862-437b-9ef3-515103f77710\") " pod="openstack/ovn-northd-0" Feb 18 11:53:47 crc kubenswrapper[4922]: I0218 11:53:47.435455 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 18 11:53:47 crc kubenswrapper[4922]: I0218 11:53:47.922741 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 18 11:53:48 crc kubenswrapper[4922]: I0218 11:53:48.476623 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"09bbc755-2862-437b-9ef3-515103f77710","Type":"ContainerStarted","Data":"d10d4588f87932e6c854e291f60ea902608e9bab1bae55bffccfe799d6dfe77d"} Feb 18 11:53:48 crc kubenswrapper[4922]: I0218 11:53:48.478586 4922 generic.go:334] "Generic (PLEG): container finished" podID="e35b9ac7-2e11-4096-a77a-4be1a41d737f" containerID="8a6f423971fc2df44ffe536b7fffc2792fbaae14dda7a484a39a6fbec81c8d3e" exitCode=0 Feb 18 11:53:48 crc kubenswrapper[4922]: I0218 11:53:48.478623 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e35b9ac7-2e11-4096-a77a-4be1a41d737f","Type":"ContainerDied","Data":"8a6f423971fc2df44ffe536b7fffc2792fbaae14dda7a484a39a6fbec81c8d3e"} Feb 18 11:53:49 crc kubenswrapper[4922]: I0218 11:53:49.510261 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"09bbc755-2862-437b-9ef3-515103f77710","Type":"ContainerStarted","Data":"9e408146c1e2977c9d8ebe6b86cf912809e85489b9a69912260744ae6bb8bac7"} Feb 18 11:53:49 crc kubenswrapper[4922]: I0218 11:53:49.510703 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"09bbc755-2862-437b-9ef3-515103f77710","Type":"ContainerStarted","Data":"0c3bd53d7f6433881def4720837acaf2e86ea1f115702f6575b37035ffd2e8fd"} Feb 18 11:53:49 crc kubenswrapper[4922]: I0218 11:53:49.510732 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 18 11:53:49 crc kubenswrapper[4922]: I0218 11:53:49.540541 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.483582376 podStartE2EDuration="2.540520857s" podCreationTimestamp="2026-02-18 11:53:47 +0000 UTC" firstStartedPulling="2026-02-18 11:53:47.926924365 +0000 UTC m=+1029.654628445" lastFinishedPulling="2026-02-18 11:53:48.983862856 +0000 UTC m=+1030.711566926" observedRunningTime="2026-02-18 11:53:49.532858833 +0000 UTC m=+1031.260562923" watchObservedRunningTime="2026-02-18 11:53:49.540520857 +0000 UTC m=+1031.268224937" Feb 18 11:53:50 crc kubenswrapper[4922]: I0218 11:53:50.021542 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 18 11:53:50 crc kubenswrapper[4922]: I0218 11:53:50.021778 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 18 11:53:50 crc kubenswrapper[4922]: I0218 11:53:50.209081 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 18 11:53:50 crc kubenswrapper[4922]: I0218 11:53:50.610092 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 18 11:53:50 crc kubenswrapper[4922]: I0218 11:53:50.611406 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5bf47b49b7-nhn5n" Feb 18 11:53:50 crc kubenswrapper[4922]: I0218 11:53:50.670327 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8554648995-mt2n6" Feb 18 11:53:50 crc kubenswrapper[4922]: I0218 11:53:50.730247 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-nhn5n"] Feb 18 11:53:51 crc kubenswrapper[4922]: I0218 11:53:51.260292 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 18 11:53:51 crc kubenswrapper[4922]: I0218 11:53:51.260595 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 18 11:53:51 crc kubenswrapper[4922]: I0218 11:53:51.383479 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 18 11:53:51 crc kubenswrapper[4922]: I0218 11:53:51.524162 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5bf47b49b7-nhn5n" podUID="30419d6d-3999-43ef-8cd9-07143299061a" containerName="dnsmasq-dns" containerID="cri-o://e636c73dbb4ceb7b2f26422690f3ab2d91f3f8e5fae643efe99e90bb666ac64a" gracePeriod=10 Feb 18 11:53:51 crc kubenswrapper[4922]: I0218 11:53:51.612838 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 18 11:53:51 crc kubenswrapper[4922]: I0218 11:53:51.989019 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-nhn5n" Feb 18 11:53:52 crc kubenswrapper[4922]: I0218 11:53:52.064321 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/30419d6d-3999-43ef-8cd9-07143299061a-ovsdbserver-nb\") pod \"30419d6d-3999-43ef-8cd9-07143299061a\" (UID: \"30419d6d-3999-43ef-8cd9-07143299061a\") " Feb 18 11:53:52 crc kubenswrapper[4922]: I0218 11:53:52.064440 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30419d6d-3999-43ef-8cd9-07143299061a-config\") pod \"30419d6d-3999-43ef-8cd9-07143299061a\" (UID: \"30419d6d-3999-43ef-8cd9-07143299061a\") " Feb 18 11:53:52 crc kubenswrapper[4922]: I0218 11:53:52.064525 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d74b5\" (UniqueName: \"kubernetes.io/projected/30419d6d-3999-43ef-8cd9-07143299061a-kube-api-access-d74b5\") pod \"30419d6d-3999-43ef-8cd9-07143299061a\" (UID: \"30419d6d-3999-43ef-8cd9-07143299061a\") " Feb 18 11:53:52 crc kubenswrapper[4922]: I0218 11:53:52.064591 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/30419d6d-3999-43ef-8cd9-07143299061a-dns-svc\") pod \"30419d6d-3999-43ef-8cd9-07143299061a\" (UID: \"30419d6d-3999-43ef-8cd9-07143299061a\") " Feb 18 11:53:52 crc kubenswrapper[4922]: I0218 11:53:52.070321 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30419d6d-3999-43ef-8cd9-07143299061a-kube-api-access-d74b5" (OuterVolumeSpecName: "kube-api-access-d74b5") pod "30419d6d-3999-43ef-8cd9-07143299061a" (UID: "30419d6d-3999-43ef-8cd9-07143299061a"). InnerVolumeSpecName "kube-api-access-d74b5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:53:52 crc kubenswrapper[4922]: I0218 11:53:52.107722 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30419d6d-3999-43ef-8cd9-07143299061a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "30419d6d-3999-43ef-8cd9-07143299061a" (UID: "30419d6d-3999-43ef-8cd9-07143299061a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:53:52 crc kubenswrapper[4922]: I0218 11:53:52.110024 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30419d6d-3999-43ef-8cd9-07143299061a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "30419d6d-3999-43ef-8cd9-07143299061a" (UID: "30419d6d-3999-43ef-8cd9-07143299061a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:53:52 crc kubenswrapper[4922]: I0218 11:53:52.118640 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30419d6d-3999-43ef-8cd9-07143299061a-config" (OuterVolumeSpecName: "config") pod "30419d6d-3999-43ef-8cd9-07143299061a" (UID: "30419d6d-3999-43ef-8cd9-07143299061a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:53:52 crc kubenswrapper[4922]: I0218 11:53:52.167399 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30419d6d-3999-43ef-8cd9-07143299061a-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:53:52 crc kubenswrapper[4922]: I0218 11:53:52.167437 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d74b5\" (UniqueName: \"kubernetes.io/projected/30419d6d-3999-43ef-8cd9-07143299061a-kube-api-access-d74b5\") on node \"crc\" DevicePath \"\"" Feb 18 11:53:52 crc kubenswrapper[4922]: I0218 11:53:52.167452 4922 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/30419d6d-3999-43ef-8cd9-07143299061a-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 11:53:52 crc kubenswrapper[4922]: I0218 11:53:52.167466 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/30419d6d-3999-43ef-8cd9-07143299061a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 18 11:53:52 crc kubenswrapper[4922]: I0218 11:53:52.533134 4922 generic.go:334] "Generic (PLEG): container finished" podID="30419d6d-3999-43ef-8cd9-07143299061a" containerID="e636c73dbb4ceb7b2f26422690f3ab2d91f3f8e5fae643efe99e90bb666ac64a" exitCode=0 Feb 18 11:53:52 crc kubenswrapper[4922]: I0218 11:53:52.533210 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-nhn5n" event={"ID":"30419d6d-3999-43ef-8cd9-07143299061a","Type":"ContainerDied","Data":"e636c73dbb4ceb7b2f26422690f3ab2d91f3f8e5fae643efe99e90bb666ac64a"} Feb 18 11:53:52 crc kubenswrapper[4922]: I0218 11:53:52.533233 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-nhn5n" Feb 18 11:53:52 crc kubenswrapper[4922]: I0218 11:53:52.533258 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-nhn5n" event={"ID":"30419d6d-3999-43ef-8cd9-07143299061a","Type":"ContainerDied","Data":"d25632e30548614105c4a5a234e63a61b760dc62618152fb02637520873fb6e0"} Feb 18 11:53:52 crc kubenswrapper[4922]: I0218 11:53:52.533281 4922 scope.go:117] "RemoveContainer" containerID="e636c73dbb4ceb7b2f26422690f3ab2d91f3f8e5fae643efe99e90bb666ac64a" Feb 18 11:53:52 crc kubenswrapper[4922]: I0218 11:53:52.565449 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-nhn5n"] Feb 18 11:53:52 crc kubenswrapper[4922]: I0218 11:53:52.572042 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-nhn5n"] Feb 18 11:53:52 crc kubenswrapper[4922]: I0218 11:53:52.763472 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-dd51-account-create-update-fr8ml"] Feb 18 11:53:52 crc kubenswrapper[4922]: E0218 11:53:52.764008 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30419d6d-3999-43ef-8cd9-07143299061a" containerName="dnsmasq-dns" Feb 18 11:53:52 crc kubenswrapper[4922]: I0218 11:53:52.764020 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="30419d6d-3999-43ef-8cd9-07143299061a" containerName="dnsmasq-dns" Feb 18 11:53:52 crc kubenswrapper[4922]: E0218 11:53:52.764057 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30419d6d-3999-43ef-8cd9-07143299061a" containerName="init" Feb 18 11:53:52 crc kubenswrapper[4922]: I0218 11:53:52.764063 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="30419d6d-3999-43ef-8cd9-07143299061a" containerName="init" Feb 18 11:53:52 crc kubenswrapper[4922]: I0218 11:53:52.764197 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="30419d6d-3999-43ef-8cd9-07143299061a" containerName="dnsmasq-dns" Feb 18 11:53:52 crc kubenswrapper[4922]: I0218 11:53:52.768032 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-dd51-account-create-update-fr8ml" Feb 18 11:53:52 crc kubenswrapper[4922]: I0218 11:53:52.770650 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 18 11:53:52 crc kubenswrapper[4922]: I0218 11:53:52.775124 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-dd51-account-create-update-fr8ml"] Feb 18 11:53:52 crc kubenswrapper[4922]: I0218 11:53:52.808925 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-499z7"] Feb 18 11:53:52 crc kubenswrapper[4922]: I0218 11:53:52.809841 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-499z7" Feb 18 11:53:52 crc kubenswrapper[4922]: I0218 11:53:52.822501 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-499z7"] Feb 18 11:53:52 crc kubenswrapper[4922]: I0218 11:53:52.876844 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmlzq\" (UniqueName: \"kubernetes.io/projected/811ffd65-f5dc-44a3-a1cb-778937ca9771-kube-api-access-vmlzq\") pod \"keystone-dd51-account-create-update-fr8ml\" (UID: \"811ffd65-f5dc-44a3-a1cb-778937ca9771\") " pod="openstack/keystone-dd51-account-create-update-fr8ml" Feb 18 11:53:52 crc kubenswrapper[4922]: I0218 11:53:52.876940 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/811ffd65-f5dc-44a3-a1cb-778937ca9771-operator-scripts\") pod \"keystone-dd51-account-create-update-fr8ml\" (UID: \"811ffd65-f5dc-44a3-a1cb-778937ca9771\") " pod="openstack/keystone-dd51-account-create-update-fr8ml" Feb 18 11:53:52 crc kubenswrapper[4922]: I0218 11:53:52.979206 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d3c9160-dd6d-4591-9554-d3c74df3a64e-operator-scripts\") pod \"keystone-db-create-499z7\" (UID: \"0d3c9160-dd6d-4591-9554-d3c74df3a64e\") " pod="openstack/keystone-db-create-499z7" Feb 18 11:53:52 crc kubenswrapper[4922]: I0218 11:53:52.979261 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmlzq\" (UniqueName: \"kubernetes.io/projected/811ffd65-f5dc-44a3-a1cb-778937ca9771-kube-api-access-vmlzq\") pod \"keystone-dd51-account-create-update-fr8ml\" (UID: \"811ffd65-f5dc-44a3-a1cb-778937ca9771\") " pod="openstack/keystone-dd51-account-create-update-fr8ml" Feb 18 11:53:52 crc kubenswrapper[4922]: I0218 11:53:52.979455 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/811ffd65-f5dc-44a3-a1cb-778937ca9771-operator-scripts\") pod \"keystone-dd51-account-create-update-fr8ml\" (UID: \"811ffd65-f5dc-44a3-a1cb-778937ca9771\") " pod="openstack/keystone-dd51-account-create-update-fr8ml" Feb 18 11:53:52 crc kubenswrapper[4922]: I0218 11:53:52.979723 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8ps9\" (UniqueName: \"kubernetes.io/projected/0d3c9160-dd6d-4591-9554-d3c74df3a64e-kube-api-access-m8ps9\") pod \"keystone-db-create-499z7\" (UID: \"0d3c9160-dd6d-4591-9554-d3c74df3a64e\") " pod="openstack/keystone-db-create-499z7" Feb 18 11:53:52 crc kubenswrapper[4922]: I0218 11:53:52.980435 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/811ffd65-f5dc-44a3-a1cb-778937ca9771-operator-scripts\") pod \"keystone-dd51-account-create-update-fr8ml\" (UID: \"811ffd65-f5dc-44a3-a1cb-778937ca9771\") " pod="openstack/keystone-dd51-account-create-update-fr8ml" Feb 18 11:53:52 crc kubenswrapper[4922]: I0218 11:53:52.985061 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30419d6d-3999-43ef-8cd9-07143299061a" path="/var/lib/kubelet/pods/30419d6d-3999-43ef-8cd9-07143299061a/volumes" Feb 18 11:53:53 crc kubenswrapper[4922]: I0218 11:53:53.013167 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-2zkj4"] Feb 18 11:53:53 crc kubenswrapper[4922]: I0218 11:53:53.014950 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-2zkj4" Feb 18 11:53:53 crc kubenswrapper[4922]: I0218 11:53:53.022057 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-2zkj4"] Feb 18 11:53:53 crc kubenswrapper[4922]: I0218 11:53:53.022482 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmlzq\" (UniqueName: \"kubernetes.io/projected/811ffd65-f5dc-44a3-a1cb-778937ca9771-kube-api-access-vmlzq\") pod \"keystone-dd51-account-create-update-fr8ml\" (UID: \"811ffd65-f5dc-44a3-a1cb-778937ca9771\") " pod="openstack/keystone-dd51-account-create-update-fr8ml" Feb 18 11:53:53 crc kubenswrapper[4922]: I0218 11:53:53.081918 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8ps9\" (UniqueName: \"kubernetes.io/projected/0d3c9160-dd6d-4591-9554-d3c74df3a64e-kube-api-access-m8ps9\") pod \"keystone-db-create-499z7\" (UID: \"0d3c9160-dd6d-4591-9554-d3c74df3a64e\") " pod="openstack/keystone-db-create-499z7" Feb 18 11:53:53 crc kubenswrapper[4922]: I0218 11:53:53.082024 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d3c9160-dd6d-4591-9554-d3c74df3a64e-operator-scripts\") pod \"keystone-db-create-499z7\" (UID: \"0d3c9160-dd6d-4591-9554-d3c74df3a64e\") " pod="openstack/keystone-db-create-499z7" Feb 18 11:53:53 crc kubenswrapper[4922]: I0218 11:53:53.082909 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d3c9160-dd6d-4591-9554-d3c74df3a64e-operator-scripts\") pod \"keystone-db-create-499z7\" (UID: \"0d3c9160-dd6d-4591-9554-d3c74df3a64e\") " pod="openstack/keystone-db-create-499z7" Feb 18 11:53:53 crc kubenswrapper[4922]: I0218 11:53:53.088748 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-dd51-account-create-update-fr8ml" Feb 18 11:53:53 crc kubenswrapper[4922]: I0218 11:53:53.109253 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8ps9\" (UniqueName: \"kubernetes.io/projected/0d3c9160-dd6d-4591-9554-d3c74df3a64e-kube-api-access-m8ps9\") pod \"keystone-db-create-499z7\" (UID: \"0d3c9160-dd6d-4591-9554-d3c74df3a64e\") " pod="openstack/keystone-db-create-499z7" Feb 18 11:53:53 crc kubenswrapper[4922]: I0218 11:53:53.128165 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-499z7" Feb 18 11:53:53 crc kubenswrapper[4922]: I0218 11:53:53.131615 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-a3b1-account-create-update-5qfd8"] Feb 18 11:53:53 crc kubenswrapper[4922]: I0218 11:53:53.133029 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a3b1-account-create-update-5qfd8" Feb 18 11:53:53 crc kubenswrapper[4922]: I0218 11:53:53.135282 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 18 11:53:53 crc kubenswrapper[4922]: I0218 11:53:53.144179 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-a3b1-account-create-update-5qfd8"] Feb 18 11:53:53 crc kubenswrapper[4922]: I0218 11:53:53.187911 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05f03ea4-2462-4f2c-b9b8-395fc9802993-operator-scripts\") pod \"placement-db-create-2zkj4\" (UID: \"05f03ea4-2462-4f2c-b9b8-395fc9802993\") " pod="openstack/placement-db-create-2zkj4" Feb 18 11:53:53 crc kubenswrapper[4922]: I0218 11:53:53.188019 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jvv6\" (UniqueName: \"kubernetes.io/projected/05f03ea4-2462-4f2c-b9b8-395fc9802993-kube-api-access-6jvv6\") pod \"placement-db-create-2zkj4\" (UID: \"05f03ea4-2462-4f2c-b9b8-395fc9802993\") " pod="openstack/placement-db-create-2zkj4" Feb 18 11:53:53 crc kubenswrapper[4922]: I0218 11:53:53.289117 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85eec6a5-292b-4061-bb90-18904535d9cc-operator-scripts\") pod \"placement-a3b1-account-create-update-5qfd8\" (UID: \"85eec6a5-292b-4061-bb90-18904535d9cc\") " pod="openstack/placement-a3b1-account-create-update-5qfd8" Feb 18 11:53:53 crc kubenswrapper[4922]: I0218 11:53:53.289187 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05f03ea4-2462-4f2c-b9b8-395fc9802993-operator-scripts\") pod \"placement-db-create-2zkj4\" (UID: \"05f03ea4-2462-4f2c-b9b8-395fc9802993\") " pod="openstack/placement-db-create-2zkj4" Feb 18 11:53:53 crc kubenswrapper[4922]: I0218 11:53:53.289267 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jvv6\" (UniqueName: \"kubernetes.io/projected/05f03ea4-2462-4f2c-b9b8-395fc9802993-kube-api-access-6jvv6\") pod \"placement-db-create-2zkj4\" (UID: \"05f03ea4-2462-4f2c-b9b8-395fc9802993\") " pod="openstack/placement-db-create-2zkj4" Feb 18 11:53:53 crc kubenswrapper[4922]: I0218 11:53:53.289309 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqlnd\" (UniqueName: \"kubernetes.io/projected/85eec6a5-292b-4061-bb90-18904535d9cc-kube-api-access-mqlnd\") pod \"placement-a3b1-account-create-update-5qfd8\" (UID: \"85eec6a5-292b-4061-bb90-18904535d9cc\") " pod="openstack/placement-a3b1-account-create-update-5qfd8" Feb 18 11:53:53 crc kubenswrapper[4922]: I0218 11:53:53.290062 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05f03ea4-2462-4f2c-b9b8-395fc9802993-operator-scripts\") pod \"placement-db-create-2zkj4\" (UID: \"05f03ea4-2462-4f2c-b9b8-395fc9802993\") " pod="openstack/placement-db-create-2zkj4" Feb 18 11:53:53 crc kubenswrapper[4922]: I0218 11:53:53.303825 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jvv6\" (UniqueName: \"kubernetes.io/projected/05f03ea4-2462-4f2c-b9b8-395fc9802993-kube-api-access-6jvv6\") pod \"placement-db-create-2zkj4\" (UID: \"05f03ea4-2462-4f2c-b9b8-395fc9802993\") " pod="openstack/placement-db-create-2zkj4" Feb 18 11:53:53 crc kubenswrapper[4922]: I0218 11:53:53.361635 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-2zkj4" Feb 18 11:53:53 crc kubenswrapper[4922]: I0218 11:53:53.390667 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85eec6a5-292b-4061-bb90-18904535d9cc-operator-scripts\") pod \"placement-a3b1-account-create-update-5qfd8\" (UID: \"85eec6a5-292b-4061-bb90-18904535d9cc\") " pod="openstack/placement-a3b1-account-create-update-5qfd8" Feb 18 11:53:53 crc kubenswrapper[4922]: I0218 11:53:53.390856 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqlnd\" (UniqueName: \"kubernetes.io/projected/85eec6a5-292b-4061-bb90-18904535d9cc-kube-api-access-mqlnd\") pod \"placement-a3b1-account-create-update-5qfd8\" (UID: \"85eec6a5-292b-4061-bb90-18904535d9cc\") " pod="openstack/placement-a3b1-account-create-update-5qfd8" Feb 18 11:53:53 crc kubenswrapper[4922]: I0218 11:53:53.391569 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85eec6a5-292b-4061-bb90-18904535d9cc-operator-scripts\") pod \"placement-a3b1-account-create-update-5qfd8\" (UID: \"85eec6a5-292b-4061-bb90-18904535d9cc\") " pod="openstack/placement-a3b1-account-create-update-5qfd8" Feb 18 11:53:53 crc kubenswrapper[4922]: I0218 11:53:53.407828 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqlnd\" (UniqueName: \"kubernetes.io/projected/85eec6a5-292b-4061-bb90-18904535d9cc-kube-api-access-mqlnd\") pod \"placement-a3b1-account-create-update-5qfd8\" (UID: \"85eec6a5-292b-4061-bb90-18904535d9cc\") " pod="openstack/placement-a3b1-account-create-update-5qfd8" Feb 18 11:53:53 crc kubenswrapper[4922]: I0218 11:53:53.475938 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a3b1-account-create-update-5qfd8" Feb 18 11:53:54 crc kubenswrapper[4922]: I0218 11:53:54.165961 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-create-xj7zt"] Feb 18 11:53:54 crc kubenswrapper[4922]: I0218 11:53:54.167219 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-xj7zt" Feb 18 11:53:54 crc kubenswrapper[4922]: I0218 11:53:54.176726 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-xj7zt"] Feb 18 11:53:54 crc kubenswrapper[4922]: I0218 11:53:54.283921 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-95b8-account-create-update-58r6z"] Feb 18 11:53:54 crc kubenswrapper[4922]: I0218 11:53:54.285046 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-95b8-account-create-update-58r6z" Feb 18 11:53:54 crc kubenswrapper[4922]: I0218 11:53:54.289965 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-db-secret" Feb 18 11:53:54 crc kubenswrapper[4922]: I0218 11:53:54.299871 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-95b8-account-create-update-58r6z"] Feb 18 11:53:54 crc kubenswrapper[4922]: I0218 11:53:54.307576 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e854dba-d50f-4228-9b7a-c8a0ae16347a-operator-scripts\") pod \"watcher-db-create-xj7zt\" (UID: \"3e854dba-d50f-4228-9b7a-c8a0ae16347a\") " pod="openstack/watcher-db-create-xj7zt" Feb 18 11:53:54 crc kubenswrapper[4922]: I0218 11:53:54.307658 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fft6\" (UniqueName: \"kubernetes.io/projected/3e854dba-d50f-4228-9b7a-c8a0ae16347a-kube-api-access-6fft6\") pod \"watcher-db-create-xj7zt\" (UID: \"3e854dba-d50f-4228-9b7a-c8a0ae16347a\") " pod="openstack/watcher-db-create-xj7zt" Feb 18 11:53:54 crc kubenswrapper[4922]: I0218 11:53:54.408579 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc452273-8a5f-47d8-8aa5-1ddfe2240e28-operator-scripts\") pod \"watcher-95b8-account-create-update-58r6z\" (UID: \"cc452273-8a5f-47d8-8aa5-1ddfe2240e28\") " pod="openstack/watcher-95b8-account-create-update-58r6z" Feb 18 11:53:54 crc kubenswrapper[4922]: I0218 11:53:54.408630 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6t58\" (UniqueName: \"kubernetes.io/projected/cc452273-8a5f-47d8-8aa5-1ddfe2240e28-kube-api-access-d6t58\") pod \"watcher-95b8-account-create-update-58r6z\" (UID: \"cc452273-8a5f-47d8-8aa5-1ddfe2240e28\") " pod="openstack/watcher-95b8-account-create-update-58r6z" Feb 18 11:53:54 crc kubenswrapper[4922]: I0218 11:53:54.408676 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e854dba-d50f-4228-9b7a-c8a0ae16347a-operator-scripts\") pod \"watcher-db-create-xj7zt\" (UID: \"3e854dba-d50f-4228-9b7a-c8a0ae16347a\") " pod="openstack/watcher-db-create-xj7zt" Feb 18 11:53:54 crc kubenswrapper[4922]: I0218 11:53:54.408909 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fft6\" (UniqueName: \"kubernetes.io/projected/3e854dba-d50f-4228-9b7a-c8a0ae16347a-kube-api-access-6fft6\") pod \"watcher-db-create-xj7zt\" (UID: \"3e854dba-d50f-4228-9b7a-c8a0ae16347a\") " pod="openstack/watcher-db-create-xj7zt" Feb 18 11:53:54 crc kubenswrapper[4922]: I0218 11:53:54.409421 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e854dba-d50f-4228-9b7a-c8a0ae16347a-operator-scripts\") pod \"watcher-db-create-xj7zt\" (UID: \"3e854dba-d50f-4228-9b7a-c8a0ae16347a\") " pod="openstack/watcher-db-create-xj7zt" Feb 18 11:53:54 crc kubenswrapper[4922]: I0218 11:53:54.433165 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-zmcwq"] Feb 18 11:53:54 crc kubenswrapper[4922]: I0218 11:53:54.449325 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fft6\" (UniqueName: \"kubernetes.io/projected/3e854dba-d50f-4228-9b7a-c8a0ae16347a-kube-api-access-6fft6\") pod \"watcher-db-create-xj7zt\" (UID: \"3e854dba-d50f-4228-9b7a-c8a0ae16347a\") " pod="openstack/watcher-db-create-xj7zt" Feb 18 11:53:54 crc kubenswrapper[4922]: I0218 11:53:54.461865 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-zmcwq"] Feb 18 11:53:54 crc kubenswrapper[4922]: I0218 11:53:54.462138 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-zmcwq" Feb 18 11:53:54 crc kubenswrapper[4922]: I0218 11:53:54.491078 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-xj7zt" Feb 18 11:53:54 crc kubenswrapper[4922]: I0218 11:53:54.511391 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc452273-8a5f-47d8-8aa5-1ddfe2240e28-operator-scripts\") pod \"watcher-95b8-account-create-update-58r6z\" (UID: \"cc452273-8a5f-47d8-8aa5-1ddfe2240e28\") " pod="openstack/watcher-95b8-account-create-update-58r6z" Feb 18 11:53:54 crc kubenswrapper[4922]: I0218 11:53:54.511477 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6t58\" (UniqueName: \"kubernetes.io/projected/cc452273-8a5f-47d8-8aa5-1ddfe2240e28-kube-api-access-d6t58\") pod \"watcher-95b8-account-create-update-58r6z\" (UID: \"cc452273-8a5f-47d8-8aa5-1ddfe2240e28\") " pod="openstack/watcher-95b8-account-create-update-58r6z" Feb 18 11:53:54 crc kubenswrapper[4922]: I0218 11:53:54.512594 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc452273-8a5f-47d8-8aa5-1ddfe2240e28-operator-scripts\") pod \"watcher-95b8-account-create-update-58r6z\" (UID: \"cc452273-8a5f-47d8-8aa5-1ddfe2240e28\") " pod="openstack/watcher-95b8-account-create-update-58r6z" Feb 18 11:53:54 crc kubenswrapper[4922]: I0218 11:53:54.542450 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6t58\" (UniqueName: \"kubernetes.io/projected/cc452273-8a5f-47d8-8aa5-1ddfe2240e28-kube-api-access-d6t58\") pod \"watcher-95b8-account-create-update-58r6z\" (UID: \"cc452273-8a5f-47d8-8aa5-1ddfe2240e28\") " pod="openstack/watcher-95b8-account-create-update-58r6z" Feb 18 11:53:54 crc kubenswrapper[4922]: I0218 11:53:54.608113 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-95b8-account-create-update-58r6z" Feb 18 11:53:54 crc kubenswrapper[4922]: I0218 11:53:54.612805 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec6256f2-24ef-4e83-9a3c-c98ef206f7b0-config\") pod \"dnsmasq-dns-b8fbc5445-zmcwq\" (UID: \"ec6256f2-24ef-4e83-9a3c-c98ef206f7b0\") " pod="openstack/dnsmasq-dns-b8fbc5445-zmcwq" Feb 18 11:53:54 crc kubenswrapper[4922]: I0218 11:53:54.612865 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec6256f2-24ef-4e83-9a3c-c98ef206f7b0-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-zmcwq\" (UID: \"ec6256f2-24ef-4e83-9a3c-c98ef206f7b0\") " pod="openstack/dnsmasq-dns-b8fbc5445-zmcwq" Feb 18 11:53:54 crc kubenswrapper[4922]: I0218 11:53:54.612907 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec6256f2-24ef-4e83-9a3c-c98ef206f7b0-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-zmcwq\" (UID: \"ec6256f2-24ef-4e83-9a3c-c98ef206f7b0\") " pod="openstack/dnsmasq-dns-b8fbc5445-zmcwq" Feb 18 11:53:54 crc kubenswrapper[4922]: I0218 11:53:54.612937 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7r9n\" (UniqueName: \"kubernetes.io/projected/ec6256f2-24ef-4e83-9a3c-c98ef206f7b0-kube-api-access-f7r9n\") pod \"dnsmasq-dns-b8fbc5445-zmcwq\" (UID: \"ec6256f2-24ef-4e83-9a3c-c98ef206f7b0\") " pod="openstack/dnsmasq-dns-b8fbc5445-zmcwq" Feb 18 11:53:54 crc kubenswrapper[4922]: I0218 11:53:54.613038 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec6256f2-24ef-4e83-9a3c-c98ef206f7b0-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-zmcwq\" (UID: \"ec6256f2-24ef-4e83-9a3c-c98ef206f7b0\") " pod="openstack/dnsmasq-dns-b8fbc5445-zmcwq" Feb 18 11:53:54 crc kubenswrapper[4922]: I0218 11:53:54.714431 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec6256f2-24ef-4e83-9a3c-c98ef206f7b0-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-zmcwq\" (UID: \"ec6256f2-24ef-4e83-9a3c-c98ef206f7b0\") " pod="openstack/dnsmasq-dns-b8fbc5445-zmcwq" Feb 18 11:53:54 crc kubenswrapper[4922]: I0218 11:53:54.714494 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec6256f2-24ef-4e83-9a3c-c98ef206f7b0-config\") pod \"dnsmasq-dns-b8fbc5445-zmcwq\" (UID: \"ec6256f2-24ef-4e83-9a3c-c98ef206f7b0\") " pod="openstack/dnsmasq-dns-b8fbc5445-zmcwq" Feb 18 11:53:54 crc kubenswrapper[4922]: I0218 11:53:54.714530 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec6256f2-24ef-4e83-9a3c-c98ef206f7b0-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-zmcwq\" (UID: \"ec6256f2-24ef-4e83-9a3c-c98ef206f7b0\") " pod="openstack/dnsmasq-dns-b8fbc5445-zmcwq" Feb 18 11:53:54 crc kubenswrapper[4922]: I0218 11:53:54.714565 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec6256f2-24ef-4e83-9a3c-c98ef206f7b0-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-zmcwq\" (UID: \"ec6256f2-24ef-4e83-9a3c-c98ef206f7b0\") " pod="openstack/dnsmasq-dns-b8fbc5445-zmcwq" Feb 18 11:53:54 crc kubenswrapper[4922]: I0218 11:53:54.714591 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7r9n\" (UniqueName: \"kubernetes.io/projected/ec6256f2-24ef-4e83-9a3c-c98ef206f7b0-kube-api-access-f7r9n\") pod \"dnsmasq-dns-b8fbc5445-zmcwq\" (UID: \"ec6256f2-24ef-4e83-9a3c-c98ef206f7b0\") " pod="openstack/dnsmasq-dns-b8fbc5445-zmcwq" Feb 18 11:53:54 crc kubenswrapper[4922]: I0218 11:53:54.715470 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec6256f2-24ef-4e83-9a3c-c98ef206f7b0-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-zmcwq\" (UID: \"ec6256f2-24ef-4e83-9a3c-c98ef206f7b0\") " pod="openstack/dnsmasq-dns-b8fbc5445-zmcwq" Feb 18 11:53:54 crc kubenswrapper[4922]: I0218 11:53:54.715580 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec6256f2-24ef-4e83-9a3c-c98ef206f7b0-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-zmcwq\" (UID: \"ec6256f2-24ef-4e83-9a3c-c98ef206f7b0\") " pod="openstack/dnsmasq-dns-b8fbc5445-zmcwq" Feb 18 11:53:54 crc kubenswrapper[4922]: I0218 11:53:54.715632 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec6256f2-24ef-4e83-9a3c-c98ef206f7b0-config\") pod \"dnsmasq-dns-b8fbc5445-zmcwq\" (UID: \"ec6256f2-24ef-4e83-9a3c-c98ef206f7b0\") " pod="openstack/dnsmasq-dns-b8fbc5445-zmcwq" Feb 18 11:53:54 crc kubenswrapper[4922]: I0218 11:53:54.715773 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec6256f2-24ef-4e83-9a3c-c98ef206f7b0-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-zmcwq\" (UID: \"ec6256f2-24ef-4e83-9a3c-c98ef206f7b0\") " pod="openstack/dnsmasq-dns-b8fbc5445-zmcwq" Feb 18 11:53:54 crc kubenswrapper[4922]: I0218 11:53:54.744123 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7r9n\" (UniqueName: \"kubernetes.io/projected/ec6256f2-24ef-4e83-9a3c-c98ef206f7b0-kube-api-access-f7r9n\") pod \"dnsmasq-dns-b8fbc5445-zmcwq\" (UID: \"ec6256f2-24ef-4e83-9a3c-c98ef206f7b0\") " pod="openstack/dnsmasq-dns-b8fbc5445-zmcwq" Feb 18 11:53:54 crc kubenswrapper[4922]: I0218 11:53:54.812180 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-zmcwq" Feb 18 11:53:55 crc kubenswrapper[4922]: I0218 11:53:55.608183 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 18 11:53:55 crc kubenswrapper[4922]: I0218 11:53:55.621955 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 18 11:53:55 crc kubenswrapper[4922]: I0218 11:53:55.623039 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 18 11:53:55 crc kubenswrapper[4922]: I0218 11:53:55.625564 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 18 11:53:55 crc kubenswrapper[4922]: I0218 11:53:55.625745 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 18 11:53:55 crc kubenswrapper[4922]: I0218 11:53:55.625910 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 18 11:53:55 crc kubenswrapper[4922]: I0218 11:53:55.626043 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-gh6mm" Feb 18 11:53:55 crc kubenswrapper[4922]: I0218 11:53:55.731013 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/0771bdc1-7622-4a65-aa82-3150630ce652-lock\") pod \"swift-storage-0\" (UID: \"0771bdc1-7622-4a65-aa82-3150630ce652\") " pod="openstack/swift-storage-0" Feb 18 11:53:55 crc kubenswrapper[4922]: I0218 11:53:55.731097 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0771bdc1-7622-4a65-aa82-3150630ce652-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"0771bdc1-7622-4a65-aa82-3150630ce652\") " pod="openstack/swift-storage-0" Feb 18 11:53:55 crc kubenswrapper[4922]: I0218 11:53:55.731140 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn972\" (UniqueName: \"kubernetes.io/projected/0771bdc1-7622-4a65-aa82-3150630ce652-kube-api-access-jn972\") pod \"swift-storage-0\" (UID: \"0771bdc1-7622-4a65-aa82-3150630ce652\") " pod="openstack/swift-storage-0" Feb 18 11:53:55 crc kubenswrapper[4922]: I0218 11:53:55.731177 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0771bdc1-7622-4a65-aa82-3150630ce652-etc-swift\") pod \"swift-storage-0\" (UID: \"0771bdc1-7622-4a65-aa82-3150630ce652\") " pod="openstack/swift-storage-0" Feb 18 11:53:55 crc kubenswrapper[4922]: I0218 11:53:55.731260 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"0771bdc1-7622-4a65-aa82-3150630ce652\") " pod="openstack/swift-storage-0" Feb 18 11:53:55 crc kubenswrapper[4922]: I0218 11:53:55.731279 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/0771bdc1-7622-4a65-aa82-3150630ce652-cache\") pod \"swift-storage-0\" (UID: \"0771bdc1-7622-4a65-aa82-3150630ce652\") " pod="openstack/swift-storage-0" Feb 18 11:53:55 crc kubenswrapper[4922]: I0218 11:53:55.832376 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"0771bdc1-7622-4a65-aa82-3150630ce652\") " pod="openstack/swift-storage-0" Feb 18 11:53:55 crc kubenswrapper[4922]: I0218 11:53:55.832436 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/0771bdc1-7622-4a65-aa82-3150630ce652-cache\") pod \"swift-storage-0\" (UID: \"0771bdc1-7622-4a65-aa82-3150630ce652\") " pod="openstack/swift-storage-0" Feb 18 11:53:55 crc kubenswrapper[4922]: I0218 11:53:55.832488 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/0771bdc1-7622-4a65-aa82-3150630ce652-lock\") pod \"swift-storage-0\" (UID: \"0771bdc1-7622-4a65-aa82-3150630ce652\") " pod="openstack/swift-storage-0" Feb 18 11:53:55 crc kubenswrapper[4922]: I0218 11:53:55.832543 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0771bdc1-7622-4a65-aa82-3150630ce652-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"0771bdc1-7622-4a65-aa82-3150630ce652\") " pod="openstack/swift-storage-0" Feb 18 11:53:55 crc kubenswrapper[4922]: I0218 11:53:55.832593 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jn972\" (UniqueName: \"kubernetes.io/projected/0771bdc1-7622-4a65-aa82-3150630ce652-kube-api-access-jn972\") pod \"swift-storage-0\" (UID: \"0771bdc1-7622-4a65-aa82-3150630ce652\") " pod="openstack/swift-storage-0" Feb 18 11:53:55 crc kubenswrapper[4922]: I0218 11:53:55.832636 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0771bdc1-7622-4a65-aa82-3150630ce652-etc-swift\") pod \"swift-storage-0\" (UID: \"0771bdc1-7622-4a65-aa82-3150630ce652\") " pod="openstack/swift-storage-0" Feb 18 11:53:55 crc kubenswrapper[4922]: I0218 11:53:55.832840 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"0771bdc1-7622-4a65-aa82-3150630ce652\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/swift-storage-0" Feb 18 11:53:55 crc kubenswrapper[4922]: I0218 11:53:55.832942 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/0771bdc1-7622-4a65-aa82-3150630ce652-cache\") pod \"swift-storage-0\" (UID: \"0771bdc1-7622-4a65-aa82-3150630ce652\") " pod="openstack/swift-storage-0" Feb 18 11:53:55 crc kubenswrapper[4922]: E0218 11:53:55.832851 4922 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 18 11:53:55 crc kubenswrapper[4922]: E0218 11:53:55.833006 4922 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 18 11:53:55 crc kubenswrapper[4922]: E0218 11:53:55.833053 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0771bdc1-7622-4a65-aa82-3150630ce652-etc-swift podName:0771bdc1-7622-4a65-aa82-3150630ce652 nodeName:}" failed. No retries permitted until 2026-02-18 11:53:56.333036173 +0000 UTC m=+1038.060740253 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/0771bdc1-7622-4a65-aa82-3150630ce652-etc-swift") pod "swift-storage-0" (UID: "0771bdc1-7622-4a65-aa82-3150630ce652") : configmap "swift-ring-files" not found Feb 18 11:53:55 crc kubenswrapper[4922]: I0218 11:53:55.833057 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/0771bdc1-7622-4a65-aa82-3150630ce652-lock\") pod \"swift-storage-0\" (UID: \"0771bdc1-7622-4a65-aa82-3150630ce652\") " pod="openstack/swift-storage-0" Feb 18 11:53:55 crc kubenswrapper[4922]: I0218 11:53:55.848283 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0771bdc1-7622-4a65-aa82-3150630ce652-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"0771bdc1-7622-4a65-aa82-3150630ce652\") " pod="openstack/swift-storage-0" Feb 18 11:53:55 crc kubenswrapper[4922]: I0218 11:53:55.851255 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn972\" (UniqueName: \"kubernetes.io/projected/0771bdc1-7622-4a65-aa82-3150630ce652-kube-api-access-jn972\") pod \"swift-storage-0\" (UID: \"0771bdc1-7622-4a65-aa82-3150630ce652\") " pod="openstack/swift-storage-0" Feb 18 11:53:55 crc kubenswrapper[4922]: I0218 11:53:55.868247 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"0771bdc1-7622-4a65-aa82-3150630ce652\") " pod="openstack/swift-storage-0" Feb 18 11:53:55 crc kubenswrapper[4922]: I0218 11:53:55.937080 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-6gzbs"] Feb 18 11:53:55 crc kubenswrapper[4922]: I0218 11:53:55.938541 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-6gzbs" Feb 18 11:53:55 crc kubenswrapper[4922]: I0218 11:53:55.941460 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 18 11:53:55 crc kubenswrapper[4922]: I0218 11:53:55.941624 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 18 11:53:55 crc kubenswrapper[4922]: I0218 11:53:55.941736 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 18 11:53:55 crc kubenswrapper[4922]: I0218 11:53:55.969844 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-6gzbs"] Feb 18 11:53:56 crc kubenswrapper[4922]: I0218 11:53:56.036207 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/83fbf909-70fe-4d3c-9b45-3f5a6733779c-dispersionconf\") pod \"swift-ring-rebalance-6gzbs\" (UID: \"83fbf909-70fe-4d3c-9b45-3f5a6733779c\") " pod="openstack/swift-ring-rebalance-6gzbs" Feb 18 11:53:56 crc kubenswrapper[4922]: I0218 11:53:56.036265 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s79m\" (UniqueName: \"kubernetes.io/projected/83fbf909-70fe-4d3c-9b45-3f5a6733779c-kube-api-access-6s79m\") pod \"swift-ring-rebalance-6gzbs\" (UID: \"83fbf909-70fe-4d3c-9b45-3f5a6733779c\") " pod="openstack/swift-ring-rebalance-6gzbs" Feb 18 11:53:56 crc kubenswrapper[4922]: I0218 11:53:56.036292 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/83fbf909-70fe-4d3c-9b45-3f5a6733779c-etc-swift\") pod \"swift-ring-rebalance-6gzbs\" (UID: \"83fbf909-70fe-4d3c-9b45-3f5a6733779c\") " pod="openstack/swift-ring-rebalance-6gzbs" Feb 18 11:53:56 crc kubenswrapper[4922]: I0218 11:53:56.036319 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/83fbf909-70fe-4d3c-9b45-3f5a6733779c-ring-data-devices\") pod \"swift-ring-rebalance-6gzbs\" (UID: \"83fbf909-70fe-4d3c-9b45-3f5a6733779c\") " pod="openstack/swift-ring-rebalance-6gzbs" Feb 18 11:53:56 crc kubenswrapper[4922]: I0218 11:53:56.036354 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/83fbf909-70fe-4d3c-9b45-3f5a6733779c-swiftconf\") pod \"swift-ring-rebalance-6gzbs\" (UID: \"83fbf909-70fe-4d3c-9b45-3f5a6733779c\") " pod="openstack/swift-ring-rebalance-6gzbs" Feb 18 11:53:56 crc kubenswrapper[4922]: I0218 11:53:56.036488 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/83fbf909-70fe-4d3c-9b45-3f5a6733779c-scripts\") pod \"swift-ring-rebalance-6gzbs\" (UID: \"83fbf909-70fe-4d3c-9b45-3f5a6733779c\") " pod="openstack/swift-ring-rebalance-6gzbs" Feb 18 11:53:56 crc kubenswrapper[4922]: I0218 11:53:56.036513 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83fbf909-70fe-4d3c-9b45-3f5a6733779c-combined-ca-bundle\") pod \"swift-ring-rebalance-6gzbs\" (UID: \"83fbf909-70fe-4d3c-9b45-3f5a6733779c\") " pod="openstack/swift-ring-rebalance-6gzbs" Feb 18 11:53:56 crc kubenswrapper[4922]: I0218 11:53:56.138315 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/83fbf909-70fe-4d3c-9b45-3f5a6733779c-swiftconf\") pod \"swift-ring-rebalance-6gzbs\" (UID: \"83fbf909-70fe-4d3c-9b45-3f5a6733779c\") " pod="openstack/swift-ring-rebalance-6gzbs" Feb 18 11:53:56 crc kubenswrapper[4922]: I0218 11:53:56.138479 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/83fbf909-70fe-4d3c-9b45-3f5a6733779c-scripts\") pod \"swift-ring-rebalance-6gzbs\" (UID: \"83fbf909-70fe-4d3c-9b45-3f5a6733779c\") " pod="openstack/swift-ring-rebalance-6gzbs" Feb 18 11:53:56 crc kubenswrapper[4922]: I0218 11:53:56.138500 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83fbf909-70fe-4d3c-9b45-3f5a6733779c-combined-ca-bundle\") pod \"swift-ring-rebalance-6gzbs\" (UID: \"83fbf909-70fe-4d3c-9b45-3f5a6733779c\") " pod="openstack/swift-ring-rebalance-6gzbs" Feb 18 11:53:56 crc kubenswrapper[4922]: I0218 11:53:56.138549 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/83fbf909-70fe-4d3c-9b45-3f5a6733779c-dispersionconf\") pod \"swift-ring-rebalance-6gzbs\" (UID: \"83fbf909-70fe-4d3c-9b45-3f5a6733779c\") " pod="openstack/swift-ring-rebalance-6gzbs" Feb 18 11:53:56 crc kubenswrapper[4922]: I0218 11:53:56.138583 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s79m\" (UniqueName: \"kubernetes.io/projected/83fbf909-70fe-4d3c-9b45-3f5a6733779c-kube-api-access-6s79m\") pod \"swift-ring-rebalance-6gzbs\" (UID: \"83fbf909-70fe-4d3c-9b45-3f5a6733779c\") " pod="openstack/swift-ring-rebalance-6gzbs" Feb 18 11:53:56 crc kubenswrapper[4922]: I0218 11:53:56.138610 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/83fbf909-70fe-4d3c-9b45-3f5a6733779c-etc-swift\") pod \"swift-ring-rebalance-6gzbs\" (UID: \"83fbf909-70fe-4d3c-9b45-3f5a6733779c\") " pod="openstack/swift-ring-rebalance-6gzbs" Feb 18 11:53:56 crc kubenswrapper[4922]: I0218 11:53:56.138640 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/83fbf909-70fe-4d3c-9b45-3f5a6733779c-ring-data-devices\") pod \"swift-ring-rebalance-6gzbs\" (UID: \"83fbf909-70fe-4d3c-9b45-3f5a6733779c\") " pod="openstack/swift-ring-rebalance-6gzbs" Feb 18 11:53:56 crc kubenswrapper[4922]: I0218 11:53:56.139262 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/83fbf909-70fe-4d3c-9b45-3f5a6733779c-etc-swift\") pod \"swift-ring-rebalance-6gzbs\" (UID: \"83fbf909-70fe-4d3c-9b45-3f5a6733779c\") " pod="openstack/swift-ring-rebalance-6gzbs" Feb 18 11:53:56 crc kubenswrapper[4922]: I0218 11:53:56.139418 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/83fbf909-70fe-4d3c-9b45-3f5a6733779c-ring-data-devices\") pod \"swift-ring-rebalance-6gzbs\" (UID: \"83fbf909-70fe-4d3c-9b45-3f5a6733779c\") " pod="openstack/swift-ring-rebalance-6gzbs" Feb 18 11:53:56 crc kubenswrapper[4922]: I0218 11:53:56.139448 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/83fbf909-70fe-4d3c-9b45-3f5a6733779c-scripts\") pod \"swift-ring-rebalance-6gzbs\" (UID: \"83fbf909-70fe-4d3c-9b45-3f5a6733779c\") " pod="openstack/swift-ring-rebalance-6gzbs" Feb 18 11:53:56 crc kubenswrapper[4922]: I0218 11:53:56.142691 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/83fbf909-70fe-4d3c-9b45-3f5a6733779c-swiftconf\") pod \"swift-ring-rebalance-6gzbs\" (UID: \"83fbf909-70fe-4d3c-9b45-3f5a6733779c\") " pod="openstack/swift-ring-rebalance-6gzbs" Feb 18 11:53:56 crc kubenswrapper[4922]: I0218 11:53:56.142926 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/83fbf909-70fe-4d3c-9b45-3f5a6733779c-dispersionconf\") pod \"swift-ring-rebalance-6gzbs\" (UID: \"83fbf909-70fe-4d3c-9b45-3f5a6733779c\") " pod="openstack/swift-ring-rebalance-6gzbs" Feb 18 11:53:56 crc kubenswrapper[4922]: I0218 11:53:56.150916 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83fbf909-70fe-4d3c-9b45-3f5a6733779c-combined-ca-bundle\") pod \"swift-ring-rebalance-6gzbs\" (UID: \"83fbf909-70fe-4d3c-9b45-3f5a6733779c\") " pod="openstack/swift-ring-rebalance-6gzbs" Feb 18 11:53:56 crc kubenswrapper[4922]: I0218 11:53:56.161936 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s79m\" (UniqueName: \"kubernetes.io/projected/83fbf909-70fe-4d3c-9b45-3f5a6733779c-kube-api-access-6s79m\") pod \"swift-ring-rebalance-6gzbs\" (UID: \"83fbf909-70fe-4d3c-9b45-3f5a6733779c\") " pod="openstack/swift-ring-rebalance-6gzbs" Feb 18 11:53:56 crc kubenswrapper[4922]: I0218 11:53:56.269488 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-6gzbs" Feb 18 11:53:56 crc kubenswrapper[4922]: I0218 11:53:56.342692 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0771bdc1-7622-4a65-aa82-3150630ce652-etc-swift\") pod \"swift-storage-0\" (UID: \"0771bdc1-7622-4a65-aa82-3150630ce652\") " pod="openstack/swift-storage-0" Feb 18 11:53:56 crc kubenswrapper[4922]: E0218 11:53:56.342889 4922 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 18 11:53:56 crc kubenswrapper[4922]: E0218 11:53:56.342916 4922 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 18 11:53:56 crc kubenswrapper[4922]: E0218 11:53:56.342975 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0771bdc1-7622-4a65-aa82-3150630ce652-etc-swift podName:0771bdc1-7622-4a65-aa82-3150630ce652 nodeName:}" failed. No retries permitted until 2026-02-18 11:53:57.342958258 +0000 UTC m=+1039.070662338 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/0771bdc1-7622-4a65-aa82-3150630ce652-etc-swift") pod "swift-storage-0" (UID: "0771bdc1-7622-4a65-aa82-3150630ce652") : configmap "swift-ring-files" not found Feb 18 11:53:56 crc kubenswrapper[4922]: I0218 11:53:56.755836 4922 scope.go:117] "RemoveContainer" containerID="60eb9c5023e2d3d6c8d592fa62a787bc619c79be6125ae7f37f446b7b4620ecf" Feb 18 11:53:56 crc kubenswrapper[4922]: I0218 11:53:56.795174 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-mqx2n"] Feb 18 11:53:56 crc kubenswrapper[4922]: I0218 11:53:56.796293 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-mqx2n" Feb 18 11:53:56 crc kubenswrapper[4922]: I0218 11:53:56.805931 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-mqx2n"] Feb 18 11:53:56 crc kubenswrapper[4922]: I0218 11:53:56.851119 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cac3a541-a2f7-4d95-97ff-1361fbd3e81e-operator-scripts\") pod \"glance-db-create-mqx2n\" (UID: \"cac3a541-a2f7-4d95-97ff-1361fbd3e81e\") " pod="openstack/glance-db-create-mqx2n" Feb 18 11:53:56 crc kubenswrapper[4922]: I0218 11:53:56.851510 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44zdx\" (UniqueName: \"kubernetes.io/projected/cac3a541-a2f7-4d95-97ff-1361fbd3e81e-kube-api-access-44zdx\") pod \"glance-db-create-mqx2n\" (UID: \"cac3a541-a2f7-4d95-97ff-1361fbd3e81e\") " pod="openstack/glance-db-create-mqx2n" Feb 18 11:53:56 crc kubenswrapper[4922]: I0218 11:53:56.903164 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-2714-account-create-update-j5l9f"] Feb 18 11:53:56 crc kubenswrapper[4922]: I0218 11:53:56.906859 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2714-account-create-update-j5l9f" Feb 18 11:53:56 crc kubenswrapper[4922]: I0218 11:53:56.909897 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 18 11:53:56 crc kubenswrapper[4922]: I0218 11:53:56.917377 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-2714-account-create-update-j5l9f"] Feb 18 11:53:56 crc kubenswrapper[4922]: I0218 11:53:56.953064 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44zdx\" (UniqueName: \"kubernetes.io/projected/cac3a541-a2f7-4d95-97ff-1361fbd3e81e-kube-api-access-44zdx\") pod \"glance-db-create-mqx2n\" (UID: \"cac3a541-a2f7-4d95-97ff-1361fbd3e81e\") " pod="openstack/glance-db-create-mqx2n" Feb 18 11:53:56 crc kubenswrapper[4922]: I0218 11:53:56.953130 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cac3a541-a2f7-4d95-97ff-1361fbd3e81e-operator-scripts\") pod \"glance-db-create-mqx2n\" (UID: \"cac3a541-a2f7-4d95-97ff-1361fbd3e81e\") " pod="openstack/glance-db-create-mqx2n" Feb 18 11:53:56 crc kubenswrapper[4922]: I0218 11:53:56.953192 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl925\" (UniqueName: \"kubernetes.io/projected/3b15fbe3-8f30-41e8-8897-037694ccb56b-kube-api-access-vl925\") pod \"glance-2714-account-create-update-j5l9f\" (UID: \"3b15fbe3-8f30-41e8-8897-037694ccb56b\") " pod="openstack/glance-2714-account-create-update-j5l9f" Feb 18 11:53:56 crc kubenswrapper[4922]: I0218 11:53:56.953211 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b15fbe3-8f30-41e8-8897-037694ccb56b-operator-scripts\") pod \"glance-2714-account-create-update-j5l9f\" (UID: \"3b15fbe3-8f30-41e8-8897-037694ccb56b\") " pod="openstack/glance-2714-account-create-update-j5l9f" Feb 18 11:53:56 crc kubenswrapper[4922]: I0218 11:53:56.953914 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cac3a541-a2f7-4d95-97ff-1361fbd3e81e-operator-scripts\") pod \"glance-db-create-mqx2n\" (UID: \"cac3a541-a2f7-4d95-97ff-1361fbd3e81e\") " pod="openstack/glance-db-create-mqx2n" Feb 18 11:53:56 crc kubenswrapper[4922]: I0218 11:53:56.980255 4922 scope.go:117] "RemoveContainer" containerID="e636c73dbb4ceb7b2f26422690f3ab2d91f3f8e5fae643efe99e90bb666ac64a" Feb 18 11:53:56 crc kubenswrapper[4922]: I0218 11:53:56.987094 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44zdx\" (UniqueName: \"kubernetes.io/projected/cac3a541-a2f7-4d95-97ff-1361fbd3e81e-kube-api-access-44zdx\") pod \"glance-db-create-mqx2n\" (UID: \"cac3a541-a2f7-4d95-97ff-1361fbd3e81e\") " pod="openstack/glance-db-create-mqx2n" Feb 18 11:53:56 crc kubenswrapper[4922]: E0218 11:53:56.996279 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e636c73dbb4ceb7b2f26422690f3ab2d91f3f8e5fae643efe99e90bb666ac64a\": container with ID starting with e636c73dbb4ceb7b2f26422690f3ab2d91f3f8e5fae643efe99e90bb666ac64a not found: ID does not exist" containerID="e636c73dbb4ceb7b2f26422690f3ab2d91f3f8e5fae643efe99e90bb666ac64a" Feb 18 11:53:56 crc kubenswrapper[4922]: I0218 11:53:56.996328 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e636c73dbb4ceb7b2f26422690f3ab2d91f3f8e5fae643efe99e90bb666ac64a"} err="failed to get container status \"e636c73dbb4ceb7b2f26422690f3ab2d91f3f8e5fae643efe99e90bb666ac64a\": rpc error: code = NotFound desc = could not find container \"e636c73dbb4ceb7b2f26422690f3ab2d91f3f8e5fae643efe99e90bb666ac64a\": container with ID starting with e636c73dbb4ceb7b2f26422690f3ab2d91f3f8e5fae643efe99e90bb666ac64a not found: ID does not exist" Feb 18 11:53:56 crc kubenswrapper[4922]: I0218 11:53:56.996441 4922 scope.go:117] "RemoveContainer" containerID="60eb9c5023e2d3d6c8d592fa62a787bc619c79be6125ae7f37f446b7b4620ecf" Feb 18 11:53:56 crc kubenswrapper[4922]: E0218 11:53:56.997006 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60eb9c5023e2d3d6c8d592fa62a787bc619c79be6125ae7f37f446b7b4620ecf\": container with ID starting with 60eb9c5023e2d3d6c8d592fa62a787bc619c79be6125ae7f37f446b7b4620ecf not found: ID does not exist" containerID="60eb9c5023e2d3d6c8d592fa62a787bc619c79be6125ae7f37f446b7b4620ecf" Feb 18 11:53:56 crc kubenswrapper[4922]: I0218 11:53:56.997058 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60eb9c5023e2d3d6c8d592fa62a787bc619c79be6125ae7f37f446b7b4620ecf"} err="failed to get container status \"60eb9c5023e2d3d6c8d592fa62a787bc619c79be6125ae7f37f446b7b4620ecf\": rpc error: code = NotFound desc = could not find container \"60eb9c5023e2d3d6c8d592fa62a787bc619c79be6125ae7f37f446b7b4620ecf\": container with ID starting with 60eb9c5023e2d3d6c8d592fa62a787bc619c79be6125ae7f37f446b7b4620ecf not found: ID does not exist" Feb 18 11:53:57 crc kubenswrapper[4922]: I0218 11:53:57.055410 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vl925\" (UniqueName: \"kubernetes.io/projected/3b15fbe3-8f30-41e8-8897-037694ccb56b-kube-api-access-vl925\") pod \"glance-2714-account-create-update-j5l9f\" (UID: \"3b15fbe3-8f30-41e8-8897-037694ccb56b\") " pod="openstack/glance-2714-account-create-update-j5l9f" Feb 18 11:53:57 crc kubenswrapper[4922]: I0218 11:53:57.055448 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b15fbe3-8f30-41e8-8897-037694ccb56b-operator-scripts\") pod \"glance-2714-account-create-update-j5l9f\" (UID: \"3b15fbe3-8f30-41e8-8897-037694ccb56b\") " pod="openstack/glance-2714-account-create-update-j5l9f" Feb 18 11:53:57 crc kubenswrapper[4922]: I0218 11:53:57.056536 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b15fbe3-8f30-41e8-8897-037694ccb56b-operator-scripts\") pod \"glance-2714-account-create-update-j5l9f\" (UID: \"3b15fbe3-8f30-41e8-8897-037694ccb56b\") " pod="openstack/glance-2714-account-create-update-j5l9f" Feb 18 11:53:57 crc kubenswrapper[4922]: I0218 11:53:57.082912 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl925\" (UniqueName: \"kubernetes.io/projected/3b15fbe3-8f30-41e8-8897-037694ccb56b-kube-api-access-vl925\") pod \"glance-2714-account-create-update-j5l9f\" (UID: \"3b15fbe3-8f30-41e8-8897-037694ccb56b\") " pod="openstack/glance-2714-account-create-update-j5l9f" Feb 18 11:53:57 crc kubenswrapper[4922]: I0218 11:53:57.084330 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2714-account-create-update-j5l9f" Feb 18 11:53:57 crc kubenswrapper[4922]: I0218 11:53:57.117942 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-mqx2n" Feb 18 11:53:57 crc kubenswrapper[4922]: I0218 11:53:57.372096 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0771bdc1-7622-4a65-aa82-3150630ce652-etc-swift\") pod \"swift-storage-0\" (UID: \"0771bdc1-7622-4a65-aa82-3150630ce652\") " pod="openstack/swift-storage-0" Feb 18 11:53:57 crc kubenswrapper[4922]: E0218 11:53:57.372559 4922 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 18 11:53:57 crc kubenswrapper[4922]: E0218 11:53:57.372592 4922 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 18 11:53:57 crc kubenswrapper[4922]: E0218 11:53:57.372648 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0771bdc1-7622-4a65-aa82-3150630ce652-etc-swift podName:0771bdc1-7622-4a65-aa82-3150630ce652 nodeName:}" failed. No retries permitted until 2026-02-18 11:53:59.372628826 +0000 UTC m=+1041.100332906 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/0771bdc1-7622-4a65-aa82-3150630ce652-etc-swift") pod "swift-storage-0" (UID: "0771bdc1-7622-4a65-aa82-3150630ce652") : configmap "swift-ring-files" not found Feb 18 11:53:57 crc kubenswrapper[4922]: I0218 11:53:57.554804 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-zmcwq"] Feb 18 11:53:57 crc kubenswrapper[4922]: I0218 11:53:57.591639 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e35b9ac7-2e11-4096-a77a-4be1a41d737f","Type":"ContainerStarted","Data":"e96b484d9840abfb2c7757906d574c4d66e64e51bbf1dbb2212180e14faff247"} Feb 18 11:53:57 crc kubenswrapper[4922]: I0218 11:53:57.596249 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2aa305a0-c015-43c2-851c-8eff778238be","Type":"ContainerStarted","Data":"c04aa8b95a26312fcc998c31a3b946f55c4ecd9e671bf072302577040fa7d761"} Feb 18 11:53:57 crc kubenswrapper[4922]: I0218 11:53:57.596614 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 18 11:53:57 crc kubenswrapper[4922]: I0218 11:53:57.597880 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-zmcwq" event={"ID":"ec6256f2-24ef-4e83-9a3c-c98ef206f7b0","Type":"ContainerStarted","Data":"dd9d9badec862e12fd53c23629f63f90af2fbdcdb63f9f7d603ffed75ff6a6ad"} Feb 18 11:53:57 crc kubenswrapper[4922]: I0218 11:53:57.622168 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=11.109344084 podStartE2EDuration="54.622150214s" podCreationTimestamp="2026-02-18 11:53:03 +0000 UTC" firstStartedPulling="2026-02-18 11:53:13.773223476 +0000 UTC m=+995.500927556" lastFinishedPulling="2026-02-18 11:53:57.286029606 +0000 UTC m=+1039.013733686" observedRunningTime="2026-02-18 11:53:57.616169802 +0000 UTC m=+1039.343873882" watchObservedRunningTime="2026-02-18 11:53:57.622150214 +0000 UTC m=+1039.349854294" Feb 18 11:53:57 crc kubenswrapper[4922]: W0218 11:53:57.680258 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83fbf909_70fe_4d3c_9b45_3f5a6733779c.slice/crio-7cb68b07b7628d48bbfa07f469ab2d1994a576e88de08a8b50b0bca8c09412b9 WatchSource:0}: Error finding container 7cb68b07b7628d48bbfa07f469ab2d1994a576e88de08a8b50b0bca8c09412b9: Status 404 returned error can't find the container with id 7cb68b07b7628d48bbfa07f469ab2d1994a576e88de08a8b50b0bca8c09412b9 Feb 18 11:53:57 crc kubenswrapper[4922]: I0218 11:53:57.688167 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-xj7zt"] Feb 18 11:53:57 crc kubenswrapper[4922]: I0218 11:53:57.694553 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-6gzbs"] Feb 18 11:53:57 crc kubenswrapper[4922]: I0218 11:53:57.702439 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-499z7"] Feb 18 11:53:57 crc kubenswrapper[4922]: I0218 11:53:57.854201 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-95b8-account-create-update-58r6z"] Feb 18 11:53:57 crc kubenswrapper[4922]: I0218 11:53:57.865755 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-2714-account-create-update-j5l9f"] Feb 18 11:53:57 crc kubenswrapper[4922]: I0218 11:53:57.874452 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-dd51-account-create-update-fr8ml"] Feb 18 11:53:57 crc kubenswrapper[4922]: I0218 11:53:57.882784 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-2zkj4"] Feb 18 11:53:57 crc kubenswrapper[4922]: I0218 11:53:57.906869 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-a3b1-account-create-update-5qfd8"] Feb 18 11:53:57 crc kubenswrapper[4922]: W0218 11:53:57.965832 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85eec6a5_292b_4061_bb90_18904535d9cc.slice/crio-099bd4d1700ca1fdea66438e1c36747fd163667b51fd5545201fba20a6a21ddf WatchSource:0}: Error finding container 099bd4d1700ca1fdea66438e1c36747fd163667b51fd5545201fba20a6a21ddf: Status 404 returned error can't find the container with id 099bd4d1700ca1fdea66438e1c36747fd163667b51fd5545201fba20a6a21ddf Feb 18 11:53:58 crc kubenswrapper[4922]: I0218 11:53:58.039296 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-mqx2n"] Feb 18 11:53:58 crc kubenswrapper[4922]: W0218 11:53:58.066914 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcac3a541_a2f7_4d95_97ff_1361fbd3e81e.slice/crio-35bfb15c61c9106211ef6e4dae348b65fb77a6409596090df5a648a4b7682f47 WatchSource:0}: Error finding container 35bfb15c61c9106211ef6e4dae348b65fb77a6409596090df5a648a4b7682f47: Status 404 returned error can't find the container with id 35bfb15c61c9106211ef6e4dae348b65fb77a6409596090df5a648a4b7682f47 Feb 18 11:53:58 crc kubenswrapper[4922]: I0218 11:53:58.607891 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-95b8-account-create-update-58r6z" event={"ID":"cc452273-8a5f-47d8-8aa5-1ddfe2240e28","Type":"ContainerStarted","Data":"54e9125c24a959588989fe6a7b334775970a6c4b231353eee75179d5fb3c2947"} Feb 18 11:53:58 crc kubenswrapper[4922]: I0218 11:53:58.610832 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-a3b1-account-create-update-5qfd8" event={"ID":"85eec6a5-292b-4061-bb90-18904535d9cc","Type":"ContainerStarted","Data":"099bd4d1700ca1fdea66438e1c36747fd163667b51fd5545201fba20a6a21ddf"} Feb 18 11:53:58 crc kubenswrapper[4922]: I0218 11:53:58.612438 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-6gzbs" event={"ID":"83fbf909-70fe-4d3c-9b45-3f5a6733779c","Type":"ContainerStarted","Data":"7cb68b07b7628d48bbfa07f469ab2d1994a576e88de08a8b50b0bca8c09412b9"} Feb 18 11:53:58 crc kubenswrapper[4922]: I0218 11:53:58.616172 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-xj7zt" event={"ID":"3e854dba-d50f-4228-9b7a-c8a0ae16347a","Type":"ContainerStarted","Data":"d21236550250b2958c4054030789e8473894e0d1d7c3c12b25573ed942732eed"} Feb 18 11:53:58 crc kubenswrapper[4922]: I0218 11:53:58.616235 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-xj7zt" event={"ID":"3e854dba-d50f-4228-9b7a-c8a0ae16347a","Type":"ContainerStarted","Data":"c68fcfcd006d47497315707b33d56f34856ae845dcd2357d6a67318d6da6c7f6"} Feb 18 11:53:58 crc kubenswrapper[4922]: I0218 11:53:58.621107 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-mqx2n" event={"ID":"cac3a541-a2f7-4d95-97ff-1361fbd3e81e","Type":"ContainerStarted","Data":"35bfb15c61c9106211ef6e4dae348b65fb77a6409596090df5a648a4b7682f47"} Feb 18 11:53:58 crc kubenswrapper[4922]: I0218 11:53:58.629090 4922 generic.go:334] "Generic (PLEG): container finished" podID="ec6256f2-24ef-4e83-9a3c-c98ef206f7b0" containerID="95d55b3314d502d1433cbc46f43ce2797b326e2911e0feadc7c77b360ebeb491" exitCode=0 Feb 18 11:53:58 crc kubenswrapper[4922]: I0218 11:53:58.629271 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-zmcwq" event={"ID":"ec6256f2-24ef-4e83-9a3c-c98ef206f7b0","Type":"ContainerDied","Data":"95d55b3314d502d1433cbc46f43ce2797b326e2911e0feadc7c77b360ebeb491"} Feb 18 11:53:58 crc kubenswrapper[4922]: I0218 11:53:58.634637 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-db-create-xj7zt" podStartSLOduration=4.634612195 podStartE2EDuration="4.634612195s" podCreationTimestamp="2026-02-18 11:53:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:58.63324629 +0000 UTC m=+1040.360950370" watchObservedRunningTime="2026-02-18 11:53:58.634612195 +0000 UTC m=+1040.362316275" Feb 18 11:53:58 crc kubenswrapper[4922]: I0218 11:53:58.639409 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-2zkj4" event={"ID":"05f03ea4-2462-4f2c-b9b8-395fc9802993","Type":"ContainerStarted","Data":"2a299295394b49d8734504ca223eb46c67dbed1ff8a13151afb9f72e374ec15e"} Feb 18 11:53:58 crc kubenswrapper[4922]: I0218 11:53:58.647503 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-dd51-account-create-update-fr8ml" event={"ID":"811ffd65-f5dc-44a3-a1cb-778937ca9771","Type":"ContainerStarted","Data":"80ee7e9b0cec62a50bee70889b18acf5fa94b2334f321fee43b9ca55b8bd52cc"} Feb 18 11:53:58 crc kubenswrapper[4922]: I0218 11:53:58.674168 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2714-account-create-update-j5l9f" event={"ID":"3b15fbe3-8f30-41e8-8897-037694ccb56b","Type":"ContainerStarted","Data":"b02d51eb3381fc4834bb097a9321f3a4baf43667d320279cf5100f69846caf84"} Feb 18 11:53:58 crc kubenswrapper[4922]: I0218 11:53:58.678410 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-499z7" event={"ID":"0d3c9160-dd6d-4591-9554-d3c74df3a64e","Type":"ContainerStarted","Data":"30c5754ee779c0d183f858e5000b674deb35f4da6cc4bd94f8c25db1f475b7bc"} Feb 18 11:53:58 crc kubenswrapper[4922]: I0218 11:53:58.722017 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-hz64r"] Feb 18 11:53:58 crc kubenswrapper[4922]: I0218 11:53:58.723582 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hz64r" Feb 18 11:53:58 crc kubenswrapper[4922]: I0218 11:53:58.726296 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 18 11:53:58 crc kubenswrapper[4922]: I0218 11:53:58.729117 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-hz64r"] Feb 18 11:53:58 crc kubenswrapper[4922]: I0218 11:53:58.802257 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82871f18-4432-42c4-bbfd-61ff507a1e95-operator-scripts\") pod \"root-account-create-update-hz64r\" (UID: \"82871f18-4432-42c4-bbfd-61ff507a1e95\") " pod="openstack/root-account-create-update-hz64r" Feb 18 11:53:58 crc kubenswrapper[4922]: I0218 11:53:58.802462 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpvzn\" (UniqueName: \"kubernetes.io/projected/82871f18-4432-42c4-bbfd-61ff507a1e95-kube-api-access-bpvzn\") pod \"root-account-create-update-hz64r\" (UID: \"82871f18-4432-42c4-bbfd-61ff507a1e95\") " pod="openstack/root-account-create-update-hz64r" Feb 18 11:53:58 crc kubenswrapper[4922]: I0218 11:53:58.904391 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpvzn\" (UniqueName: \"kubernetes.io/projected/82871f18-4432-42c4-bbfd-61ff507a1e95-kube-api-access-bpvzn\") pod \"root-account-create-update-hz64r\" (UID: \"82871f18-4432-42c4-bbfd-61ff507a1e95\") " pod="openstack/root-account-create-update-hz64r" Feb 18 11:53:58 crc kubenswrapper[4922]: I0218 11:53:58.904474 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82871f18-4432-42c4-bbfd-61ff507a1e95-operator-scripts\") pod \"root-account-create-update-hz64r\" (UID: \"82871f18-4432-42c4-bbfd-61ff507a1e95\") " pod="openstack/root-account-create-update-hz64r" Feb 18 11:53:58 crc kubenswrapper[4922]: I0218 11:53:58.905180 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82871f18-4432-42c4-bbfd-61ff507a1e95-operator-scripts\") pod \"root-account-create-update-hz64r\" (UID: \"82871f18-4432-42c4-bbfd-61ff507a1e95\") " pod="openstack/root-account-create-update-hz64r" Feb 18 11:53:58 crc kubenswrapper[4922]: I0218 11:53:58.925448 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpvzn\" (UniqueName: \"kubernetes.io/projected/82871f18-4432-42c4-bbfd-61ff507a1e95-kube-api-access-bpvzn\") pod \"root-account-create-update-hz64r\" (UID: \"82871f18-4432-42c4-bbfd-61ff507a1e95\") " pod="openstack/root-account-create-update-hz64r" Feb 18 11:53:59 crc kubenswrapper[4922]: I0218 11:53:59.097674 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hz64r" Feb 18 11:53:59 crc kubenswrapper[4922]: I0218 11:53:59.438409 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0771bdc1-7622-4a65-aa82-3150630ce652-etc-swift\") pod \"swift-storage-0\" (UID: \"0771bdc1-7622-4a65-aa82-3150630ce652\") " pod="openstack/swift-storage-0" Feb 18 11:53:59 crc kubenswrapper[4922]: E0218 11:53:59.438588 4922 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 18 11:53:59 crc kubenswrapper[4922]: E0218 11:53:59.438928 4922 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 18 11:53:59 crc kubenswrapper[4922]: E0218 11:53:59.438995 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0771bdc1-7622-4a65-aa82-3150630ce652-etc-swift podName:0771bdc1-7622-4a65-aa82-3150630ce652 nodeName:}" failed. No retries permitted until 2026-02-18 11:54:03.438972679 +0000 UTC m=+1045.166676759 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/0771bdc1-7622-4a65-aa82-3150630ce652-etc-swift") pod "swift-storage-0" (UID: "0771bdc1-7622-4a65-aa82-3150630ce652") : configmap "swift-ring-files" not found Feb 18 11:53:59 crc kubenswrapper[4922]: I0218 11:53:59.688243 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2714-account-create-update-j5l9f" event={"ID":"3b15fbe3-8f30-41e8-8897-037694ccb56b","Type":"ContainerStarted","Data":"fedb74d1b6b4c2f0bd2aef2df515af61a8cc0caff9248cf99b996d4ac610fc62"} Feb 18 11:53:59 crc kubenswrapper[4922]: I0218 11:53:59.691688 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-zmcwq" event={"ID":"ec6256f2-24ef-4e83-9a3c-c98ef206f7b0","Type":"ContainerStarted","Data":"9c3a3f840bf392e642ec5f6d88074e14f3634fce3f5fe950f1ca6ef37f5d1bd5"} Feb 18 11:53:59 crc kubenswrapper[4922]: I0218 11:53:59.691861 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-zmcwq" Feb 18 11:53:59 crc kubenswrapper[4922]: I0218 11:53:59.693668 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-95b8-account-create-update-58r6z" event={"ID":"cc452273-8a5f-47d8-8aa5-1ddfe2240e28","Type":"ContainerStarted","Data":"a445d9cdaa7af5fce54eb1734c1edd288e3e96655dfca8e174ddcae4e353e89d"} Feb 18 11:53:59 crc kubenswrapper[4922]: I0218 11:53:59.695850 4922 generic.go:334] "Generic (PLEG): container finished" podID="3e854dba-d50f-4228-9b7a-c8a0ae16347a" containerID="d21236550250b2958c4054030789e8473894e0d1d7c3c12b25573ed942732eed" exitCode=0 Feb 18 11:53:59 crc kubenswrapper[4922]: I0218 11:53:59.695953 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-xj7zt" event={"ID":"3e854dba-d50f-4228-9b7a-c8a0ae16347a","Type":"ContainerDied","Data":"d21236550250b2958c4054030789e8473894e0d1d7c3c12b25573ed942732eed"} Feb 18 11:53:59 crc kubenswrapper[4922]: I0218 11:53:59.698097 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-dd51-account-create-update-fr8ml" event={"ID":"811ffd65-f5dc-44a3-a1cb-778937ca9771","Type":"ContainerStarted","Data":"e6310d509175d11956cf35f2881cac092138c8da93939b01d64f0006c350cdcc"} Feb 18 11:53:59 crc kubenswrapper[4922]: I0218 11:53:59.703425 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-mqx2n" event={"ID":"cac3a541-a2f7-4d95-97ff-1361fbd3e81e","Type":"ContainerStarted","Data":"e7a3b382cba61f7101e30bbe239f4c456f983c4335448f6fab20d5e184472620"} Feb 18 11:53:59 crc kubenswrapper[4922]: I0218 11:53:59.705802 4922 generic.go:334] "Generic (PLEG): container finished" podID="0d3c9160-dd6d-4591-9554-d3c74df3a64e" containerID="38f62ebe43eed17090600fd985ab87c725adb4a3b86d21051e6be95923794e24" exitCode=0 Feb 18 11:53:59 crc kubenswrapper[4922]: I0218 11:53:59.705857 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-499z7" event={"ID":"0d3c9160-dd6d-4591-9554-d3c74df3a64e","Type":"ContainerDied","Data":"38f62ebe43eed17090600fd985ab87c725adb4a3b86d21051e6be95923794e24"} Feb 18 11:53:59 crc kubenswrapper[4922]: I0218 11:53:59.707493 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-2zkj4" event={"ID":"05f03ea4-2462-4f2c-b9b8-395fc9802993","Type":"ContainerStarted","Data":"c34444b452577c8ce28445ed13810cd08c00c27fb0fdc17083d4a2fd0f78af8b"} Feb 18 11:53:59 crc kubenswrapper[4922]: I0218 11:53:59.707981 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-2714-account-create-update-j5l9f" podStartSLOduration=3.707971603 podStartE2EDuration="3.707971603s" podCreationTimestamp="2026-02-18 11:53:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:59.700908834 +0000 UTC m=+1041.428612934" watchObservedRunningTime="2026-02-18 11:53:59.707971603 +0000 UTC m=+1041.435675673" Feb 18 11:53:59 crc kubenswrapper[4922]: I0218 11:53:59.710339 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-a3b1-account-create-update-5qfd8" event={"ID":"85eec6a5-292b-4061-bb90-18904535d9cc","Type":"ContainerStarted","Data":"cca6d9a35f7a3b90b18a7c03d2317544897c0eada4e7ac4555360fd3747c2a53"} Feb 18 11:53:59 crc kubenswrapper[4922]: I0218 11:53:59.721336 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-dd51-account-create-update-fr8ml" podStartSLOduration=7.721316602 podStartE2EDuration="7.721316602s" podCreationTimestamp="2026-02-18 11:53:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:59.712400986 +0000 UTC m=+1041.440105076" watchObservedRunningTime="2026-02-18 11:53:59.721316602 +0000 UTC m=+1041.449020692" Feb 18 11:53:59 crc kubenswrapper[4922]: I0218 11:53:59.737775 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-zmcwq" podStartSLOduration=5.73775932 podStartE2EDuration="5.73775932s" podCreationTimestamp="2026-02-18 11:53:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:59.733301477 +0000 UTC m=+1041.461005567" watchObservedRunningTime="2026-02-18 11:53:59.73775932 +0000 UTC m=+1041.465463400" Feb 18 11:53:59 crc kubenswrapper[4922]: I0218 11:53:59.773897 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-95b8-account-create-update-58r6z" podStartSLOduration=5.773878548 podStartE2EDuration="5.773878548s" podCreationTimestamp="2026-02-18 11:53:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:59.768387508 +0000 UTC m=+1041.496091588" watchObservedRunningTime="2026-02-18 11:53:59.773878548 +0000 UTC m=+1041.501582628" Feb 18 11:53:59 crc kubenswrapper[4922]: I0218 11:53:59.786493 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-2zkj4" podStartSLOduration=7.786475618 podStartE2EDuration="7.786475618s" podCreationTimestamp="2026-02-18 11:53:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:59.781280306 +0000 UTC m=+1041.508984386" watchObservedRunningTime="2026-02-18 11:53:59.786475618 +0000 UTC m=+1041.514179698" Feb 18 11:53:59 crc kubenswrapper[4922]: I0218 11:53:59.823315 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-mqx2n" podStartSLOduration=3.8232972629999997 podStartE2EDuration="3.823297263s" podCreationTimestamp="2026-02-18 11:53:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:59.814637543 +0000 UTC m=+1041.542341623" watchObservedRunningTime="2026-02-18 11:53:59.823297263 +0000 UTC m=+1041.551001343" Feb 18 11:53:59 crc kubenswrapper[4922]: I0218 11:53:59.837934 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-hz64r"] Feb 18 11:53:59 crc kubenswrapper[4922]: I0218 11:53:59.840627 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-a3b1-account-create-update-5qfd8" podStartSLOduration=6.840606543 podStartE2EDuration="6.840606543s" podCreationTimestamp="2026-02-18 11:53:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:59.827923571 +0000 UTC m=+1041.555627671" watchObservedRunningTime="2026-02-18 11:53:59.840606543 +0000 UTC m=+1041.568310623" Feb 18 11:54:00 crc kubenswrapper[4922]: I0218 11:54:00.719798 4922 generic.go:334] "Generic (PLEG): container finished" podID="cc452273-8a5f-47d8-8aa5-1ddfe2240e28" containerID="a445d9cdaa7af5fce54eb1734c1edd288e3e96655dfca8e174ddcae4e353e89d" exitCode=0 Feb 18 11:54:00 crc kubenswrapper[4922]: I0218 11:54:00.719894 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-95b8-account-create-update-58r6z" event={"ID":"cc452273-8a5f-47d8-8aa5-1ddfe2240e28","Type":"ContainerDied","Data":"a445d9cdaa7af5fce54eb1734c1edd288e3e96655dfca8e174ddcae4e353e89d"} Feb 18 11:54:00 crc kubenswrapper[4922]: I0218 11:54:00.722213 4922 generic.go:334] "Generic (PLEG): container finished" podID="85eec6a5-292b-4061-bb90-18904535d9cc" containerID="cca6d9a35f7a3b90b18a7c03d2317544897c0eada4e7ac4555360fd3747c2a53" exitCode=0 Feb 18 11:54:00 crc kubenswrapper[4922]: I0218 11:54:00.722267 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-a3b1-account-create-update-5qfd8" event={"ID":"85eec6a5-292b-4061-bb90-18904535d9cc","Type":"ContainerDied","Data":"cca6d9a35f7a3b90b18a7c03d2317544897c0eada4e7ac4555360fd3747c2a53"} Feb 18 11:54:00 crc kubenswrapper[4922]: I0218 11:54:00.723829 4922 generic.go:334] "Generic (PLEG): container finished" podID="05f03ea4-2462-4f2c-b9b8-395fc9802993" containerID="c34444b452577c8ce28445ed13810cd08c00c27fb0fdc17083d4a2fd0f78af8b" exitCode=0 Feb 18 11:54:00 crc kubenswrapper[4922]: I0218 11:54:00.723900 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-2zkj4" event={"ID":"05f03ea4-2462-4f2c-b9b8-395fc9802993","Type":"ContainerDied","Data":"c34444b452577c8ce28445ed13810cd08c00c27fb0fdc17083d4a2fd0f78af8b"} Feb 18 11:54:00 crc kubenswrapper[4922]: I0218 11:54:00.726389 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e35b9ac7-2e11-4096-a77a-4be1a41d737f","Type":"ContainerStarted","Data":"3f427700514946e216712d6593eedc4f63a171beb78d1aacdd21101e9bbb6fd9"} Feb 18 11:54:00 crc kubenswrapper[4922]: I0218 11:54:00.728083 4922 generic.go:334] "Generic (PLEG): container finished" podID="811ffd65-f5dc-44a3-a1cb-778937ca9771" containerID="e6310d509175d11956cf35f2881cac092138c8da93939b01d64f0006c350cdcc" exitCode=0 Feb 18 11:54:00 crc kubenswrapper[4922]: I0218 11:54:00.728141 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-dd51-account-create-update-fr8ml" event={"ID":"811ffd65-f5dc-44a3-a1cb-778937ca9771","Type":"ContainerDied","Data":"e6310d509175d11956cf35f2881cac092138c8da93939b01d64f0006c350cdcc"} Feb 18 11:54:00 crc kubenswrapper[4922]: I0218 11:54:00.729671 4922 generic.go:334] "Generic (PLEG): container finished" podID="cac3a541-a2f7-4d95-97ff-1361fbd3e81e" containerID="e7a3b382cba61f7101e30bbe239f4c456f983c4335448f6fab20d5e184472620" exitCode=0 Feb 18 11:54:00 crc kubenswrapper[4922]: I0218 11:54:00.729721 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-mqx2n" event={"ID":"cac3a541-a2f7-4d95-97ff-1361fbd3e81e","Type":"ContainerDied","Data":"e7a3b382cba61f7101e30bbe239f4c456f983c4335448f6fab20d5e184472620"} Feb 18 11:54:00 crc kubenswrapper[4922]: I0218 11:54:00.736050 4922 generic.go:334] "Generic (PLEG): container finished" podID="3b15fbe3-8f30-41e8-8897-037694ccb56b" containerID="fedb74d1b6b4c2f0bd2aef2df515af61a8cc0caff9248cf99b996d4ac610fc62" exitCode=0 Feb 18 11:54:00 crc kubenswrapper[4922]: I0218 11:54:00.736103 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2714-account-create-update-j5l9f" event={"ID":"3b15fbe3-8f30-41e8-8897-037694ccb56b","Type":"ContainerDied","Data":"fedb74d1b6b4c2f0bd2aef2df515af61a8cc0caff9248cf99b996d4ac610fc62"} Feb 18 11:54:02 crc kubenswrapper[4922]: W0218 11:54:02.396409 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82871f18_4432_42c4_bbfd_61ff507a1e95.slice/crio-3b32203e1127c8bf7749c5d652a55547d30ff0cbc6a4eb374405868fb79465cd WatchSource:0}: Error finding container 3b32203e1127c8bf7749c5d652a55547d30ff0cbc6a4eb374405868fb79465cd: Status 404 returned error can't find the container with id 3b32203e1127c8bf7749c5d652a55547d30ff0cbc6a4eb374405868fb79465cd Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.521464 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-mqx2n" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.529942 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-2zkj4" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.538078 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-95b8-account-create-update-58r6z" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.575002 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-xj7zt" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.585676 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a3b1-account-create-update-5qfd8" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.598510 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-499z7" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.607079 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jvv6\" (UniqueName: \"kubernetes.io/projected/05f03ea4-2462-4f2c-b9b8-395fc9802993-kube-api-access-6jvv6\") pod \"05f03ea4-2462-4f2c-b9b8-395fc9802993\" (UID: \"05f03ea4-2462-4f2c-b9b8-395fc9802993\") " Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.607302 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc452273-8a5f-47d8-8aa5-1ddfe2240e28-operator-scripts\") pod \"cc452273-8a5f-47d8-8aa5-1ddfe2240e28\" (UID: \"cc452273-8a5f-47d8-8aa5-1ddfe2240e28\") " Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.607516 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cac3a541-a2f7-4d95-97ff-1361fbd3e81e-operator-scripts\") pod \"cac3a541-a2f7-4d95-97ff-1361fbd3e81e\" (UID: \"cac3a541-a2f7-4d95-97ff-1361fbd3e81e\") " Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.608018 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2714-account-create-update-j5l9f" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.607964 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc452273-8a5f-47d8-8aa5-1ddfe2240e28-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cc452273-8a5f-47d8-8aa5-1ddfe2240e28" (UID: "cc452273-8a5f-47d8-8aa5-1ddfe2240e28"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.608002 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cac3a541-a2f7-4d95-97ff-1361fbd3e81e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cac3a541-a2f7-4d95-97ff-1361fbd3e81e" (UID: "cac3a541-a2f7-4d95-97ff-1361fbd3e81e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.608273 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44zdx\" (UniqueName: \"kubernetes.io/projected/cac3a541-a2f7-4d95-97ff-1361fbd3e81e-kube-api-access-44zdx\") pod \"cac3a541-a2f7-4d95-97ff-1361fbd3e81e\" (UID: \"cac3a541-a2f7-4d95-97ff-1361fbd3e81e\") " Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.608393 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05f03ea4-2462-4f2c-b9b8-395fc9802993-operator-scripts\") pod \"05f03ea4-2462-4f2c-b9b8-395fc9802993\" (UID: \"05f03ea4-2462-4f2c-b9b8-395fc9802993\") " Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.608505 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6t58\" (UniqueName: \"kubernetes.io/projected/cc452273-8a5f-47d8-8aa5-1ddfe2240e28-kube-api-access-d6t58\") pod \"cc452273-8a5f-47d8-8aa5-1ddfe2240e28\" (UID: \"cc452273-8a5f-47d8-8aa5-1ddfe2240e28\") " Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.608992 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05f03ea4-2462-4f2c-b9b8-395fc9802993-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "05f03ea4-2462-4f2c-b9b8-395fc9802993" (UID: "05f03ea4-2462-4f2c-b9b8-395fc9802993"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.610875 4922 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc452273-8a5f-47d8-8aa5-1ddfe2240e28-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.611007 4922 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cac3a541-a2f7-4d95-97ff-1361fbd3e81e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.611070 4922 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05f03ea4-2462-4f2c-b9b8-395fc9802993-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.616254 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cac3a541-a2f7-4d95-97ff-1361fbd3e81e-kube-api-access-44zdx" (OuterVolumeSpecName: "kube-api-access-44zdx") pod "cac3a541-a2f7-4d95-97ff-1361fbd3e81e" (UID: "cac3a541-a2f7-4d95-97ff-1361fbd3e81e"). InnerVolumeSpecName "kube-api-access-44zdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.618013 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05f03ea4-2462-4f2c-b9b8-395fc9802993-kube-api-access-6jvv6" (OuterVolumeSpecName: "kube-api-access-6jvv6") pod "05f03ea4-2462-4f2c-b9b8-395fc9802993" (UID: "05f03ea4-2462-4f2c-b9b8-395fc9802993"). InnerVolumeSpecName "kube-api-access-6jvv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.620647 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-dd51-account-create-update-fr8ml" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.627924 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc452273-8a5f-47d8-8aa5-1ddfe2240e28-kube-api-access-d6t58" (OuterVolumeSpecName: "kube-api-access-d6t58") pod "cc452273-8a5f-47d8-8aa5-1ddfe2240e28" (UID: "cc452273-8a5f-47d8-8aa5-1ddfe2240e28"). InnerVolumeSpecName "kube-api-access-d6t58". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.712267 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqlnd\" (UniqueName: \"kubernetes.io/projected/85eec6a5-292b-4061-bb90-18904535d9cc-kube-api-access-mqlnd\") pod \"85eec6a5-292b-4061-bb90-18904535d9cc\" (UID: \"85eec6a5-292b-4061-bb90-18904535d9cc\") " Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.712638 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8ps9\" (UniqueName: \"kubernetes.io/projected/0d3c9160-dd6d-4591-9554-d3c74df3a64e-kube-api-access-m8ps9\") pod \"0d3c9160-dd6d-4591-9554-d3c74df3a64e\" (UID: \"0d3c9160-dd6d-4591-9554-d3c74df3a64e\") " Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.712735 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/811ffd65-f5dc-44a3-a1cb-778937ca9771-operator-scripts\") pod \"811ffd65-f5dc-44a3-a1cb-778937ca9771\" (UID: \"811ffd65-f5dc-44a3-a1cb-778937ca9771\") " Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.712814 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmlzq\" (UniqueName: \"kubernetes.io/projected/811ffd65-f5dc-44a3-a1cb-778937ca9771-kube-api-access-vmlzq\") pod \"811ffd65-f5dc-44a3-a1cb-778937ca9771\" (UID: \"811ffd65-f5dc-44a3-a1cb-778937ca9771\") " Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.712890 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vl925\" (UniqueName: \"kubernetes.io/projected/3b15fbe3-8f30-41e8-8897-037694ccb56b-kube-api-access-vl925\") pod \"3b15fbe3-8f30-41e8-8897-037694ccb56b\" (UID: \"3b15fbe3-8f30-41e8-8897-037694ccb56b\") " Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.712960 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e854dba-d50f-4228-9b7a-c8a0ae16347a-operator-scripts\") pod \"3e854dba-d50f-4228-9b7a-c8a0ae16347a\" (UID: \"3e854dba-d50f-4228-9b7a-c8a0ae16347a\") " Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.713043 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b15fbe3-8f30-41e8-8897-037694ccb56b-operator-scripts\") pod \"3b15fbe3-8f30-41e8-8897-037694ccb56b\" (UID: \"3b15fbe3-8f30-41e8-8897-037694ccb56b\") " Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.713136 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d3c9160-dd6d-4591-9554-d3c74df3a64e-operator-scripts\") pod \"0d3c9160-dd6d-4591-9554-d3c74df3a64e\" (UID: \"0d3c9160-dd6d-4591-9554-d3c74df3a64e\") " Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.713244 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fft6\" (UniqueName: \"kubernetes.io/projected/3e854dba-d50f-4228-9b7a-c8a0ae16347a-kube-api-access-6fft6\") pod \"3e854dba-d50f-4228-9b7a-c8a0ae16347a\" (UID: \"3e854dba-d50f-4228-9b7a-c8a0ae16347a\") " Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.713398 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85eec6a5-292b-4061-bb90-18904535d9cc-operator-scripts\") pod \"85eec6a5-292b-4061-bb90-18904535d9cc\" (UID: \"85eec6a5-292b-4061-bb90-18904535d9cc\") " Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.713679 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/811ffd65-f5dc-44a3-a1cb-778937ca9771-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "811ffd65-f5dc-44a3-a1cb-778937ca9771" (UID: "811ffd65-f5dc-44a3-a1cb-778937ca9771"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.713885 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6t58\" (UniqueName: \"kubernetes.io/projected/cc452273-8a5f-47d8-8aa5-1ddfe2240e28-kube-api-access-d6t58\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.713954 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jvv6\" (UniqueName: \"kubernetes.io/projected/05f03ea4-2462-4f2c-b9b8-395fc9802993-kube-api-access-6jvv6\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.714016 4922 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/811ffd65-f5dc-44a3-a1cb-778937ca9771-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.714075 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44zdx\" (UniqueName: \"kubernetes.io/projected/cac3a541-a2f7-4d95-97ff-1361fbd3e81e-kube-api-access-44zdx\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.714091 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b15fbe3-8f30-41e8-8897-037694ccb56b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3b15fbe3-8f30-41e8-8897-037694ccb56b" (UID: "3b15fbe3-8f30-41e8-8897-037694ccb56b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.714727 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e854dba-d50f-4228-9b7a-c8a0ae16347a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3e854dba-d50f-4228-9b7a-c8a0ae16347a" (UID: "3e854dba-d50f-4228-9b7a-c8a0ae16347a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.715420 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85eec6a5-292b-4061-bb90-18904535d9cc-kube-api-access-mqlnd" (OuterVolumeSpecName: "kube-api-access-mqlnd") pod "85eec6a5-292b-4061-bb90-18904535d9cc" (UID: "85eec6a5-292b-4061-bb90-18904535d9cc"). InnerVolumeSpecName "kube-api-access-mqlnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.715822 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d3c9160-dd6d-4591-9554-d3c74df3a64e-kube-api-access-m8ps9" (OuterVolumeSpecName: "kube-api-access-m8ps9") pod "0d3c9160-dd6d-4591-9554-d3c74df3a64e" (UID: "0d3c9160-dd6d-4591-9554-d3c74df3a64e"). InnerVolumeSpecName "kube-api-access-m8ps9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.715902 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d3c9160-dd6d-4591-9554-d3c74df3a64e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0d3c9160-dd6d-4591-9554-d3c74df3a64e" (UID: "0d3c9160-dd6d-4591-9554-d3c74df3a64e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.716160 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85eec6a5-292b-4061-bb90-18904535d9cc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "85eec6a5-292b-4061-bb90-18904535d9cc" (UID: "85eec6a5-292b-4061-bb90-18904535d9cc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.716461 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b15fbe3-8f30-41e8-8897-037694ccb56b-kube-api-access-vl925" (OuterVolumeSpecName: "kube-api-access-vl925") pod "3b15fbe3-8f30-41e8-8897-037694ccb56b" (UID: "3b15fbe3-8f30-41e8-8897-037694ccb56b"). InnerVolumeSpecName "kube-api-access-vl925". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.718991 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e854dba-d50f-4228-9b7a-c8a0ae16347a-kube-api-access-6fft6" (OuterVolumeSpecName: "kube-api-access-6fft6") pod "3e854dba-d50f-4228-9b7a-c8a0ae16347a" (UID: "3e854dba-d50f-4228-9b7a-c8a0ae16347a"). InnerVolumeSpecName "kube-api-access-6fft6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.720814 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/811ffd65-f5dc-44a3-a1cb-778937ca9771-kube-api-access-vmlzq" (OuterVolumeSpecName: "kube-api-access-vmlzq") pod "811ffd65-f5dc-44a3-a1cb-778937ca9771" (UID: "811ffd65-f5dc-44a3-a1cb-778937ca9771"). InnerVolumeSpecName "kube-api-access-vmlzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.763565 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-95b8-account-create-update-58r6z" event={"ID":"cc452273-8a5f-47d8-8aa5-1ddfe2240e28","Type":"ContainerDied","Data":"54e9125c24a959588989fe6a7b334775970a6c4b231353eee75179d5fb3c2947"} Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.763609 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54e9125c24a959588989fe6a7b334775970a6c4b231353eee75179d5fb3c2947" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.763702 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-95b8-account-create-update-58r6z" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.765106 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-xj7zt" event={"ID":"3e854dba-d50f-4228-9b7a-c8a0ae16347a","Type":"ContainerDied","Data":"c68fcfcd006d47497315707b33d56f34856ae845dcd2357d6a67318d6da6c7f6"} Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.765163 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c68fcfcd006d47497315707b33d56f34856ae845dcd2357d6a67318d6da6c7f6" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.765265 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-xj7zt" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.768247 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-hz64r" event={"ID":"82871f18-4432-42c4-bbfd-61ff507a1e95","Type":"ContainerStarted","Data":"3b32203e1127c8bf7749c5d652a55547d30ff0cbc6a4eb374405868fb79465cd"} Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.770155 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-dd51-account-create-update-fr8ml" event={"ID":"811ffd65-f5dc-44a3-a1cb-778937ca9771","Type":"ContainerDied","Data":"80ee7e9b0cec62a50bee70889b18acf5fa94b2334f321fee43b9ca55b8bd52cc"} Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.770214 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80ee7e9b0cec62a50bee70889b18acf5fa94b2334f321fee43b9ca55b8bd52cc" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.770534 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-dd51-account-create-update-fr8ml" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.774639 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2714-account-create-update-j5l9f" event={"ID":"3b15fbe3-8f30-41e8-8897-037694ccb56b","Type":"ContainerDied","Data":"b02d51eb3381fc4834bb097a9321f3a4baf43667d320279cf5100f69846caf84"} Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.774677 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b02d51eb3381fc4834bb097a9321f3a4baf43667d320279cf5100f69846caf84" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.775013 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2714-account-create-update-j5l9f" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.780884 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a3b1-account-create-update-5qfd8" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.780900 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-a3b1-account-create-update-5qfd8" event={"ID":"85eec6a5-292b-4061-bb90-18904535d9cc","Type":"ContainerDied","Data":"099bd4d1700ca1fdea66438e1c36747fd163667b51fd5545201fba20a6a21ddf"} Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.781184 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="099bd4d1700ca1fdea66438e1c36747fd163667b51fd5545201fba20a6a21ddf" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.782714 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-499z7" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.782754 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-499z7" event={"ID":"0d3c9160-dd6d-4591-9554-d3c74df3a64e","Type":"ContainerDied","Data":"30c5754ee779c0d183f858e5000b674deb35f4da6cc4bd94f8c25db1f475b7bc"} Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.782776 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30c5754ee779c0d183f858e5000b674deb35f4da6cc4bd94f8c25db1f475b7bc" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.784906 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-2zkj4" event={"ID":"05f03ea4-2462-4f2c-b9b8-395fc9802993","Type":"ContainerDied","Data":"2a299295394b49d8734504ca223eb46c67dbed1ff8a13151afb9f72e374ec15e"} Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.784931 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a299295394b49d8734504ca223eb46c67dbed1ff8a13151afb9f72e374ec15e" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.784977 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-2zkj4" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.789586 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-mqx2n" event={"ID":"cac3a541-a2f7-4d95-97ff-1361fbd3e81e","Type":"ContainerDied","Data":"35bfb15c61c9106211ef6e4dae348b65fb77a6409596090df5a648a4b7682f47"} Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.789632 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35bfb15c61c9106211ef6e4dae348b65fb77a6409596090df5a648a4b7682f47" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.789700 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-mqx2n" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.816628 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8ps9\" (UniqueName: \"kubernetes.io/projected/0d3c9160-dd6d-4591-9554-d3c74df3a64e-kube-api-access-m8ps9\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.817499 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmlzq\" (UniqueName: \"kubernetes.io/projected/811ffd65-f5dc-44a3-a1cb-778937ca9771-kube-api-access-vmlzq\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.817523 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vl925\" (UniqueName: \"kubernetes.io/projected/3b15fbe3-8f30-41e8-8897-037694ccb56b-kube-api-access-vl925\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.817535 4922 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e854dba-d50f-4228-9b7a-c8a0ae16347a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.817545 4922 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b15fbe3-8f30-41e8-8897-037694ccb56b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.817555 4922 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d3c9160-dd6d-4591-9554-d3c74df3a64e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.817565 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fft6\" (UniqueName: \"kubernetes.io/projected/3e854dba-d50f-4228-9b7a-c8a0ae16347a-kube-api-access-6fft6\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.817574 4922 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85eec6a5-292b-4061-bb90-18904535d9cc-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:02 crc kubenswrapper[4922]: I0218 11:54:02.817583 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqlnd\" (UniqueName: \"kubernetes.io/projected/85eec6a5-292b-4061-bb90-18904535d9cc-kube-api-access-mqlnd\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:03 crc kubenswrapper[4922]: I0218 11:54:03.527388 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0771bdc1-7622-4a65-aa82-3150630ce652-etc-swift\") pod \"swift-storage-0\" (UID: \"0771bdc1-7622-4a65-aa82-3150630ce652\") " pod="openstack/swift-storage-0" Feb 18 11:54:03 crc kubenswrapper[4922]: E0218 11:54:03.527615 4922 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 18 11:54:03 crc kubenswrapper[4922]: E0218 11:54:03.527653 4922 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 18 11:54:03 crc kubenswrapper[4922]: E0218 11:54:03.527718 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0771bdc1-7622-4a65-aa82-3150630ce652-etc-swift podName:0771bdc1-7622-4a65-aa82-3150630ce652 nodeName:}" failed. No retries permitted until 2026-02-18 11:54:11.52769493 +0000 UTC m=+1053.255399010 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/0771bdc1-7622-4a65-aa82-3150630ce652-etc-swift") pod "swift-storage-0" (UID: "0771bdc1-7622-4a65-aa82-3150630ce652") : configmap "swift-ring-files" not found Feb 18 11:54:04 crc kubenswrapper[4922]: I0218 11:54:04.263884 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 18 11:54:04 crc kubenswrapper[4922]: I0218 11:54:04.807548 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-hz64r" event={"ID":"82871f18-4432-42c4-bbfd-61ff507a1e95","Type":"ContainerStarted","Data":"f9617d9b57da4e95bb7eb7f0412ba485b6082bcd962a46f87e7d295e47d23bfb"} Feb 18 11:54:04 crc kubenswrapper[4922]: I0218 11:54:04.814533 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-zmcwq" Feb 18 11:54:04 crc kubenswrapper[4922]: I0218 11:54:04.829736 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-hz64r" podStartSLOduration=6.829713437 podStartE2EDuration="6.829713437s" podCreationTimestamp="2026-02-18 11:53:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:54:04.821040796 +0000 UTC m=+1046.548744876" watchObservedRunningTime="2026-02-18 11:54:04.829713437 +0000 UTC m=+1046.557417527" Feb 18 11:54:04 crc kubenswrapper[4922]: I0218 11:54:04.912407 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-mt2n6"] Feb 18 11:54:04 crc kubenswrapper[4922]: I0218 11:54:04.912671 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-mt2n6" podUID="66ee71b5-db58-4478-94c7-0067be9c018e" containerName="dnsmasq-dns" containerID="cri-o://4e3aef5e68dbf272d244ded6025cd71542de6a272f367b19ff7f37bb7a72fc94" gracePeriod=10 Feb 18 11:54:05 crc kubenswrapper[4922]: I0218 11:54:05.401232 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-mt2n6" Feb 18 11:54:05 crc kubenswrapper[4922]: I0218 11:54:05.461656 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66ee71b5-db58-4478-94c7-0067be9c018e-ovsdbserver-nb\") pod \"66ee71b5-db58-4478-94c7-0067be9c018e\" (UID: \"66ee71b5-db58-4478-94c7-0067be9c018e\") " Feb 18 11:54:05 crc kubenswrapper[4922]: I0218 11:54:05.461793 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xglxs\" (UniqueName: \"kubernetes.io/projected/66ee71b5-db58-4478-94c7-0067be9c018e-kube-api-access-xglxs\") pod \"66ee71b5-db58-4478-94c7-0067be9c018e\" (UID: \"66ee71b5-db58-4478-94c7-0067be9c018e\") " Feb 18 11:54:05 crc kubenswrapper[4922]: I0218 11:54:05.461882 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66ee71b5-db58-4478-94c7-0067be9c018e-config\") pod \"66ee71b5-db58-4478-94c7-0067be9c018e\" (UID: \"66ee71b5-db58-4478-94c7-0067be9c018e\") " Feb 18 11:54:05 crc kubenswrapper[4922]: I0218 11:54:05.461959 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66ee71b5-db58-4478-94c7-0067be9c018e-ovsdbserver-sb\") pod \"66ee71b5-db58-4478-94c7-0067be9c018e\" (UID: \"66ee71b5-db58-4478-94c7-0067be9c018e\") " Feb 18 11:54:05 crc kubenswrapper[4922]: I0218 11:54:05.462038 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66ee71b5-db58-4478-94c7-0067be9c018e-dns-svc\") pod \"66ee71b5-db58-4478-94c7-0067be9c018e\" (UID: \"66ee71b5-db58-4478-94c7-0067be9c018e\") " Feb 18 11:54:05 crc kubenswrapper[4922]: I0218 11:54:05.477938 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66ee71b5-db58-4478-94c7-0067be9c018e-kube-api-access-xglxs" (OuterVolumeSpecName: "kube-api-access-xglxs") pod "66ee71b5-db58-4478-94c7-0067be9c018e" (UID: "66ee71b5-db58-4478-94c7-0067be9c018e"). InnerVolumeSpecName "kube-api-access-xglxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:54:05 crc kubenswrapper[4922]: I0218 11:54:05.507022 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66ee71b5-db58-4478-94c7-0067be9c018e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "66ee71b5-db58-4478-94c7-0067be9c018e" (UID: "66ee71b5-db58-4478-94c7-0067be9c018e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:54:05 crc kubenswrapper[4922]: I0218 11:54:05.514894 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66ee71b5-db58-4478-94c7-0067be9c018e-config" (OuterVolumeSpecName: "config") pod "66ee71b5-db58-4478-94c7-0067be9c018e" (UID: "66ee71b5-db58-4478-94c7-0067be9c018e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:54:05 crc kubenswrapper[4922]: I0218 11:54:05.517135 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66ee71b5-db58-4478-94c7-0067be9c018e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "66ee71b5-db58-4478-94c7-0067be9c018e" (UID: "66ee71b5-db58-4478-94c7-0067be9c018e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:54:05 crc kubenswrapper[4922]: I0218 11:54:05.532169 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66ee71b5-db58-4478-94c7-0067be9c018e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "66ee71b5-db58-4478-94c7-0067be9c018e" (UID: "66ee71b5-db58-4478-94c7-0067be9c018e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:54:05 crc kubenswrapper[4922]: I0218 11:54:05.565750 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66ee71b5-db58-4478-94c7-0067be9c018e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:05 crc kubenswrapper[4922]: I0218 11:54:05.565790 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xglxs\" (UniqueName: \"kubernetes.io/projected/66ee71b5-db58-4478-94c7-0067be9c018e-kube-api-access-xglxs\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:05 crc kubenswrapper[4922]: I0218 11:54:05.565804 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66ee71b5-db58-4478-94c7-0067be9c018e-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:05 crc kubenswrapper[4922]: I0218 11:54:05.565815 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66ee71b5-db58-4478-94c7-0067be9c018e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:05 crc kubenswrapper[4922]: I0218 11:54:05.565824 4922 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66ee71b5-db58-4478-94c7-0067be9c018e-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:05 crc kubenswrapper[4922]: I0218 11:54:05.820631 4922 generic.go:334] "Generic (PLEG): container finished" podID="66ee71b5-db58-4478-94c7-0067be9c018e" containerID="4e3aef5e68dbf272d244ded6025cd71542de6a272f367b19ff7f37bb7a72fc94" exitCode=0 Feb 18 11:54:05 crc kubenswrapper[4922]: I0218 11:54:05.820710 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-mt2n6" Feb 18 11:54:05 crc kubenswrapper[4922]: I0218 11:54:05.820729 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-mt2n6" event={"ID":"66ee71b5-db58-4478-94c7-0067be9c018e","Type":"ContainerDied","Data":"4e3aef5e68dbf272d244ded6025cd71542de6a272f367b19ff7f37bb7a72fc94"} Feb 18 11:54:05 crc kubenswrapper[4922]: I0218 11:54:05.820778 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-mt2n6" event={"ID":"66ee71b5-db58-4478-94c7-0067be9c018e","Type":"ContainerDied","Data":"ff48b67de3bebc231cad1e6022f943d477fb1723890c418485b7af3b8fa2de11"} Feb 18 11:54:05 crc kubenswrapper[4922]: I0218 11:54:05.820799 4922 scope.go:117] "RemoveContainer" containerID="4e3aef5e68dbf272d244ded6025cd71542de6a272f367b19ff7f37bb7a72fc94" Feb 18 11:54:05 crc kubenswrapper[4922]: I0218 11:54:05.824782 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-6gzbs" event={"ID":"83fbf909-70fe-4d3c-9b45-3f5a6733779c","Type":"ContainerStarted","Data":"b2a0d9fa0c0ee6a87b933b519a3cc219be9eb78825f3cbe1dcc1c3100e2a1d94"} Feb 18 11:54:05 crc kubenswrapper[4922]: I0218 11:54:05.827324 4922 generic.go:334] "Generic (PLEG): container finished" podID="82871f18-4432-42c4-bbfd-61ff507a1e95" containerID="f9617d9b57da4e95bb7eb7f0412ba485b6082bcd962a46f87e7d295e47d23bfb" exitCode=0 Feb 18 11:54:05 crc kubenswrapper[4922]: I0218 11:54:05.827374 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-hz64r" event={"ID":"82871f18-4432-42c4-bbfd-61ff507a1e95","Type":"ContainerDied","Data":"f9617d9b57da4e95bb7eb7f0412ba485b6082bcd962a46f87e7d295e47d23bfb"} Feb 18 11:54:05 crc kubenswrapper[4922]: I0218 11:54:05.860223 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-6gzbs" podStartSLOduration=3.733553199 podStartE2EDuration="10.860203156s" podCreationTimestamp="2026-02-18 11:53:55 +0000 UTC" firstStartedPulling="2026-02-18 11:53:57.692815249 +0000 UTC m=+1039.420519329" lastFinishedPulling="2026-02-18 11:54:04.819465206 +0000 UTC m=+1046.547169286" observedRunningTime="2026-02-18 11:54:05.857897207 +0000 UTC m=+1047.585601287" watchObservedRunningTime="2026-02-18 11:54:05.860203156 +0000 UTC m=+1047.587907236" Feb 18 11:54:05 crc kubenswrapper[4922]: I0218 11:54:05.915720 4922 scope.go:117] "RemoveContainer" containerID="4802ba16c84d861fcc08bada9e91f7825e817d37cd0acc6e85ba3301fe424969" Feb 18 11:54:05 crc kubenswrapper[4922]: I0218 11:54:05.927260 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-mt2n6"] Feb 18 11:54:05 crc kubenswrapper[4922]: I0218 11:54:05.933808 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-mt2n6"] Feb 18 11:54:05 crc kubenswrapper[4922]: I0218 11:54:05.953390 4922 scope.go:117] "RemoveContainer" containerID="4e3aef5e68dbf272d244ded6025cd71542de6a272f367b19ff7f37bb7a72fc94" Feb 18 11:54:05 crc kubenswrapper[4922]: E0218 11:54:05.954008 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e3aef5e68dbf272d244ded6025cd71542de6a272f367b19ff7f37bb7a72fc94\": container with ID starting with 4e3aef5e68dbf272d244ded6025cd71542de6a272f367b19ff7f37bb7a72fc94 not found: ID does not exist" containerID="4e3aef5e68dbf272d244ded6025cd71542de6a272f367b19ff7f37bb7a72fc94" Feb 18 11:54:05 crc kubenswrapper[4922]: I0218 11:54:05.954058 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e3aef5e68dbf272d244ded6025cd71542de6a272f367b19ff7f37bb7a72fc94"} err="failed to get container status \"4e3aef5e68dbf272d244ded6025cd71542de6a272f367b19ff7f37bb7a72fc94\": rpc error: code = NotFound desc = could not find container \"4e3aef5e68dbf272d244ded6025cd71542de6a272f367b19ff7f37bb7a72fc94\": container with ID starting with 4e3aef5e68dbf272d244ded6025cd71542de6a272f367b19ff7f37bb7a72fc94 not found: ID does not exist" Feb 18 11:54:05 crc kubenswrapper[4922]: I0218 11:54:05.954098 4922 scope.go:117] "RemoveContainer" containerID="4802ba16c84d861fcc08bada9e91f7825e817d37cd0acc6e85ba3301fe424969" Feb 18 11:54:05 crc kubenswrapper[4922]: E0218 11:54:05.954531 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4802ba16c84d861fcc08bada9e91f7825e817d37cd0acc6e85ba3301fe424969\": container with ID starting with 4802ba16c84d861fcc08bada9e91f7825e817d37cd0acc6e85ba3301fe424969 not found: ID does not exist" containerID="4802ba16c84d861fcc08bada9e91f7825e817d37cd0acc6e85ba3301fe424969" Feb 18 11:54:05 crc kubenswrapper[4922]: I0218 11:54:05.954561 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4802ba16c84d861fcc08bada9e91f7825e817d37cd0acc6e85ba3301fe424969"} err="failed to get container status \"4802ba16c84d861fcc08bada9e91f7825e817d37cd0acc6e85ba3301fe424969\": rpc error: code = NotFound desc = could not find container \"4802ba16c84d861fcc08bada9e91f7825e817d37cd0acc6e85ba3301fe424969\": container with ID starting with 4802ba16c84d861fcc08bada9e91f7825e817d37cd0acc6e85ba3301fe424969 not found: ID does not exist" Feb 18 11:54:07 crc kubenswrapper[4922]: I0218 11:54:06.993893 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66ee71b5-db58-4478-94c7-0067be9c018e" path="/var/lib/kubelet/pods/66ee71b5-db58-4478-94c7-0067be9c018e/volumes" Feb 18 11:54:07 crc kubenswrapper[4922]: I0218 11:54:07.066744 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-st9pz"] Feb 18 11:54:07 crc kubenswrapper[4922]: E0218 11:54:07.067376 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d3c9160-dd6d-4591-9554-d3c74df3a64e" containerName="mariadb-database-create" Feb 18 11:54:07 crc kubenswrapper[4922]: I0218 11:54:07.067395 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d3c9160-dd6d-4591-9554-d3c74df3a64e" containerName="mariadb-database-create" Feb 18 11:54:07 crc kubenswrapper[4922]: E0218 11:54:07.067407 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e854dba-d50f-4228-9b7a-c8a0ae16347a" containerName="mariadb-database-create" Feb 18 11:54:07 crc kubenswrapper[4922]: I0218 11:54:07.067414 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e854dba-d50f-4228-9b7a-c8a0ae16347a" containerName="mariadb-database-create" Feb 18 11:54:07 crc kubenswrapper[4922]: E0218 11:54:07.067433 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85eec6a5-292b-4061-bb90-18904535d9cc" containerName="mariadb-account-create-update" Feb 18 11:54:07 crc kubenswrapper[4922]: I0218 11:54:07.067439 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="85eec6a5-292b-4061-bb90-18904535d9cc" containerName="mariadb-account-create-update" Feb 18 11:54:07 crc kubenswrapper[4922]: E0218 11:54:07.067448 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc452273-8a5f-47d8-8aa5-1ddfe2240e28" containerName="mariadb-account-create-update" Feb 18 11:54:07 crc kubenswrapper[4922]: I0218 11:54:07.067454 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc452273-8a5f-47d8-8aa5-1ddfe2240e28" containerName="mariadb-account-create-update" Feb 18 11:54:07 crc kubenswrapper[4922]: E0218 11:54:07.067462 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cac3a541-a2f7-4d95-97ff-1361fbd3e81e" containerName="mariadb-database-create" Feb 18 11:54:07 crc kubenswrapper[4922]: I0218 11:54:07.067468 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="cac3a541-a2f7-4d95-97ff-1361fbd3e81e" containerName="mariadb-database-create" Feb 18 11:54:07 crc kubenswrapper[4922]: E0218 11:54:07.067479 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05f03ea4-2462-4f2c-b9b8-395fc9802993" containerName="mariadb-database-create" Feb 18 11:54:07 crc kubenswrapper[4922]: I0218 11:54:07.067484 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="05f03ea4-2462-4f2c-b9b8-395fc9802993" containerName="mariadb-database-create" Feb 18 11:54:07 crc kubenswrapper[4922]: E0218 11:54:07.067492 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b15fbe3-8f30-41e8-8897-037694ccb56b" containerName="mariadb-account-create-update" Feb 18 11:54:07 crc kubenswrapper[4922]: I0218 11:54:07.067499 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b15fbe3-8f30-41e8-8897-037694ccb56b" containerName="mariadb-account-create-update" Feb 18 11:54:07 crc kubenswrapper[4922]: E0218 11:54:07.067509 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="811ffd65-f5dc-44a3-a1cb-778937ca9771" containerName="mariadb-account-create-update" Feb 18 11:54:07 crc kubenswrapper[4922]: I0218 11:54:07.067515 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="811ffd65-f5dc-44a3-a1cb-778937ca9771" containerName="mariadb-account-create-update" Feb 18 11:54:07 crc kubenswrapper[4922]: E0218 11:54:07.067527 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66ee71b5-db58-4478-94c7-0067be9c018e" containerName="dnsmasq-dns" Feb 18 11:54:07 crc kubenswrapper[4922]: I0218 11:54:07.067533 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="66ee71b5-db58-4478-94c7-0067be9c018e" containerName="dnsmasq-dns" Feb 18 11:54:07 crc kubenswrapper[4922]: E0218 11:54:07.067542 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66ee71b5-db58-4478-94c7-0067be9c018e" containerName="init" Feb 18 11:54:07 crc kubenswrapper[4922]: I0218 11:54:07.067548 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="66ee71b5-db58-4478-94c7-0067be9c018e" containerName="init" Feb 18 11:54:07 crc kubenswrapper[4922]: I0218 11:54:07.067704 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="811ffd65-f5dc-44a3-a1cb-778937ca9771" containerName="mariadb-account-create-update" Feb 18 11:54:07 crc kubenswrapper[4922]: I0218 11:54:07.067715 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="66ee71b5-db58-4478-94c7-0067be9c018e" containerName="dnsmasq-dns" Feb 18 11:54:07 crc kubenswrapper[4922]: I0218 11:54:07.067722 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="05f03ea4-2462-4f2c-b9b8-395fc9802993" containerName="mariadb-database-create" Feb 18 11:54:07 crc kubenswrapper[4922]: I0218 11:54:07.067731 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="cac3a541-a2f7-4d95-97ff-1361fbd3e81e" containerName="mariadb-database-create" Feb 18 11:54:07 crc kubenswrapper[4922]: I0218 11:54:07.067742 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc452273-8a5f-47d8-8aa5-1ddfe2240e28" containerName="mariadb-account-create-update" Feb 18 11:54:07 crc kubenswrapper[4922]: I0218 11:54:07.067752 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d3c9160-dd6d-4591-9554-d3c74df3a64e" containerName="mariadb-database-create" Feb 18 11:54:07 crc kubenswrapper[4922]: I0218 11:54:07.067759 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e854dba-d50f-4228-9b7a-c8a0ae16347a" containerName="mariadb-database-create" Feb 18 11:54:07 crc kubenswrapper[4922]: I0218 11:54:07.067769 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="85eec6a5-292b-4061-bb90-18904535d9cc" containerName="mariadb-account-create-update" Feb 18 11:54:07 crc kubenswrapper[4922]: I0218 11:54:07.067779 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b15fbe3-8f30-41e8-8897-037694ccb56b" containerName="mariadb-account-create-update" Feb 18 11:54:07 crc kubenswrapper[4922]: I0218 11:54:07.068352 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-st9pz" Feb 18 11:54:07 crc kubenswrapper[4922]: I0218 11:54:07.071226 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-jr8f4" Feb 18 11:54:07 crc kubenswrapper[4922]: I0218 11:54:07.072798 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 18 11:54:07 crc kubenswrapper[4922]: I0218 11:54:07.083508 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-st9pz"] Feb 18 11:54:07 crc kubenswrapper[4922]: I0218 11:54:07.204900 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmckq\" (UniqueName: \"kubernetes.io/projected/855fb3ec-e473-4a99-a94f-cc96dda6d9c4-kube-api-access-zmckq\") pod \"glance-db-sync-st9pz\" (UID: \"855fb3ec-e473-4a99-a94f-cc96dda6d9c4\") " pod="openstack/glance-db-sync-st9pz" Feb 18 11:54:07 crc kubenswrapper[4922]: I0218 11:54:07.204952 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/855fb3ec-e473-4a99-a94f-cc96dda6d9c4-combined-ca-bundle\") pod \"glance-db-sync-st9pz\" (UID: \"855fb3ec-e473-4a99-a94f-cc96dda6d9c4\") " pod="openstack/glance-db-sync-st9pz" Feb 18 11:54:07 crc kubenswrapper[4922]: I0218 11:54:07.205030 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/855fb3ec-e473-4a99-a94f-cc96dda6d9c4-config-data\") pod \"glance-db-sync-st9pz\" (UID: \"855fb3ec-e473-4a99-a94f-cc96dda6d9c4\") " pod="openstack/glance-db-sync-st9pz" Feb 18 11:54:07 crc kubenswrapper[4922]: I0218 11:54:07.205057 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/855fb3ec-e473-4a99-a94f-cc96dda6d9c4-db-sync-config-data\") pod \"glance-db-sync-st9pz\" (UID: \"855fb3ec-e473-4a99-a94f-cc96dda6d9c4\") " pod="openstack/glance-db-sync-st9pz" Feb 18 11:54:07 crc kubenswrapper[4922]: I0218 11:54:07.306356 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmckq\" (UniqueName: \"kubernetes.io/projected/855fb3ec-e473-4a99-a94f-cc96dda6d9c4-kube-api-access-zmckq\") pod \"glance-db-sync-st9pz\" (UID: \"855fb3ec-e473-4a99-a94f-cc96dda6d9c4\") " pod="openstack/glance-db-sync-st9pz" Feb 18 11:54:07 crc kubenswrapper[4922]: I0218 11:54:07.306412 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/855fb3ec-e473-4a99-a94f-cc96dda6d9c4-combined-ca-bundle\") pod \"glance-db-sync-st9pz\" (UID: \"855fb3ec-e473-4a99-a94f-cc96dda6d9c4\") " pod="openstack/glance-db-sync-st9pz" Feb 18 11:54:07 crc kubenswrapper[4922]: I0218 11:54:07.306482 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/855fb3ec-e473-4a99-a94f-cc96dda6d9c4-config-data\") pod \"glance-db-sync-st9pz\" (UID: \"855fb3ec-e473-4a99-a94f-cc96dda6d9c4\") " pod="openstack/glance-db-sync-st9pz" Feb 18 11:54:07 crc kubenswrapper[4922]: I0218 11:54:07.306512 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/855fb3ec-e473-4a99-a94f-cc96dda6d9c4-db-sync-config-data\") pod \"glance-db-sync-st9pz\" (UID: \"855fb3ec-e473-4a99-a94f-cc96dda6d9c4\") " pod="openstack/glance-db-sync-st9pz" Feb 18 11:54:07 crc kubenswrapper[4922]: I0218 11:54:07.314997 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/855fb3ec-e473-4a99-a94f-cc96dda6d9c4-combined-ca-bundle\") pod \"glance-db-sync-st9pz\" (UID: \"855fb3ec-e473-4a99-a94f-cc96dda6d9c4\") " pod="openstack/glance-db-sync-st9pz" Feb 18 11:54:07 crc kubenswrapper[4922]: I0218 11:54:07.316047 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/855fb3ec-e473-4a99-a94f-cc96dda6d9c4-db-sync-config-data\") pod \"glance-db-sync-st9pz\" (UID: \"855fb3ec-e473-4a99-a94f-cc96dda6d9c4\") " pod="openstack/glance-db-sync-st9pz" Feb 18 11:54:07 crc kubenswrapper[4922]: I0218 11:54:07.324661 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/855fb3ec-e473-4a99-a94f-cc96dda6d9c4-config-data\") pod \"glance-db-sync-st9pz\" (UID: \"855fb3ec-e473-4a99-a94f-cc96dda6d9c4\") " pod="openstack/glance-db-sync-st9pz" Feb 18 11:54:07 crc kubenswrapper[4922]: I0218 11:54:07.325843 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmckq\" (UniqueName: \"kubernetes.io/projected/855fb3ec-e473-4a99-a94f-cc96dda6d9c4-kube-api-access-zmckq\") pod \"glance-db-sync-st9pz\" (UID: \"855fb3ec-e473-4a99-a94f-cc96dda6d9c4\") " pod="openstack/glance-db-sync-st9pz" Feb 18 11:54:07 crc kubenswrapper[4922]: I0218 11:54:07.406625 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-st9pz" Feb 18 11:54:07 crc kubenswrapper[4922]: I0218 11:54:07.516889 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 18 11:54:08 crc kubenswrapper[4922]: I0218 11:54:08.872020 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-hz64r" event={"ID":"82871f18-4432-42c4-bbfd-61ff507a1e95","Type":"ContainerDied","Data":"3b32203e1127c8bf7749c5d652a55547d30ff0cbc6a4eb374405868fb79465cd"} Feb 18 11:54:08 crc kubenswrapper[4922]: I0218 11:54:08.872270 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b32203e1127c8bf7749c5d652a55547d30ff0cbc6a4eb374405868fb79465cd" Feb 18 11:54:08 crc kubenswrapper[4922]: I0218 11:54:08.931969 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hz64r" Feb 18 11:54:09 crc kubenswrapper[4922]: I0218 11:54:09.034478 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82871f18-4432-42c4-bbfd-61ff507a1e95-operator-scripts\") pod \"82871f18-4432-42c4-bbfd-61ff507a1e95\" (UID: \"82871f18-4432-42c4-bbfd-61ff507a1e95\") " Feb 18 11:54:09 crc kubenswrapper[4922]: I0218 11:54:09.034562 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpvzn\" (UniqueName: \"kubernetes.io/projected/82871f18-4432-42c4-bbfd-61ff507a1e95-kube-api-access-bpvzn\") pod \"82871f18-4432-42c4-bbfd-61ff507a1e95\" (UID: \"82871f18-4432-42c4-bbfd-61ff507a1e95\") " Feb 18 11:54:09 crc kubenswrapper[4922]: I0218 11:54:09.035442 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82871f18-4432-42c4-bbfd-61ff507a1e95-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "82871f18-4432-42c4-bbfd-61ff507a1e95" (UID: "82871f18-4432-42c4-bbfd-61ff507a1e95"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:54:09 crc kubenswrapper[4922]: I0218 11:54:09.049376 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82871f18-4432-42c4-bbfd-61ff507a1e95-kube-api-access-bpvzn" (OuterVolumeSpecName: "kube-api-access-bpvzn") pod "82871f18-4432-42c4-bbfd-61ff507a1e95" (UID: "82871f18-4432-42c4-bbfd-61ff507a1e95"). InnerVolumeSpecName "kube-api-access-bpvzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:54:09 crc kubenswrapper[4922]: I0218 11:54:09.137663 4922 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82871f18-4432-42c4-bbfd-61ff507a1e95-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:09 crc kubenswrapper[4922]: I0218 11:54:09.137711 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpvzn\" (UniqueName: \"kubernetes.io/projected/82871f18-4432-42c4-bbfd-61ff507a1e95-kube-api-access-bpvzn\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:09 crc kubenswrapper[4922]: I0218 11:54:09.402695 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-st9pz"] Feb 18 11:54:09 crc kubenswrapper[4922]: I0218 11:54:09.807799 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 11:54:09 crc kubenswrapper[4922]: I0218 11:54:09.808111 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 11:54:09 crc kubenswrapper[4922]: I0218 11:54:09.808158 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-znglx" Feb 18 11:54:09 crc kubenswrapper[4922]: I0218 11:54:09.808845 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3ecb5c1316b12312a3aa69dd6c259c33fc3f21c1c782e1e74d55d4b725fb05a8"} pod="openshift-machine-config-operator/machine-config-daemon-znglx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 11:54:09 crc kubenswrapper[4922]: I0218 11:54:09.808914 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" containerID="cri-o://3ecb5c1316b12312a3aa69dd6c259c33fc3f21c1c782e1e74d55d4b725fb05a8" gracePeriod=600 Feb 18 11:54:09 crc kubenswrapper[4922]: I0218 11:54:09.880570 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-st9pz" event={"ID":"855fb3ec-e473-4a99-a94f-cc96dda6d9c4","Type":"ContainerStarted","Data":"bb03e387f02f6078ba9ca11f5028b069ffe62c115543a3d26dcd8e4428a02edd"} Feb 18 11:54:09 crc kubenswrapper[4922]: I0218 11:54:09.883239 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hz64r" Feb 18 11:54:09 crc kubenswrapper[4922]: I0218 11:54:09.883274 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e35b9ac7-2e11-4096-a77a-4be1a41d737f","Type":"ContainerStarted","Data":"184f9306d2796d4d920eb675a2e3eb752c81b217570c13dde972d69b9b2436a0"} Feb 18 11:54:09 crc kubenswrapper[4922]: I0218 11:54:09.917857 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=9.950005457 podStartE2EDuration="1m5.917836597s" podCreationTimestamp="2026-02-18 11:53:04 +0000 UTC" firstStartedPulling="2026-02-18 11:53:13.280385255 +0000 UTC m=+995.008089335" lastFinishedPulling="2026-02-18 11:54:09.248216395 +0000 UTC m=+1050.975920475" observedRunningTime="2026-02-18 11:54:09.911168007 +0000 UTC m=+1051.638872087" watchObservedRunningTime="2026-02-18 11:54:09.917836597 +0000 UTC m=+1051.645540677" Feb 18 11:54:10 crc kubenswrapper[4922]: I0218 11:54:10.023796 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-hz64r"] Feb 18 11:54:10 crc kubenswrapper[4922]: I0218 11:54:10.031133 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-hz64r"] Feb 18 11:54:10 crc kubenswrapper[4922]: I0218 11:54:10.529097 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 18 11:54:10 crc kubenswrapper[4922]: I0218 11:54:10.904012 4922 generic.go:334] "Generic (PLEG): container finished" podID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerID="3ecb5c1316b12312a3aa69dd6c259c33fc3f21c1c782e1e74d55d4b725fb05a8" exitCode=0 Feb 18 11:54:10 crc kubenswrapper[4922]: I0218 11:54:10.904068 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" event={"ID":"fdb7cedc-b2e3-48f0-80e0-e17073b43228","Type":"ContainerDied","Data":"3ecb5c1316b12312a3aa69dd6c259c33fc3f21c1c782e1e74d55d4b725fb05a8"} Feb 18 11:54:10 crc kubenswrapper[4922]: I0218 11:54:10.904118 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" event={"ID":"fdb7cedc-b2e3-48f0-80e0-e17073b43228","Type":"ContainerStarted","Data":"6ab0358a6b4b84604aa8265da97113127295fc06806ed22c39a69885110c93fc"} Feb 18 11:54:10 crc kubenswrapper[4922]: I0218 11:54:10.904138 4922 scope.go:117] "RemoveContainer" containerID="0ea4c69ba94e3a69c2a9d6932ead886d1aa8f5af4ec72d79e294ae3c3d8f54dd" Feb 18 11:54:10 crc kubenswrapper[4922]: I0218 11:54:10.983758 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82871f18-4432-42c4-bbfd-61ff507a1e95" path="/var/lib/kubelet/pods/82871f18-4432-42c4-bbfd-61ff507a1e95/volumes" Feb 18 11:54:11 crc kubenswrapper[4922]: I0218 11:54:11.611237 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0771bdc1-7622-4a65-aa82-3150630ce652-etc-swift\") pod \"swift-storage-0\" (UID: \"0771bdc1-7622-4a65-aa82-3150630ce652\") " pod="openstack/swift-storage-0" Feb 18 11:54:11 crc kubenswrapper[4922]: E0218 11:54:11.611458 4922 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 18 11:54:11 crc kubenswrapper[4922]: E0218 11:54:11.611555 4922 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 18 11:54:11 crc kubenswrapper[4922]: E0218 11:54:11.611631 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0771bdc1-7622-4a65-aa82-3150630ce652-etc-swift podName:0771bdc1-7622-4a65-aa82-3150630ce652 nodeName:}" failed. No retries permitted until 2026-02-18 11:54:27.611614506 +0000 UTC m=+1069.339318586 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/0771bdc1-7622-4a65-aa82-3150630ce652-etc-swift") pod "swift-storage-0" (UID: "0771bdc1-7622-4a65-aa82-3150630ce652") : configmap "swift-ring-files" not found Feb 18 11:54:11 crc kubenswrapper[4922]: I0218 11:54:11.987317 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-996pg" podUID="a2d0a226-07e2-402d-a868-2f8374670dac" containerName="ovn-controller" probeResult="failure" output=< Feb 18 11:54:11 crc kubenswrapper[4922]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 18 11:54:11 crc kubenswrapper[4922]: > Feb 18 11:54:12 crc kubenswrapper[4922]: I0218 11:54:12.930715 4922 generic.go:334] "Generic (PLEG): container finished" podID="cef557d2-b935-4cf6-98f1-d3c2251c0e38" containerID="fcebb6698c6e4874e1a9ec8fbf1e0b1f0b32b5ba69a663e4d66216e0e480bd70" exitCode=0 Feb 18 11:54:12 crc kubenswrapper[4922]: I0218 11:54:12.930822 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cef557d2-b935-4cf6-98f1-d3c2251c0e38","Type":"ContainerDied","Data":"fcebb6698c6e4874e1a9ec8fbf1e0b1f0b32b5ba69a663e4d66216e0e480bd70"} Feb 18 11:54:12 crc kubenswrapper[4922]: I0218 11:54:12.932647 4922 generic.go:334] "Generic (PLEG): container finished" podID="83fbf909-70fe-4d3c-9b45-3f5a6733779c" containerID="b2a0d9fa0c0ee6a87b933b519a3cc219be9eb78825f3cbe1dcc1c3100e2a1d94" exitCode=0 Feb 18 11:54:12 crc kubenswrapper[4922]: I0218 11:54:12.932704 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-6gzbs" event={"ID":"83fbf909-70fe-4d3c-9b45-3f5a6733779c","Type":"ContainerDied","Data":"b2a0d9fa0c0ee6a87b933b519a3cc219be9eb78825f3cbe1dcc1c3100e2a1d94"} Feb 18 11:54:12 crc kubenswrapper[4922]: I0218 11:54:12.942490 4922 generic.go:334] "Generic (PLEG): container finished" podID="12b84523-522e-4e8c-b78e-0094262fb1f8" containerID="e2215857f67c60d9f5c91d2142e0be2f147282bd8c00e873605b8bd17b7df49a" exitCode=0 Feb 18 11:54:12 crc kubenswrapper[4922]: I0218 11:54:12.942545 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"12b84523-522e-4e8c-b78e-0094262fb1f8","Type":"ContainerDied","Data":"e2215857f67c60d9f5c91d2142e0be2f147282bd8c00e873605b8bd17b7df49a"} Feb 18 11:54:13 crc kubenswrapper[4922]: I0218 11:54:13.738034 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-qvjkl"] Feb 18 11:54:13 crc kubenswrapper[4922]: E0218 11:54:13.738897 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82871f18-4432-42c4-bbfd-61ff507a1e95" containerName="mariadb-account-create-update" Feb 18 11:54:13 crc kubenswrapper[4922]: I0218 11:54:13.738917 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="82871f18-4432-42c4-bbfd-61ff507a1e95" containerName="mariadb-account-create-update" Feb 18 11:54:13 crc kubenswrapper[4922]: I0218 11:54:13.739113 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="82871f18-4432-42c4-bbfd-61ff507a1e95" containerName="mariadb-account-create-update" Feb 18 11:54:13 crc kubenswrapper[4922]: I0218 11:54:13.739751 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qvjkl" Feb 18 11:54:13 crc kubenswrapper[4922]: I0218 11:54:13.742274 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 18 11:54:13 crc kubenswrapper[4922]: I0218 11:54:13.749267 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-qvjkl"] Feb 18 11:54:13 crc kubenswrapper[4922]: I0218 11:54:13.853481 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83953c4e-9a54-453a-880f-d5e4c01608f9-operator-scripts\") pod \"root-account-create-update-qvjkl\" (UID: \"83953c4e-9a54-453a-880f-d5e4c01608f9\") " pod="openstack/root-account-create-update-qvjkl" Feb 18 11:54:13 crc kubenswrapper[4922]: I0218 11:54:13.853674 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxdxh\" (UniqueName: \"kubernetes.io/projected/83953c4e-9a54-453a-880f-d5e4c01608f9-kube-api-access-dxdxh\") pod \"root-account-create-update-qvjkl\" (UID: \"83953c4e-9a54-453a-880f-d5e4c01608f9\") " pod="openstack/root-account-create-update-qvjkl" Feb 18 11:54:13 crc kubenswrapper[4922]: I0218 11:54:13.953899 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"12b84523-522e-4e8c-b78e-0094262fb1f8","Type":"ContainerStarted","Data":"ebcec927014698c8dd3e3cce0832985c8aae1c112a06d773a543bb68530768f3"} Feb 18 11:54:13 crc kubenswrapper[4922]: I0218 11:54:13.954181 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 18 11:54:13 crc kubenswrapper[4922]: I0218 11:54:13.955376 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxdxh\" (UniqueName: \"kubernetes.io/projected/83953c4e-9a54-453a-880f-d5e4c01608f9-kube-api-access-dxdxh\") pod \"root-account-create-update-qvjkl\" (UID: \"83953c4e-9a54-453a-880f-d5e4c01608f9\") " pod="openstack/root-account-create-update-qvjkl" Feb 18 11:54:13 crc kubenswrapper[4922]: I0218 11:54:13.955478 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83953c4e-9a54-453a-880f-d5e4c01608f9-operator-scripts\") pod \"root-account-create-update-qvjkl\" (UID: \"83953c4e-9a54-453a-880f-d5e4c01608f9\") " pod="openstack/root-account-create-update-qvjkl" Feb 18 11:54:13 crc kubenswrapper[4922]: I0218 11:54:13.956433 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83953c4e-9a54-453a-880f-d5e4c01608f9-operator-scripts\") pod \"root-account-create-update-qvjkl\" (UID: \"83953c4e-9a54-453a-880f-d5e4c01608f9\") " pod="openstack/root-account-create-update-qvjkl" Feb 18 11:54:13 crc kubenswrapper[4922]: I0218 11:54:13.959716 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cef557d2-b935-4cf6-98f1-d3c2251c0e38","Type":"ContainerStarted","Data":"745e13628b88d2d6249c23c49d872e4f0c6a229af5e85aec709bffe789a60191"} Feb 18 11:54:13 crc kubenswrapper[4922]: I0218 11:54:13.977938 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxdxh\" (UniqueName: \"kubernetes.io/projected/83953c4e-9a54-453a-880f-d5e4c01608f9-kube-api-access-dxdxh\") pod \"root-account-create-update-qvjkl\" (UID: \"83953c4e-9a54-453a-880f-d5e4c01608f9\") " pod="openstack/root-account-create-update-qvjkl" Feb 18 11:54:14 crc kubenswrapper[4922]: I0218 11:54:14.009155 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=65.366206733 podStartE2EDuration="1m17.009128003s" podCreationTimestamp="2026-02-18 11:52:57 +0000 UTC" firstStartedPulling="2026-02-18 11:53:13.085861674 +0000 UTC m=+994.813565754" lastFinishedPulling="2026-02-18 11:53:24.728782904 +0000 UTC m=+1006.456487024" observedRunningTime="2026-02-18 11:54:13.998212616 +0000 UTC m=+1055.725916696" watchObservedRunningTime="2026-02-18 11:54:14.009128003 +0000 UTC m=+1055.736832083" Feb 18 11:54:14 crc kubenswrapper[4922]: I0218 11:54:14.067153 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qvjkl" Feb 18 11:54:14 crc kubenswrapper[4922]: I0218 11:54:14.469287 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-6gzbs" Feb 18 11:54:14 crc kubenswrapper[4922]: I0218 11:54:14.502484 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=66.824249365 podStartE2EDuration="1m17.502449646s" podCreationTimestamp="2026-02-18 11:52:57 +0000 UTC" firstStartedPulling="2026-02-18 11:53:13.427054482 +0000 UTC m=+995.154758562" lastFinishedPulling="2026-02-18 11:53:24.105254763 +0000 UTC m=+1005.832958843" observedRunningTime="2026-02-18 11:54:14.047524599 +0000 UTC m=+1055.775228779" watchObservedRunningTime="2026-02-18 11:54:14.502449646 +0000 UTC m=+1056.230153726" Feb 18 11:54:14 crc kubenswrapper[4922]: I0218 11:54:14.570458 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/83fbf909-70fe-4d3c-9b45-3f5a6733779c-etc-swift\") pod \"83fbf909-70fe-4d3c-9b45-3f5a6733779c\" (UID: \"83fbf909-70fe-4d3c-9b45-3f5a6733779c\") " Feb 18 11:54:14 crc kubenswrapper[4922]: I0218 11:54:14.570611 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6s79m\" (UniqueName: \"kubernetes.io/projected/83fbf909-70fe-4d3c-9b45-3f5a6733779c-kube-api-access-6s79m\") pod \"83fbf909-70fe-4d3c-9b45-3f5a6733779c\" (UID: \"83fbf909-70fe-4d3c-9b45-3f5a6733779c\") " Feb 18 11:54:14 crc kubenswrapper[4922]: I0218 11:54:14.570670 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/83fbf909-70fe-4d3c-9b45-3f5a6733779c-ring-data-devices\") pod \"83fbf909-70fe-4d3c-9b45-3f5a6733779c\" (UID: \"83fbf909-70fe-4d3c-9b45-3f5a6733779c\") " Feb 18 11:54:14 crc kubenswrapper[4922]: I0218 11:54:14.570843 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/83fbf909-70fe-4d3c-9b45-3f5a6733779c-swiftconf\") pod \"83fbf909-70fe-4d3c-9b45-3f5a6733779c\" (UID: \"83fbf909-70fe-4d3c-9b45-3f5a6733779c\") " Feb 18 11:54:14 crc kubenswrapper[4922]: I0218 11:54:14.570963 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/83fbf909-70fe-4d3c-9b45-3f5a6733779c-scripts\") pod \"83fbf909-70fe-4d3c-9b45-3f5a6733779c\" (UID: \"83fbf909-70fe-4d3c-9b45-3f5a6733779c\") " Feb 18 11:54:14 crc kubenswrapper[4922]: I0218 11:54:14.570989 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83fbf909-70fe-4d3c-9b45-3f5a6733779c-combined-ca-bundle\") pod \"83fbf909-70fe-4d3c-9b45-3f5a6733779c\" (UID: \"83fbf909-70fe-4d3c-9b45-3f5a6733779c\") " Feb 18 11:54:14 crc kubenswrapper[4922]: I0218 11:54:14.571066 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/83fbf909-70fe-4d3c-9b45-3f5a6733779c-dispersionconf\") pod \"83fbf909-70fe-4d3c-9b45-3f5a6733779c\" (UID: \"83fbf909-70fe-4d3c-9b45-3f5a6733779c\") " Feb 18 11:54:14 crc kubenswrapper[4922]: I0218 11:54:14.572093 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83fbf909-70fe-4d3c-9b45-3f5a6733779c-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "83fbf909-70fe-4d3c-9b45-3f5a6733779c" (UID: "83fbf909-70fe-4d3c-9b45-3f5a6733779c"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:54:14 crc kubenswrapper[4922]: I0218 11:54:14.572452 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83fbf909-70fe-4d3c-9b45-3f5a6733779c-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "83fbf909-70fe-4d3c-9b45-3f5a6733779c" (UID: "83fbf909-70fe-4d3c-9b45-3f5a6733779c"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:54:14 crc kubenswrapper[4922]: I0218 11:54:14.577760 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83fbf909-70fe-4d3c-9b45-3f5a6733779c-kube-api-access-6s79m" (OuterVolumeSpecName: "kube-api-access-6s79m") pod "83fbf909-70fe-4d3c-9b45-3f5a6733779c" (UID: "83fbf909-70fe-4d3c-9b45-3f5a6733779c"). InnerVolumeSpecName "kube-api-access-6s79m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:54:14 crc kubenswrapper[4922]: I0218 11:54:14.581307 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83fbf909-70fe-4d3c-9b45-3f5a6733779c-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "83fbf909-70fe-4d3c-9b45-3f5a6733779c" (UID: "83fbf909-70fe-4d3c-9b45-3f5a6733779c"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:54:14 crc kubenswrapper[4922]: I0218 11:54:14.597557 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83fbf909-70fe-4d3c-9b45-3f5a6733779c-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "83fbf909-70fe-4d3c-9b45-3f5a6733779c" (UID: "83fbf909-70fe-4d3c-9b45-3f5a6733779c"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:54:14 crc kubenswrapper[4922]: I0218 11:54:14.601481 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83fbf909-70fe-4d3c-9b45-3f5a6733779c-scripts" (OuterVolumeSpecName: "scripts") pod "83fbf909-70fe-4d3c-9b45-3f5a6733779c" (UID: "83fbf909-70fe-4d3c-9b45-3f5a6733779c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:54:14 crc kubenswrapper[4922]: I0218 11:54:14.606811 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83fbf909-70fe-4d3c-9b45-3f5a6733779c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "83fbf909-70fe-4d3c-9b45-3f5a6733779c" (UID: "83fbf909-70fe-4d3c-9b45-3f5a6733779c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:54:14 crc kubenswrapper[4922]: I0218 11:54:14.673788 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6s79m\" (UniqueName: \"kubernetes.io/projected/83fbf909-70fe-4d3c-9b45-3f5a6733779c-kube-api-access-6s79m\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:14 crc kubenswrapper[4922]: I0218 11:54:14.673860 4922 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/83fbf909-70fe-4d3c-9b45-3f5a6733779c-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:14 crc kubenswrapper[4922]: I0218 11:54:14.673872 4922 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/83fbf909-70fe-4d3c-9b45-3f5a6733779c-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:14 crc kubenswrapper[4922]: I0218 11:54:14.673882 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/83fbf909-70fe-4d3c-9b45-3f5a6733779c-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:14 crc kubenswrapper[4922]: I0218 11:54:14.673890 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83fbf909-70fe-4d3c-9b45-3f5a6733779c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:14 crc kubenswrapper[4922]: I0218 11:54:14.673898 4922 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/83fbf909-70fe-4d3c-9b45-3f5a6733779c-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:14 crc kubenswrapper[4922]: I0218 11:54:14.673906 4922 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/83fbf909-70fe-4d3c-9b45-3f5a6733779c-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:14 crc kubenswrapper[4922]: I0218 11:54:14.718914 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-qvjkl"] Feb 18 11:54:14 crc kubenswrapper[4922]: W0218 11:54:14.738283 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83953c4e_9a54_453a_880f_d5e4c01608f9.slice/crio-9b3edbf7a895e8645eb5ae0cc01ca375f19a1fd80e1f6971b3313bb081ae8ee8 WatchSource:0}: Error finding container 9b3edbf7a895e8645eb5ae0cc01ca375f19a1fd80e1f6971b3313bb081ae8ee8: Status 404 returned error can't find the container with id 9b3edbf7a895e8645eb5ae0cc01ca375f19a1fd80e1f6971b3313bb081ae8ee8 Feb 18 11:54:14 crc kubenswrapper[4922]: I0218 11:54:14.981890 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-6gzbs" Feb 18 11:54:14 crc kubenswrapper[4922]: I0218 11:54:14.998320 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-qvjkl" event={"ID":"83953c4e-9a54-453a-880f-d5e4c01608f9","Type":"ContainerStarted","Data":"50e02399793b7c21fb8b885dfb76c0b8a822098669151d568f198c715c6c35d1"} Feb 18 11:54:14 crc kubenswrapper[4922]: I0218 11:54:14.998388 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-qvjkl" event={"ID":"83953c4e-9a54-453a-880f-d5e4c01608f9","Type":"ContainerStarted","Data":"9b3edbf7a895e8645eb5ae0cc01ca375f19a1fd80e1f6971b3313bb081ae8ee8"} Feb 18 11:54:14 crc kubenswrapper[4922]: I0218 11:54:14.998404 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-6gzbs" event={"ID":"83fbf909-70fe-4d3c-9b45-3f5a6733779c","Type":"ContainerDied","Data":"7cb68b07b7628d48bbfa07f469ab2d1994a576e88de08a8b50b0bca8c09412b9"} Feb 18 11:54:14 crc kubenswrapper[4922]: I0218 11:54:14.998420 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7cb68b07b7628d48bbfa07f469ab2d1994a576e88de08a8b50b0bca8c09412b9" Feb 18 11:54:15 crc kubenswrapper[4922]: I0218 11:54:15.017051 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-qvjkl" podStartSLOduration=2.017031438 podStartE2EDuration="2.017031438s" podCreationTimestamp="2026-02-18 11:54:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:54:15.003446413 +0000 UTC m=+1056.731150503" watchObservedRunningTime="2026-02-18 11:54:15.017031438 +0000 UTC m=+1056.744735518" Feb 18 11:54:15 crc kubenswrapper[4922]: I0218 11:54:15.992424 4922 generic.go:334] "Generic (PLEG): container finished" podID="83953c4e-9a54-453a-880f-d5e4c01608f9" containerID="50e02399793b7c21fb8b885dfb76c0b8a822098669151d568f198c715c6c35d1" exitCode=0 Feb 18 11:54:15 crc kubenswrapper[4922]: I0218 11:54:15.992575 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-qvjkl" event={"ID":"83953c4e-9a54-453a-880f-d5e4c01608f9","Type":"ContainerDied","Data":"50e02399793b7c21fb8b885dfb76c0b8a822098669151d568f198c715c6c35d1"} Feb 18 11:54:16 crc kubenswrapper[4922]: I0218 11:54:16.983483 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-stvc7" Feb 18 11:54:16 crc kubenswrapper[4922]: I0218 11:54:16.992253 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-stvc7" Feb 18 11:54:16 crc kubenswrapper[4922]: I0218 11:54:16.994990 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-996pg" podUID="a2d0a226-07e2-402d-a868-2f8374670dac" containerName="ovn-controller" probeResult="failure" output=< Feb 18 11:54:16 crc kubenswrapper[4922]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 18 11:54:16 crc kubenswrapper[4922]: > Feb 18 11:54:17 crc kubenswrapper[4922]: I0218 11:54:17.252083 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-996pg-config-mkd9k"] Feb 18 11:54:17 crc kubenswrapper[4922]: E0218 11:54:17.252810 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83fbf909-70fe-4d3c-9b45-3f5a6733779c" containerName="swift-ring-rebalance" Feb 18 11:54:17 crc kubenswrapper[4922]: I0218 11:54:17.252836 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="83fbf909-70fe-4d3c-9b45-3f5a6733779c" containerName="swift-ring-rebalance" Feb 18 11:54:17 crc kubenswrapper[4922]: I0218 11:54:17.253085 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="83fbf909-70fe-4d3c-9b45-3f5a6733779c" containerName="swift-ring-rebalance" Feb 18 11:54:17 crc kubenswrapper[4922]: I0218 11:54:17.253794 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-996pg-config-mkd9k" Feb 18 11:54:17 crc kubenswrapper[4922]: I0218 11:54:17.256068 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 18 11:54:17 crc kubenswrapper[4922]: I0218 11:54:17.260649 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-996pg-config-mkd9k"] Feb 18 11:54:17 crc kubenswrapper[4922]: I0218 11:54:17.325382 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc75n\" (UniqueName: \"kubernetes.io/projected/82ba5186-f0aa-4d19-a516-254374dba75f-kube-api-access-pc75n\") pod \"ovn-controller-996pg-config-mkd9k\" (UID: \"82ba5186-f0aa-4d19-a516-254374dba75f\") " pod="openstack/ovn-controller-996pg-config-mkd9k" Feb 18 11:54:17 crc kubenswrapper[4922]: I0218 11:54:17.325564 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/82ba5186-f0aa-4d19-a516-254374dba75f-var-log-ovn\") pod \"ovn-controller-996pg-config-mkd9k\" (UID: \"82ba5186-f0aa-4d19-a516-254374dba75f\") " pod="openstack/ovn-controller-996pg-config-mkd9k" Feb 18 11:54:17 crc kubenswrapper[4922]: I0218 11:54:17.325596 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/82ba5186-f0aa-4d19-a516-254374dba75f-var-run\") pod \"ovn-controller-996pg-config-mkd9k\" (UID: \"82ba5186-f0aa-4d19-a516-254374dba75f\") " pod="openstack/ovn-controller-996pg-config-mkd9k" Feb 18 11:54:17 crc kubenswrapper[4922]: I0218 11:54:17.325772 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/82ba5186-f0aa-4d19-a516-254374dba75f-var-run-ovn\") pod \"ovn-controller-996pg-config-mkd9k\" (UID: \"82ba5186-f0aa-4d19-a516-254374dba75f\") " pod="openstack/ovn-controller-996pg-config-mkd9k" Feb 18 11:54:17 crc kubenswrapper[4922]: I0218 11:54:17.325824 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/82ba5186-f0aa-4d19-a516-254374dba75f-scripts\") pod \"ovn-controller-996pg-config-mkd9k\" (UID: \"82ba5186-f0aa-4d19-a516-254374dba75f\") " pod="openstack/ovn-controller-996pg-config-mkd9k" Feb 18 11:54:17 crc kubenswrapper[4922]: I0218 11:54:17.325853 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/82ba5186-f0aa-4d19-a516-254374dba75f-additional-scripts\") pod \"ovn-controller-996pg-config-mkd9k\" (UID: \"82ba5186-f0aa-4d19-a516-254374dba75f\") " pod="openstack/ovn-controller-996pg-config-mkd9k" Feb 18 11:54:17 crc kubenswrapper[4922]: I0218 11:54:17.427413 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/82ba5186-f0aa-4d19-a516-254374dba75f-var-run-ovn\") pod \"ovn-controller-996pg-config-mkd9k\" (UID: \"82ba5186-f0aa-4d19-a516-254374dba75f\") " pod="openstack/ovn-controller-996pg-config-mkd9k" Feb 18 11:54:17 crc kubenswrapper[4922]: I0218 11:54:17.427486 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/82ba5186-f0aa-4d19-a516-254374dba75f-scripts\") pod \"ovn-controller-996pg-config-mkd9k\" (UID: \"82ba5186-f0aa-4d19-a516-254374dba75f\") " pod="openstack/ovn-controller-996pg-config-mkd9k" Feb 18 11:54:17 crc kubenswrapper[4922]: I0218 11:54:17.427510 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/82ba5186-f0aa-4d19-a516-254374dba75f-additional-scripts\") pod \"ovn-controller-996pg-config-mkd9k\" (UID: \"82ba5186-f0aa-4d19-a516-254374dba75f\") " pod="openstack/ovn-controller-996pg-config-mkd9k" Feb 18 11:54:17 crc kubenswrapper[4922]: I0218 11:54:17.427546 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc75n\" (UniqueName: \"kubernetes.io/projected/82ba5186-f0aa-4d19-a516-254374dba75f-kube-api-access-pc75n\") pod \"ovn-controller-996pg-config-mkd9k\" (UID: \"82ba5186-f0aa-4d19-a516-254374dba75f\") " pod="openstack/ovn-controller-996pg-config-mkd9k" Feb 18 11:54:17 crc kubenswrapper[4922]: I0218 11:54:17.427631 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/82ba5186-f0aa-4d19-a516-254374dba75f-var-log-ovn\") pod \"ovn-controller-996pg-config-mkd9k\" (UID: \"82ba5186-f0aa-4d19-a516-254374dba75f\") " pod="openstack/ovn-controller-996pg-config-mkd9k" Feb 18 11:54:17 crc kubenswrapper[4922]: I0218 11:54:17.427662 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/82ba5186-f0aa-4d19-a516-254374dba75f-var-run\") pod \"ovn-controller-996pg-config-mkd9k\" (UID: \"82ba5186-f0aa-4d19-a516-254374dba75f\") " pod="openstack/ovn-controller-996pg-config-mkd9k" Feb 18 11:54:17 crc kubenswrapper[4922]: I0218 11:54:17.427764 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/82ba5186-f0aa-4d19-a516-254374dba75f-var-run\") pod \"ovn-controller-996pg-config-mkd9k\" (UID: \"82ba5186-f0aa-4d19-a516-254374dba75f\") " pod="openstack/ovn-controller-996pg-config-mkd9k" Feb 18 11:54:17 crc kubenswrapper[4922]: I0218 11:54:17.427824 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/82ba5186-f0aa-4d19-a516-254374dba75f-var-log-ovn\") pod \"ovn-controller-996pg-config-mkd9k\" (UID: \"82ba5186-f0aa-4d19-a516-254374dba75f\") " pod="openstack/ovn-controller-996pg-config-mkd9k" Feb 18 11:54:17 crc kubenswrapper[4922]: I0218 11:54:17.428029 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/82ba5186-f0aa-4d19-a516-254374dba75f-var-run-ovn\") pod \"ovn-controller-996pg-config-mkd9k\" (UID: \"82ba5186-f0aa-4d19-a516-254374dba75f\") " pod="openstack/ovn-controller-996pg-config-mkd9k" Feb 18 11:54:17 crc kubenswrapper[4922]: I0218 11:54:17.428592 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/82ba5186-f0aa-4d19-a516-254374dba75f-additional-scripts\") pod \"ovn-controller-996pg-config-mkd9k\" (UID: \"82ba5186-f0aa-4d19-a516-254374dba75f\") " pod="openstack/ovn-controller-996pg-config-mkd9k" Feb 18 11:54:17 crc kubenswrapper[4922]: I0218 11:54:17.429738 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/82ba5186-f0aa-4d19-a516-254374dba75f-scripts\") pod \"ovn-controller-996pg-config-mkd9k\" (UID: \"82ba5186-f0aa-4d19-a516-254374dba75f\") " pod="openstack/ovn-controller-996pg-config-mkd9k" Feb 18 11:54:17 crc kubenswrapper[4922]: I0218 11:54:17.458992 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc75n\" (UniqueName: \"kubernetes.io/projected/82ba5186-f0aa-4d19-a516-254374dba75f-kube-api-access-pc75n\") pod \"ovn-controller-996pg-config-mkd9k\" (UID: \"82ba5186-f0aa-4d19-a516-254374dba75f\") " pod="openstack/ovn-controller-996pg-config-mkd9k" Feb 18 11:54:17 crc kubenswrapper[4922]: I0218 11:54:17.569908 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-996pg-config-mkd9k" Feb 18 11:54:18 crc kubenswrapper[4922]: I0218 11:54:18.525601 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 18 11:54:20 crc kubenswrapper[4922]: I0218 11:54:20.528976 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 18 11:54:20 crc kubenswrapper[4922]: I0218 11:54:20.533035 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 18 11:54:21 crc kubenswrapper[4922]: I0218 11:54:21.058061 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 18 11:54:21 crc kubenswrapper[4922]: I0218 11:54:21.993441 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-996pg" podUID="a2d0a226-07e2-402d-a868-2f8374670dac" containerName="ovn-controller" probeResult="failure" output=< Feb 18 11:54:21 crc kubenswrapper[4922]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 18 11:54:21 crc kubenswrapper[4922]: > Feb 18 11:54:23 crc kubenswrapper[4922]: I0218 11:54:23.347752 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 18 11:54:23 crc kubenswrapper[4922]: I0218 11:54:23.348063 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="e35b9ac7-2e11-4096-a77a-4be1a41d737f" containerName="prometheus" containerID="cri-o://e96b484d9840abfb2c7757906d574c4d66e64e51bbf1dbb2212180e14faff247" gracePeriod=600 Feb 18 11:54:23 crc kubenswrapper[4922]: I0218 11:54:23.348582 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="e35b9ac7-2e11-4096-a77a-4be1a41d737f" containerName="thanos-sidecar" containerID="cri-o://184f9306d2796d4d920eb675a2e3eb752c81b217570c13dde972d69b9b2436a0" gracePeriod=600 Feb 18 11:54:23 crc kubenswrapper[4922]: I0218 11:54:23.348643 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="e35b9ac7-2e11-4096-a77a-4be1a41d737f" containerName="config-reloader" containerID="cri-o://3f427700514946e216712d6593eedc4f63a171beb78d1aacdd21101e9bbb6fd9" gracePeriod=600 Feb 18 11:54:24 crc kubenswrapper[4922]: I0218 11:54:24.091768 4922 generic.go:334] "Generic (PLEG): container finished" podID="e35b9ac7-2e11-4096-a77a-4be1a41d737f" containerID="184f9306d2796d4d920eb675a2e3eb752c81b217570c13dde972d69b9b2436a0" exitCode=0 Feb 18 11:54:24 crc kubenswrapper[4922]: I0218 11:54:24.091806 4922 generic.go:334] "Generic (PLEG): container finished" podID="e35b9ac7-2e11-4096-a77a-4be1a41d737f" containerID="3f427700514946e216712d6593eedc4f63a171beb78d1aacdd21101e9bbb6fd9" exitCode=0 Feb 18 11:54:24 crc kubenswrapper[4922]: I0218 11:54:24.091814 4922 generic.go:334] "Generic (PLEG): container finished" podID="e35b9ac7-2e11-4096-a77a-4be1a41d737f" containerID="e96b484d9840abfb2c7757906d574c4d66e64e51bbf1dbb2212180e14faff247" exitCode=0 Feb 18 11:54:24 crc kubenswrapper[4922]: I0218 11:54:24.091834 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e35b9ac7-2e11-4096-a77a-4be1a41d737f","Type":"ContainerDied","Data":"184f9306d2796d4d920eb675a2e3eb752c81b217570c13dde972d69b9b2436a0"} Feb 18 11:54:24 crc kubenswrapper[4922]: I0218 11:54:24.091862 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e35b9ac7-2e11-4096-a77a-4be1a41d737f","Type":"ContainerDied","Data":"3f427700514946e216712d6593eedc4f63a171beb78d1aacdd21101e9bbb6fd9"} Feb 18 11:54:24 crc kubenswrapper[4922]: I0218 11:54:24.091874 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e35b9ac7-2e11-4096-a77a-4be1a41d737f","Type":"ContainerDied","Data":"e96b484d9840abfb2c7757906d574c4d66e64e51bbf1dbb2212180e14faff247"} Feb 18 11:54:24 crc kubenswrapper[4922]: E0218 11:54:24.705204 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Feb 18 11:54:24 crc kubenswrapper[4922]: E0218 11:54:24.705449 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zmckq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-st9pz_openstack(855fb3ec-e473-4a99-a94f-cc96dda6d9c4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 11:54:24 crc kubenswrapper[4922]: E0218 11:54:24.707245 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-st9pz" podUID="855fb3ec-e473-4a99-a94f-cc96dda6d9c4" Feb 18 11:54:24 crc kubenswrapper[4922]: I0218 11:54:24.737905 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qvjkl" Feb 18 11:54:24 crc kubenswrapper[4922]: I0218 11:54:24.785577 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxdxh\" (UniqueName: \"kubernetes.io/projected/83953c4e-9a54-453a-880f-d5e4c01608f9-kube-api-access-dxdxh\") pod \"83953c4e-9a54-453a-880f-d5e4c01608f9\" (UID: \"83953c4e-9a54-453a-880f-d5e4c01608f9\") " Feb 18 11:54:24 crc kubenswrapper[4922]: I0218 11:54:24.785716 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83953c4e-9a54-453a-880f-d5e4c01608f9-operator-scripts\") pod \"83953c4e-9a54-453a-880f-d5e4c01608f9\" (UID: \"83953c4e-9a54-453a-880f-d5e4c01608f9\") " Feb 18 11:54:24 crc kubenswrapper[4922]: I0218 11:54:24.787644 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83953c4e-9a54-453a-880f-d5e4c01608f9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "83953c4e-9a54-453a-880f-d5e4c01608f9" (UID: "83953c4e-9a54-453a-880f-d5e4c01608f9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:54:24 crc kubenswrapper[4922]: I0218 11:54:24.795868 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83953c4e-9a54-453a-880f-d5e4c01608f9-kube-api-access-dxdxh" (OuterVolumeSpecName: "kube-api-access-dxdxh") pod "83953c4e-9a54-453a-880f-d5e4c01608f9" (UID: "83953c4e-9a54-453a-880f-d5e4c01608f9"). InnerVolumeSpecName "kube-api-access-dxdxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:54:24 crc kubenswrapper[4922]: I0218 11:54:24.888875 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxdxh\" (UniqueName: \"kubernetes.io/projected/83953c4e-9a54-453a-880f-d5e4c01608f9-kube-api-access-dxdxh\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:24 crc kubenswrapper[4922]: I0218 11:54:24.888920 4922 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83953c4e-9a54-453a-880f-d5e4c01608f9-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.043946 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.091567 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15ee050c-37dd-43df-8cbe-1200f09a5545\") pod \"e35b9ac7-2e11-4096-a77a-4be1a41d737f\" (UID: \"e35b9ac7-2e11-4096-a77a-4be1a41d737f\") " Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.091871 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e35b9ac7-2e11-4096-a77a-4be1a41d737f-config-out\") pod \"e35b9ac7-2e11-4096-a77a-4be1a41d737f\" (UID: \"e35b9ac7-2e11-4096-a77a-4be1a41d737f\") " Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.092033 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e35b9ac7-2e11-4096-a77a-4be1a41d737f-thanos-prometheus-http-client-file\") pod \"e35b9ac7-2e11-4096-a77a-4be1a41d737f\" (UID: \"e35b9ac7-2e11-4096-a77a-4be1a41d737f\") " Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.092147 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d6c8\" (UniqueName: \"kubernetes.io/projected/e35b9ac7-2e11-4096-a77a-4be1a41d737f-kube-api-access-2d6c8\") pod \"e35b9ac7-2e11-4096-a77a-4be1a41d737f\" (UID: \"e35b9ac7-2e11-4096-a77a-4be1a41d737f\") " Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.092226 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e35b9ac7-2e11-4096-a77a-4be1a41d737f-prometheus-metric-storage-rulefiles-0\") pod \"e35b9ac7-2e11-4096-a77a-4be1a41d737f\" (UID: \"e35b9ac7-2e11-4096-a77a-4be1a41d737f\") " Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.092414 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e35b9ac7-2e11-4096-a77a-4be1a41d737f-config\") pod \"e35b9ac7-2e11-4096-a77a-4be1a41d737f\" (UID: \"e35b9ac7-2e11-4096-a77a-4be1a41d737f\") " Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.092739 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/e35b9ac7-2e11-4096-a77a-4be1a41d737f-prometheus-metric-storage-rulefiles-2\") pod \"e35b9ac7-2e11-4096-a77a-4be1a41d737f\" (UID: \"e35b9ac7-2e11-4096-a77a-4be1a41d737f\") " Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.092862 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/e35b9ac7-2e11-4096-a77a-4be1a41d737f-prometheus-metric-storage-rulefiles-1\") pod \"e35b9ac7-2e11-4096-a77a-4be1a41d737f\" (UID: \"e35b9ac7-2e11-4096-a77a-4be1a41d737f\") " Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.092946 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e35b9ac7-2e11-4096-a77a-4be1a41d737f-tls-assets\") pod \"e35b9ac7-2e11-4096-a77a-4be1a41d737f\" (UID: \"e35b9ac7-2e11-4096-a77a-4be1a41d737f\") " Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.093031 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e35b9ac7-2e11-4096-a77a-4be1a41d737f-web-config\") pod \"e35b9ac7-2e11-4096-a77a-4be1a41d737f\" (UID: \"e35b9ac7-2e11-4096-a77a-4be1a41d737f\") " Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.098913 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e35b9ac7-2e11-4096-a77a-4be1a41d737f-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "e35b9ac7-2e11-4096-a77a-4be1a41d737f" (UID: "e35b9ac7-2e11-4096-a77a-4be1a41d737f"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.108531 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e35b9ac7-2e11-4096-a77a-4be1a41d737f-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "e35b9ac7-2e11-4096-a77a-4be1a41d737f" (UID: "e35b9ac7-2e11-4096-a77a-4be1a41d737f"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.108563 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e35b9ac7-2e11-4096-a77a-4be1a41d737f-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "e35b9ac7-2e11-4096-a77a-4be1a41d737f" (UID: "e35b9ac7-2e11-4096-a77a-4be1a41d737f"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.109628 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e35b9ac7-2e11-4096-a77a-4be1a41d737f-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "e35b9ac7-2e11-4096-a77a-4be1a41d737f" (UID: "e35b9ac7-2e11-4096-a77a-4be1a41d737f"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.113724 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e35b9ac7-2e11-4096-a77a-4be1a41d737f-kube-api-access-2d6c8" (OuterVolumeSpecName: "kube-api-access-2d6c8") pod "e35b9ac7-2e11-4096-a77a-4be1a41d737f" (UID: "e35b9ac7-2e11-4096-a77a-4be1a41d737f"). InnerVolumeSpecName "kube-api-access-2d6c8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.119579 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e35b9ac7-2e11-4096-a77a-4be1a41d737f-config-out" (OuterVolumeSpecName: "config-out") pod "e35b9ac7-2e11-4096-a77a-4be1a41d737f" (UID: "e35b9ac7-2e11-4096-a77a-4be1a41d737f"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.120040 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-qvjkl" event={"ID":"83953c4e-9a54-453a-880f-d5e4c01608f9","Type":"ContainerDied","Data":"9b3edbf7a895e8645eb5ae0cc01ca375f19a1fd80e1f6971b3313bb081ae8ee8"} Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.120086 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b3edbf7a895e8645eb5ae0cc01ca375f19a1fd80e1f6971b3313bb081ae8ee8" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.120177 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qvjkl" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.126344 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e35b9ac7-2e11-4096-a77a-4be1a41d737f","Type":"ContainerDied","Data":"bf8e622508a488fa6a5aab10a0db437c746f98562fadc68cb32fcd3c28724d08"} Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.126421 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.126429 4922 scope.go:117] "RemoveContainer" containerID="184f9306d2796d4d920eb675a2e3eb752c81b217570c13dde972d69b9b2436a0" Feb 18 11:54:25 crc kubenswrapper[4922]: E0218 11:54:25.128948 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-st9pz" podUID="855fb3ec-e473-4a99-a94f-cc96dda6d9c4" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.130942 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e35b9ac7-2e11-4096-a77a-4be1a41d737f-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "e35b9ac7-2e11-4096-a77a-4be1a41d737f" (UID: "e35b9ac7-2e11-4096-a77a-4be1a41d737f"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.133130 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e35b9ac7-2e11-4096-a77a-4be1a41d737f-config" (OuterVolumeSpecName: "config") pod "e35b9ac7-2e11-4096-a77a-4be1a41d737f" (UID: "e35b9ac7-2e11-4096-a77a-4be1a41d737f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.147885 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15ee050c-37dd-43df-8cbe-1200f09a5545" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "e35b9ac7-2e11-4096-a77a-4be1a41d737f" (UID: "e35b9ac7-2e11-4096-a77a-4be1a41d737f"). InnerVolumeSpecName "pvc-15ee050c-37dd-43df-8cbe-1200f09a5545". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.175876 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e35b9ac7-2e11-4096-a77a-4be1a41d737f-web-config" (OuterVolumeSpecName: "web-config") pod "e35b9ac7-2e11-4096-a77a-4be1a41d737f" (UID: "e35b9ac7-2e11-4096-a77a-4be1a41d737f"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.176018 4922 scope.go:117] "RemoveContainer" containerID="3f427700514946e216712d6593eedc4f63a171beb78d1aacdd21101e9bbb6fd9" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.196496 4922 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-15ee050c-37dd-43df-8cbe-1200f09a5545\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15ee050c-37dd-43df-8cbe-1200f09a5545\") on node \"crc\" " Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.196526 4922 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e35b9ac7-2e11-4096-a77a-4be1a41d737f-config-out\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.196543 4922 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e35b9ac7-2e11-4096-a77a-4be1a41d737f-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.196554 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d6c8\" (UniqueName: \"kubernetes.io/projected/e35b9ac7-2e11-4096-a77a-4be1a41d737f-kube-api-access-2d6c8\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.196566 4922 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e35b9ac7-2e11-4096-a77a-4be1a41d737f-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.196576 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e35b9ac7-2e11-4096-a77a-4be1a41d737f-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.196586 4922 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/e35b9ac7-2e11-4096-a77a-4be1a41d737f-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.196595 4922 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/e35b9ac7-2e11-4096-a77a-4be1a41d737f-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.196604 4922 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e35b9ac7-2e11-4096-a77a-4be1a41d737f-tls-assets\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.196613 4922 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e35b9ac7-2e11-4096-a77a-4be1a41d737f-web-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.197663 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-996pg-config-mkd9k"] Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.208680 4922 scope.go:117] "RemoveContainer" containerID="e96b484d9840abfb2c7757906d574c4d66e64e51bbf1dbb2212180e14faff247" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.214721 4922 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.214898 4922 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-15ee050c-37dd-43df-8cbe-1200f09a5545" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15ee050c-37dd-43df-8cbe-1200f09a5545") on node "crc" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.239402 4922 scope.go:117] "RemoveContainer" containerID="8a6f423971fc2df44ffe536b7fffc2792fbaae14dda7a484a39a6fbec81c8d3e" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.298077 4922 reconciler_common.go:293] "Volume detached for volume \"pvc-15ee050c-37dd-43df-8cbe-1200f09a5545\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15ee050c-37dd-43df-8cbe-1200f09a5545\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.464324 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.472160 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.508176 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 18 11:54:25 crc kubenswrapper[4922]: E0218 11:54:25.508617 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e35b9ac7-2e11-4096-a77a-4be1a41d737f" containerName="init-config-reloader" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.508642 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="e35b9ac7-2e11-4096-a77a-4be1a41d737f" containerName="init-config-reloader" Feb 18 11:54:25 crc kubenswrapper[4922]: E0218 11:54:25.508656 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e35b9ac7-2e11-4096-a77a-4be1a41d737f" containerName="thanos-sidecar" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.508664 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="e35b9ac7-2e11-4096-a77a-4be1a41d737f" containerName="thanos-sidecar" Feb 18 11:54:25 crc kubenswrapper[4922]: E0218 11:54:25.508681 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83953c4e-9a54-453a-880f-d5e4c01608f9" containerName="mariadb-account-create-update" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.508692 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="83953c4e-9a54-453a-880f-d5e4c01608f9" containerName="mariadb-account-create-update" Feb 18 11:54:25 crc kubenswrapper[4922]: E0218 11:54:25.508713 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e35b9ac7-2e11-4096-a77a-4be1a41d737f" containerName="prometheus" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.508721 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="e35b9ac7-2e11-4096-a77a-4be1a41d737f" containerName="prometheus" Feb 18 11:54:25 crc kubenswrapper[4922]: E0218 11:54:25.508731 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e35b9ac7-2e11-4096-a77a-4be1a41d737f" containerName="config-reloader" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.508740 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="e35b9ac7-2e11-4096-a77a-4be1a41d737f" containerName="config-reloader" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.508941 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="83953c4e-9a54-453a-880f-d5e4c01608f9" containerName="mariadb-account-create-update" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.508960 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="e35b9ac7-2e11-4096-a77a-4be1a41d737f" containerName="thanos-sidecar" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.508973 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="e35b9ac7-2e11-4096-a77a-4be1a41d737f" containerName="config-reloader" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.508985 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="e35b9ac7-2e11-4096-a77a-4be1a41d737f" containerName="prometheus" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.510878 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.522432 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.522582 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.522649 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.522918 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.523277 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-xmthr" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.523466 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.523580 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.523653 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.534588 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.539750 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.602835 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.602875 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.602978 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.603069 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-config\") pod \"prometheus-metric-storage-0\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.603117 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.603152 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.603226 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.603376 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.603497 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-15ee050c-37dd-43df-8cbe-1200f09a5545\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15ee050c-37dd-43df-8cbe-1200f09a5545\") pod \"prometheus-metric-storage-0\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.603561 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.603618 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp5t7\" (UniqueName: \"kubernetes.io/projected/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-kube-api-access-vp5t7\") pod \"prometheus-metric-storage-0\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.603738 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.603809 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.705547 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.705639 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-15ee050c-37dd-43df-8cbe-1200f09a5545\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15ee050c-37dd-43df-8cbe-1200f09a5545\") pod \"prometheus-metric-storage-0\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.705688 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.705715 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vp5t7\" (UniqueName: \"kubernetes.io/projected/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-kube-api-access-vp5t7\") pod \"prometheus-metric-storage-0\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.705750 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.705775 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.705834 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.705858 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.705890 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.705920 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-config\") pod \"prometheus-metric-storage-0\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.705947 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.705973 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.706012 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.707788 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.707989 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.708044 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.710776 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.711191 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.711464 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.716809 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.717342 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.718105 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.718172 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.718532 4922 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.718557 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-15ee050c-37dd-43df-8cbe-1200f09a5545\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15ee050c-37dd-43df-8cbe-1200f09a5545\") pod \"prometheus-metric-storage-0\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d461f0c4a551673a0d7d7003637451f1312f1b9722a2159a051859daee296e97/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.721866 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-config\") pod \"prometheus-metric-storage-0\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.729852 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp5t7\" (UniqueName: \"kubernetes.io/projected/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-kube-api-access-vp5t7\") pod \"prometheus-metric-storage-0\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.754166 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-15ee050c-37dd-43df-8cbe-1200f09a5545\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15ee050c-37dd-43df-8cbe-1200f09a5545\") pod \"prometheus-metric-storage-0\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") " pod="openstack/prometheus-metric-storage-0" Feb 18 11:54:25 crc kubenswrapper[4922]: I0218 11:54:25.834999 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 18 11:54:26 crc kubenswrapper[4922]: I0218 11:54:26.145317 4922 generic.go:334] "Generic (PLEG): container finished" podID="82ba5186-f0aa-4d19-a516-254374dba75f" containerID="5176cb9980de6bcd0a67b80f4ff01a72286ab10295b4c3d177fe01a39914f0b0" exitCode=0 Feb 18 11:54:26 crc kubenswrapper[4922]: I0218 11:54:26.145617 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-996pg-config-mkd9k" event={"ID":"82ba5186-f0aa-4d19-a516-254374dba75f","Type":"ContainerDied","Data":"5176cb9980de6bcd0a67b80f4ff01a72286ab10295b4c3d177fe01a39914f0b0"} Feb 18 11:54:26 crc kubenswrapper[4922]: I0218 11:54:26.145643 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-996pg-config-mkd9k" event={"ID":"82ba5186-f0aa-4d19-a516-254374dba75f","Type":"ContainerStarted","Data":"b06f706f916de7eacc8759839f38b70d5279b78a62e299714b26a81bb5f0da22"} Feb 18 11:54:26 crc kubenswrapper[4922]: W0218 11:54:26.205783 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod791eb2d2_e8b0_401d_8b8b_b79d675c1ca4.slice/crio-eb650b51fbe53331680115b5b916f11fa8f568a0682dba6d2801da306b88e53d WatchSource:0}: Error finding container eb650b51fbe53331680115b5b916f11fa8f568a0682dba6d2801da306b88e53d: Status 404 returned error can't find the container with id eb650b51fbe53331680115b5b916f11fa8f568a0682dba6d2801da306b88e53d Feb 18 11:54:26 crc kubenswrapper[4922]: I0218 11:54:26.211015 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 18 11:54:26 crc kubenswrapper[4922]: I0218 11:54:26.983964 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e35b9ac7-2e11-4096-a77a-4be1a41d737f" path="/var/lib/kubelet/pods/e35b9ac7-2e11-4096-a77a-4be1a41d737f/volumes" Feb 18 11:54:26 crc kubenswrapper[4922]: I0218 11:54:26.985011 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-996pg" Feb 18 11:54:27 crc kubenswrapper[4922]: I0218 11:54:27.154223 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4","Type":"ContainerStarted","Data":"eb650b51fbe53331680115b5b916f11fa8f568a0682dba6d2801da306b88e53d"} Feb 18 11:54:27 crc kubenswrapper[4922]: I0218 11:54:27.507817 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-996pg-config-mkd9k" Feb 18 11:54:27 crc kubenswrapper[4922]: I0218 11:54:27.538817 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/82ba5186-f0aa-4d19-a516-254374dba75f-additional-scripts\") pod \"82ba5186-f0aa-4d19-a516-254374dba75f\" (UID: \"82ba5186-f0aa-4d19-a516-254374dba75f\") " Feb 18 11:54:27 crc kubenswrapper[4922]: I0218 11:54:27.538867 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/82ba5186-f0aa-4d19-a516-254374dba75f-var-run\") pod \"82ba5186-f0aa-4d19-a516-254374dba75f\" (UID: \"82ba5186-f0aa-4d19-a516-254374dba75f\") " Feb 18 11:54:27 crc kubenswrapper[4922]: I0218 11:54:27.538945 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/82ba5186-f0aa-4d19-a516-254374dba75f-scripts\") pod \"82ba5186-f0aa-4d19-a516-254374dba75f\" (UID: \"82ba5186-f0aa-4d19-a516-254374dba75f\") " Feb 18 11:54:27 crc kubenswrapper[4922]: I0218 11:54:27.538978 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/82ba5186-f0aa-4d19-a516-254374dba75f-var-run-ovn\") pod \"82ba5186-f0aa-4d19-a516-254374dba75f\" (UID: \"82ba5186-f0aa-4d19-a516-254374dba75f\") " Feb 18 11:54:27 crc kubenswrapper[4922]: I0218 11:54:27.539008 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pc75n\" (UniqueName: \"kubernetes.io/projected/82ba5186-f0aa-4d19-a516-254374dba75f-kube-api-access-pc75n\") pod \"82ba5186-f0aa-4d19-a516-254374dba75f\" (UID: \"82ba5186-f0aa-4d19-a516-254374dba75f\") " Feb 18 11:54:27 crc kubenswrapper[4922]: I0218 11:54:27.539039 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/82ba5186-f0aa-4d19-a516-254374dba75f-var-log-ovn\") pod \"82ba5186-f0aa-4d19-a516-254374dba75f\" (UID: \"82ba5186-f0aa-4d19-a516-254374dba75f\") " Feb 18 11:54:27 crc kubenswrapper[4922]: I0218 11:54:27.539033 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/82ba5186-f0aa-4d19-a516-254374dba75f-var-run" (OuterVolumeSpecName: "var-run") pod "82ba5186-f0aa-4d19-a516-254374dba75f" (UID: "82ba5186-f0aa-4d19-a516-254374dba75f"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 11:54:27 crc kubenswrapper[4922]: I0218 11:54:27.539094 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/82ba5186-f0aa-4d19-a516-254374dba75f-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "82ba5186-f0aa-4d19-a516-254374dba75f" (UID: "82ba5186-f0aa-4d19-a516-254374dba75f"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 11:54:27 crc kubenswrapper[4922]: I0218 11:54:27.539552 4922 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/82ba5186-f0aa-4d19-a516-254374dba75f-var-run\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:27 crc kubenswrapper[4922]: I0218 11:54:27.539572 4922 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/82ba5186-f0aa-4d19-a516-254374dba75f-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:27 crc kubenswrapper[4922]: I0218 11:54:27.539614 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/82ba5186-f0aa-4d19-a516-254374dba75f-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "82ba5186-f0aa-4d19-a516-254374dba75f" (UID: "82ba5186-f0aa-4d19-a516-254374dba75f"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 11:54:27 crc kubenswrapper[4922]: I0218 11:54:27.539834 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82ba5186-f0aa-4d19-a516-254374dba75f-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "82ba5186-f0aa-4d19-a516-254374dba75f" (UID: "82ba5186-f0aa-4d19-a516-254374dba75f"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:54:27 crc kubenswrapper[4922]: I0218 11:54:27.540304 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82ba5186-f0aa-4d19-a516-254374dba75f-scripts" (OuterVolumeSpecName: "scripts") pod "82ba5186-f0aa-4d19-a516-254374dba75f" (UID: "82ba5186-f0aa-4d19-a516-254374dba75f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:54:27 crc kubenswrapper[4922]: I0218 11:54:27.545291 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82ba5186-f0aa-4d19-a516-254374dba75f-kube-api-access-pc75n" (OuterVolumeSpecName: "kube-api-access-pc75n") pod "82ba5186-f0aa-4d19-a516-254374dba75f" (UID: "82ba5186-f0aa-4d19-a516-254374dba75f"). InnerVolumeSpecName "kube-api-access-pc75n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:54:27 crc kubenswrapper[4922]: I0218 11:54:27.641231 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0771bdc1-7622-4a65-aa82-3150630ce652-etc-swift\") pod \"swift-storage-0\" (UID: \"0771bdc1-7622-4a65-aa82-3150630ce652\") " pod="openstack/swift-storage-0" Feb 18 11:54:27 crc kubenswrapper[4922]: I0218 11:54:27.641340 4922 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/82ba5186-f0aa-4d19-a516-254374dba75f-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:27 crc kubenswrapper[4922]: I0218 11:54:27.641352 4922 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/82ba5186-f0aa-4d19-a516-254374dba75f-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:27 crc kubenswrapper[4922]: I0218 11:54:27.641373 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/82ba5186-f0aa-4d19-a516-254374dba75f-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:27 crc kubenswrapper[4922]: I0218 11:54:27.641382 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pc75n\" (UniqueName: \"kubernetes.io/projected/82ba5186-f0aa-4d19-a516-254374dba75f-kube-api-access-pc75n\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:27 crc kubenswrapper[4922]: I0218 11:54:27.645898 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0771bdc1-7622-4a65-aa82-3150630ce652-etc-swift\") pod \"swift-storage-0\" (UID: \"0771bdc1-7622-4a65-aa82-3150630ce652\") " pod="openstack/swift-storage-0" Feb 18 11:54:27 crc kubenswrapper[4922]: I0218 11:54:27.743389 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 18 11:54:28 crc kubenswrapper[4922]: I0218 11:54:28.164589 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-996pg-config-mkd9k" event={"ID":"82ba5186-f0aa-4d19-a516-254374dba75f","Type":"ContainerDied","Data":"b06f706f916de7eacc8759839f38b70d5279b78a62e299714b26a81bb5f0da22"} Feb 18 11:54:28 crc kubenswrapper[4922]: I0218 11:54:28.164883 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b06f706f916de7eacc8759839f38b70d5279b78a62e299714b26a81bb5f0da22" Feb 18 11:54:28 crc kubenswrapper[4922]: I0218 11:54:28.164725 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-996pg-config-mkd9k" Feb 18 11:54:28 crc kubenswrapper[4922]: I0218 11:54:28.257273 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 18 11:54:28 crc kubenswrapper[4922]: W0218 11:54:28.365562 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0771bdc1_7622_4a65_aa82_3150630ce652.slice/crio-38b08c7945f70d47345b2139c350579b55cad071b52f78b49b6dc84b2b6dc24f WatchSource:0}: Error finding container 38b08c7945f70d47345b2139c350579b55cad071b52f78b49b6dc84b2b6dc24f: Status 404 returned error can't find the container with id 38b08c7945f70d47345b2139c350579b55cad071b52f78b49b6dc84b2b6dc24f Feb 18 11:54:28 crc kubenswrapper[4922]: I0218 11:54:28.528521 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 18 11:54:28 crc kubenswrapper[4922]: I0218 11:54:28.643687 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-996pg-config-mkd9k"] Feb 18 11:54:28 crc kubenswrapper[4922]: I0218 11:54:28.653625 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-996pg-config-mkd9k"] Feb 18 11:54:28 crc kubenswrapper[4922]: I0218 11:54:28.873808 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 18 11:54:28 crc kubenswrapper[4922]: I0218 11:54:28.994575 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82ba5186-f0aa-4d19-a516-254374dba75f" path="/var/lib/kubelet/pods/82ba5186-f0aa-4d19-a516-254374dba75f/volumes" Feb 18 11:54:28 crc kubenswrapper[4922]: I0218 11:54:28.995450 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-g95qz"] Feb 18 11:54:28 crc kubenswrapper[4922]: E0218 11:54:28.995920 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82ba5186-f0aa-4d19-a516-254374dba75f" containerName="ovn-config" Feb 18 11:54:28 crc kubenswrapper[4922]: I0218 11:54:28.996015 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="82ba5186-f0aa-4d19-a516-254374dba75f" containerName="ovn-config" Feb 18 11:54:28 crc kubenswrapper[4922]: I0218 11:54:28.996399 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="82ba5186-f0aa-4d19-a516-254374dba75f" containerName="ovn-config" Feb 18 11:54:28 crc kubenswrapper[4922]: I0218 11:54:28.997307 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-g95qz" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.054969 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-g95qz"] Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.139329 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-sync-sznrv"] Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.143696 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-sznrv" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.147069 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-sznrv"] Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.152773 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-82jsz" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.153015 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-config-data" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.170531 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkbqt\" (UniqueName: \"kubernetes.io/projected/b521417c-1968-49ee-8435-9e44af7e8a52-kube-api-access-wkbqt\") pod \"cinder-db-create-g95qz\" (UID: \"b521417c-1968-49ee-8435-9e44af7e8a52\") " pod="openstack/cinder-db-create-g95qz" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.171383 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b521417c-1968-49ee-8435-9e44af7e8a52-operator-scripts\") pod \"cinder-db-create-g95qz\" (UID: \"b521417c-1968-49ee-8435-9e44af7e8a52\") " pod="openstack/cinder-db-create-g95qz" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.180123 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0771bdc1-7622-4a65-aa82-3150630ce652","Type":"ContainerStarted","Data":"38b08c7945f70d47345b2139c350579b55cad071b52f78b49b6dc84b2b6dc24f"} Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.185658 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-f0db-account-create-update-vwv99"] Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.186848 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f0db-account-create-update-vwv99" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.200635 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.251352 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-f0db-account-create-update-vwv99"] Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.273353 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f835c05-4bbb-4678-9410-8523cf308f05-combined-ca-bundle\") pod \"watcher-db-sync-sznrv\" (UID: \"5f835c05-4bbb-4678-9410-8523cf308f05\") " pod="openstack/watcher-db-sync-sznrv" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.273454 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cmr9\" (UniqueName: \"kubernetes.io/projected/87604619-ec13-480d-9456-c5062685287d-kube-api-access-2cmr9\") pod \"cinder-f0db-account-create-update-vwv99\" (UID: \"87604619-ec13-480d-9456-c5062685287d\") " pod="openstack/cinder-f0db-account-create-update-vwv99" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.273528 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkbqt\" (UniqueName: \"kubernetes.io/projected/b521417c-1968-49ee-8435-9e44af7e8a52-kube-api-access-wkbqt\") pod \"cinder-db-create-g95qz\" (UID: \"b521417c-1968-49ee-8435-9e44af7e8a52\") " pod="openstack/cinder-db-create-g95qz" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.273572 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87604619-ec13-480d-9456-c5062685287d-operator-scripts\") pod \"cinder-f0db-account-create-update-vwv99\" (UID: \"87604619-ec13-480d-9456-c5062685287d\") " pod="openstack/cinder-f0db-account-create-update-vwv99" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.273614 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f835c05-4bbb-4678-9410-8523cf308f05-config-data\") pod \"watcher-db-sync-sznrv\" (UID: \"5f835c05-4bbb-4678-9410-8523cf308f05\") " pod="openstack/watcher-db-sync-sznrv" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.273660 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5f835c05-4bbb-4678-9410-8523cf308f05-db-sync-config-data\") pod \"watcher-db-sync-sznrv\" (UID: \"5f835c05-4bbb-4678-9410-8523cf308f05\") " pod="openstack/watcher-db-sync-sznrv" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.273693 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b521417c-1968-49ee-8435-9e44af7e8a52-operator-scripts\") pod \"cinder-db-create-g95qz\" (UID: \"b521417c-1968-49ee-8435-9e44af7e8a52\") " pod="openstack/cinder-db-create-g95qz" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.273797 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxwsh\" (UniqueName: \"kubernetes.io/projected/5f835c05-4bbb-4678-9410-8523cf308f05-kube-api-access-cxwsh\") pod \"watcher-db-sync-sznrv\" (UID: \"5f835c05-4bbb-4678-9410-8523cf308f05\") " pod="openstack/watcher-db-sync-sznrv" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.274858 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b521417c-1968-49ee-8435-9e44af7e8a52-operator-scripts\") pod \"cinder-db-create-g95qz\" (UID: \"b521417c-1968-49ee-8435-9e44af7e8a52\") " pod="openstack/cinder-db-create-g95qz" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.338695 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkbqt\" (UniqueName: \"kubernetes.io/projected/b521417c-1968-49ee-8435-9e44af7e8a52-kube-api-access-wkbqt\") pod \"cinder-db-create-g95qz\" (UID: \"b521417c-1968-49ee-8435-9e44af7e8a52\") " pod="openstack/cinder-db-create-g95qz" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.339129 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-g95qz" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.375173 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87604619-ec13-480d-9456-c5062685287d-operator-scripts\") pod \"cinder-f0db-account-create-update-vwv99\" (UID: \"87604619-ec13-480d-9456-c5062685287d\") " pod="openstack/cinder-f0db-account-create-update-vwv99" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.375486 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f835c05-4bbb-4678-9410-8523cf308f05-config-data\") pod \"watcher-db-sync-sznrv\" (UID: \"5f835c05-4bbb-4678-9410-8523cf308f05\") " pod="openstack/watcher-db-sync-sznrv" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.375575 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5f835c05-4bbb-4678-9410-8523cf308f05-db-sync-config-data\") pod \"watcher-db-sync-sznrv\" (UID: \"5f835c05-4bbb-4678-9410-8523cf308f05\") " pod="openstack/watcher-db-sync-sznrv" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.375702 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxwsh\" (UniqueName: \"kubernetes.io/projected/5f835c05-4bbb-4678-9410-8523cf308f05-kube-api-access-cxwsh\") pod \"watcher-db-sync-sznrv\" (UID: \"5f835c05-4bbb-4678-9410-8523cf308f05\") " pod="openstack/watcher-db-sync-sznrv" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.375786 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f835c05-4bbb-4678-9410-8523cf308f05-combined-ca-bundle\") pod \"watcher-db-sync-sznrv\" (UID: \"5f835c05-4bbb-4678-9410-8523cf308f05\") " pod="openstack/watcher-db-sync-sznrv" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.375864 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cmr9\" (UniqueName: \"kubernetes.io/projected/87604619-ec13-480d-9456-c5062685287d-kube-api-access-2cmr9\") pod \"cinder-f0db-account-create-update-vwv99\" (UID: \"87604619-ec13-480d-9456-c5062685287d\") " pod="openstack/cinder-f0db-account-create-update-vwv99" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.376117 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87604619-ec13-480d-9456-c5062685287d-operator-scripts\") pod \"cinder-f0db-account-create-update-vwv99\" (UID: \"87604619-ec13-480d-9456-c5062685287d\") " pod="openstack/cinder-f0db-account-create-update-vwv99" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.381800 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f835c05-4bbb-4678-9410-8523cf308f05-config-data\") pod \"watcher-db-sync-sznrv\" (UID: \"5f835c05-4bbb-4678-9410-8523cf308f05\") " pod="openstack/watcher-db-sync-sznrv" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.397932 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5f835c05-4bbb-4678-9410-8523cf308f05-db-sync-config-data\") pod \"watcher-db-sync-sznrv\" (UID: \"5f835c05-4bbb-4678-9410-8523cf308f05\") " pod="openstack/watcher-db-sync-sznrv" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.399516 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f835c05-4bbb-4678-9410-8523cf308f05-combined-ca-bundle\") pod \"watcher-db-sync-sznrv\" (UID: \"5f835c05-4bbb-4678-9410-8523cf308f05\") " pod="openstack/watcher-db-sync-sznrv" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.437187 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cmr9\" (UniqueName: \"kubernetes.io/projected/87604619-ec13-480d-9456-c5062685287d-kube-api-access-2cmr9\") pod \"cinder-f0db-account-create-update-vwv99\" (UID: \"87604619-ec13-480d-9456-c5062685287d\") " pod="openstack/cinder-f0db-account-create-update-vwv99" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.437288 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxwsh\" (UniqueName: \"kubernetes.io/projected/5f835c05-4bbb-4678-9410-8523cf308f05-kube-api-access-cxwsh\") pod \"watcher-db-sync-sznrv\" (UID: \"5f835c05-4bbb-4678-9410-8523cf308f05\") " pod="openstack/watcher-db-sync-sznrv" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.449479 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-59fc-account-create-update-29wvh"] Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.450842 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-59fc-account-create-update-29wvh" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.455521 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.474420 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-sznrv" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.478273 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-czsfv"] Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.479711 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-czsfv" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.497873 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-59fc-account-create-update-29wvh"] Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.521426 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-czsfv"] Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.526581 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f0db-account-create-update-vwv99" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.586841 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-fnkcj"] Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.592240 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgkd4\" (UniqueName: \"kubernetes.io/projected/3f413eca-d25a-4b47-82f6-e25088b65f2d-kube-api-access-qgkd4\") pod \"barbican-db-create-czsfv\" (UID: \"3f413eca-d25a-4b47-82f6-e25088b65f2d\") " pod="openstack/barbican-db-create-czsfv" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.592286 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f413eca-d25a-4b47-82f6-e25088b65f2d-operator-scripts\") pod \"barbican-db-create-czsfv\" (UID: \"3f413eca-d25a-4b47-82f6-e25088b65f2d\") " pod="openstack/barbican-db-create-czsfv" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.592348 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65mhx\" (UniqueName: \"kubernetes.io/projected/76ba1bc1-3352-42a4-a80e-2fc2ac0e66eb-kube-api-access-65mhx\") pod \"barbican-59fc-account-create-update-29wvh\" (UID: \"76ba1bc1-3352-42a4-a80e-2fc2ac0e66eb\") " pod="openstack/barbican-59fc-account-create-update-29wvh" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.592399 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76ba1bc1-3352-42a4-a80e-2fc2ac0e66eb-operator-scripts\") pod \"barbican-59fc-account-create-update-29wvh\" (UID: \"76ba1bc1-3352-42a4-a80e-2fc2ac0e66eb\") " pod="openstack/barbican-59fc-account-create-update-29wvh" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.593223 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-fnkcj" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.598254 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-fnkcj"] Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.628303 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-r8rn4" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.628968 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.629296 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.629414 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.640023 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5b89-account-create-update-x8q45"] Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.645301 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b89-account-create-update-x8q45" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.661723 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.696497 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2102ef9b-8151-4edf-8b43-7c4486203911-combined-ca-bundle\") pod \"keystone-db-sync-fnkcj\" (UID: \"2102ef9b-8151-4edf-8b43-7c4486203911\") " pod="openstack/keystone-db-sync-fnkcj" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.696640 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgkd4\" (UniqueName: \"kubernetes.io/projected/3f413eca-d25a-4b47-82f6-e25088b65f2d-kube-api-access-qgkd4\") pod \"barbican-db-create-czsfv\" (UID: \"3f413eca-d25a-4b47-82f6-e25088b65f2d\") " pod="openstack/barbican-db-create-czsfv" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.696669 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsps6\" (UniqueName: \"kubernetes.io/projected/2102ef9b-8151-4edf-8b43-7c4486203911-kube-api-access-bsps6\") pod \"keystone-db-sync-fnkcj\" (UID: \"2102ef9b-8151-4edf-8b43-7c4486203911\") " pod="openstack/keystone-db-sync-fnkcj" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.696698 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f413eca-d25a-4b47-82f6-e25088b65f2d-operator-scripts\") pod \"barbican-db-create-czsfv\" (UID: \"3f413eca-d25a-4b47-82f6-e25088b65f2d\") " pod="openstack/barbican-db-create-czsfv" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.696724 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2102ef9b-8151-4edf-8b43-7c4486203911-config-data\") pod \"keystone-db-sync-fnkcj\" (UID: \"2102ef9b-8151-4edf-8b43-7c4486203911\") " pod="openstack/keystone-db-sync-fnkcj" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.696932 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65mhx\" (UniqueName: \"kubernetes.io/projected/76ba1bc1-3352-42a4-a80e-2fc2ac0e66eb-kube-api-access-65mhx\") pod \"barbican-59fc-account-create-update-29wvh\" (UID: \"76ba1bc1-3352-42a4-a80e-2fc2ac0e66eb\") " pod="openstack/barbican-59fc-account-create-update-29wvh" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.696978 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76ba1bc1-3352-42a4-a80e-2fc2ac0e66eb-operator-scripts\") pod \"barbican-59fc-account-create-update-29wvh\" (UID: \"76ba1bc1-3352-42a4-a80e-2fc2ac0e66eb\") " pod="openstack/barbican-59fc-account-create-update-29wvh" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.697729 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f413eca-d25a-4b47-82f6-e25088b65f2d-operator-scripts\") pod \"barbican-db-create-czsfv\" (UID: \"3f413eca-d25a-4b47-82f6-e25088b65f2d\") " pod="openstack/barbican-db-create-czsfv" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.697899 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76ba1bc1-3352-42a4-a80e-2fc2ac0e66eb-operator-scripts\") pod \"barbican-59fc-account-create-update-29wvh\" (UID: \"76ba1bc1-3352-42a4-a80e-2fc2ac0e66eb\") " pod="openstack/barbican-59fc-account-create-update-29wvh" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.712505 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5b89-account-create-update-x8q45"] Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.741504 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-mvrlh"] Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.743009 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-mvrlh" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.753571 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgkd4\" (UniqueName: \"kubernetes.io/projected/3f413eca-d25a-4b47-82f6-e25088b65f2d-kube-api-access-qgkd4\") pod \"barbican-db-create-czsfv\" (UID: \"3f413eca-d25a-4b47-82f6-e25088b65f2d\") " pod="openstack/barbican-db-create-czsfv" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.754489 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65mhx\" (UniqueName: \"kubernetes.io/projected/76ba1bc1-3352-42a4-a80e-2fc2ac0e66eb-kube-api-access-65mhx\") pod \"barbican-59fc-account-create-update-29wvh\" (UID: \"76ba1bc1-3352-42a4-a80e-2fc2ac0e66eb\") " pod="openstack/barbican-59fc-account-create-update-29wvh" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.765255 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-mvrlh"] Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.798226 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2102ef9b-8151-4edf-8b43-7c4486203911-combined-ca-bundle\") pod \"keystone-db-sync-fnkcj\" (UID: \"2102ef9b-8151-4edf-8b43-7c4486203911\") " pod="openstack/keystone-db-sync-fnkcj" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.798318 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsps6\" (UniqueName: \"kubernetes.io/projected/2102ef9b-8151-4edf-8b43-7c4486203911-kube-api-access-bsps6\") pod \"keystone-db-sync-fnkcj\" (UID: \"2102ef9b-8151-4edf-8b43-7c4486203911\") " pod="openstack/keystone-db-sync-fnkcj" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.798344 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9e55f2d-153c-47a0-95c4-84f8795ca57e-operator-scripts\") pod \"neutron-5b89-account-create-update-x8q45\" (UID: \"b9e55f2d-153c-47a0-95c4-84f8795ca57e\") " pod="openstack/neutron-5b89-account-create-update-x8q45" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.798404 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2102ef9b-8151-4edf-8b43-7c4486203911-config-data\") pod \"keystone-db-sync-fnkcj\" (UID: \"2102ef9b-8151-4edf-8b43-7c4486203911\") " pod="openstack/keystone-db-sync-fnkcj" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.798474 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9djps\" (UniqueName: \"kubernetes.io/projected/b9e55f2d-153c-47a0-95c4-84f8795ca57e-kube-api-access-9djps\") pod \"neutron-5b89-account-create-update-x8q45\" (UID: \"b9e55f2d-153c-47a0-95c4-84f8795ca57e\") " pod="openstack/neutron-5b89-account-create-update-x8q45" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.812111 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2102ef9b-8151-4edf-8b43-7c4486203911-combined-ca-bundle\") pod \"keystone-db-sync-fnkcj\" (UID: \"2102ef9b-8151-4edf-8b43-7c4486203911\") " pod="openstack/keystone-db-sync-fnkcj" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.817707 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2102ef9b-8151-4edf-8b43-7c4486203911-config-data\") pod \"keystone-db-sync-fnkcj\" (UID: \"2102ef9b-8151-4edf-8b43-7c4486203911\") " pod="openstack/keystone-db-sync-fnkcj" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.818052 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsps6\" (UniqueName: \"kubernetes.io/projected/2102ef9b-8151-4edf-8b43-7c4486203911-kube-api-access-bsps6\") pod \"keystone-db-sync-fnkcj\" (UID: \"2102ef9b-8151-4edf-8b43-7c4486203911\") " pod="openstack/keystone-db-sync-fnkcj" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.846747 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-59fc-account-create-update-29wvh" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.876304 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-czsfv" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.902179 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46a88eb1-e58a-437e-b1eb-2dcb7e80b37f-operator-scripts\") pod \"neutron-db-create-mvrlh\" (UID: \"46a88eb1-e58a-437e-b1eb-2dcb7e80b37f\") " pod="openstack/neutron-db-create-mvrlh" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.902242 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9e55f2d-153c-47a0-95c4-84f8795ca57e-operator-scripts\") pod \"neutron-5b89-account-create-update-x8q45\" (UID: \"b9e55f2d-153c-47a0-95c4-84f8795ca57e\") " pod="openstack/neutron-5b89-account-create-update-x8q45" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.902323 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9djps\" (UniqueName: \"kubernetes.io/projected/b9e55f2d-153c-47a0-95c4-84f8795ca57e-kube-api-access-9djps\") pod \"neutron-5b89-account-create-update-x8q45\" (UID: \"b9e55f2d-153c-47a0-95c4-84f8795ca57e\") " pod="openstack/neutron-5b89-account-create-update-x8q45" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.902355 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhhfx\" (UniqueName: \"kubernetes.io/projected/46a88eb1-e58a-437e-b1eb-2dcb7e80b37f-kube-api-access-qhhfx\") pod \"neutron-db-create-mvrlh\" (UID: \"46a88eb1-e58a-437e-b1eb-2dcb7e80b37f\") " pod="openstack/neutron-db-create-mvrlh" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.907585 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9e55f2d-153c-47a0-95c4-84f8795ca57e-operator-scripts\") pod \"neutron-5b89-account-create-update-x8q45\" (UID: \"b9e55f2d-153c-47a0-95c4-84f8795ca57e\") " pod="openstack/neutron-5b89-account-create-update-x8q45" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.959088 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9djps\" (UniqueName: \"kubernetes.io/projected/b9e55f2d-153c-47a0-95c4-84f8795ca57e-kube-api-access-9djps\") pod \"neutron-5b89-account-create-update-x8q45\" (UID: \"b9e55f2d-153c-47a0-95c4-84f8795ca57e\") " pod="openstack/neutron-5b89-account-create-update-x8q45" Feb 18 11:54:29 crc kubenswrapper[4922]: I0218 11:54:29.972905 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-fnkcj" Feb 18 11:54:30 crc kubenswrapper[4922]: I0218 11:54:30.005883 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhhfx\" (UniqueName: \"kubernetes.io/projected/46a88eb1-e58a-437e-b1eb-2dcb7e80b37f-kube-api-access-qhhfx\") pod \"neutron-db-create-mvrlh\" (UID: \"46a88eb1-e58a-437e-b1eb-2dcb7e80b37f\") " pod="openstack/neutron-db-create-mvrlh" Feb 18 11:54:30 crc kubenswrapper[4922]: I0218 11:54:30.006018 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46a88eb1-e58a-437e-b1eb-2dcb7e80b37f-operator-scripts\") pod \"neutron-db-create-mvrlh\" (UID: \"46a88eb1-e58a-437e-b1eb-2dcb7e80b37f\") " pod="openstack/neutron-db-create-mvrlh" Feb 18 11:54:30 crc kubenswrapper[4922]: I0218 11:54:30.006703 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46a88eb1-e58a-437e-b1eb-2dcb7e80b37f-operator-scripts\") pod \"neutron-db-create-mvrlh\" (UID: \"46a88eb1-e58a-437e-b1eb-2dcb7e80b37f\") " pod="openstack/neutron-db-create-mvrlh" Feb 18 11:54:30 crc kubenswrapper[4922]: I0218 11:54:30.056493 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhhfx\" (UniqueName: \"kubernetes.io/projected/46a88eb1-e58a-437e-b1eb-2dcb7e80b37f-kube-api-access-qhhfx\") pod \"neutron-db-create-mvrlh\" (UID: \"46a88eb1-e58a-437e-b1eb-2dcb7e80b37f\") " pod="openstack/neutron-db-create-mvrlh" Feb 18 11:54:30 crc kubenswrapper[4922]: I0218 11:54:30.073667 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b89-account-create-update-x8q45" Feb 18 11:54:30 crc kubenswrapper[4922]: I0218 11:54:30.112539 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-mvrlh" Feb 18 11:54:30 crc kubenswrapper[4922]: I0218 11:54:30.164182 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-qvjkl"] Feb 18 11:54:30 crc kubenswrapper[4922]: I0218 11:54:30.183242 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-qvjkl"] Feb 18 11:54:30 crc kubenswrapper[4922]: I0218 11:54:30.201584 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4","Type":"ContainerStarted","Data":"66d83f5860f3877f372e8e31496543d7009d3da9bb6e8967b9cf02ffa25c24f8"} Feb 18 11:54:30 crc kubenswrapper[4922]: I0218 11:54:30.252395 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-g95qz"] Feb 18 11:54:30 crc kubenswrapper[4922]: I0218 11:54:30.415945 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-sznrv"] Feb 18 11:54:30 crc kubenswrapper[4922]: W0218 11:54:30.458927 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f835c05_4bbb_4678_9410_8523cf308f05.slice/crio-64fe53faccd6664eee091113f52c4e7d0ab540979e620f3862763ef2ec700f6d WatchSource:0}: Error finding container 64fe53faccd6664eee091113f52c4e7d0ab540979e620f3862763ef2ec700f6d: Status 404 returned error can't find the container with id 64fe53faccd6664eee091113f52c4e7d0ab540979e620f3862763ef2ec700f6d Feb 18 11:54:30 crc kubenswrapper[4922]: I0218 11:54:30.476256 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-czsfv"] Feb 18 11:54:30 crc kubenswrapper[4922]: I0218 11:54:30.539107 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-f0db-account-create-update-vwv99"] Feb 18 11:54:30 crc kubenswrapper[4922]: I0218 11:54:30.651106 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-59fc-account-create-update-29wvh"] Feb 18 11:54:30 crc kubenswrapper[4922]: W0218 11:54:30.692929 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76ba1bc1_3352_42a4_a80e_2fc2ac0e66eb.slice/crio-1d81707c5bd2ed4c306d7d2993e386735f9bc3f51595e0bf23abec0c6fba5f1a WatchSource:0}: Error finding container 1d81707c5bd2ed4c306d7d2993e386735f9bc3f51595e0bf23abec0c6fba5f1a: Status 404 returned error can't find the container with id 1d81707c5bd2ed4c306d7d2993e386735f9bc3f51595e0bf23abec0c6fba5f1a Feb 18 11:54:30 crc kubenswrapper[4922]: I0218 11:54:30.701426 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5b89-account-create-update-x8q45"] Feb 18 11:54:30 crc kubenswrapper[4922]: I0218 11:54:30.870860 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-fnkcj"] Feb 18 11:54:30 crc kubenswrapper[4922]: I0218 11:54:30.908499 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-mvrlh"] Feb 18 11:54:30 crc kubenswrapper[4922]: W0218 11:54:30.913711 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46a88eb1_e58a_437e_b1eb_2dcb7e80b37f.slice/crio-c03df322018b9138faee6254589c424bb6e088b4296997dd45244c942e4db7ab WatchSource:0}: Error finding container c03df322018b9138faee6254589c424bb6e088b4296997dd45244c942e4db7ab: Status 404 returned error can't find the container with id c03df322018b9138faee6254589c424bb6e088b4296997dd45244c942e4db7ab Feb 18 11:54:31 crc kubenswrapper[4922]: I0218 11:54:31.015774 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83953c4e-9a54-453a-880f-d5e4c01608f9" path="/var/lib/kubelet/pods/83953c4e-9a54-453a-880f-d5e4c01608f9/volumes" Feb 18 11:54:31 crc kubenswrapper[4922]: I0218 11:54:31.211591 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-g95qz" event={"ID":"b521417c-1968-49ee-8435-9e44af7e8a52","Type":"ContainerStarted","Data":"9b8999d1e83a0573e94d5cd921d708e3242c66fa9f1b39ddd8ee350a09e0dbf3"} Feb 18 11:54:31 crc kubenswrapper[4922]: I0218 11:54:31.213448 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f0db-account-create-update-vwv99" event={"ID":"87604619-ec13-480d-9456-c5062685287d","Type":"ContainerStarted","Data":"2bdf174ced138c4cd3a885471b1652700fd63cb3ebd9aa23d03d3a168ff65eb0"} Feb 18 11:54:31 crc kubenswrapper[4922]: I0218 11:54:31.215276 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-sznrv" event={"ID":"5f835c05-4bbb-4678-9410-8523cf308f05","Type":"ContainerStarted","Data":"64fe53faccd6664eee091113f52c4e7d0ab540979e620f3862763ef2ec700f6d"} Feb 18 11:54:31 crc kubenswrapper[4922]: I0218 11:54:31.217947 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-59fc-account-create-update-29wvh" event={"ID":"76ba1bc1-3352-42a4-a80e-2fc2ac0e66eb","Type":"ContainerStarted","Data":"1d81707c5bd2ed4c306d7d2993e386735f9bc3f51595e0bf23abec0c6fba5f1a"} Feb 18 11:54:31 crc kubenswrapper[4922]: I0218 11:54:31.219555 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-mvrlh" event={"ID":"46a88eb1-e58a-437e-b1eb-2dcb7e80b37f","Type":"ContainerStarted","Data":"c03df322018b9138faee6254589c424bb6e088b4296997dd45244c942e4db7ab"} Feb 18 11:54:31 crc kubenswrapper[4922]: I0218 11:54:31.222056 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-fnkcj" event={"ID":"2102ef9b-8151-4edf-8b43-7c4486203911","Type":"ContainerStarted","Data":"e7cf1c0a3ad45fa91a71f51e0bf0901154e6b84ba139717db0e095e711718562"} Feb 18 11:54:31 crc kubenswrapper[4922]: I0218 11:54:31.223816 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b89-account-create-update-x8q45" event={"ID":"b9e55f2d-153c-47a0-95c4-84f8795ca57e","Type":"ContainerStarted","Data":"4d88076fc7a40e7ade149549d2f2ef5ee9743a2e42595b2835c326127fe730f0"} Feb 18 11:54:31 crc kubenswrapper[4922]: I0218 11:54:31.226652 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-czsfv" event={"ID":"3f413eca-d25a-4b47-82f6-e25088b65f2d","Type":"ContainerStarted","Data":"284db3161b4e11bde939d2e242b4a6f4a60d3fc946f9e2e18054dbeb8b0bcecd"} Feb 18 11:54:32 crc kubenswrapper[4922]: I0218 11:54:32.233899 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f0db-account-create-update-vwv99" event={"ID":"87604619-ec13-480d-9456-c5062685287d","Type":"ContainerStarted","Data":"459c3dccc59cc9a7a25c5999d57a9a40f0164adf154409ca83fde7883515bc8e"} Feb 18 11:54:32 crc kubenswrapper[4922]: I0218 11:54:32.235539 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-59fc-account-create-update-29wvh" event={"ID":"76ba1bc1-3352-42a4-a80e-2fc2ac0e66eb","Type":"ContainerStarted","Data":"2773d6c651dc9198f969541910251805f8f5c022eea560495fc1c18971e69a25"} Feb 18 11:54:32 crc kubenswrapper[4922]: I0218 11:54:32.237062 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-mvrlh" event={"ID":"46a88eb1-e58a-437e-b1eb-2dcb7e80b37f","Type":"ContainerStarted","Data":"c059c2685304c1bbe7cb0f5bf34e25f7ad09441877372a8258fba3155ab8954a"} Feb 18 11:54:32 crc kubenswrapper[4922]: I0218 11:54:32.239012 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b89-account-create-update-x8q45" event={"ID":"b9e55f2d-153c-47a0-95c4-84f8795ca57e","Type":"ContainerStarted","Data":"f872729a376e1215c5740d39fd4f7a50b0af6397c21fc8ffd7ccc6655e4b95bf"} Feb 18 11:54:32 crc kubenswrapper[4922]: I0218 11:54:32.241286 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-czsfv" event={"ID":"3f413eca-d25a-4b47-82f6-e25088b65f2d","Type":"ContainerStarted","Data":"0f464cbb217061ebc522def4ee4f957ba0ab7d4cce7acd97b0cad24b01e4e4cc"} Feb 18 11:54:32 crc kubenswrapper[4922]: I0218 11:54:32.266119 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-f0db-account-create-update-vwv99" podStartSLOduration=3.266099768 podStartE2EDuration="3.266099768s" podCreationTimestamp="2026-02-18 11:54:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:54:32.260082435 +0000 UTC m=+1073.987786515" watchObservedRunningTime="2026-02-18 11:54:32.266099768 +0000 UTC m=+1073.993803848" Feb 18 11:54:32 crc kubenswrapper[4922]: I0218 11:54:32.280810 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5b89-account-create-update-x8q45" podStartSLOduration=3.280793141 podStartE2EDuration="3.280793141s" podCreationTimestamp="2026-02-18 11:54:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:54:32.278159324 +0000 UTC m=+1074.005863404" watchObservedRunningTime="2026-02-18 11:54:32.280793141 +0000 UTC m=+1074.008497221" Feb 18 11:54:32 crc kubenswrapper[4922]: I0218 11:54:32.308457 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-mvrlh" podStartSLOduration=3.308440003 podStartE2EDuration="3.308440003s" podCreationTimestamp="2026-02-18 11:54:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:54:32.304876393 +0000 UTC m=+1074.032580483" watchObservedRunningTime="2026-02-18 11:54:32.308440003 +0000 UTC m=+1074.036144083" Feb 18 11:54:32 crc kubenswrapper[4922]: I0218 11:54:32.341010 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-czsfv" podStartSLOduration=3.34098785 podStartE2EDuration="3.34098785s" podCreationTimestamp="2026-02-18 11:54:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:54:32.327741574 +0000 UTC m=+1074.055445654" watchObservedRunningTime="2026-02-18 11:54:32.34098785 +0000 UTC m=+1074.068691930" Feb 18 11:54:32 crc kubenswrapper[4922]: I0218 11:54:32.353005 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-59fc-account-create-update-29wvh" podStartSLOduration=3.352983595 podStartE2EDuration="3.352983595s" podCreationTimestamp="2026-02-18 11:54:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:54:32.346852819 +0000 UTC m=+1074.074556909" watchObservedRunningTime="2026-02-18 11:54:32.352983595 +0000 UTC m=+1074.080687675" Feb 18 11:54:33 crc kubenswrapper[4922]: I0218 11:54:33.250834 4922 generic.go:334] "Generic (PLEG): container finished" podID="87604619-ec13-480d-9456-c5062685287d" containerID="459c3dccc59cc9a7a25c5999d57a9a40f0164adf154409ca83fde7883515bc8e" exitCode=0 Feb 18 11:54:33 crc kubenswrapper[4922]: I0218 11:54:33.251152 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f0db-account-create-update-vwv99" event={"ID":"87604619-ec13-480d-9456-c5062685287d","Type":"ContainerDied","Data":"459c3dccc59cc9a7a25c5999d57a9a40f0164adf154409ca83fde7883515bc8e"} Feb 18 11:54:33 crc kubenswrapper[4922]: I0218 11:54:33.253652 4922 generic.go:334] "Generic (PLEG): container finished" podID="76ba1bc1-3352-42a4-a80e-2fc2ac0e66eb" containerID="2773d6c651dc9198f969541910251805f8f5c022eea560495fc1c18971e69a25" exitCode=0 Feb 18 11:54:33 crc kubenswrapper[4922]: I0218 11:54:33.253698 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-59fc-account-create-update-29wvh" event={"ID":"76ba1bc1-3352-42a4-a80e-2fc2ac0e66eb","Type":"ContainerDied","Data":"2773d6c651dc9198f969541910251805f8f5c022eea560495fc1c18971e69a25"} Feb 18 11:54:33 crc kubenswrapper[4922]: I0218 11:54:33.255456 4922 generic.go:334] "Generic (PLEG): container finished" podID="46a88eb1-e58a-437e-b1eb-2dcb7e80b37f" containerID="c059c2685304c1bbe7cb0f5bf34e25f7ad09441877372a8258fba3155ab8954a" exitCode=0 Feb 18 11:54:33 crc kubenswrapper[4922]: I0218 11:54:33.255495 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-mvrlh" event={"ID":"46a88eb1-e58a-437e-b1eb-2dcb7e80b37f","Type":"ContainerDied","Data":"c059c2685304c1bbe7cb0f5bf34e25f7ad09441877372a8258fba3155ab8954a"} Feb 18 11:54:33 crc kubenswrapper[4922]: I0218 11:54:33.257283 4922 generic.go:334] "Generic (PLEG): container finished" podID="b9e55f2d-153c-47a0-95c4-84f8795ca57e" containerID="f872729a376e1215c5740d39fd4f7a50b0af6397c21fc8ffd7ccc6655e4b95bf" exitCode=0 Feb 18 11:54:33 crc kubenswrapper[4922]: I0218 11:54:33.257324 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b89-account-create-update-x8q45" event={"ID":"b9e55f2d-153c-47a0-95c4-84f8795ca57e","Type":"ContainerDied","Data":"f872729a376e1215c5740d39fd4f7a50b0af6397c21fc8ffd7ccc6655e4b95bf"} Feb 18 11:54:33 crc kubenswrapper[4922]: I0218 11:54:33.258890 4922 generic.go:334] "Generic (PLEG): container finished" podID="3f413eca-d25a-4b47-82f6-e25088b65f2d" containerID="0f464cbb217061ebc522def4ee4f957ba0ab7d4cce7acd97b0cad24b01e4e4cc" exitCode=0 Feb 18 11:54:33 crc kubenswrapper[4922]: I0218 11:54:33.258939 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-czsfv" event={"ID":"3f413eca-d25a-4b47-82f6-e25088b65f2d","Type":"ContainerDied","Data":"0f464cbb217061ebc522def4ee4f957ba0ab7d4cce7acd97b0cad24b01e4e4cc"} Feb 18 11:54:33 crc kubenswrapper[4922]: I0218 11:54:33.270094 4922 generic.go:334] "Generic (PLEG): container finished" podID="b521417c-1968-49ee-8435-9e44af7e8a52" containerID="cb46a05482d6c2364b368bb1c2e067b6de93db6a4072db86c206647939a79206" exitCode=0 Feb 18 11:54:33 crc kubenswrapper[4922]: I0218 11:54:33.270145 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-g95qz" event={"ID":"b521417c-1968-49ee-8435-9e44af7e8a52","Type":"ContainerDied","Data":"cb46a05482d6c2364b368bb1c2e067b6de93db6a4072db86c206647939a79206"} Feb 18 11:54:34 crc kubenswrapper[4922]: I0218 11:54:34.291110 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0771bdc1-7622-4a65-aa82-3150630ce652","Type":"ContainerStarted","Data":"cb3763398a9b0ccacee702791c679ac99f40356ddb1a484f571a730c4fc2b052"} Feb 18 11:54:34 crc kubenswrapper[4922]: I0218 11:54:34.291156 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0771bdc1-7622-4a65-aa82-3150630ce652","Type":"ContainerStarted","Data":"22c5e506a27e803c8fdafc19727c85eb82a49af3fcf31981e10ac5d6c6be1f28"} Feb 18 11:54:34 crc kubenswrapper[4922]: I0218 11:54:34.819565 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f0db-account-create-update-vwv99" Feb 18 11:54:34 crc kubenswrapper[4922]: I0218 11:54:34.874558 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-czsfv" Feb 18 11:54:34 crc kubenswrapper[4922]: I0218 11:54:34.937283 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87604619-ec13-480d-9456-c5062685287d-operator-scripts\") pod \"87604619-ec13-480d-9456-c5062685287d\" (UID: \"87604619-ec13-480d-9456-c5062685287d\") " Feb 18 11:54:34 crc kubenswrapper[4922]: I0218 11:54:34.937332 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cmr9\" (UniqueName: \"kubernetes.io/projected/87604619-ec13-480d-9456-c5062685287d-kube-api-access-2cmr9\") pod \"87604619-ec13-480d-9456-c5062685287d\" (UID: \"87604619-ec13-480d-9456-c5062685287d\") " Feb 18 11:54:34 crc kubenswrapper[4922]: I0218 11:54:34.937377 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgkd4\" (UniqueName: \"kubernetes.io/projected/3f413eca-d25a-4b47-82f6-e25088b65f2d-kube-api-access-qgkd4\") pod \"3f413eca-d25a-4b47-82f6-e25088b65f2d\" (UID: \"3f413eca-d25a-4b47-82f6-e25088b65f2d\") " Feb 18 11:54:34 crc kubenswrapper[4922]: I0218 11:54:34.937532 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f413eca-d25a-4b47-82f6-e25088b65f2d-operator-scripts\") pod \"3f413eca-d25a-4b47-82f6-e25088b65f2d\" (UID: \"3f413eca-d25a-4b47-82f6-e25088b65f2d\") " Feb 18 11:54:34 crc kubenswrapper[4922]: I0218 11:54:34.937965 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87604619-ec13-480d-9456-c5062685287d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "87604619-ec13-480d-9456-c5062685287d" (UID: "87604619-ec13-480d-9456-c5062685287d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:54:34 crc kubenswrapper[4922]: I0218 11:54:34.940076 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f413eca-d25a-4b47-82f6-e25088b65f2d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3f413eca-d25a-4b47-82f6-e25088b65f2d" (UID: "3f413eca-d25a-4b47-82f6-e25088b65f2d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:54:34 crc kubenswrapper[4922]: I0218 11:54:34.944658 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f413eca-d25a-4b47-82f6-e25088b65f2d-kube-api-access-qgkd4" (OuterVolumeSpecName: "kube-api-access-qgkd4") pod "3f413eca-d25a-4b47-82f6-e25088b65f2d" (UID: "3f413eca-d25a-4b47-82f6-e25088b65f2d"). InnerVolumeSpecName "kube-api-access-qgkd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:54:34 crc kubenswrapper[4922]: I0218 11:54:34.946244 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87604619-ec13-480d-9456-c5062685287d-kube-api-access-2cmr9" (OuterVolumeSpecName: "kube-api-access-2cmr9") pod "87604619-ec13-480d-9456-c5062685287d" (UID: "87604619-ec13-480d-9456-c5062685287d"). InnerVolumeSpecName "kube-api-access-2cmr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:54:34 crc kubenswrapper[4922]: I0218 11:54:34.947615 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-59fc-account-create-update-29wvh" Feb 18 11:54:34 crc kubenswrapper[4922]: I0218 11:54:34.948084 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-g95qz" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.001785 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b89-account-create-update-x8q45" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.038349 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-mvrlh" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.038519 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76ba1bc1-3352-42a4-a80e-2fc2ac0e66eb-operator-scripts\") pod \"76ba1bc1-3352-42a4-a80e-2fc2ac0e66eb\" (UID: \"76ba1bc1-3352-42a4-a80e-2fc2ac0e66eb\") " Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.038625 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkbqt\" (UniqueName: \"kubernetes.io/projected/b521417c-1968-49ee-8435-9e44af7e8a52-kube-api-access-wkbqt\") pod \"b521417c-1968-49ee-8435-9e44af7e8a52\" (UID: \"b521417c-1968-49ee-8435-9e44af7e8a52\") " Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.038787 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b521417c-1968-49ee-8435-9e44af7e8a52-operator-scripts\") pod \"b521417c-1968-49ee-8435-9e44af7e8a52\" (UID: \"b521417c-1968-49ee-8435-9e44af7e8a52\") " Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.038809 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65mhx\" (UniqueName: \"kubernetes.io/projected/76ba1bc1-3352-42a4-a80e-2fc2ac0e66eb-kube-api-access-65mhx\") pod \"76ba1bc1-3352-42a4-a80e-2fc2ac0e66eb\" (UID: \"76ba1bc1-3352-42a4-a80e-2fc2ac0e66eb\") " Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.039279 4922 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f413eca-d25a-4b47-82f6-e25088b65f2d-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.039300 4922 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87604619-ec13-480d-9456-c5062685287d-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.039312 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cmr9\" (UniqueName: \"kubernetes.io/projected/87604619-ec13-480d-9456-c5062685287d-kube-api-access-2cmr9\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.039325 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgkd4\" (UniqueName: \"kubernetes.io/projected/3f413eca-d25a-4b47-82f6-e25088b65f2d-kube-api-access-qgkd4\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.039856 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76ba1bc1-3352-42a4-a80e-2fc2ac0e66eb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "76ba1bc1-3352-42a4-a80e-2fc2ac0e66eb" (UID: "76ba1bc1-3352-42a4-a80e-2fc2ac0e66eb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.041055 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b521417c-1968-49ee-8435-9e44af7e8a52-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b521417c-1968-49ee-8435-9e44af7e8a52" (UID: "b521417c-1968-49ee-8435-9e44af7e8a52"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.067512 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b521417c-1968-49ee-8435-9e44af7e8a52-kube-api-access-wkbqt" (OuterVolumeSpecName: "kube-api-access-wkbqt") pod "b521417c-1968-49ee-8435-9e44af7e8a52" (UID: "b521417c-1968-49ee-8435-9e44af7e8a52"). InnerVolumeSpecName "kube-api-access-wkbqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.073973 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76ba1bc1-3352-42a4-a80e-2fc2ac0e66eb-kube-api-access-65mhx" (OuterVolumeSpecName: "kube-api-access-65mhx") pod "76ba1bc1-3352-42a4-a80e-2fc2ac0e66eb" (UID: "76ba1bc1-3352-42a4-a80e-2fc2ac0e66eb"). InnerVolumeSpecName "kube-api-access-65mhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.140851 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhhfx\" (UniqueName: \"kubernetes.io/projected/46a88eb1-e58a-437e-b1eb-2dcb7e80b37f-kube-api-access-qhhfx\") pod \"46a88eb1-e58a-437e-b1eb-2dcb7e80b37f\" (UID: \"46a88eb1-e58a-437e-b1eb-2dcb7e80b37f\") " Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.141021 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9e55f2d-153c-47a0-95c4-84f8795ca57e-operator-scripts\") pod \"b9e55f2d-153c-47a0-95c4-84f8795ca57e\" (UID: \"b9e55f2d-153c-47a0-95c4-84f8795ca57e\") " Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.141080 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9djps\" (UniqueName: \"kubernetes.io/projected/b9e55f2d-153c-47a0-95c4-84f8795ca57e-kube-api-access-9djps\") pod \"b9e55f2d-153c-47a0-95c4-84f8795ca57e\" (UID: \"b9e55f2d-153c-47a0-95c4-84f8795ca57e\") " Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.141583 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46a88eb1-e58a-437e-b1eb-2dcb7e80b37f-operator-scripts\") pod \"46a88eb1-e58a-437e-b1eb-2dcb7e80b37f\" (UID: \"46a88eb1-e58a-437e-b1eb-2dcb7e80b37f\") " Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.142889 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkbqt\" (UniqueName: \"kubernetes.io/projected/b521417c-1968-49ee-8435-9e44af7e8a52-kube-api-access-wkbqt\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.142913 4922 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b521417c-1968-49ee-8435-9e44af7e8a52-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.142927 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65mhx\" (UniqueName: \"kubernetes.io/projected/76ba1bc1-3352-42a4-a80e-2fc2ac0e66eb-kube-api-access-65mhx\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.142940 4922 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76ba1bc1-3352-42a4-a80e-2fc2ac0e66eb-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.143756 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46a88eb1-e58a-437e-b1eb-2dcb7e80b37f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "46a88eb1-e58a-437e-b1eb-2dcb7e80b37f" (UID: "46a88eb1-e58a-437e-b1eb-2dcb7e80b37f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.147885 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46a88eb1-e58a-437e-b1eb-2dcb7e80b37f-kube-api-access-qhhfx" (OuterVolumeSpecName: "kube-api-access-qhhfx") pod "46a88eb1-e58a-437e-b1eb-2dcb7e80b37f" (UID: "46a88eb1-e58a-437e-b1eb-2dcb7e80b37f"). InnerVolumeSpecName "kube-api-access-qhhfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.148391 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9e55f2d-153c-47a0-95c4-84f8795ca57e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b9e55f2d-153c-47a0-95c4-84f8795ca57e" (UID: "b9e55f2d-153c-47a0-95c4-84f8795ca57e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.150940 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9e55f2d-153c-47a0-95c4-84f8795ca57e-kube-api-access-9djps" (OuterVolumeSpecName: "kube-api-access-9djps") pod "b9e55f2d-153c-47a0-95c4-84f8795ca57e" (UID: "b9e55f2d-153c-47a0-95c4-84f8795ca57e"). InnerVolumeSpecName "kube-api-access-9djps". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.202072 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-xhpvj"] Feb 18 11:54:35 crc kubenswrapper[4922]: E0218 11:54:35.202440 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87604619-ec13-480d-9456-c5062685287d" containerName="mariadb-account-create-update" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.202471 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="87604619-ec13-480d-9456-c5062685287d" containerName="mariadb-account-create-update" Feb 18 11:54:35 crc kubenswrapper[4922]: E0218 11:54:35.202487 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76ba1bc1-3352-42a4-a80e-2fc2ac0e66eb" containerName="mariadb-account-create-update" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.202496 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="76ba1bc1-3352-42a4-a80e-2fc2ac0e66eb" containerName="mariadb-account-create-update" Feb 18 11:54:35 crc kubenswrapper[4922]: E0218 11:54:35.202512 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46a88eb1-e58a-437e-b1eb-2dcb7e80b37f" containerName="mariadb-database-create" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.202521 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="46a88eb1-e58a-437e-b1eb-2dcb7e80b37f" containerName="mariadb-database-create" Feb 18 11:54:35 crc kubenswrapper[4922]: E0218 11:54:35.202535 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f413eca-d25a-4b47-82f6-e25088b65f2d" containerName="mariadb-database-create" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.202545 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f413eca-d25a-4b47-82f6-e25088b65f2d" containerName="mariadb-database-create" Feb 18 11:54:35 crc kubenswrapper[4922]: E0218 11:54:35.202557 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b521417c-1968-49ee-8435-9e44af7e8a52" containerName="mariadb-database-create" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.202563 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="b521417c-1968-49ee-8435-9e44af7e8a52" containerName="mariadb-database-create" Feb 18 11:54:35 crc kubenswrapper[4922]: E0218 11:54:35.202574 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9e55f2d-153c-47a0-95c4-84f8795ca57e" containerName="mariadb-account-create-update" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.202580 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9e55f2d-153c-47a0-95c4-84f8795ca57e" containerName="mariadb-account-create-update" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.202721 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="46a88eb1-e58a-437e-b1eb-2dcb7e80b37f" containerName="mariadb-database-create" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.202734 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="b521417c-1968-49ee-8435-9e44af7e8a52" containerName="mariadb-database-create" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.202751 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9e55f2d-153c-47a0-95c4-84f8795ca57e" containerName="mariadb-account-create-update" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.202763 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="87604619-ec13-480d-9456-c5062685287d" containerName="mariadb-account-create-update" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.202772 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="76ba1bc1-3352-42a4-a80e-2fc2ac0e66eb" containerName="mariadb-account-create-update" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.202781 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f413eca-d25a-4b47-82f6-e25088b65f2d" containerName="mariadb-database-create" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.203255 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xhpvj" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.212350 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.216826 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-xhpvj"] Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.245098 4922 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46a88eb1-e58a-437e-b1eb-2dcb7e80b37f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.245156 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhhfx\" (UniqueName: \"kubernetes.io/projected/46a88eb1-e58a-437e-b1eb-2dcb7e80b37f-kube-api-access-qhhfx\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.245173 4922 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9e55f2d-153c-47a0-95c4-84f8795ca57e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.245187 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9djps\" (UniqueName: \"kubernetes.io/projected/b9e55f2d-153c-47a0-95c4-84f8795ca57e-kube-api-access-9djps\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.315134 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-59fc-account-create-update-29wvh" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.315159 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-59fc-account-create-update-29wvh" event={"ID":"76ba1bc1-3352-42a4-a80e-2fc2ac0e66eb","Type":"ContainerDied","Data":"1d81707c5bd2ed4c306d7d2993e386735f9bc3f51595e0bf23abec0c6fba5f1a"} Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.316761 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d81707c5bd2ed4c306d7d2993e386735f9bc3f51595e0bf23abec0c6fba5f1a" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.317981 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-mvrlh" event={"ID":"46a88eb1-e58a-437e-b1eb-2dcb7e80b37f","Type":"ContainerDied","Data":"c03df322018b9138faee6254589c424bb6e088b4296997dd45244c942e4db7ab"} Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.318005 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c03df322018b9138faee6254589c424bb6e088b4296997dd45244c942e4db7ab" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.318047 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-mvrlh" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.322977 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b89-account-create-update-x8q45" event={"ID":"b9e55f2d-153c-47a0-95c4-84f8795ca57e","Type":"ContainerDied","Data":"4d88076fc7a40e7ade149549d2f2ef5ee9743a2e42595b2835c326127fe730f0"} Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.323013 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d88076fc7a40e7ade149549d2f2ef5ee9743a2e42595b2835c326127fe730f0" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.323011 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b89-account-create-update-x8q45" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.328488 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0771bdc1-7622-4a65-aa82-3150630ce652","Type":"ContainerStarted","Data":"11f435f7a138ff88d942d18784f7daee9767063e1ca8c9d22468b614fb9184ae"} Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.328538 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0771bdc1-7622-4a65-aa82-3150630ce652","Type":"ContainerStarted","Data":"ecf7f29fde1a718be3812f570f693693cf5186801e692c9023c577d084285ea0"} Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.330321 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-czsfv" event={"ID":"3f413eca-d25a-4b47-82f6-e25088b65f2d","Type":"ContainerDied","Data":"284db3161b4e11bde939d2e242b4a6f4a60d3fc946f9e2e18054dbeb8b0bcecd"} Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.330347 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-czsfv" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.330382 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="284db3161b4e11bde939d2e242b4a6f4a60d3fc946f9e2e18054dbeb8b0bcecd" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.331803 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-g95qz" event={"ID":"b521417c-1968-49ee-8435-9e44af7e8a52","Type":"ContainerDied","Data":"9b8999d1e83a0573e94d5cd921d708e3242c66fa9f1b39ddd8ee350a09e0dbf3"} Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.331835 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b8999d1e83a0573e94d5cd921d708e3242c66fa9f1b39ddd8ee350a09e0dbf3" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.331898 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-g95qz" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.335142 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f0db-account-create-update-vwv99" event={"ID":"87604619-ec13-480d-9456-c5062685287d","Type":"ContainerDied","Data":"2bdf174ced138c4cd3a885471b1652700fd63cb3ebd9aa23d03d3a168ff65eb0"} Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.335176 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2bdf174ced138c4cd3a885471b1652700fd63cb3ebd9aa23d03d3a168ff65eb0" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.335247 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f0db-account-create-update-vwv99" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.346279 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpsmc\" (UniqueName: \"kubernetes.io/projected/3f2ad2ed-7e29-4760-aa9f-e5bf6bc96056-kube-api-access-rpsmc\") pod \"root-account-create-update-xhpvj\" (UID: \"3f2ad2ed-7e29-4760-aa9f-e5bf6bc96056\") " pod="openstack/root-account-create-update-xhpvj" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.346356 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f2ad2ed-7e29-4760-aa9f-e5bf6bc96056-operator-scripts\") pod \"root-account-create-update-xhpvj\" (UID: \"3f2ad2ed-7e29-4760-aa9f-e5bf6bc96056\") " pod="openstack/root-account-create-update-xhpvj" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.448296 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpsmc\" (UniqueName: \"kubernetes.io/projected/3f2ad2ed-7e29-4760-aa9f-e5bf6bc96056-kube-api-access-rpsmc\") pod \"root-account-create-update-xhpvj\" (UID: \"3f2ad2ed-7e29-4760-aa9f-e5bf6bc96056\") " pod="openstack/root-account-create-update-xhpvj" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.448459 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f2ad2ed-7e29-4760-aa9f-e5bf6bc96056-operator-scripts\") pod \"root-account-create-update-xhpvj\" (UID: \"3f2ad2ed-7e29-4760-aa9f-e5bf6bc96056\") " pod="openstack/root-account-create-update-xhpvj" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.449524 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f2ad2ed-7e29-4760-aa9f-e5bf6bc96056-operator-scripts\") pod \"root-account-create-update-xhpvj\" (UID: \"3f2ad2ed-7e29-4760-aa9f-e5bf6bc96056\") " pod="openstack/root-account-create-update-xhpvj" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.469087 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpsmc\" (UniqueName: \"kubernetes.io/projected/3f2ad2ed-7e29-4760-aa9f-e5bf6bc96056-kube-api-access-rpsmc\") pod \"root-account-create-update-xhpvj\" (UID: \"3f2ad2ed-7e29-4760-aa9f-e5bf6bc96056\") " pod="openstack/root-account-create-update-xhpvj" Feb 18 11:54:35 crc kubenswrapper[4922]: I0218 11:54:35.530759 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xhpvj" Feb 18 11:54:36 crc kubenswrapper[4922]: I0218 11:54:36.345181 4922 generic.go:334] "Generic (PLEG): container finished" podID="791eb2d2-e8b0-401d-8b8b-b79d675c1ca4" containerID="66d83f5860f3877f372e8e31496543d7009d3da9bb6e8967b9cf02ffa25c24f8" exitCode=0 Feb 18 11:54:36 crc kubenswrapper[4922]: I0218 11:54:36.345329 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4","Type":"ContainerDied","Data":"66d83f5860f3877f372e8e31496543d7009d3da9bb6e8967b9cf02ffa25c24f8"} Feb 18 11:54:49 crc kubenswrapper[4922]: I0218 11:54:49.584288 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-xhpvj"] Feb 18 11:54:52 crc kubenswrapper[4922]: W0218 11:54:52.573601 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f2ad2ed_7e29_4760_aa9f_e5bf6bc96056.slice/crio-cda5ec3708d7b2f4b55f59d5b6c60d14db4ff2054fe6b3b308bcdd72c7da1a97 WatchSource:0}: Error finding container cda5ec3708d7b2f4b55f59d5b6c60d14db4ff2054fe6b3b308bcdd72c7da1a97: Status 404 returned error can't find the container with id cda5ec3708d7b2f4b55f59d5b6c60d14db4ff2054fe6b3b308bcdd72c7da1a97 Feb 18 11:54:53 crc kubenswrapper[4922]: E0218 11:54:53.205091 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Feb 18 11:54:53 crc kubenswrapper[4922]: E0218 11:54:53.205282 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zmckq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-st9pz_openstack(855fb3ec-e473-4a99-a94f-cc96dda6d9c4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 11:54:53 crc kubenswrapper[4922]: E0218 11:54:53.206766 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-st9pz" podUID="855fb3ec-e473-4a99-a94f-cc96dda6d9c4" Feb 18 11:54:53 crc kubenswrapper[4922]: I0218 11:54:53.207019 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 18 11:54:53 crc kubenswrapper[4922]: E0218 11:54:53.342296 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.158:5001/podified-epoxy-centos9/openstack-watcher-api:watcher_latest" Feb 18 11:54:53 crc kubenswrapper[4922]: E0218 11:54:53.342351 4922 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.158:5001/podified-epoxy-centos9/openstack-watcher-api:watcher_latest" Feb 18 11:54:53 crc kubenswrapper[4922]: E0218 11:54:53.342624 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:watcher-db-sync,Image:38.102.83.158:5001/podified-epoxy-centos9/openstack-watcher-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/watcher/watcher.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:watcher-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cxwsh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-db-sync-sznrv_openstack(5f835c05-4bbb-4678-9410-8523cf308f05): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 11:54:53 crc kubenswrapper[4922]: E0218 11:54:53.343960 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/watcher-db-sync-sznrv" podUID="5f835c05-4bbb-4678-9410-8523cf308f05" Feb 18 11:54:53 crc kubenswrapper[4922]: I0218 11:54:53.494799 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xhpvj" event={"ID":"3f2ad2ed-7e29-4760-aa9f-e5bf6bc96056","Type":"ContainerStarted","Data":"cda5ec3708d7b2f4b55f59d5b6c60d14db4ff2054fe6b3b308bcdd72c7da1a97"} Feb 18 11:54:53 crc kubenswrapper[4922]: E0218 11:54:53.497328 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.158:5001/podified-epoxy-centos9/openstack-watcher-api:watcher_latest\\\"\"" pod="openstack/watcher-db-sync-sznrv" podUID="5f835c05-4bbb-4678-9410-8523cf308f05" Feb 18 11:54:54 crc kubenswrapper[4922]: I0218 11:54:54.505798 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4","Type":"ContainerStarted","Data":"6a87e12a9d544ccc469c240f591cfdfbc76c565f80e42da4d9b856e9a979562b"} Feb 18 11:54:54 crc kubenswrapper[4922]: I0218 11:54:54.507604 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xhpvj" event={"ID":"3f2ad2ed-7e29-4760-aa9f-e5bf6bc96056","Type":"ContainerStarted","Data":"3b378847d4d22bec8f5e44e34e16024756ec3850a6ac282a1c69fb4c7cbed59f"} Feb 18 11:54:54 crc kubenswrapper[4922]: I0218 11:54:54.509313 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-fnkcj" event={"ID":"2102ef9b-8151-4edf-8b43-7c4486203911","Type":"ContainerStarted","Data":"96a538608703e5f575eecc4d7943dd23e9a2c36f206b74a346320d63b212ab98"} Feb 18 11:54:54 crc kubenswrapper[4922]: I0218 11:54:54.513144 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0771bdc1-7622-4a65-aa82-3150630ce652","Type":"ContainerStarted","Data":"2d7ee715b5d934a60ccc4fa8fe636b8ae65649af013d76a0d7146d35621583f4"} Feb 18 11:54:54 crc kubenswrapper[4922]: I0218 11:54:54.513167 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0771bdc1-7622-4a65-aa82-3150630ce652","Type":"ContainerStarted","Data":"3dcd45d219e5564a608d73b6a762cf85b1185176c70bb51d25995c0d9ffda06e"} Feb 18 11:54:54 crc kubenswrapper[4922]: I0218 11:54:54.529093 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-xhpvj" podStartSLOduration=19.529068552 podStartE2EDuration="19.529068552s" podCreationTimestamp="2026-02-18 11:54:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:54:54.524923287 +0000 UTC m=+1096.252627367" watchObservedRunningTime="2026-02-18 11:54:54.529068552 +0000 UTC m=+1096.256772632" Feb 18 11:54:54 crc kubenswrapper[4922]: I0218 11:54:54.549166 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-fnkcj" podStartSLOduration=3.20674165 podStartE2EDuration="25.549148532s" podCreationTimestamp="2026-02-18 11:54:29 +0000 UTC" firstStartedPulling="2026-02-18 11:54:30.878506427 +0000 UTC m=+1072.606210507" lastFinishedPulling="2026-02-18 11:54:53.220913269 +0000 UTC m=+1094.948617389" observedRunningTime="2026-02-18 11:54:54.542924754 +0000 UTC m=+1096.270628834" watchObservedRunningTime="2026-02-18 11:54:54.549148532 +0000 UTC m=+1096.276852612" Feb 18 11:54:55 crc kubenswrapper[4922]: I0218 11:54:55.523289 4922 generic.go:334] "Generic (PLEG): container finished" podID="3f2ad2ed-7e29-4760-aa9f-e5bf6bc96056" containerID="3b378847d4d22bec8f5e44e34e16024756ec3850a6ac282a1c69fb4c7cbed59f" exitCode=0 Feb 18 11:54:55 crc kubenswrapper[4922]: I0218 11:54:55.523609 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xhpvj" event={"ID":"3f2ad2ed-7e29-4760-aa9f-e5bf6bc96056","Type":"ContainerDied","Data":"3b378847d4d22bec8f5e44e34e16024756ec3850a6ac282a1c69fb4c7cbed59f"} Feb 18 11:54:55 crc kubenswrapper[4922]: I0218 11:54:55.529725 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0771bdc1-7622-4a65-aa82-3150630ce652","Type":"ContainerStarted","Data":"702a0b64ae0a6f1df760160dcae295326177206b6911d0b37de4d77ff3643292"} Feb 18 11:54:55 crc kubenswrapper[4922]: I0218 11:54:55.529807 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0771bdc1-7622-4a65-aa82-3150630ce652","Type":"ContainerStarted","Data":"3f83a7028420b34f585a6a85938c72318f7af775dde1cb5b8b3d7caac762d7fd"} Feb 18 11:54:56 crc kubenswrapper[4922]: I0218 11:54:56.844194 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xhpvj" Feb 18 11:54:56 crc kubenswrapper[4922]: I0218 11:54:56.975540 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpsmc\" (UniqueName: \"kubernetes.io/projected/3f2ad2ed-7e29-4760-aa9f-e5bf6bc96056-kube-api-access-rpsmc\") pod \"3f2ad2ed-7e29-4760-aa9f-e5bf6bc96056\" (UID: \"3f2ad2ed-7e29-4760-aa9f-e5bf6bc96056\") " Feb 18 11:54:56 crc kubenswrapper[4922]: I0218 11:54:56.975630 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f2ad2ed-7e29-4760-aa9f-e5bf6bc96056-operator-scripts\") pod \"3f2ad2ed-7e29-4760-aa9f-e5bf6bc96056\" (UID: \"3f2ad2ed-7e29-4760-aa9f-e5bf6bc96056\") " Feb 18 11:54:56 crc kubenswrapper[4922]: I0218 11:54:56.976163 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f2ad2ed-7e29-4760-aa9f-e5bf6bc96056-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3f2ad2ed-7e29-4760-aa9f-e5bf6bc96056" (UID: "3f2ad2ed-7e29-4760-aa9f-e5bf6bc96056"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:54:56 crc kubenswrapper[4922]: I0218 11:54:56.976888 4922 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f2ad2ed-7e29-4760-aa9f-e5bf6bc96056-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:56 crc kubenswrapper[4922]: I0218 11:54:56.984566 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f2ad2ed-7e29-4760-aa9f-e5bf6bc96056-kube-api-access-rpsmc" (OuterVolumeSpecName: "kube-api-access-rpsmc") pod "3f2ad2ed-7e29-4760-aa9f-e5bf6bc96056" (UID: "3f2ad2ed-7e29-4760-aa9f-e5bf6bc96056"). InnerVolumeSpecName "kube-api-access-rpsmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:54:57 crc kubenswrapper[4922]: I0218 11:54:57.080671 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpsmc\" (UniqueName: \"kubernetes.io/projected/3f2ad2ed-7e29-4760-aa9f-e5bf6bc96056-kube-api-access-rpsmc\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:57 crc kubenswrapper[4922]: I0218 11:54:57.551829 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4","Type":"ContainerStarted","Data":"a913d6036fa961f4da9057dcc5a6d40a6e4b3e35d77e43f7033ef4f740b7546b"} Feb 18 11:54:57 crc kubenswrapper[4922]: I0218 11:54:57.552123 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4","Type":"ContainerStarted","Data":"491ccef72e7d04d2ba0ad4f552d88e3b6a7d14b1901010114c09e2c9f1ceb17c"} Feb 18 11:54:57 crc kubenswrapper[4922]: I0218 11:54:57.553164 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xhpvj" event={"ID":"3f2ad2ed-7e29-4760-aa9f-e5bf6bc96056","Type":"ContainerDied","Data":"cda5ec3708d7b2f4b55f59d5b6c60d14db4ff2054fe6b3b308bcdd72c7da1a97"} Feb 18 11:54:57 crc kubenswrapper[4922]: I0218 11:54:57.553191 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cda5ec3708d7b2f4b55f59d5b6c60d14db4ff2054fe6b3b308bcdd72c7da1a97" Feb 18 11:54:57 crc kubenswrapper[4922]: I0218 11:54:57.553252 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xhpvj" Feb 18 11:54:57 crc kubenswrapper[4922]: I0218 11:54:57.589145 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=32.58912237 podStartE2EDuration="32.58912237s" podCreationTimestamp="2026-02-18 11:54:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:54:57.588687069 +0000 UTC m=+1099.316391169" watchObservedRunningTime="2026-02-18 11:54:57.58912237 +0000 UTC m=+1099.316826450" Feb 18 11:54:58 crc kubenswrapper[4922]: I0218 11:54:58.569245 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0771bdc1-7622-4a65-aa82-3150630ce652","Type":"ContainerStarted","Data":"8683da66dc56b9b992d4f71e8d035f8d162e0768fc93b7f6f278e9c7a3505316"} Feb 18 11:54:58 crc kubenswrapper[4922]: I0218 11:54:58.570307 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0771bdc1-7622-4a65-aa82-3150630ce652","Type":"ContainerStarted","Data":"409439d3613775a0a0cfc3cd788d4d9d8790e06d10bbfd2d942990950e52afee"} Feb 18 11:54:59 crc kubenswrapper[4922]: I0218 11:54:59.592780 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0771bdc1-7622-4a65-aa82-3150630ce652","Type":"ContainerStarted","Data":"defe5ce8a73e6664263abcc81c3e02689b14d12d9d5d4caf56ccaebce514415c"} Feb 18 11:54:59 crc kubenswrapper[4922]: I0218 11:54:59.593117 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0771bdc1-7622-4a65-aa82-3150630ce652","Type":"ContainerStarted","Data":"cf5b0c68a4a569fd7722160f29c535e4de61e15cc54c9dc86a3cb77f9958e9ab"} Feb 18 11:54:59 crc kubenswrapper[4922]: I0218 11:54:59.593129 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0771bdc1-7622-4a65-aa82-3150630ce652","Type":"ContainerStarted","Data":"7a73f4521bddd4bc67844b95a569e63aa1de2a3491f087efb5f6e1bc4176cf13"} Feb 18 11:55:00 crc kubenswrapper[4922]: I0218 11:55:00.835987 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 18 11:55:05 crc kubenswrapper[4922]: E0218 11:55:05.974075 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-st9pz" podUID="855fb3ec-e473-4a99-a94f-cc96dda6d9c4" Feb 18 11:55:07 crc kubenswrapper[4922]: I0218 11:55:07.666861 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0771bdc1-7622-4a65-aa82-3150630ce652","Type":"ContainerStarted","Data":"dabd0d755dd7bd6a2c1225baf5ee9fb272de42f7707117b64cebceea782eb8a5"} Feb 18 11:55:08 crc kubenswrapper[4922]: I0218 11:55:08.680120 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0771bdc1-7622-4a65-aa82-3150630ce652","Type":"ContainerStarted","Data":"7d6ea9e74abfefdd59a5398765c2930b709aecca0131c57ee9ddb2139819fc4b"} Feb 18 11:55:08 crc kubenswrapper[4922]: I0218 11:55:08.686269 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-sznrv" event={"ID":"5f835c05-4bbb-4678-9410-8523cf308f05","Type":"ContainerStarted","Data":"7b6ad8f3e92b6bbfd943139b3553e32910b71922866701c9884de500a3eaebeb"} Feb 18 11:55:08 crc kubenswrapper[4922]: I0218 11:55:08.737973 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=45.504890019 podStartE2EDuration="1m14.737954333s" podCreationTimestamp="2026-02-18 11:53:54 +0000 UTC" firstStartedPulling="2026-02-18 11:54:28.3676229 +0000 UTC m=+1070.095326980" lastFinishedPulling="2026-02-18 11:54:57.600687214 +0000 UTC m=+1099.328391294" observedRunningTime="2026-02-18 11:55:08.732334911 +0000 UTC m=+1110.460038991" watchObservedRunningTime="2026-02-18 11:55:08.737954333 +0000 UTC m=+1110.465658413" Feb 18 11:55:08 crc kubenswrapper[4922]: I0218 11:55:08.774875 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-db-sync-sznrv" podStartSLOduration=2.26859052 podStartE2EDuration="39.774849496s" podCreationTimestamp="2026-02-18 11:54:29 +0000 UTC" firstStartedPulling="2026-02-18 11:54:30.475470208 +0000 UTC m=+1072.203174288" lastFinishedPulling="2026-02-18 11:55:07.981729184 +0000 UTC m=+1109.709433264" observedRunningTime="2026-02-18 11:55:08.758199545 +0000 UTC m=+1110.485903635" watchObservedRunningTime="2026-02-18 11:55:08.774849496 +0000 UTC m=+1110.502553576" Feb 18 11:55:09 crc kubenswrapper[4922]: I0218 11:55:09.048264 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-zpx62"] Feb 18 11:55:09 crc kubenswrapper[4922]: E0218 11:55:09.048851 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f2ad2ed-7e29-4760-aa9f-e5bf6bc96056" containerName="mariadb-account-create-update" Feb 18 11:55:09 crc kubenswrapper[4922]: I0218 11:55:09.048870 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f2ad2ed-7e29-4760-aa9f-e5bf6bc96056" containerName="mariadb-account-create-update" Feb 18 11:55:09 crc kubenswrapper[4922]: I0218 11:55:09.049066 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f2ad2ed-7e29-4760-aa9f-e5bf6bc96056" containerName="mariadb-account-create-update" Feb 18 11:55:09 crc kubenswrapper[4922]: I0218 11:55:09.049982 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-zpx62" Feb 18 11:55:09 crc kubenswrapper[4922]: I0218 11:55:09.052809 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 18 11:55:09 crc kubenswrapper[4922]: I0218 11:55:09.067433 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-zpx62"] Feb 18 11:55:09 crc kubenswrapper[4922]: I0218 11:55:09.153915 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df0f7ce3-64d4-45c3-b416-58e49b5b5bac-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-zpx62\" (UID: \"df0f7ce3-64d4-45c3-b416-58e49b5b5bac\") " pod="openstack/dnsmasq-dns-5c79d794d7-zpx62" Feb 18 11:55:09 crc kubenswrapper[4922]: I0218 11:55:09.154005 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df0f7ce3-64d4-45c3-b416-58e49b5b5bac-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-zpx62\" (UID: \"df0f7ce3-64d4-45c3-b416-58e49b5b5bac\") " pod="openstack/dnsmasq-dns-5c79d794d7-zpx62" Feb 18 11:55:09 crc kubenswrapper[4922]: I0218 11:55:09.154050 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn24f\" (UniqueName: \"kubernetes.io/projected/df0f7ce3-64d4-45c3-b416-58e49b5b5bac-kube-api-access-xn24f\") pod \"dnsmasq-dns-5c79d794d7-zpx62\" (UID: \"df0f7ce3-64d4-45c3-b416-58e49b5b5bac\") " pod="openstack/dnsmasq-dns-5c79d794d7-zpx62" Feb 18 11:55:09 crc kubenswrapper[4922]: I0218 11:55:09.154075 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df0f7ce3-64d4-45c3-b416-58e49b5b5bac-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-zpx62\" (UID: \"df0f7ce3-64d4-45c3-b416-58e49b5b5bac\") " pod="openstack/dnsmasq-dns-5c79d794d7-zpx62" Feb 18 11:55:09 crc kubenswrapper[4922]: I0218 11:55:09.154099 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df0f7ce3-64d4-45c3-b416-58e49b5b5bac-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-zpx62\" (UID: \"df0f7ce3-64d4-45c3-b416-58e49b5b5bac\") " pod="openstack/dnsmasq-dns-5c79d794d7-zpx62" Feb 18 11:55:09 crc kubenswrapper[4922]: I0218 11:55:09.154120 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df0f7ce3-64d4-45c3-b416-58e49b5b5bac-config\") pod \"dnsmasq-dns-5c79d794d7-zpx62\" (UID: \"df0f7ce3-64d4-45c3-b416-58e49b5b5bac\") " pod="openstack/dnsmasq-dns-5c79d794d7-zpx62" Feb 18 11:55:09 crc kubenswrapper[4922]: I0218 11:55:09.255966 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df0f7ce3-64d4-45c3-b416-58e49b5b5bac-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-zpx62\" (UID: \"df0f7ce3-64d4-45c3-b416-58e49b5b5bac\") " pod="openstack/dnsmasq-dns-5c79d794d7-zpx62" Feb 18 11:55:09 crc kubenswrapper[4922]: I0218 11:55:09.256077 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df0f7ce3-64d4-45c3-b416-58e49b5b5bac-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-zpx62\" (UID: \"df0f7ce3-64d4-45c3-b416-58e49b5b5bac\") " pod="openstack/dnsmasq-dns-5c79d794d7-zpx62" Feb 18 11:55:09 crc kubenswrapper[4922]: I0218 11:55:09.256125 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn24f\" (UniqueName: \"kubernetes.io/projected/df0f7ce3-64d4-45c3-b416-58e49b5b5bac-kube-api-access-xn24f\") pod \"dnsmasq-dns-5c79d794d7-zpx62\" (UID: \"df0f7ce3-64d4-45c3-b416-58e49b5b5bac\") " pod="openstack/dnsmasq-dns-5c79d794d7-zpx62" Feb 18 11:55:09 crc kubenswrapper[4922]: I0218 11:55:09.256151 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df0f7ce3-64d4-45c3-b416-58e49b5b5bac-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-zpx62\" (UID: \"df0f7ce3-64d4-45c3-b416-58e49b5b5bac\") " pod="openstack/dnsmasq-dns-5c79d794d7-zpx62" Feb 18 11:55:09 crc kubenswrapper[4922]: I0218 11:55:09.256185 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df0f7ce3-64d4-45c3-b416-58e49b5b5bac-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-zpx62\" (UID: \"df0f7ce3-64d4-45c3-b416-58e49b5b5bac\") " pod="openstack/dnsmasq-dns-5c79d794d7-zpx62" Feb 18 11:55:09 crc kubenswrapper[4922]: I0218 11:55:09.256215 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df0f7ce3-64d4-45c3-b416-58e49b5b5bac-config\") pod \"dnsmasq-dns-5c79d794d7-zpx62\" (UID: \"df0f7ce3-64d4-45c3-b416-58e49b5b5bac\") " pod="openstack/dnsmasq-dns-5c79d794d7-zpx62" Feb 18 11:55:09 crc kubenswrapper[4922]: I0218 11:55:09.257267 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df0f7ce3-64d4-45c3-b416-58e49b5b5bac-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-zpx62\" (UID: \"df0f7ce3-64d4-45c3-b416-58e49b5b5bac\") " pod="openstack/dnsmasq-dns-5c79d794d7-zpx62" Feb 18 11:55:09 crc kubenswrapper[4922]: I0218 11:55:09.257270 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df0f7ce3-64d4-45c3-b416-58e49b5b5bac-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-zpx62\" (UID: \"df0f7ce3-64d4-45c3-b416-58e49b5b5bac\") " pod="openstack/dnsmasq-dns-5c79d794d7-zpx62" Feb 18 11:55:09 crc kubenswrapper[4922]: I0218 11:55:09.257419 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df0f7ce3-64d4-45c3-b416-58e49b5b5bac-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-zpx62\" (UID: \"df0f7ce3-64d4-45c3-b416-58e49b5b5bac\") " pod="openstack/dnsmasq-dns-5c79d794d7-zpx62" Feb 18 11:55:09 crc kubenswrapper[4922]: I0218 11:55:09.257696 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df0f7ce3-64d4-45c3-b416-58e49b5b5bac-config\") pod \"dnsmasq-dns-5c79d794d7-zpx62\" (UID: \"df0f7ce3-64d4-45c3-b416-58e49b5b5bac\") " pod="openstack/dnsmasq-dns-5c79d794d7-zpx62" Feb 18 11:55:09 crc kubenswrapper[4922]: I0218 11:55:09.257772 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df0f7ce3-64d4-45c3-b416-58e49b5b5bac-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-zpx62\" (UID: \"df0f7ce3-64d4-45c3-b416-58e49b5b5bac\") " pod="openstack/dnsmasq-dns-5c79d794d7-zpx62" Feb 18 11:55:09 crc kubenswrapper[4922]: I0218 11:55:09.276150 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn24f\" (UniqueName: \"kubernetes.io/projected/df0f7ce3-64d4-45c3-b416-58e49b5b5bac-kube-api-access-xn24f\") pod \"dnsmasq-dns-5c79d794d7-zpx62\" (UID: \"df0f7ce3-64d4-45c3-b416-58e49b5b5bac\") " pod="openstack/dnsmasq-dns-5c79d794d7-zpx62" Feb 18 11:55:09 crc kubenswrapper[4922]: I0218 11:55:09.368524 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-zpx62" Feb 18 11:55:09 crc kubenswrapper[4922]: I0218 11:55:09.846484 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-zpx62"] Feb 18 11:55:10 crc kubenswrapper[4922]: I0218 11:55:10.705953 4922 generic.go:334] "Generic (PLEG): container finished" podID="df0f7ce3-64d4-45c3-b416-58e49b5b5bac" containerID="de3eb0b81c2ecf51916540dbcf6e765b8396d655b971b5b5ee803c68d62a7d58" exitCode=0 Feb 18 11:55:10 crc kubenswrapper[4922]: I0218 11:55:10.706112 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-zpx62" event={"ID":"df0f7ce3-64d4-45c3-b416-58e49b5b5bac","Type":"ContainerDied","Data":"de3eb0b81c2ecf51916540dbcf6e765b8396d655b971b5b5ee803c68d62a7d58"} Feb 18 11:55:10 crc kubenswrapper[4922]: I0218 11:55:10.706309 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-zpx62" event={"ID":"df0f7ce3-64d4-45c3-b416-58e49b5b5bac","Type":"ContainerStarted","Data":"d6fd96a63f0eb8def4408247acfb363976fe91401640a5a1bfb2074f5c37e39c"} Feb 18 11:55:10 crc kubenswrapper[4922]: I0218 11:55:10.835892 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 18 11:55:10 crc kubenswrapper[4922]: I0218 11:55:10.844110 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 18 11:55:11 crc kubenswrapper[4922]: I0218 11:55:11.716547 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-zpx62" event={"ID":"df0f7ce3-64d4-45c3-b416-58e49b5b5bac","Type":"ContainerStarted","Data":"b46ddfc217eb56701f6d0cade709faba90adfb7a9e5d395fb4ad4edb78eeacbf"} Feb 18 11:55:11 crc kubenswrapper[4922]: I0218 11:55:11.716882 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c79d794d7-zpx62" Feb 18 11:55:11 crc kubenswrapper[4922]: I0218 11:55:11.725516 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 18 11:55:11 crc kubenswrapper[4922]: I0218 11:55:11.747969 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c79d794d7-zpx62" podStartSLOduration=2.747951703 podStartE2EDuration="2.747951703s" podCreationTimestamp="2026-02-18 11:55:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:55:11.742181657 +0000 UTC m=+1113.469885777" watchObservedRunningTime="2026-02-18 11:55:11.747951703 +0000 UTC m=+1113.475655783" Feb 18 11:55:16 crc kubenswrapper[4922]: I0218 11:55:16.768030 4922 generic.go:334] "Generic (PLEG): container finished" podID="5f835c05-4bbb-4678-9410-8523cf308f05" containerID="7b6ad8f3e92b6bbfd943139b3553e32910b71922866701c9884de500a3eaebeb" exitCode=0 Feb 18 11:55:16 crc kubenswrapper[4922]: I0218 11:55:16.768258 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-sznrv" event={"ID":"5f835c05-4bbb-4678-9410-8523cf308f05","Type":"ContainerDied","Data":"7b6ad8f3e92b6bbfd943139b3553e32910b71922866701c9884de500a3eaebeb"} Feb 18 11:55:16 crc kubenswrapper[4922]: I0218 11:55:16.772187 4922 generic.go:334] "Generic (PLEG): container finished" podID="2102ef9b-8151-4edf-8b43-7c4486203911" containerID="96a538608703e5f575eecc4d7943dd23e9a2c36f206b74a346320d63b212ab98" exitCode=0 Feb 18 11:55:16 crc kubenswrapper[4922]: I0218 11:55:16.772228 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-fnkcj" event={"ID":"2102ef9b-8151-4edf-8b43-7c4486203911","Type":"ContainerDied","Data":"96a538608703e5f575eecc4d7943dd23e9a2c36f206b74a346320d63b212ab98"} Feb 18 11:55:18 crc kubenswrapper[4922]: I0218 11:55:18.227638 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-sznrv" Feb 18 11:55:18 crc kubenswrapper[4922]: I0218 11:55:18.236817 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-fnkcj" Feb 18 11:55:18 crc kubenswrapper[4922]: I0218 11:55:18.326865 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxwsh\" (UniqueName: \"kubernetes.io/projected/5f835c05-4bbb-4678-9410-8523cf308f05-kube-api-access-cxwsh\") pod \"5f835c05-4bbb-4678-9410-8523cf308f05\" (UID: \"5f835c05-4bbb-4678-9410-8523cf308f05\") " Feb 18 11:55:18 crc kubenswrapper[4922]: I0218 11:55:18.326962 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2102ef9b-8151-4edf-8b43-7c4486203911-combined-ca-bundle\") pod \"2102ef9b-8151-4edf-8b43-7c4486203911\" (UID: \"2102ef9b-8151-4edf-8b43-7c4486203911\") " Feb 18 11:55:18 crc kubenswrapper[4922]: I0218 11:55:18.327017 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f835c05-4bbb-4678-9410-8523cf308f05-combined-ca-bundle\") pod \"5f835c05-4bbb-4678-9410-8523cf308f05\" (UID: \"5f835c05-4bbb-4678-9410-8523cf308f05\") " Feb 18 11:55:18 crc kubenswrapper[4922]: I0218 11:55:18.327099 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5f835c05-4bbb-4678-9410-8523cf308f05-db-sync-config-data\") pod \"5f835c05-4bbb-4678-9410-8523cf308f05\" (UID: \"5f835c05-4bbb-4678-9410-8523cf308f05\") " Feb 18 11:55:18 crc kubenswrapper[4922]: I0218 11:55:18.327127 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsps6\" (UniqueName: \"kubernetes.io/projected/2102ef9b-8151-4edf-8b43-7c4486203911-kube-api-access-bsps6\") pod \"2102ef9b-8151-4edf-8b43-7c4486203911\" (UID: \"2102ef9b-8151-4edf-8b43-7c4486203911\") " Feb 18 11:55:18 crc kubenswrapper[4922]: I0218 11:55:18.327983 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f835c05-4bbb-4678-9410-8523cf308f05-config-data\") pod \"5f835c05-4bbb-4678-9410-8523cf308f05\" (UID: \"5f835c05-4bbb-4678-9410-8523cf308f05\") " Feb 18 11:55:18 crc kubenswrapper[4922]: I0218 11:55:18.328015 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2102ef9b-8151-4edf-8b43-7c4486203911-config-data\") pod \"2102ef9b-8151-4edf-8b43-7c4486203911\" (UID: \"2102ef9b-8151-4edf-8b43-7c4486203911\") " Feb 18 11:55:18 crc kubenswrapper[4922]: I0218 11:55:18.332882 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2102ef9b-8151-4edf-8b43-7c4486203911-kube-api-access-bsps6" (OuterVolumeSpecName: "kube-api-access-bsps6") pod "2102ef9b-8151-4edf-8b43-7c4486203911" (UID: "2102ef9b-8151-4edf-8b43-7c4486203911"). InnerVolumeSpecName "kube-api-access-bsps6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:55:18 crc kubenswrapper[4922]: I0218 11:55:18.335137 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f835c05-4bbb-4678-9410-8523cf308f05-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "5f835c05-4bbb-4678-9410-8523cf308f05" (UID: "5f835c05-4bbb-4678-9410-8523cf308f05"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:55:18 crc kubenswrapper[4922]: I0218 11:55:18.335428 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f835c05-4bbb-4678-9410-8523cf308f05-kube-api-access-cxwsh" (OuterVolumeSpecName: "kube-api-access-cxwsh") pod "5f835c05-4bbb-4678-9410-8523cf308f05" (UID: "5f835c05-4bbb-4678-9410-8523cf308f05"). InnerVolumeSpecName "kube-api-access-cxwsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:55:18 crc kubenswrapper[4922]: I0218 11:55:18.350488 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f835c05-4bbb-4678-9410-8523cf308f05-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5f835c05-4bbb-4678-9410-8523cf308f05" (UID: "5f835c05-4bbb-4678-9410-8523cf308f05"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:55:18 crc kubenswrapper[4922]: I0218 11:55:18.360515 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2102ef9b-8151-4edf-8b43-7c4486203911-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2102ef9b-8151-4edf-8b43-7c4486203911" (UID: "2102ef9b-8151-4edf-8b43-7c4486203911"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:55:18 crc kubenswrapper[4922]: I0218 11:55:18.374881 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2102ef9b-8151-4edf-8b43-7c4486203911-config-data" (OuterVolumeSpecName: "config-data") pod "2102ef9b-8151-4edf-8b43-7c4486203911" (UID: "2102ef9b-8151-4edf-8b43-7c4486203911"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:55:18 crc kubenswrapper[4922]: I0218 11:55:18.385492 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f835c05-4bbb-4678-9410-8523cf308f05-config-data" (OuterVolumeSpecName: "config-data") pod "5f835c05-4bbb-4678-9410-8523cf308f05" (UID: "5f835c05-4bbb-4678-9410-8523cf308f05"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:55:18 crc kubenswrapper[4922]: I0218 11:55:18.429536 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f835c05-4bbb-4678-9410-8523cf308f05-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 11:55:18 crc kubenswrapper[4922]: I0218 11:55:18.429578 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2102ef9b-8151-4edf-8b43-7c4486203911-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 11:55:18 crc kubenswrapper[4922]: I0218 11:55:18.429591 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxwsh\" (UniqueName: \"kubernetes.io/projected/5f835c05-4bbb-4678-9410-8523cf308f05-kube-api-access-cxwsh\") on node \"crc\" DevicePath \"\"" Feb 18 11:55:18 crc kubenswrapper[4922]: I0218 11:55:18.429604 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2102ef9b-8151-4edf-8b43-7c4486203911-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:55:18 crc kubenswrapper[4922]: I0218 11:55:18.429615 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f835c05-4bbb-4678-9410-8523cf308f05-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:55:18 crc kubenswrapper[4922]: I0218 11:55:18.429626 4922 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5f835c05-4bbb-4678-9410-8523cf308f05-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 11:55:18 crc kubenswrapper[4922]: I0218 11:55:18.429639 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsps6\" (UniqueName: \"kubernetes.io/projected/2102ef9b-8151-4edf-8b43-7c4486203911-kube-api-access-bsps6\") on node \"crc\" DevicePath \"\"" Feb 18 11:55:18 crc kubenswrapper[4922]: I0218 11:55:18.788958 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-sznrv" Feb 18 11:55:18 crc kubenswrapper[4922]: I0218 11:55:18.788956 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-sznrv" event={"ID":"5f835c05-4bbb-4678-9410-8523cf308f05","Type":"ContainerDied","Data":"64fe53faccd6664eee091113f52c4e7d0ab540979e620f3862763ef2ec700f6d"} Feb 18 11:55:18 crc kubenswrapper[4922]: I0218 11:55:18.789647 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64fe53faccd6664eee091113f52c4e7d0ab540979e620f3862763ef2ec700f6d" Feb 18 11:55:18 crc kubenswrapper[4922]: I0218 11:55:18.790528 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-fnkcj" event={"ID":"2102ef9b-8151-4edf-8b43-7c4486203911","Type":"ContainerDied","Data":"e7cf1c0a3ad45fa91a71f51e0bf0901154e6b84ba139717db0e095e711718562"} Feb 18 11:55:18 crc kubenswrapper[4922]: I0218 11:55:18.790563 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7cf1c0a3ad45fa91a71f51e0bf0901154e6b84ba139717db0e095e711718562" Feb 18 11:55:18 crc kubenswrapper[4922]: I0218 11:55:18.790590 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-fnkcj" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.063515 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-zpx62"] Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.063777 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c79d794d7-zpx62" podUID="df0f7ce3-64d4-45c3-b416-58e49b5b5bac" containerName="dnsmasq-dns" containerID="cri-o://b46ddfc217eb56701f6d0cade709faba90adfb7a9e5d395fb4ad4edb78eeacbf" gracePeriod=10 Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.066542 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c79d794d7-zpx62" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.101321 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b868669f-bms2q"] Feb 18 11:55:19 crc kubenswrapper[4922]: E0218 11:55:19.102297 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2102ef9b-8151-4edf-8b43-7c4486203911" containerName="keystone-db-sync" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.102315 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="2102ef9b-8151-4edf-8b43-7c4486203911" containerName="keystone-db-sync" Feb 18 11:55:19 crc kubenswrapper[4922]: E0218 11:55:19.102331 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f835c05-4bbb-4678-9410-8523cf308f05" containerName="watcher-db-sync" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.102336 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f835c05-4bbb-4678-9410-8523cf308f05" containerName="watcher-db-sync" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.102504 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="2102ef9b-8151-4edf-8b43-7c4486203911" containerName="keystone-db-sync" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.102517 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f835c05-4bbb-4678-9410-8523cf308f05" containerName="watcher-db-sync" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.103532 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b868669f-bms2q" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.143293 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/080a4dbf-a721-4b07-8c48-1ed03637a871-ovsdbserver-sb\") pod \"dnsmasq-dns-5b868669f-bms2q\" (UID: \"080a4dbf-a721-4b07-8c48-1ed03637a871\") " pod="openstack/dnsmasq-dns-5b868669f-bms2q" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.143390 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/080a4dbf-a721-4b07-8c48-1ed03637a871-ovsdbserver-nb\") pod \"dnsmasq-dns-5b868669f-bms2q\" (UID: \"080a4dbf-a721-4b07-8c48-1ed03637a871\") " pod="openstack/dnsmasq-dns-5b868669f-bms2q" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.143428 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85mgs\" (UniqueName: \"kubernetes.io/projected/080a4dbf-a721-4b07-8c48-1ed03637a871-kube-api-access-85mgs\") pod \"dnsmasq-dns-5b868669f-bms2q\" (UID: \"080a4dbf-a721-4b07-8c48-1ed03637a871\") " pod="openstack/dnsmasq-dns-5b868669f-bms2q" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.143503 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/080a4dbf-a721-4b07-8c48-1ed03637a871-config\") pod \"dnsmasq-dns-5b868669f-bms2q\" (UID: \"080a4dbf-a721-4b07-8c48-1ed03637a871\") " pod="openstack/dnsmasq-dns-5b868669f-bms2q" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.143534 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/080a4dbf-a721-4b07-8c48-1ed03637a871-dns-swift-storage-0\") pod \"dnsmasq-dns-5b868669f-bms2q\" (UID: \"080a4dbf-a721-4b07-8c48-1ed03637a871\") " pod="openstack/dnsmasq-dns-5b868669f-bms2q" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.143554 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/080a4dbf-a721-4b07-8c48-1ed03637a871-dns-svc\") pod \"dnsmasq-dns-5b868669f-bms2q\" (UID: \"080a4dbf-a721-4b07-8c48-1ed03637a871\") " pod="openstack/dnsmasq-dns-5b868669f-bms2q" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.176314 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-mqkkx"] Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.177407 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mqkkx" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.194873 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.195070 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.195205 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.195405 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.195523 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-r8rn4" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.221392 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-bms2q"] Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.247542 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/080a4dbf-a721-4b07-8c48-1ed03637a871-config\") pod \"dnsmasq-dns-5b868669f-bms2q\" (UID: \"080a4dbf-a721-4b07-8c48-1ed03637a871\") " pod="openstack/dnsmasq-dns-5b868669f-bms2q" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.247593 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/080a4dbf-a721-4b07-8c48-1ed03637a871-dns-swift-storage-0\") pod \"dnsmasq-dns-5b868669f-bms2q\" (UID: \"080a4dbf-a721-4b07-8c48-1ed03637a871\") " pod="openstack/dnsmasq-dns-5b868669f-bms2q" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.247620 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/080a4dbf-a721-4b07-8c48-1ed03637a871-dns-svc\") pod \"dnsmasq-dns-5b868669f-bms2q\" (UID: \"080a4dbf-a721-4b07-8c48-1ed03637a871\") " pod="openstack/dnsmasq-dns-5b868669f-bms2q" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.247668 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7598fc1c-8735-4c0e-a095-f13117d3037e-config-data\") pod \"keystone-bootstrap-mqkkx\" (UID: \"7598fc1c-8735-4c0e-a095-f13117d3037e\") " pod="openstack/keystone-bootstrap-mqkkx" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.247691 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7598fc1c-8735-4c0e-a095-f13117d3037e-fernet-keys\") pod \"keystone-bootstrap-mqkkx\" (UID: \"7598fc1c-8735-4c0e-a095-f13117d3037e\") " pod="openstack/keystone-bootstrap-mqkkx" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.247738 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7598fc1c-8735-4c0e-a095-f13117d3037e-combined-ca-bundle\") pod \"keystone-bootstrap-mqkkx\" (UID: \"7598fc1c-8735-4c0e-a095-f13117d3037e\") " pod="openstack/keystone-bootstrap-mqkkx" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.247769 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/080a4dbf-a721-4b07-8c48-1ed03637a871-ovsdbserver-sb\") pod \"dnsmasq-dns-5b868669f-bms2q\" (UID: \"080a4dbf-a721-4b07-8c48-1ed03637a871\") " pod="openstack/dnsmasq-dns-5b868669f-bms2q" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.247801 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj8x7\" (UniqueName: \"kubernetes.io/projected/7598fc1c-8735-4c0e-a095-f13117d3037e-kube-api-access-kj8x7\") pod \"keystone-bootstrap-mqkkx\" (UID: \"7598fc1c-8735-4c0e-a095-f13117d3037e\") " pod="openstack/keystone-bootstrap-mqkkx" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.247822 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/080a4dbf-a721-4b07-8c48-1ed03637a871-ovsdbserver-nb\") pod \"dnsmasq-dns-5b868669f-bms2q\" (UID: \"080a4dbf-a721-4b07-8c48-1ed03637a871\") " pod="openstack/dnsmasq-dns-5b868669f-bms2q" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.247852 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7598fc1c-8735-4c0e-a095-f13117d3037e-scripts\") pod \"keystone-bootstrap-mqkkx\" (UID: \"7598fc1c-8735-4c0e-a095-f13117d3037e\") " pod="openstack/keystone-bootstrap-mqkkx" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.247867 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85mgs\" (UniqueName: \"kubernetes.io/projected/080a4dbf-a721-4b07-8c48-1ed03637a871-kube-api-access-85mgs\") pod \"dnsmasq-dns-5b868669f-bms2q\" (UID: \"080a4dbf-a721-4b07-8c48-1ed03637a871\") " pod="openstack/dnsmasq-dns-5b868669f-bms2q" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.247888 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7598fc1c-8735-4c0e-a095-f13117d3037e-credential-keys\") pod \"keystone-bootstrap-mqkkx\" (UID: \"7598fc1c-8735-4c0e-a095-f13117d3037e\") " pod="openstack/keystone-bootstrap-mqkkx" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.250856 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/080a4dbf-a721-4b07-8c48-1ed03637a871-ovsdbserver-sb\") pod \"dnsmasq-dns-5b868669f-bms2q\" (UID: \"080a4dbf-a721-4b07-8c48-1ed03637a871\") " pod="openstack/dnsmasq-dns-5b868669f-bms2q" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.250907 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/080a4dbf-a721-4b07-8c48-1ed03637a871-dns-swift-storage-0\") pod \"dnsmasq-dns-5b868669f-bms2q\" (UID: \"080a4dbf-a721-4b07-8c48-1ed03637a871\") " pod="openstack/dnsmasq-dns-5b868669f-bms2q" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.251491 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/080a4dbf-a721-4b07-8c48-1ed03637a871-ovsdbserver-nb\") pod \"dnsmasq-dns-5b868669f-bms2q\" (UID: \"080a4dbf-a721-4b07-8c48-1ed03637a871\") " pod="openstack/dnsmasq-dns-5b868669f-bms2q" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.252387 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/080a4dbf-a721-4b07-8c48-1ed03637a871-dns-svc\") pod \"dnsmasq-dns-5b868669f-bms2q\" (UID: \"080a4dbf-a721-4b07-8c48-1ed03637a871\") " pod="openstack/dnsmasq-dns-5b868669f-bms2q" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.265905 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-mqkkx"] Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.266897 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/080a4dbf-a721-4b07-8c48-1ed03637a871-config\") pod \"dnsmasq-dns-5b868669f-bms2q\" (UID: \"080a4dbf-a721-4b07-8c48-1ed03637a871\") " pod="openstack/dnsmasq-dns-5b868669f-bms2q" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.325283 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85mgs\" (UniqueName: \"kubernetes.io/projected/080a4dbf-a721-4b07-8c48-1ed03637a871-kube-api-access-85mgs\") pod \"dnsmasq-dns-5b868669f-bms2q\" (UID: \"080a4dbf-a721-4b07-8c48-1ed03637a871\") " pod="openstack/dnsmasq-dns-5b868669f-bms2q" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.362694 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7598fc1c-8735-4c0e-a095-f13117d3037e-config-data\") pod \"keystone-bootstrap-mqkkx\" (UID: \"7598fc1c-8735-4c0e-a095-f13117d3037e\") " pod="openstack/keystone-bootstrap-mqkkx" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.362765 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7598fc1c-8735-4c0e-a095-f13117d3037e-fernet-keys\") pod \"keystone-bootstrap-mqkkx\" (UID: \"7598fc1c-8735-4c0e-a095-f13117d3037e\") " pod="openstack/keystone-bootstrap-mqkkx" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.362863 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7598fc1c-8735-4c0e-a095-f13117d3037e-combined-ca-bundle\") pod \"keystone-bootstrap-mqkkx\" (UID: \"7598fc1c-8735-4c0e-a095-f13117d3037e\") " pod="openstack/keystone-bootstrap-mqkkx" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.362996 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kj8x7\" (UniqueName: \"kubernetes.io/projected/7598fc1c-8735-4c0e-a095-f13117d3037e-kube-api-access-kj8x7\") pod \"keystone-bootstrap-mqkkx\" (UID: \"7598fc1c-8735-4c0e-a095-f13117d3037e\") " pod="openstack/keystone-bootstrap-mqkkx" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.363072 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7598fc1c-8735-4c0e-a095-f13117d3037e-scripts\") pod \"keystone-bootstrap-mqkkx\" (UID: \"7598fc1c-8735-4c0e-a095-f13117d3037e\") " pod="openstack/keystone-bootstrap-mqkkx" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.363120 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7598fc1c-8735-4c0e-a095-f13117d3037e-credential-keys\") pod \"keystone-bootstrap-mqkkx\" (UID: \"7598fc1c-8735-4c0e-a095-f13117d3037e\") " pod="openstack/keystone-bootstrap-mqkkx" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.386693 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5c79d794d7-zpx62" podUID="df0f7ce3-64d4-45c3-b416-58e49b5b5bac" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.144:5353: connect: connection refused" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.391398 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7598fc1c-8735-4c0e-a095-f13117d3037e-combined-ca-bundle\") pod \"keystone-bootstrap-mqkkx\" (UID: \"7598fc1c-8735-4c0e-a095-f13117d3037e\") " pod="openstack/keystone-bootstrap-mqkkx" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.422190 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7598fc1c-8735-4c0e-a095-f13117d3037e-credential-keys\") pod \"keystone-bootstrap-mqkkx\" (UID: \"7598fc1c-8735-4c0e-a095-f13117d3037e\") " pod="openstack/keystone-bootstrap-mqkkx" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.423085 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7598fc1c-8735-4c0e-a095-f13117d3037e-fernet-keys\") pod \"keystone-bootstrap-mqkkx\" (UID: \"7598fc1c-8735-4c0e-a095-f13117d3037e\") " pod="openstack/keystone-bootstrap-mqkkx" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.423427 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7598fc1c-8735-4c0e-a095-f13117d3037e-scripts\") pod \"keystone-bootstrap-mqkkx\" (UID: \"7598fc1c-8735-4c0e-a095-f13117d3037e\") " pod="openstack/keystone-bootstrap-mqkkx" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.466919 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7598fc1c-8735-4c0e-a095-f13117d3037e-config-data\") pod \"keystone-bootstrap-mqkkx\" (UID: \"7598fc1c-8735-4c0e-a095-f13117d3037e\") " pod="openstack/keystone-bootstrap-mqkkx" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.467700 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.468926 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.470303 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj8x7\" (UniqueName: \"kubernetes.io/projected/7598fc1c-8735-4c0e-a095-f13117d3037e-kube-api-access-kj8x7\") pod \"keystone-bootstrap-mqkkx\" (UID: \"7598fc1c-8735-4c0e-a095-f13117d3037e\") " pod="openstack/keystone-bootstrap-mqkkx" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.470978 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-82jsz" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.471186 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.492364 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.561746 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b868669f-bms2q" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.562440 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.563523 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.566725 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.619613 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.626459 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.634802 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-zjb6x"] Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.638766 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.645033 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.660351 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-zjb6x"] Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.660442 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.660533 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-zjb6x" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.672405 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.672440 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-569dfc4865-ndwdj"] Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.672500 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-bmh7l" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.672589 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.674291 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-569dfc4865-ndwdj" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.675935 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c614b6a-8d46-4a07-89f6-1a7cc64dfdad-logs\") pod \"watcher-applier-0\" (UID: \"0c614b6a-8d46-4a07-89f6-1a7cc64dfdad\") " pod="openstack/watcher-applier-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.675986 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9e9fc515-5e15-41fc-8e76-b9f3af099a0f-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"9e9fc515-5e15-41fc-8e76-b9f3af099a0f\") " pod="openstack/watcher-decision-engine-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.676041 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c614b6a-8d46-4a07-89f6-1a7cc64dfdad-config-data\") pod \"watcher-applier-0\" (UID: \"0c614b6a-8d46-4a07-89f6-1a7cc64dfdad\") " pod="openstack/watcher-applier-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.676072 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlfjr\" (UniqueName: \"kubernetes.io/projected/0c614b6a-8d46-4a07-89f6-1a7cc64dfdad-kube-api-access-dlfjr\") pod \"watcher-applier-0\" (UID: \"0c614b6a-8d46-4a07-89f6-1a7cc64dfdad\") " pod="openstack/watcher-applier-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.676099 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lkg5\" (UniqueName: \"kubernetes.io/projected/9e9fc515-5e15-41fc-8e76-b9f3af099a0f-kube-api-access-6lkg5\") pod \"watcher-decision-engine-0\" (UID: \"9e9fc515-5e15-41fc-8e76-b9f3af099a0f\") " pod="openstack/watcher-decision-engine-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.676181 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e9fc515-5e15-41fc-8e76-b9f3af099a0f-config-data\") pod \"watcher-decision-engine-0\" (UID: \"9e9fc515-5e15-41fc-8e76-b9f3af099a0f\") " pod="openstack/watcher-decision-engine-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.676278 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c614b6a-8d46-4a07-89f6-1a7cc64dfdad-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"0c614b6a-8d46-4a07-89f6-1a7cc64dfdad\") " pod="openstack/watcher-applier-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.676344 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e9fc515-5e15-41fc-8e76-b9f3af099a0f-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"9e9fc515-5e15-41fc-8e76-b9f3af099a0f\") " pod="openstack/watcher-decision-engine-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.676503 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e9fc515-5e15-41fc-8e76-b9f3af099a0f-logs\") pod \"watcher-decision-engine-0\" (UID: \"9e9fc515-5e15-41fc-8e76-b9f3af099a0f\") " pod="openstack/watcher-decision-engine-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.681744 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mqkkx" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.691654 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.691988 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-lqkqv" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.692182 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.692279 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.692379 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-569dfc4865-ndwdj"] Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.757829 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-clz29"] Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.758942 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-clz29" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.775817 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.776047 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-pnzs4" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.776189 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.777843 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c7e64f4b-f6d6-43b3-aeab-47a0da094a84-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"c7e64f4b-f6d6-43b3-aeab-47a0da094a84\") " pod="openstack/watcher-api-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.777897 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c614b6a-8d46-4a07-89f6-1a7cc64dfdad-config-data\") pod \"watcher-applier-0\" (UID: \"0c614b6a-8d46-4a07-89f6-1a7cc64dfdad\") " pod="openstack/watcher-applier-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.777924 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7e64f4b-f6d6-43b3-aeab-47a0da094a84-logs\") pod \"watcher-api-0\" (UID: \"c7e64f4b-f6d6-43b3-aeab-47a0da094a84\") " pod="openstack/watcher-api-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.777953 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlfjr\" (UniqueName: \"kubernetes.io/projected/0c614b6a-8d46-4a07-89f6-1a7cc64dfdad-kube-api-access-dlfjr\") pod \"watcher-applier-0\" (UID: \"0c614b6a-8d46-4a07-89f6-1a7cc64dfdad\") " pod="openstack/watcher-applier-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.777975 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lkg5\" (UniqueName: \"kubernetes.io/projected/9e9fc515-5e15-41fc-8e76-b9f3af099a0f-kube-api-access-6lkg5\") pod \"watcher-decision-engine-0\" (UID: \"9e9fc515-5e15-41fc-8e76-b9f3af099a0f\") " pod="openstack/watcher-decision-engine-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.778003 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8d1daa08-43b5-47db-bd94-3efb0eb4dce2-scripts\") pod \"horizon-569dfc4865-ndwdj\" (UID: \"8d1daa08-43b5-47db-bd94-3efb0eb4dce2\") " pod="openstack/horizon-569dfc4865-ndwdj" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.778026 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7e64f4b-f6d6-43b3-aeab-47a0da094a84-config-data\") pod \"watcher-api-0\" (UID: \"c7e64f4b-f6d6-43b3-aeab-47a0da094a84\") " pod="openstack/watcher-api-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.778046 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6bkw\" (UniqueName: \"kubernetes.io/projected/c7e64f4b-f6d6-43b3-aeab-47a0da094a84-kube-api-access-k6bkw\") pod \"watcher-api-0\" (UID: \"c7e64f4b-f6d6-43b3-aeab-47a0da094a84\") " pod="openstack/watcher-api-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.778082 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e9fc515-5e15-41fc-8e76-b9f3af099a0f-config-data\") pod \"watcher-decision-engine-0\" (UID: \"9e9fc515-5e15-41fc-8e76-b9f3af099a0f\") " pod="openstack/watcher-decision-engine-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.778126 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c614b6a-8d46-4a07-89f6-1a7cc64dfdad-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"0c614b6a-8d46-4a07-89f6-1a7cc64dfdad\") " pod="openstack/watcher-applier-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.778153 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvmgk\" (UniqueName: \"kubernetes.io/projected/4bcd3608-244b-44f0-be1f-5d953cd35964-kube-api-access-kvmgk\") pod \"neutron-db-sync-zjb6x\" (UID: \"4bcd3608-244b-44f0-be1f-5d953cd35964\") " pod="openstack/neutron-db-sync-zjb6x" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.778184 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8d1daa08-43b5-47db-bd94-3efb0eb4dce2-config-data\") pod \"horizon-569dfc4865-ndwdj\" (UID: \"8d1daa08-43b5-47db-bd94-3efb0eb4dce2\") " pod="openstack/horizon-569dfc4865-ndwdj" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.778211 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e9fc515-5e15-41fc-8e76-b9f3af099a0f-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"9e9fc515-5e15-41fc-8e76-b9f3af099a0f\") " pod="openstack/watcher-decision-engine-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.778464 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbxnl\" (UniqueName: \"kubernetes.io/projected/8d1daa08-43b5-47db-bd94-3efb0eb4dce2-kube-api-access-lbxnl\") pod \"horizon-569dfc4865-ndwdj\" (UID: \"8d1daa08-43b5-47db-bd94-3efb0eb4dce2\") " pod="openstack/horizon-569dfc4865-ndwdj" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.778638 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e64f4b-f6d6-43b3-aeab-47a0da094a84-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"c7e64f4b-f6d6-43b3-aeab-47a0da094a84\") " pod="openstack/watcher-api-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.778760 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e9fc515-5e15-41fc-8e76-b9f3af099a0f-logs\") pod \"watcher-decision-engine-0\" (UID: \"9e9fc515-5e15-41fc-8e76-b9f3af099a0f\") " pod="openstack/watcher-decision-engine-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.778791 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bcd3608-244b-44f0-be1f-5d953cd35964-combined-ca-bundle\") pod \"neutron-db-sync-zjb6x\" (UID: \"4bcd3608-244b-44f0-be1f-5d953cd35964\") " pod="openstack/neutron-db-sync-zjb6x" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.778853 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d1daa08-43b5-47db-bd94-3efb0eb4dce2-logs\") pod \"horizon-569dfc4865-ndwdj\" (UID: \"8d1daa08-43b5-47db-bd94-3efb0eb4dce2\") " pod="openstack/horizon-569dfc4865-ndwdj" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.778887 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c614b6a-8d46-4a07-89f6-1a7cc64dfdad-logs\") pod \"watcher-applier-0\" (UID: \"0c614b6a-8d46-4a07-89f6-1a7cc64dfdad\") " pod="openstack/watcher-applier-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.778913 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9e9fc515-5e15-41fc-8e76-b9f3af099a0f-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"9e9fc515-5e15-41fc-8e76-b9f3af099a0f\") " pod="openstack/watcher-decision-engine-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.778934 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4bcd3608-244b-44f0-be1f-5d953cd35964-config\") pod \"neutron-db-sync-zjb6x\" (UID: \"4bcd3608-244b-44f0-be1f-5d953cd35964\") " pod="openstack/neutron-db-sync-zjb6x" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.778967 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8d1daa08-43b5-47db-bd94-3efb0eb4dce2-horizon-secret-key\") pod \"horizon-569dfc4865-ndwdj\" (UID: \"8d1daa08-43b5-47db-bd94-3efb0eb4dce2\") " pod="openstack/horizon-569dfc4865-ndwdj" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.792056 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c614b6a-8d46-4a07-89f6-1a7cc64dfdad-logs\") pod \"watcher-applier-0\" (UID: \"0c614b6a-8d46-4a07-89f6-1a7cc64dfdad\") " pod="openstack/watcher-applier-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.792139 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.792546 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e9fc515-5e15-41fc-8e76-b9f3af099a0f-logs\") pod \"watcher-decision-engine-0\" (UID: \"9e9fc515-5e15-41fc-8e76-b9f3af099a0f\") " pod="openstack/watcher-decision-engine-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.803264 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.810086 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9e9fc515-5e15-41fc-8e76-b9f3af099a0f-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"9e9fc515-5e15-41fc-8e76-b9f3af099a0f\") " pod="openstack/watcher-decision-engine-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.811585 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c614b6a-8d46-4a07-89f6-1a7cc64dfdad-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"0c614b6a-8d46-4a07-89f6-1a7cc64dfdad\") " pod="openstack/watcher-applier-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.815478 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e9fc515-5e15-41fc-8e76-b9f3af099a0f-config-data\") pod \"watcher-decision-engine-0\" (UID: \"9e9fc515-5e15-41fc-8e76-b9f3af099a0f\") " pod="openstack/watcher-decision-engine-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.818120 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c614b6a-8d46-4a07-89f6-1a7cc64dfdad-config-data\") pod \"watcher-applier-0\" (UID: \"0c614b6a-8d46-4a07-89f6-1a7cc64dfdad\") " pod="openstack/watcher-applier-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.822423 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e9fc515-5e15-41fc-8e76-b9f3af099a0f-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"9e9fc515-5e15-41fc-8e76-b9f3af099a0f\") " pod="openstack/watcher-decision-engine-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.823462 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-rvpx7"] Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.824578 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-rvpx7" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.838842 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.839035 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-xpg9l" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.840994 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.841140 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.841958 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlfjr\" (UniqueName: \"kubernetes.io/projected/0c614b6a-8d46-4a07-89f6-1a7cc64dfdad-kube-api-access-dlfjr\") pod \"watcher-applier-0\" (UID: \"0c614b6a-8d46-4a07-89f6-1a7cc64dfdad\") " pod="openstack/watcher-applier-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.842121 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lkg5\" (UniqueName: \"kubernetes.io/projected/9e9fc515-5e15-41fc-8e76-b9f3af099a0f-kube-api-access-6lkg5\") pod \"watcher-decision-engine-0\" (UID: \"9e9fc515-5e15-41fc-8e76-b9f3af099a0f\") " pod="openstack/watcher-decision-engine-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.848993 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.854905 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.882846 4922 generic.go:334] "Generic (PLEG): container finished" podID="df0f7ce3-64d4-45c3-b416-58e49b5b5bac" containerID="b46ddfc217eb56701f6d0cade709faba90adfb7a9e5d395fb4ad4edb78eeacbf" exitCode=0 Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.882902 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-zpx62" event={"ID":"df0f7ce3-64d4-45c3-b416-58e49b5b5bac","Type":"ContainerDied","Data":"b46ddfc217eb56701f6d0cade709faba90adfb7a9e5d395fb4ad4edb78eeacbf"} Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.883315 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3873c9e-308f-46ea-ac8f-4ee78ca92235-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a3873c9e-308f-46ea-ac8f-4ee78ca92235\") " pod="openstack/ceilometer-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.883352 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c7e64f4b-f6d6-43b3-aeab-47a0da094a84-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"c7e64f4b-f6d6-43b3-aeab-47a0da094a84\") " pod="openstack/watcher-api-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.883408 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7e64f4b-f6d6-43b3-aeab-47a0da094a84-logs\") pod \"watcher-api-0\" (UID: \"c7e64f4b-f6d6-43b3-aeab-47a0da094a84\") " pod="openstack/watcher-api-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.883429 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3873c9e-308f-46ea-ac8f-4ee78ca92235-run-httpd\") pod \"ceilometer-0\" (UID: \"a3873c9e-308f-46ea-ac8f-4ee78ca92235\") " pod="openstack/ceilometer-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.883450 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71318c6d-61ee-4fb4-8682-7cf3fc0ae044-logs\") pod \"placement-db-sync-clz29\" (UID: \"71318c6d-61ee-4fb4-8682-7cf3fc0ae044\") " pod="openstack/placement-db-sync-clz29" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.883471 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8d1daa08-43b5-47db-bd94-3efb0eb4dce2-scripts\") pod \"horizon-569dfc4865-ndwdj\" (UID: \"8d1daa08-43b5-47db-bd94-3efb0eb4dce2\") " pod="openstack/horizon-569dfc4865-ndwdj" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.883486 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7e64f4b-f6d6-43b3-aeab-47a0da094a84-config-data\") pod \"watcher-api-0\" (UID: \"c7e64f4b-f6d6-43b3-aeab-47a0da094a84\") " pod="openstack/watcher-api-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.883508 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d8ft\" (UniqueName: \"kubernetes.io/projected/71318c6d-61ee-4fb4-8682-7cf3fc0ae044-kube-api-access-5d8ft\") pod \"placement-db-sync-clz29\" (UID: \"71318c6d-61ee-4fb4-8682-7cf3fc0ae044\") " pod="openstack/placement-db-sync-clz29" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.883536 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6bkw\" (UniqueName: \"kubernetes.io/projected/c7e64f4b-f6d6-43b3-aeab-47a0da094a84-kube-api-access-k6bkw\") pod \"watcher-api-0\" (UID: \"c7e64f4b-f6d6-43b3-aeab-47a0da094a84\") " pod="openstack/watcher-api-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.883589 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvmgk\" (UniqueName: \"kubernetes.io/projected/4bcd3608-244b-44f0-be1f-5d953cd35964-kube-api-access-kvmgk\") pod \"neutron-db-sync-zjb6x\" (UID: \"4bcd3608-244b-44f0-be1f-5d953cd35964\") " pod="openstack/neutron-db-sync-zjb6x" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.883619 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71318c6d-61ee-4fb4-8682-7cf3fc0ae044-scripts\") pod \"placement-db-sync-clz29\" (UID: \"71318c6d-61ee-4fb4-8682-7cf3fc0ae044\") " pod="openstack/placement-db-sync-clz29" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.883643 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8d1daa08-43b5-47db-bd94-3efb0eb4dce2-config-data\") pod \"horizon-569dfc4865-ndwdj\" (UID: \"8d1daa08-43b5-47db-bd94-3efb0eb4dce2\") " pod="openstack/horizon-569dfc4865-ndwdj" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.883677 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71318c6d-61ee-4fb4-8682-7cf3fc0ae044-config-data\") pod \"placement-db-sync-clz29\" (UID: \"71318c6d-61ee-4fb4-8682-7cf3fc0ae044\") " pod="openstack/placement-db-sync-clz29" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.883707 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbxnl\" (UniqueName: \"kubernetes.io/projected/8d1daa08-43b5-47db-bd94-3efb0eb4dce2-kube-api-access-lbxnl\") pod \"horizon-569dfc4865-ndwdj\" (UID: \"8d1daa08-43b5-47db-bd94-3efb0eb4dce2\") " pod="openstack/horizon-569dfc4865-ndwdj" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.883731 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e64f4b-f6d6-43b3-aeab-47a0da094a84-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"c7e64f4b-f6d6-43b3-aeab-47a0da094a84\") " pod="openstack/watcher-api-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.883767 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3873c9e-308f-46ea-ac8f-4ee78ca92235-config-data\") pod \"ceilometer-0\" (UID: \"a3873c9e-308f-46ea-ac8f-4ee78ca92235\") " pod="openstack/ceilometer-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.883788 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmwrr\" (UniqueName: \"kubernetes.io/projected/a3873c9e-308f-46ea-ac8f-4ee78ca92235-kube-api-access-jmwrr\") pod \"ceilometer-0\" (UID: \"a3873c9e-308f-46ea-ac8f-4ee78ca92235\") " pod="openstack/ceilometer-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.883823 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3873c9e-308f-46ea-ac8f-4ee78ca92235-scripts\") pod \"ceilometer-0\" (UID: \"a3873c9e-308f-46ea-ac8f-4ee78ca92235\") " pod="openstack/ceilometer-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.883854 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bcd3608-244b-44f0-be1f-5d953cd35964-combined-ca-bundle\") pod \"neutron-db-sync-zjb6x\" (UID: \"4bcd3608-244b-44f0-be1f-5d953cd35964\") " pod="openstack/neutron-db-sync-zjb6x" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.883884 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3873c9e-308f-46ea-ac8f-4ee78ca92235-log-httpd\") pod \"ceilometer-0\" (UID: \"a3873c9e-308f-46ea-ac8f-4ee78ca92235\") " pod="openstack/ceilometer-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.883916 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d1daa08-43b5-47db-bd94-3efb0eb4dce2-logs\") pod \"horizon-569dfc4865-ndwdj\" (UID: \"8d1daa08-43b5-47db-bd94-3efb0eb4dce2\") " pod="openstack/horizon-569dfc4865-ndwdj" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.883938 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a3873c9e-308f-46ea-ac8f-4ee78ca92235-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a3873c9e-308f-46ea-ac8f-4ee78ca92235\") " pod="openstack/ceilometer-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.883954 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71318c6d-61ee-4fb4-8682-7cf3fc0ae044-combined-ca-bundle\") pod \"placement-db-sync-clz29\" (UID: \"71318c6d-61ee-4fb4-8682-7cf3fc0ae044\") " pod="openstack/placement-db-sync-clz29" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.883972 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4bcd3608-244b-44f0-be1f-5d953cd35964-config\") pod \"neutron-db-sync-zjb6x\" (UID: \"4bcd3608-244b-44f0-be1f-5d953cd35964\") " pod="openstack/neutron-db-sync-zjb6x" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.884003 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8d1daa08-43b5-47db-bd94-3efb0eb4dce2-horizon-secret-key\") pod \"horizon-569dfc4865-ndwdj\" (UID: \"8d1daa08-43b5-47db-bd94-3efb0eb4dce2\") " pod="openstack/horizon-569dfc4865-ndwdj" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.884176 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7e64f4b-f6d6-43b3-aeab-47a0da094a84-logs\") pod \"watcher-api-0\" (UID: \"c7e64f4b-f6d6-43b3-aeab-47a0da094a84\") " pod="openstack/watcher-api-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.884953 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8d1daa08-43b5-47db-bd94-3efb0eb4dce2-scripts\") pod \"horizon-569dfc4865-ndwdj\" (UID: \"8d1daa08-43b5-47db-bd94-3efb0eb4dce2\") " pod="openstack/horizon-569dfc4865-ndwdj" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.887496 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8d1daa08-43b5-47db-bd94-3efb0eb4dce2-config-data\") pod \"horizon-569dfc4865-ndwdj\" (UID: \"8d1daa08-43b5-47db-bd94-3efb0eb4dce2\") " pod="openstack/horizon-569dfc4865-ndwdj" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.888638 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d1daa08-43b5-47db-bd94-3efb0eb4dce2-logs\") pod \"horizon-569dfc4865-ndwdj\" (UID: \"8d1daa08-43b5-47db-bd94-3efb0eb4dce2\") " pod="openstack/horizon-569dfc4865-ndwdj" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.890817 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.909428 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-bms2q"] Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.925051 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c7e64f4b-f6d6-43b3-aeab-47a0da094a84-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"c7e64f4b-f6d6-43b3-aeab-47a0da094a84\") " pod="openstack/watcher-api-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.925532 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8d1daa08-43b5-47db-bd94-3efb0eb4dce2-horizon-secret-key\") pod \"horizon-569dfc4865-ndwdj\" (UID: \"8d1daa08-43b5-47db-bd94-3efb0eb4dce2\") " pod="openstack/horizon-569dfc4865-ndwdj" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.946336 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4bcd3608-244b-44f0-be1f-5d953cd35964-config\") pod \"neutron-db-sync-zjb6x\" (UID: \"4bcd3608-244b-44f0-be1f-5d953cd35964\") " pod="openstack/neutron-db-sync-zjb6x" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.948447 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-clz29"] Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.949109 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e64f4b-f6d6-43b3-aeab-47a0da094a84-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"c7e64f4b-f6d6-43b3-aeab-47a0da094a84\") " pod="openstack/watcher-api-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.949562 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7e64f4b-f6d6-43b3-aeab-47a0da094a84-config-data\") pod \"watcher-api-0\" (UID: \"c7e64f4b-f6d6-43b3-aeab-47a0da094a84\") " pod="openstack/watcher-api-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.956199 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvmgk\" (UniqueName: \"kubernetes.io/projected/4bcd3608-244b-44f0-be1f-5d953cd35964-kube-api-access-kvmgk\") pod \"neutron-db-sync-zjb6x\" (UID: \"4bcd3608-244b-44f0-be1f-5d953cd35964\") " pod="openstack/neutron-db-sync-zjb6x" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.956634 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbxnl\" (UniqueName: \"kubernetes.io/projected/8d1daa08-43b5-47db-bd94-3efb0eb4dce2-kube-api-access-lbxnl\") pod \"horizon-569dfc4865-ndwdj\" (UID: \"8d1daa08-43b5-47db-bd94-3efb0eb4dce2\") " pod="openstack/horizon-569dfc4865-ndwdj" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.961573 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-mqfhx"] Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.962855 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-mqfhx" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.964302 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bcd3608-244b-44f0-be1f-5d953cd35964-combined-ca-bundle\") pod \"neutron-db-sync-zjb6x\" (UID: \"4bcd3608-244b-44f0-be1f-5d953cd35964\") " pod="openstack/neutron-db-sync-zjb6x" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.965138 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6bkw\" (UniqueName: \"kubernetes.io/projected/c7e64f4b-f6d6-43b3-aeab-47a0da094a84-kube-api-access-k6bkw\") pod \"watcher-api-0\" (UID: \"c7e64f4b-f6d6-43b3-aeab-47a0da094a84\") " pod="openstack/watcher-api-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.984112 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-rvpx7"] Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.984936 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3873c9e-308f-46ea-ac8f-4ee78ca92235-run-httpd\") pod \"ceilometer-0\" (UID: \"a3873c9e-308f-46ea-ac8f-4ee78ca92235\") " pod="openstack/ceilometer-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.984979 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71318c6d-61ee-4fb4-8682-7cf3fc0ae044-logs\") pod \"placement-db-sync-clz29\" (UID: \"71318c6d-61ee-4fb4-8682-7cf3fc0ae044\") " pod="openstack/placement-db-sync-clz29" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.985002 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5d8ft\" (UniqueName: \"kubernetes.io/projected/71318c6d-61ee-4fb4-8682-7cf3fc0ae044-kube-api-access-5d8ft\") pod \"placement-db-sync-clz29\" (UID: \"71318c6d-61ee-4fb4-8682-7cf3fc0ae044\") " pod="openstack/placement-db-sync-clz29" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.985034 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d7852f85-b8c5-458e-901c-3659c5ed2713-db-sync-config-data\") pod \"cinder-db-sync-rvpx7\" (UID: \"d7852f85-b8c5-458e-901c-3659c5ed2713\") " pod="openstack/cinder-db-sync-rvpx7" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.985071 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h8nh\" (UniqueName: \"kubernetes.io/projected/d7852f85-b8c5-458e-901c-3659c5ed2713-kube-api-access-4h8nh\") pod \"cinder-db-sync-rvpx7\" (UID: \"d7852f85-b8c5-458e-901c-3659c5ed2713\") " pod="openstack/cinder-db-sync-rvpx7" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.985089 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71318c6d-61ee-4fb4-8682-7cf3fc0ae044-scripts\") pod \"placement-db-sync-clz29\" (UID: \"71318c6d-61ee-4fb4-8682-7cf3fc0ae044\") " pod="openstack/placement-db-sync-clz29" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.985107 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d7852f85-b8c5-458e-901c-3659c5ed2713-etc-machine-id\") pod \"cinder-db-sync-rvpx7\" (UID: \"d7852f85-b8c5-458e-901c-3659c5ed2713\") " pod="openstack/cinder-db-sync-rvpx7" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.985135 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71318c6d-61ee-4fb4-8682-7cf3fc0ae044-config-data\") pod \"placement-db-sync-clz29\" (UID: \"71318c6d-61ee-4fb4-8682-7cf3fc0ae044\") " pod="openstack/placement-db-sync-clz29" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.985167 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3873c9e-308f-46ea-ac8f-4ee78ca92235-config-data\") pod \"ceilometer-0\" (UID: \"a3873c9e-308f-46ea-ac8f-4ee78ca92235\") " pod="openstack/ceilometer-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.985184 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmwrr\" (UniqueName: \"kubernetes.io/projected/a3873c9e-308f-46ea-ac8f-4ee78ca92235-kube-api-access-jmwrr\") pod \"ceilometer-0\" (UID: \"a3873c9e-308f-46ea-ac8f-4ee78ca92235\") " pod="openstack/ceilometer-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.985206 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7852f85-b8c5-458e-901c-3659c5ed2713-scripts\") pod \"cinder-db-sync-rvpx7\" (UID: \"d7852f85-b8c5-458e-901c-3659c5ed2713\") " pod="openstack/cinder-db-sync-rvpx7" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.985225 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3873c9e-308f-46ea-ac8f-4ee78ca92235-scripts\") pod \"ceilometer-0\" (UID: \"a3873c9e-308f-46ea-ac8f-4ee78ca92235\") " pod="openstack/ceilometer-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.985249 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7852f85-b8c5-458e-901c-3659c5ed2713-combined-ca-bundle\") pod \"cinder-db-sync-rvpx7\" (UID: \"d7852f85-b8c5-458e-901c-3659c5ed2713\") " pod="openstack/cinder-db-sync-rvpx7" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.985273 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3873c9e-308f-46ea-ac8f-4ee78ca92235-log-httpd\") pod \"ceilometer-0\" (UID: \"a3873c9e-308f-46ea-ac8f-4ee78ca92235\") " pod="openstack/ceilometer-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.985298 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a3873c9e-308f-46ea-ac8f-4ee78ca92235-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a3873c9e-308f-46ea-ac8f-4ee78ca92235\") " pod="openstack/ceilometer-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.985315 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71318c6d-61ee-4fb4-8682-7cf3fc0ae044-combined-ca-bundle\") pod \"placement-db-sync-clz29\" (UID: \"71318c6d-61ee-4fb4-8682-7cf3fc0ae044\") " pod="openstack/placement-db-sync-clz29" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.985337 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3873c9e-308f-46ea-ac8f-4ee78ca92235-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a3873c9e-308f-46ea-ac8f-4ee78ca92235\") " pod="openstack/ceilometer-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.985379 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7852f85-b8c5-458e-901c-3659c5ed2713-config-data\") pod \"cinder-db-sync-rvpx7\" (UID: \"d7852f85-b8c5-458e-901c-3659c5ed2713\") " pod="openstack/cinder-db-sync-rvpx7" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.985810 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3873c9e-308f-46ea-ac8f-4ee78ca92235-run-httpd\") pod \"ceilometer-0\" (UID: \"a3873c9e-308f-46ea-ac8f-4ee78ca92235\") " pod="openstack/ceilometer-0" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.986050 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71318c6d-61ee-4fb4-8682-7cf3fc0ae044-logs\") pod \"placement-db-sync-clz29\" (UID: \"71318c6d-61ee-4fb4-8682-7cf3fc0ae044\") " pod="openstack/placement-db-sync-clz29" Feb 18 11:55:19 crc kubenswrapper[4922]: I0218 11:55:19.991391 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3873c9e-308f-46ea-ac8f-4ee78ca92235-log-httpd\") pod \"ceilometer-0\" (UID: \"a3873c9e-308f-46ea-ac8f-4ee78ca92235\") " pod="openstack/ceilometer-0" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.003245 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.003439 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-pktbk" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.004039 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.019336 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71318c6d-61ee-4fb4-8682-7cf3fc0ae044-scripts\") pod \"placement-db-sync-clz29\" (UID: \"71318c6d-61ee-4fb4-8682-7cf3fc0ae044\") " pod="openstack/placement-db-sync-clz29" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.023234 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-mqfhx"] Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.026265 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-zjb6x" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.044863 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-2ktqs"] Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.046201 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cf78879c9-2ktqs" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.055428 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-97df67fc7-qxhz9"] Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.057116 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-97df67fc7-qxhz9" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.078665 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-569dfc4865-ndwdj" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.087931 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-725mj\" (UniqueName: \"kubernetes.io/projected/31aad152-dcb7-472f-a0f8-d90ae972442b-kube-api-access-725mj\") pod \"barbican-db-sync-mqfhx\" (UID: \"31aad152-dcb7-472f-a0f8-d90ae972442b\") " pod="openstack/barbican-db-sync-mqfhx" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.087971 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7852f85-b8c5-458e-901c-3659c5ed2713-config-data\") pod \"cinder-db-sync-rvpx7\" (UID: \"d7852f85-b8c5-458e-901c-3659c5ed2713\") " pod="openstack/cinder-db-sync-rvpx7" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.088019 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d7852f85-b8c5-458e-901c-3659c5ed2713-db-sync-config-data\") pod \"cinder-db-sync-rvpx7\" (UID: \"d7852f85-b8c5-458e-901c-3659c5ed2713\") " pod="openstack/cinder-db-sync-rvpx7" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.088039 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/31aad152-dcb7-472f-a0f8-d90ae972442b-db-sync-config-data\") pod \"barbican-db-sync-mqfhx\" (UID: \"31aad152-dcb7-472f-a0f8-d90ae972442b\") " pod="openstack/barbican-db-sync-mqfhx" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.088077 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4h8nh\" (UniqueName: \"kubernetes.io/projected/d7852f85-b8c5-458e-901c-3659c5ed2713-kube-api-access-4h8nh\") pod \"cinder-db-sync-rvpx7\" (UID: \"d7852f85-b8c5-458e-901c-3659c5ed2713\") " pod="openstack/cinder-db-sync-rvpx7" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.088098 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d7852f85-b8c5-458e-901c-3659c5ed2713-etc-machine-id\") pod \"cinder-db-sync-rvpx7\" (UID: \"d7852f85-b8c5-458e-901c-3659c5ed2713\") " pod="openstack/cinder-db-sync-rvpx7" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.088114 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31aad152-dcb7-472f-a0f8-d90ae972442b-combined-ca-bundle\") pod \"barbican-db-sync-mqfhx\" (UID: \"31aad152-dcb7-472f-a0f8-d90ae972442b\") " pod="openstack/barbican-db-sync-mqfhx" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.088186 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7852f85-b8c5-458e-901c-3659c5ed2713-scripts\") pod \"cinder-db-sync-rvpx7\" (UID: \"d7852f85-b8c5-458e-901c-3659c5ed2713\") " pod="openstack/cinder-db-sync-rvpx7" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.088210 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7852f85-b8c5-458e-901c-3659c5ed2713-combined-ca-bundle\") pod \"cinder-db-sync-rvpx7\" (UID: \"d7852f85-b8c5-458e-901c-3659c5ed2713\") " pod="openstack/cinder-db-sync-rvpx7" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.088937 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d7852f85-b8c5-458e-901c-3659c5ed2713-etc-machine-id\") pod \"cinder-db-sync-rvpx7\" (UID: \"d7852f85-b8c5-458e-901c-3659c5ed2713\") " pod="openstack/cinder-db-sync-rvpx7" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.092679 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-2ktqs"] Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.115052 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3873c9e-308f-46ea-ac8f-4ee78ca92235-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a3873c9e-308f-46ea-ac8f-4ee78ca92235\") " pod="openstack/ceilometer-0" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.117196 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7852f85-b8c5-458e-901c-3659c5ed2713-combined-ca-bundle\") pod \"cinder-db-sync-rvpx7\" (UID: \"d7852f85-b8c5-458e-901c-3659c5ed2713\") " pod="openstack/cinder-db-sync-rvpx7" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.123572 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a3873c9e-308f-46ea-ac8f-4ee78ca92235-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a3873c9e-308f-46ea-ac8f-4ee78ca92235\") " pod="openstack/ceilometer-0" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.124571 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71318c6d-61ee-4fb4-8682-7cf3fc0ae044-config-data\") pod \"placement-db-sync-clz29\" (UID: \"71318c6d-61ee-4fb4-8682-7cf3fc0ae044\") " pod="openstack/placement-db-sync-clz29" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.124895 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3873c9e-308f-46ea-ac8f-4ee78ca92235-scripts\") pod \"ceilometer-0\" (UID: \"a3873c9e-308f-46ea-ac8f-4ee78ca92235\") " pod="openstack/ceilometer-0" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.130149 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.134573 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3873c9e-308f-46ea-ac8f-4ee78ca92235-config-data\") pod \"ceilometer-0\" (UID: \"a3873c9e-308f-46ea-ac8f-4ee78ca92235\") " pod="openstack/ceilometer-0" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.138211 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5d8ft\" (UniqueName: \"kubernetes.io/projected/71318c6d-61ee-4fb4-8682-7cf3fc0ae044-kube-api-access-5d8ft\") pod \"placement-db-sync-clz29\" (UID: \"71318c6d-61ee-4fb4-8682-7cf3fc0ae044\") " pod="openstack/placement-db-sync-clz29" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.151207 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71318c6d-61ee-4fb4-8682-7cf3fc0ae044-combined-ca-bundle\") pod \"placement-db-sync-clz29\" (UID: \"71318c6d-61ee-4fb4-8682-7cf3fc0ae044\") " pod="openstack/placement-db-sync-clz29" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.168874 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7852f85-b8c5-458e-901c-3659c5ed2713-scripts\") pod \"cinder-db-sync-rvpx7\" (UID: \"d7852f85-b8c5-458e-901c-3659c5ed2713\") " pod="openstack/cinder-db-sync-rvpx7" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.169721 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmwrr\" (UniqueName: \"kubernetes.io/projected/a3873c9e-308f-46ea-ac8f-4ee78ca92235-kube-api-access-jmwrr\") pod \"ceilometer-0\" (UID: \"a3873c9e-308f-46ea-ac8f-4ee78ca92235\") " pod="openstack/ceilometer-0" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.192278 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d7852f85-b8c5-458e-901c-3659c5ed2713-db-sync-config-data\") pod \"cinder-db-sync-rvpx7\" (UID: \"d7852f85-b8c5-458e-901c-3659c5ed2713\") " pod="openstack/cinder-db-sync-rvpx7" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.192332 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7852f85-b8c5-458e-901c-3659c5ed2713-config-data\") pod \"cinder-db-sync-rvpx7\" (UID: \"d7852f85-b8c5-458e-901c-3659c5ed2713\") " pod="openstack/cinder-db-sync-rvpx7" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.193718 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/20b58cdc-a36d-4a63-b86d-474dac5d4566-config-data\") pod \"horizon-97df67fc7-qxhz9\" (UID: \"20b58cdc-a36d-4a63-b86d-474dac5d4566\") " pod="openstack/horizon-97df67fc7-qxhz9" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.193823 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/20b58cdc-a36d-4a63-b86d-474dac5d4566-horizon-secret-key\") pod \"horizon-97df67fc7-qxhz9\" (UID: \"20b58cdc-a36d-4a63-b86d-474dac5d4566\") " pod="openstack/horizon-97df67fc7-qxhz9" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.193887 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/20b58cdc-a36d-4a63-b86d-474dac5d4566-scripts\") pod \"horizon-97df67fc7-qxhz9\" (UID: \"20b58cdc-a36d-4a63-b86d-474dac5d4566\") " pod="openstack/horizon-97df67fc7-qxhz9" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.193924 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24bbb94b-821e-4c8c-ae27-356f296903bf-ovsdbserver-sb\") pod \"dnsmasq-dns-cf78879c9-2ktqs\" (UID: \"24bbb94b-821e-4c8c-ae27-356f296903bf\") " pod="openstack/dnsmasq-dns-cf78879c9-2ktqs" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.193949 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p24jm\" (UniqueName: \"kubernetes.io/projected/20b58cdc-a36d-4a63-b86d-474dac5d4566-kube-api-access-p24jm\") pod \"horizon-97df67fc7-qxhz9\" (UID: \"20b58cdc-a36d-4a63-b86d-474dac5d4566\") " pod="openstack/horizon-97df67fc7-qxhz9" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.194001 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/24bbb94b-821e-4c8c-ae27-356f296903bf-dns-swift-storage-0\") pod \"dnsmasq-dns-cf78879c9-2ktqs\" (UID: \"24bbb94b-821e-4c8c-ae27-356f296903bf\") " pod="openstack/dnsmasq-dns-cf78879c9-2ktqs" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.194094 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/31aad152-dcb7-472f-a0f8-d90ae972442b-db-sync-config-data\") pod \"barbican-db-sync-mqfhx\" (UID: \"31aad152-dcb7-472f-a0f8-d90ae972442b\") " pod="openstack/barbican-db-sync-mqfhx" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.194173 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24bbb94b-821e-4c8c-ae27-356f296903bf-config\") pod \"dnsmasq-dns-cf78879c9-2ktqs\" (UID: \"24bbb94b-821e-4c8c-ae27-356f296903bf\") " pod="openstack/dnsmasq-dns-cf78879c9-2ktqs" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.194227 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31aad152-dcb7-472f-a0f8-d90ae972442b-combined-ca-bundle\") pod \"barbican-db-sync-mqfhx\" (UID: \"31aad152-dcb7-472f-a0f8-d90ae972442b\") " pod="openstack/barbican-db-sync-mqfhx" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.194302 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-97df67fc7-qxhz9"] Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.194349 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20b58cdc-a36d-4a63-b86d-474dac5d4566-logs\") pod \"horizon-97df67fc7-qxhz9\" (UID: \"20b58cdc-a36d-4a63-b86d-474dac5d4566\") " pod="openstack/horizon-97df67fc7-qxhz9" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.194585 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24bbb94b-821e-4c8c-ae27-356f296903bf-ovsdbserver-nb\") pod \"dnsmasq-dns-cf78879c9-2ktqs\" (UID: \"24bbb94b-821e-4c8c-ae27-356f296903bf\") " pod="openstack/dnsmasq-dns-cf78879c9-2ktqs" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.194666 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8kkt\" (UniqueName: \"kubernetes.io/projected/24bbb94b-821e-4c8c-ae27-356f296903bf-kube-api-access-x8kkt\") pod \"dnsmasq-dns-cf78879c9-2ktqs\" (UID: \"24bbb94b-821e-4c8c-ae27-356f296903bf\") " pod="openstack/dnsmasq-dns-cf78879c9-2ktqs" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.194716 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24bbb94b-821e-4c8c-ae27-356f296903bf-dns-svc\") pod \"dnsmasq-dns-cf78879c9-2ktqs\" (UID: \"24bbb94b-821e-4c8c-ae27-356f296903bf\") " pod="openstack/dnsmasq-dns-cf78879c9-2ktqs" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.194773 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-725mj\" (UniqueName: \"kubernetes.io/projected/31aad152-dcb7-472f-a0f8-d90ae972442b-kube-api-access-725mj\") pod \"barbican-db-sync-mqfhx\" (UID: \"31aad152-dcb7-472f-a0f8-d90ae972442b\") " pod="openstack/barbican-db-sync-mqfhx" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.223011 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31aad152-dcb7-472f-a0f8-d90ae972442b-combined-ca-bundle\") pod \"barbican-db-sync-mqfhx\" (UID: \"31aad152-dcb7-472f-a0f8-d90ae972442b\") " pod="openstack/barbican-db-sync-mqfhx" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.223461 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/31aad152-dcb7-472f-a0f8-d90ae972442b-db-sync-config-data\") pod \"barbican-db-sync-mqfhx\" (UID: \"31aad152-dcb7-472f-a0f8-d90ae972442b\") " pod="openstack/barbican-db-sync-mqfhx" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.226107 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-725mj\" (UniqueName: \"kubernetes.io/projected/31aad152-dcb7-472f-a0f8-d90ae972442b-kube-api-access-725mj\") pod \"barbican-db-sync-mqfhx\" (UID: \"31aad152-dcb7-472f-a0f8-d90ae972442b\") " pod="openstack/barbican-db-sync-mqfhx" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.226329 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h8nh\" (UniqueName: \"kubernetes.io/projected/d7852f85-b8c5-458e-901c-3659c5ed2713-kube-api-access-4h8nh\") pod \"cinder-db-sync-rvpx7\" (UID: \"d7852f85-b8c5-458e-901c-3659c5ed2713\") " pod="openstack/cinder-db-sync-rvpx7" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.310559 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20b58cdc-a36d-4a63-b86d-474dac5d4566-logs\") pod \"horizon-97df67fc7-qxhz9\" (UID: \"20b58cdc-a36d-4a63-b86d-474dac5d4566\") " pod="openstack/horizon-97df67fc7-qxhz9" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.312198 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24bbb94b-821e-4c8c-ae27-356f296903bf-ovsdbserver-nb\") pod \"dnsmasq-dns-cf78879c9-2ktqs\" (UID: \"24bbb94b-821e-4c8c-ae27-356f296903bf\") " pod="openstack/dnsmasq-dns-cf78879c9-2ktqs" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.312399 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8kkt\" (UniqueName: \"kubernetes.io/projected/24bbb94b-821e-4c8c-ae27-356f296903bf-kube-api-access-x8kkt\") pod \"dnsmasq-dns-cf78879c9-2ktqs\" (UID: \"24bbb94b-821e-4c8c-ae27-356f296903bf\") " pod="openstack/dnsmasq-dns-cf78879c9-2ktqs" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.312565 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24bbb94b-821e-4c8c-ae27-356f296903bf-dns-svc\") pod \"dnsmasq-dns-cf78879c9-2ktqs\" (UID: \"24bbb94b-821e-4c8c-ae27-356f296903bf\") " pod="openstack/dnsmasq-dns-cf78879c9-2ktqs" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.312734 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/20b58cdc-a36d-4a63-b86d-474dac5d4566-config-data\") pod \"horizon-97df67fc7-qxhz9\" (UID: \"20b58cdc-a36d-4a63-b86d-474dac5d4566\") " pod="openstack/horizon-97df67fc7-qxhz9" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.312987 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/20b58cdc-a36d-4a63-b86d-474dac5d4566-horizon-secret-key\") pod \"horizon-97df67fc7-qxhz9\" (UID: \"20b58cdc-a36d-4a63-b86d-474dac5d4566\") " pod="openstack/horizon-97df67fc7-qxhz9" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.313145 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/20b58cdc-a36d-4a63-b86d-474dac5d4566-scripts\") pod \"horizon-97df67fc7-qxhz9\" (UID: \"20b58cdc-a36d-4a63-b86d-474dac5d4566\") " pod="openstack/horizon-97df67fc7-qxhz9" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.313261 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24bbb94b-821e-4c8c-ae27-356f296903bf-ovsdbserver-sb\") pod \"dnsmasq-dns-cf78879c9-2ktqs\" (UID: \"24bbb94b-821e-4c8c-ae27-356f296903bf\") " pod="openstack/dnsmasq-dns-cf78879c9-2ktqs" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.313352 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p24jm\" (UniqueName: \"kubernetes.io/projected/20b58cdc-a36d-4a63-b86d-474dac5d4566-kube-api-access-p24jm\") pod \"horizon-97df67fc7-qxhz9\" (UID: \"20b58cdc-a36d-4a63-b86d-474dac5d4566\") " pod="openstack/horizon-97df67fc7-qxhz9" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.313476 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/24bbb94b-821e-4c8c-ae27-356f296903bf-dns-swift-storage-0\") pod \"dnsmasq-dns-cf78879c9-2ktqs\" (UID: \"24bbb94b-821e-4c8c-ae27-356f296903bf\") " pod="openstack/dnsmasq-dns-cf78879c9-2ktqs" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.313620 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24bbb94b-821e-4c8c-ae27-356f296903bf-config\") pod \"dnsmasq-dns-cf78879c9-2ktqs\" (UID: \"24bbb94b-821e-4c8c-ae27-356f296903bf\") " pod="openstack/dnsmasq-dns-cf78879c9-2ktqs" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.314189 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24bbb94b-821e-4c8c-ae27-356f296903bf-dns-svc\") pod \"dnsmasq-dns-cf78879c9-2ktqs\" (UID: \"24bbb94b-821e-4c8c-ae27-356f296903bf\") " pod="openstack/dnsmasq-dns-cf78879c9-2ktqs" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.314230 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24bbb94b-821e-4c8c-ae27-356f296903bf-ovsdbserver-nb\") pod \"dnsmasq-dns-cf78879c9-2ktqs\" (UID: \"24bbb94b-821e-4c8c-ae27-356f296903bf\") " pod="openstack/dnsmasq-dns-cf78879c9-2ktqs" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.311532 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20b58cdc-a36d-4a63-b86d-474dac5d4566-logs\") pod \"horizon-97df67fc7-qxhz9\" (UID: \"20b58cdc-a36d-4a63-b86d-474dac5d4566\") " pod="openstack/horizon-97df67fc7-qxhz9" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.315614 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24bbb94b-821e-4c8c-ae27-356f296903bf-ovsdbserver-sb\") pod \"dnsmasq-dns-cf78879c9-2ktqs\" (UID: \"24bbb94b-821e-4c8c-ae27-356f296903bf\") " pod="openstack/dnsmasq-dns-cf78879c9-2ktqs" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.315776 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/20b58cdc-a36d-4a63-b86d-474dac5d4566-scripts\") pod \"horizon-97df67fc7-qxhz9\" (UID: \"20b58cdc-a36d-4a63-b86d-474dac5d4566\") " pod="openstack/horizon-97df67fc7-qxhz9" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.315999 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/20b58cdc-a36d-4a63-b86d-474dac5d4566-config-data\") pod \"horizon-97df67fc7-qxhz9\" (UID: \"20b58cdc-a36d-4a63-b86d-474dac5d4566\") " pod="openstack/horizon-97df67fc7-qxhz9" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.317164 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/24bbb94b-821e-4c8c-ae27-356f296903bf-dns-swift-storage-0\") pod \"dnsmasq-dns-cf78879c9-2ktqs\" (UID: \"24bbb94b-821e-4c8c-ae27-356f296903bf\") " pod="openstack/dnsmasq-dns-cf78879c9-2ktqs" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.320772 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24bbb94b-821e-4c8c-ae27-356f296903bf-config\") pod \"dnsmasq-dns-cf78879c9-2ktqs\" (UID: \"24bbb94b-821e-4c8c-ae27-356f296903bf\") " pod="openstack/dnsmasq-dns-cf78879c9-2ktqs" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.331758 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/20b58cdc-a36d-4a63-b86d-474dac5d4566-horizon-secret-key\") pod \"horizon-97df67fc7-qxhz9\" (UID: \"20b58cdc-a36d-4a63-b86d-474dac5d4566\") " pod="openstack/horizon-97df67fc7-qxhz9" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.361892 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p24jm\" (UniqueName: \"kubernetes.io/projected/20b58cdc-a36d-4a63-b86d-474dac5d4566-kube-api-access-p24jm\") pod \"horizon-97df67fc7-qxhz9\" (UID: \"20b58cdc-a36d-4a63-b86d-474dac5d4566\") " pod="openstack/horizon-97df67fc7-qxhz9" Feb 18 11:55:20 crc kubenswrapper[4922]: I0218 11:55:20.362999 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8kkt\" (UniqueName: \"kubernetes.io/projected/24bbb94b-821e-4c8c-ae27-356f296903bf-kube-api-access-x8kkt\") pod \"dnsmasq-dns-cf78879c9-2ktqs\" (UID: \"24bbb94b-821e-4c8c-ae27-356f296903bf\") " pod="openstack/dnsmasq-dns-cf78879c9-2ktqs" Feb 18 11:55:22 crc kubenswrapper[4922]: I0218 11:55:20.411691 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-clz29" Feb 18 11:55:22 crc kubenswrapper[4922]: I0218 11:55:20.464241 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 11:55:22 crc kubenswrapper[4922]: I0218 11:55:20.473819 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-rvpx7" Feb 18 11:55:22 crc kubenswrapper[4922]: I0218 11:55:20.658270 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-mqfhx" Feb 18 11:55:22 crc kubenswrapper[4922]: I0218 11:55:20.679296 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-97df67fc7-qxhz9" Feb 18 11:55:22 crc kubenswrapper[4922]: I0218 11:55:20.688431 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cf78879c9-2ktqs" Feb 18 11:55:22 crc kubenswrapper[4922]: I0218 11:55:20.895919 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-st9pz" event={"ID":"855fb3ec-e473-4a99-a94f-cc96dda6d9c4","Type":"ContainerStarted","Data":"f6b3d065bfd75b2cfcd638eb81caf2052d3e13f6ce152807bf20c4d48f24622e"} Feb 18 11:55:22 crc kubenswrapper[4922]: I0218 11:55:21.494112 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Feb 18 11:55:22 crc kubenswrapper[4922]: I0218 11:55:21.562434 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-569dfc4865-ndwdj"] Feb 18 11:55:22 crc kubenswrapper[4922]: I0218 11:55:21.637459 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5b45644857-ghjwx"] Feb 18 11:55:22 crc kubenswrapper[4922]: I0218 11:55:21.638869 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5b45644857-ghjwx" Feb 18 11:55:22 crc kubenswrapper[4922]: I0218 11:55:21.673324 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5b45644857-ghjwx"] Feb 18 11:55:22 crc kubenswrapper[4922]: I0218 11:55:21.741528 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/92ff9b70-4f7e-43b8-b270-3470a18fcbda-scripts\") pod \"horizon-5b45644857-ghjwx\" (UID: \"92ff9b70-4f7e-43b8-b270-3470a18fcbda\") " pod="openstack/horizon-5b45644857-ghjwx" Feb 18 11:55:22 crc kubenswrapper[4922]: I0218 11:55:21.741608 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/92ff9b70-4f7e-43b8-b270-3470a18fcbda-config-data\") pod \"horizon-5b45644857-ghjwx\" (UID: \"92ff9b70-4f7e-43b8-b270-3470a18fcbda\") " pod="openstack/horizon-5b45644857-ghjwx" Feb 18 11:55:22 crc kubenswrapper[4922]: I0218 11:55:21.741647 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92ff9b70-4f7e-43b8-b270-3470a18fcbda-logs\") pod \"horizon-5b45644857-ghjwx\" (UID: \"92ff9b70-4f7e-43b8-b270-3470a18fcbda\") " pod="openstack/horizon-5b45644857-ghjwx" Feb 18 11:55:22 crc kubenswrapper[4922]: I0218 11:55:21.741666 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmpjz\" (UniqueName: \"kubernetes.io/projected/92ff9b70-4f7e-43b8-b270-3470a18fcbda-kube-api-access-gmpjz\") pod \"horizon-5b45644857-ghjwx\" (UID: \"92ff9b70-4f7e-43b8-b270-3470a18fcbda\") " pod="openstack/horizon-5b45644857-ghjwx" Feb 18 11:55:22 crc kubenswrapper[4922]: I0218 11:55:21.741728 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/92ff9b70-4f7e-43b8-b270-3470a18fcbda-horizon-secret-key\") pod \"horizon-5b45644857-ghjwx\" (UID: \"92ff9b70-4f7e-43b8-b270-3470a18fcbda\") " pod="openstack/horizon-5b45644857-ghjwx" Feb 18 11:55:22 crc kubenswrapper[4922]: I0218 11:55:21.844002 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92ff9b70-4f7e-43b8-b270-3470a18fcbda-logs\") pod \"horizon-5b45644857-ghjwx\" (UID: \"92ff9b70-4f7e-43b8-b270-3470a18fcbda\") " pod="openstack/horizon-5b45644857-ghjwx" Feb 18 11:55:22 crc kubenswrapper[4922]: I0218 11:55:21.844072 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmpjz\" (UniqueName: \"kubernetes.io/projected/92ff9b70-4f7e-43b8-b270-3470a18fcbda-kube-api-access-gmpjz\") pod \"horizon-5b45644857-ghjwx\" (UID: \"92ff9b70-4f7e-43b8-b270-3470a18fcbda\") " pod="openstack/horizon-5b45644857-ghjwx" Feb 18 11:55:22 crc kubenswrapper[4922]: I0218 11:55:21.844198 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/92ff9b70-4f7e-43b8-b270-3470a18fcbda-horizon-secret-key\") pod \"horizon-5b45644857-ghjwx\" (UID: \"92ff9b70-4f7e-43b8-b270-3470a18fcbda\") " pod="openstack/horizon-5b45644857-ghjwx" Feb 18 11:55:22 crc kubenswrapper[4922]: I0218 11:55:21.844273 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/92ff9b70-4f7e-43b8-b270-3470a18fcbda-scripts\") pod \"horizon-5b45644857-ghjwx\" (UID: \"92ff9b70-4f7e-43b8-b270-3470a18fcbda\") " pod="openstack/horizon-5b45644857-ghjwx" Feb 18 11:55:22 crc kubenswrapper[4922]: I0218 11:55:21.844804 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/92ff9b70-4f7e-43b8-b270-3470a18fcbda-config-data\") pod \"horizon-5b45644857-ghjwx\" (UID: \"92ff9b70-4f7e-43b8-b270-3470a18fcbda\") " pod="openstack/horizon-5b45644857-ghjwx" Feb 18 11:55:22 crc kubenswrapper[4922]: I0218 11:55:21.845074 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92ff9b70-4f7e-43b8-b270-3470a18fcbda-logs\") pod \"horizon-5b45644857-ghjwx\" (UID: \"92ff9b70-4f7e-43b8-b270-3470a18fcbda\") " pod="openstack/horizon-5b45644857-ghjwx" Feb 18 11:55:22 crc kubenswrapper[4922]: I0218 11:55:21.845428 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/92ff9b70-4f7e-43b8-b270-3470a18fcbda-scripts\") pod \"horizon-5b45644857-ghjwx\" (UID: \"92ff9b70-4f7e-43b8-b270-3470a18fcbda\") " pod="openstack/horizon-5b45644857-ghjwx" Feb 18 11:55:22 crc kubenswrapper[4922]: I0218 11:55:21.846089 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/92ff9b70-4f7e-43b8-b270-3470a18fcbda-config-data\") pod \"horizon-5b45644857-ghjwx\" (UID: \"92ff9b70-4f7e-43b8-b270-3470a18fcbda\") " pod="openstack/horizon-5b45644857-ghjwx" Feb 18 11:55:22 crc kubenswrapper[4922]: I0218 11:55:21.851015 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/92ff9b70-4f7e-43b8-b270-3470a18fcbda-horizon-secret-key\") pod \"horizon-5b45644857-ghjwx\" (UID: \"92ff9b70-4f7e-43b8-b270-3470a18fcbda\") " pod="openstack/horizon-5b45644857-ghjwx" Feb 18 11:55:22 crc kubenswrapper[4922]: I0218 11:55:21.873135 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmpjz\" (UniqueName: \"kubernetes.io/projected/92ff9b70-4f7e-43b8-b270-3470a18fcbda-kube-api-access-gmpjz\") pod \"horizon-5b45644857-ghjwx\" (UID: \"92ff9b70-4f7e-43b8-b270-3470a18fcbda\") " pod="openstack/horizon-5b45644857-ghjwx" Feb 18 11:55:22 crc kubenswrapper[4922]: I0218 11:55:21.958028 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5b45644857-ghjwx" Feb 18 11:55:22 crc kubenswrapper[4922]: I0218 11:55:22.790223 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-st9pz" podStartSLOduration=6.351904296 podStartE2EDuration="1m15.790202329s" podCreationTimestamp="2026-02-18 11:54:07 +0000 UTC" firstStartedPulling="2026-02-18 11:54:09.401080259 +0000 UTC m=+1051.128784339" lastFinishedPulling="2026-02-18 11:55:18.839378292 +0000 UTC m=+1120.567082372" observedRunningTime="2026-02-18 11:55:21.953404872 +0000 UTC m=+1123.681108952" watchObservedRunningTime="2026-02-18 11:55:22.790202329 +0000 UTC m=+1124.517906399" Feb 18 11:55:22 crc kubenswrapper[4922]: I0218 11:55:22.806633 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 11:55:22 crc kubenswrapper[4922]: I0218 11:55:22.902917 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-zpx62" Feb 18 11:55:22 crc kubenswrapper[4922]: I0218 11:55:22.938466 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-zpx62" event={"ID":"df0f7ce3-64d4-45c3-b416-58e49b5b5bac","Type":"ContainerDied","Data":"d6fd96a63f0eb8def4408247acfb363976fe91401640a5a1bfb2074f5c37e39c"} Feb 18 11:55:22 crc kubenswrapper[4922]: I0218 11:55:22.938537 4922 scope.go:117] "RemoveContainer" containerID="b46ddfc217eb56701f6d0cade709faba90adfb7a9e5d395fb4ad4edb78eeacbf" Feb 18 11:55:22 crc kubenswrapper[4922]: I0218 11:55:22.938546 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-zpx62" Feb 18 11:55:22 crc kubenswrapper[4922]: I0218 11:55:22.973093 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df0f7ce3-64d4-45c3-b416-58e49b5b5bac-dns-swift-storage-0\") pod \"df0f7ce3-64d4-45c3-b416-58e49b5b5bac\" (UID: \"df0f7ce3-64d4-45c3-b416-58e49b5b5bac\") " Feb 18 11:55:22 crc kubenswrapper[4922]: I0218 11:55:22.973242 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xn24f\" (UniqueName: \"kubernetes.io/projected/df0f7ce3-64d4-45c3-b416-58e49b5b5bac-kube-api-access-xn24f\") pod \"df0f7ce3-64d4-45c3-b416-58e49b5b5bac\" (UID: \"df0f7ce3-64d4-45c3-b416-58e49b5b5bac\") " Feb 18 11:55:22 crc kubenswrapper[4922]: I0218 11:55:22.973283 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df0f7ce3-64d4-45c3-b416-58e49b5b5bac-config\") pod \"df0f7ce3-64d4-45c3-b416-58e49b5b5bac\" (UID: \"df0f7ce3-64d4-45c3-b416-58e49b5b5bac\") " Feb 18 11:55:22 crc kubenswrapper[4922]: I0218 11:55:22.973351 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df0f7ce3-64d4-45c3-b416-58e49b5b5bac-dns-svc\") pod \"df0f7ce3-64d4-45c3-b416-58e49b5b5bac\" (UID: \"df0f7ce3-64d4-45c3-b416-58e49b5b5bac\") " Feb 18 11:55:22 crc kubenswrapper[4922]: I0218 11:55:22.973450 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df0f7ce3-64d4-45c3-b416-58e49b5b5bac-ovsdbserver-sb\") pod \"df0f7ce3-64d4-45c3-b416-58e49b5b5bac\" (UID: \"df0f7ce3-64d4-45c3-b416-58e49b5b5bac\") " Feb 18 11:55:22 crc kubenswrapper[4922]: I0218 11:55:22.973537 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df0f7ce3-64d4-45c3-b416-58e49b5b5bac-ovsdbserver-nb\") pod \"df0f7ce3-64d4-45c3-b416-58e49b5b5bac\" (UID: \"df0f7ce3-64d4-45c3-b416-58e49b5b5bac\") " Feb 18 11:55:22 crc kubenswrapper[4922]: I0218 11:55:22.991772 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df0f7ce3-64d4-45c3-b416-58e49b5b5bac-kube-api-access-xn24f" (OuterVolumeSpecName: "kube-api-access-xn24f") pod "df0f7ce3-64d4-45c3-b416-58e49b5b5bac" (UID: "df0f7ce3-64d4-45c3-b416-58e49b5b5bac"). InnerVolumeSpecName "kube-api-access-xn24f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:55:23 crc kubenswrapper[4922]: I0218 11:55:23.033683 4922 scope.go:117] "RemoveContainer" containerID="de3eb0b81c2ecf51916540dbcf6e765b8396d655b971b5b5ee803c68d62a7d58" Feb 18 11:55:23 crc kubenswrapper[4922]: I0218 11:55:23.080075 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xn24f\" (UniqueName: \"kubernetes.io/projected/df0f7ce3-64d4-45c3-b416-58e49b5b5bac-kube-api-access-xn24f\") on node \"crc\" DevicePath \"\"" Feb 18 11:55:23 crc kubenswrapper[4922]: W0218 11:55:23.090673 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod080a4dbf_a721_4b07_8c48_1ed03637a871.slice/crio-953a10f63b24926b9e9c17b705d09c9065c66847ee05a8c88823e22fb95fe0b7 WatchSource:0}: Error finding container 953a10f63b24926b9e9c17b705d09c9065c66847ee05a8c88823e22fb95fe0b7: Status 404 returned error can't find the container with id 953a10f63b24926b9e9c17b705d09c9065c66847ee05a8c88823e22fb95fe0b7 Feb 18 11:55:23 crc kubenswrapper[4922]: I0218 11:55:23.103333 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df0f7ce3-64d4-45c3-b416-58e49b5b5bac-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "df0f7ce3-64d4-45c3-b416-58e49b5b5bac" (UID: "df0f7ce3-64d4-45c3-b416-58e49b5b5bac"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:55:23 crc kubenswrapper[4922]: I0218 11:55:23.114880 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df0f7ce3-64d4-45c3-b416-58e49b5b5bac-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "df0f7ce3-64d4-45c3-b416-58e49b5b5bac" (UID: "df0f7ce3-64d4-45c3-b416-58e49b5b5bac"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:55:23 crc kubenswrapper[4922]: I0218 11:55:23.120749 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df0f7ce3-64d4-45c3-b416-58e49b5b5bac-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "df0f7ce3-64d4-45c3-b416-58e49b5b5bac" (UID: "df0f7ce3-64d4-45c3-b416-58e49b5b5bac"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:55:23 crc kubenswrapper[4922]: I0218 11:55:23.128196 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df0f7ce3-64d4-45c3-b416-58e49b5b5bac-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "df0f7ce3-64d4-45c3-b416-58e49b5b5bac" (UID: "df0f7ce3-64d4-45c3-b416-58e49b5b5bac"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:55:23 crc kubenswrapper[4922]: I0218 11:55:23.128404 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df0f7ce3-64d4-45c3-b416-58e49b5b5bac-config" (OuterVolumeSpecName: "config") pod "df0f7ce3-64d4-45c3-b416-58e49b5b5bac" (UID: "df0f7ce3-64d4-45c3-b416-58e49b5b5bac"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:55:23 crc kubenswrapper[4922]: I0218 11:55:23.140602 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-bms2q"] Feb 18 11:55:23 crc kubenswrapper[4922]: I0218 11:55:23.182022 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df0f7ce3-64d4-45c3-b416-58e49b5b5bac-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 18 11:55:23 crc kubenswrapper[4922]: I0218 11:55:23.182071 4922 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df0f7ce3-64d4-45c3-b416-58e49b5b5bac-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 18 11:55:23 crc kubenswrapper[4922]: I0218 11:55:23.182088 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df0f7ce3-64d4-45c3-b416-58e49b5b5bac-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:55:23 crc kubenswrapper[4922]: I0218 11:55:23.182098 4922 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df0f7ce3-64d4-45c3-b416-58e49b5b5bac-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 11:55:23 crc kubenswrapper[4922]: I0218 11:55:23.182106 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df0f7ce3-64d4-45c3-b416-58e49b5b5bac-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 18 11:55:23 crc kubenswrapper[4922]: I0218 11:55:23.202723 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 18 11:55:23 crc kubenswrapper[4922]: W0218 11:55:23.207553 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e9fc515_5e15_41fc_8e76_b9f3af099a0f.slice/crio-089198a21158f6f6b56f6630ad31cc5105f4b0001f6d2937e2ed72533d2aaec8 WatchSource:0}: Error finding container 089198a21158f6f6b56f6630ad31cc5105f4b0001f6d2937e2ed72533d2aaec8: Status 404 returned error can't find the container with id 089198a21158f6f6b56f6630ad31cc5105f4b0001f6d2937e2ed72533d2aaec8 Feb 18 11:55:23 crc kubenswrapper[4922]: I0218 11:55:23.218657 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Feb 18 11:55:23 crc kubenswrapper[4922]: I0218 11:55:23.229278 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-mqkkx"] Feb 18 11:55:23 crc kubenswrapper[4922]: I0218 11:55:23.313187 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-zpx62"] Feb 18 11:55:23 crc kubenswrapper[4922]: I0218 11:55:23.335863 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-zpx62"] Feb 18 11:55:23 crc kubenswrapper[4922]: I0218 11:55:23.520475 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-clz29"] Feb 18 11:55:23 crc kubenswrapper[4922]: I0218 11:55:23.532311 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-zjb6x"] Feb 18 11:55:23 crc kubenswrapper[4922]: I0218 11:55:23.558735 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-97df67fc7-qxhz9"] Feb 18 11:55:23 crc kubenswrapper[4922]: W0218 11:55:23.585160 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31aad152_dcb7_472f_a0f8_d90ae972442b.slice/crio-9b5d6cd64e7bf271ae9d437b3dc8665c7fa56f4097bd261a86583c671e88f922 WatchSource:0}: Error finding container 9b5d6cd64e7bf271ae9d437b3dc8665c7fa56f4097bd261a86583c671e88f922: Status 404 returned error can't find the container with id 9b5d6cd64e7bf271ae9d437b3dc8665c7fa56f4097bd261a86583c671e88f922 Feb 18 11:55:23 crc kubenswrapper[4922]: I0218 11:55:23.594808 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-mqfhx"] Feb 18 11:55:23 crc kubenswrapper[4922]: I0218 11:55:23.604460 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-rvpx7"] Feb 18 11:55:23 crc kubenswrapper[4922]: I0218 11:55:23.613749 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5b45644857-ghjwx"] Feb 18 11:55:23 crc kubenswrapper[4922]: I0218 11:55:23.622977 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-569dfc4865-ndwdj"] Feb 18 11:55:23 crc kubenswrapper[4922]: I0218 11:55:23.631872 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 11:55:23 crc kubenswrapper[4922]: I0218 11:55:23.727052 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-2ktqs"] Feb 18 11:55:23 crc kubenswrapper[4922]: I0218 11:55:23.737875 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Feb 18 11:55:23 crc kubenswrapper[4922]: W0218 11:55:23.760204 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c614b6a_8d46_4a07_89f6_1a7cc64dfdad.slice/crio-c1680a690b411a1b27ee20c720d7966628a4b267d54560c222bee5a47c521f74 WatchSource:0}: Error finding container c1680a690b411a1b27ee20c720d7966628a4b267d54560c222bee5a47c521f74: Status 404 returned error can't find the container with id c1680a690b411a1b27ee20c720d7966628a4b267d54560c222bee5a47c521f74 Feb 18 11:55:23 crc kubenswrapper[4922]: W0218 11:55:23.762374 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24bbb94b_821e_4c8c_ae27_356f296903bf.slice/crio-d52ba93a2e395a5fe3e842c514b38b9be3486416e78d8b85381f805174f39d21 WatchSource:0}: Error finding container d52ba93a2e395a5fe3e842c514b38b9be3486416e78d8b85381f805174f39d21: Status 404 returned error can't find the container with id d52ba93a2e395a5fe3e842c514b38b9be3486416e78d8b85381f805174f39d21 Feb 18 11:55:23 crc kubenswrapper[4922]: I0218 11:55:23.954170 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5b45644857-ghjwx" event={"ID":"92ff9b70-4f7e-43b8-b270-3470a18fcbda","Type":"ContainerStarted","Data":"bea8a00a01eb30a986c3d9e163f1f6f16bd92caadc91d9f743ebfba2b622cd5d"} Feb 18 11:55:23 crc kubenswrapper[4922]: I0218 11:55:23.956520 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-2ktqs" event={"ID":"24bbb94b-821e-4c8c-ae27-356f296903bf","Type":"ContainerStarted","Data":"d52ba93a2e395a5fe3e842c514b38b9be3486416e78d8b85381f805174f39d21"} Feb 18 11:55:23 crc kubenswrapper[4922]: I0218 11:55:23.960649 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"9e9fc515-5e15-41fc-8e76-b9f3af099a0f","Type":"ContainerStarted","Data":"089198a21158f6f6b56f6630ad31cc5105f4b0001f6d2937e2ed72533d2aaec8"} Feb 18 11:55:23 crc kubenswrapper[4922]: I0218 11:55:23.967727 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-mqfhx" event={"ID":"31aad152-dcb7-472f-a0f8-d90ae972442b","Type":"ContainerStarted","Data":"9b5d6cd64e7bf271ae9d437b3dc8665c7fa56f4097bd261a86583c671e88f922"} Feb 18 11:55:23 crc kubenswrapper[4922]: I0218 11:55:23.979569 4922 generic.go:334] "Generic (PLEG): container finished" podID="080a4dbf-a721-4b07-8c48-1ed03637a871" containerID="62ed2a0efd29b6e0c16d2156abed1ba070dd313cfdfdd01c3fa25fadbcf98e85" exitCode=0 Feb 18 11:55:23 crc kubenswrapper[4922]: I0218 11:55:23.979665 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b868669f-bms2q" event={"ID":"080a4dbf-a721-4b07-8c48-1ed03637a871","Type":"ContainerDied","Data":"62ed2a0efd29b6e0c16d2156abed1ba070dd313cfdfdd01c3fa25fadbcf98e85"} Feb 18 11:55:23 crc kubenswrapper[4922]: I0218 11:55:23.979693 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b868669f-bms2q" event={"ID":"080a4dbf-a721-4b07-8c48-1ed03637a871","Type":"ContainerStarted","Data":"953a10f63b24926b9e9c17b705d09c9065c66847ee05a8c88823e22fb95fe0b7"} Feb 18 11:55:23 crc kubenswrapper[4922]: I0218 11:55:23.983390 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"0c614b6a-8d46-4a07-89f6-1a7cc64dfdad","Type":"ContainerStarted","Data":"c1680a690b411a1b27ee20c720d7966628a4b267d54560c222bee5a47c521f74"} Feb 18 11:55:23 crc kubenswrapper[4922]: I0218 11:55:23.987160 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3873c9e-308f-46ea-ac8f-4ee78ca92235","Type":"ContainerStarted","Data":"d077394e189534489b8a5cebe609760985017f8cfefceb856dec5e4e90cc20e1"} Feb 18 11:55:24 crc kubenswrapper[4922]: I0218 11:55:24.011556 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"c7e64f4b-f6d6-43b3-aeab-47a0da094a84","Type":"ContainerStarted","Data":"45913156357b5b17fd4ecdb6fa34c39561bf28bcfc8935781691cdc1a3026c9b"} Feb 18 11:55:24 crc kubenswrapper[4922]: I0218 11:55:24.011639 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"c7e64f4b-f6d6-43b3-aeab-47a0da094a84","Type":"ContainerStarted","Data":"5443076c45ea0609742b231b2d38cf6b7ce01d78bf985c32d5e9cd70f2e17de2"} Feb 18 11:55:24 crc kubenswrapper[4922]: I0218 11:55:24.015968 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-clz29" event={"ID":"71318c6d-61ee-4fb4-8682-7cf3fc0ae044","Type":"ContainerStarted","Data":"c85150d5f0f00107589b346911782e9cacbfd0c2f75cf23b1582f2bb391c607c"} Feb 18 11:55:24 crc kubenswrapper[4922]: I0218 11:55:24.024661 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-zjb6x" event={"ID":"4bcd3608-244b-44f0-be1f-5d953cd35964","Type":"ContainerStarted","Data":"4f713717bbd69a1844002c6344555c40f26be59a2b8b6c3086945e62b2e3a5ca"} Feb 18 11:55:24 crc kubenswrapper[4922]: I0218 11:55:24.050340 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mqkkx" event={"ID":"7598fc1c-8735-4c0e-a095-f13117d3037e","Type":"ContainerStarted","Data":"ffcb369954fde5f2470b893096827224a7452c99e4a60328dd0f303414ba87d8"} Feb 18 11:55:24 crc kubenswrapper[4922]: I0218 11:55:24.050444 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mqkkx" event={"ID":"7598fc1c-8735-4c0e-a095-f13117d3037e","Type":"ContainerStarted","Data":"5cd177217aaeb4ee4e85b6aadff7f1be10663fa2d35eb9127c500266083c678f"} Feb 18 11:55:24 crc kubenswrapper[4922]: I0218 11:55:24.070741 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-rvpx7" event={"ID":"d7852f85-b8c5-458e-901c-3659c5ed2713","Type":"ContainerStarted","Data":"b5010985da36e7523bd0bc3fdfdcc8c443c58a1cc63438b6b3af9b7f64ca52d5"} Feb 18 11:55:24 crc kubenswrapper[4922]: I0218 11:55:24.078716 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-97df67fc7-qxhz9" event={"ID":"20b58cdc-a36d-4a63-b86d-474dac5d4566","Type":"ContainerStarted","Data":"45548fdfaf9ace756d38cd2bba939d9053aaab1d8d9828430bf80dd188bf6b85"} Feb 18 11:55:24 crc kubenswrapper[4922]: I0218 11:55:24.079279 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-mqkkx" podStartSLOduration=5.079256659 podStartE2EDuration="5.079256659s" podCreationTimestamp="2026-02-18 11:55:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:55:24.075433292 +0000 UTC m=+1125.803137372" watchObservedRunningTime="2026-02-18 11:55:24.079256659 +0000 UTC m=+1125.806960749" Feb 18 11:55:24 crc kubenswrapper[4922]: I0218 11:55:24.081740 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-569dfc4865-ndwdj" event={"ID":"8d1daa08-43b5-47db-bd94-3efb0eb4dce2","Type":"ContainerStarted","Data":"de849e1d2ee9efed619f5a1fae0183071acffda1776ddbf93a3e733f6554e51b"} Feb 18 11:55:24 crc kubenswrapper[4922]: I0218 11:55:24.466574 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b868669f-bms2q" Feb 18 11:55:24 crc kubenswrapper[4922]: I0218 11:55:24.523215 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/080a4dbf-a721-4b07-8c48-1ed03637a871-dns-svc\") pod \"080a4dbf-a721-4b07-8c48-1ed03637a871\" (UID: \"080a4dbf-a721-4b07-8c48-1ed03637a871\") " Feb 18 11:55:24 crc kubenswrapper[4922]: I0218 11:55:24.523310 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/080a4dbf-a721-4b07-8c48-1ed03637a871-dns-swift-storage-0\") pod \"080a4dbf-a721-4b07-8c48-1ed03637a871\" (UID: \"080a4dbf-a721-4b07-8c48-1ed03637a871\") " Feb 18 11:55:24 crc kubenswrapper[4922]: I0218 11:55:24.523374 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/080a4dbf-a721-4b07-8c48-1ed03637a871-ovsdbserver-nb\") pod \"080a4dbf-a721-4b07-8c48-1ed03637a871\" (UID: \"080a4dbf-a721-4b07-8c48-1ed03637a871\") " Feb 18 11:55:24 crc kubenswrapper[4922]: I0218 11:55:24.523460 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/080a4dbf-a721-4b07-8c48-1ed03637a871-ovsdbserver-sb\") pod \"080a4dbf-a721-4b07-8c48-1ed03637a871\" (UID: \"080a4dbf-a721-4b07-8c48-1ed03637a871\") " Feb 18 11:55:24 crc kubenswrapper[4922]: I0218 11:55:24.523536 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85mgs\" (UniqueName: \"kubernetes.io/projected/080a4dbf-a721-4b07-8c48-1ed03637a871-kube-api-access-85mgs\") pod \"080a4dbf-a721-4b07-8c48-1ed03637a871\" (UID: \"080a4dbf-a721-4b07-8c48-1ed03637a871\") " Feb 18 11:55:24 crc kubenswrapper[4922]: I0218 11:55:24.523588 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/080a4dbf-a721-4b07-8c48-1ed03637a871-config\") pod \"080a4dbf-a721-4b07-8c48-1ed03637a871\" (UID: \"080a4dbf-a721-4b07-8c48-1ed03637a871\") " Feb 18 11:55:24 crc kubenswrapper[4922]: I0218 11:55:24.550538 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/080a4dbf-a721-4b07-8c48-1ed03637a871-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "080a4dbf-a721-4b07-8c48-1ed03637a871" (UID: "080a4dbf-a721-4b07-8c48-1ed03637a871"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:55:24 crc kubenswrapper[4922]: I0218 11:55:24.550835 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/080a4dbf-a721-4b07-8c48-1ed03637a871-kube-api-access-85mgs" (OuterVolumeSpecName: "kube-api-access-85mgs") pod "080a4dbf-a721-4b07-8c48-1ed03637a871" (UID: "080a4dbf-a721-4b07-8c48-1ed03637a871"). InnerVolumeSpecName "kube-api-access-85mgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:55:24 crc kubenswrapper[4922]: I0218 11:55:24.555852 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/080a4dbf-a721-4b07-8c48-1ed03637a871-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "080a4dbf-a721-4b07-8c48-1ed03637a871" (UID: "080a4dbf-a721-4b07-8c48-1ed03637a871"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:55:24 crc kubenswrapper[4922]: I0218 11:55:24.564336 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/080a4dbf-a721-4b07-8c48-1ed03637a871-config" (OuterVolumeSpecName: "config") pod "080a4dbf-a721-4b07-8c48-1ed03637a871" (UID: "080a4dbf-a721-4b07-8c48-1ed03637a871"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:55:24 crc kubenswrapper[4922]: I0218 11:55:24.588965 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/080a4dbf-a721-4b07-8c48-1ed03637a871-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "080a4dbf-a721-4b07-8c48-1ed03637a871" (UID: "080a4dbf-a721-4b07-8c48-1ed03637a871"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:55:24 crc kubenswrapper[4922]: I0218 11:55:24.589505 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/080a4dbf-a721-4b07-8c48-1ed03637a871-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "080a4dbf-a721-4b07-8c48-1ed03637a871" (UID: "080a4dbf-a721-4b07-8c48-1ed03637a871"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:55:24 crc kubenswrapper[4922]: I0218 11:55:24.629852 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85mgs\" (UniqueName: \"kubernetes.io/projected/080a4dbf-a721-4b07-8c48-1ed03637a871-kube-api-access-85mgs\") on node \"crc\" DevicePath \"\"" Feb 18 11:55:24 crc kubenswrapper[4922]: I0218 11:55:24.629890 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/080a4dbf-a721-4b07-8c48-1ed03637a871-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:55:24 crc kubenswrapper[4922]: I0218 11:55:24.629901 4922 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/080a4dbf-a721-4b07-8c48-1ed03637a871-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 11:55:24 crc kubenswrapper[4922]: I0218 11:55:24.629914 4922 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/080a4dbf-a721-4b07-8c48-1ed03637a871-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 18 11:55:24 crc kubenswrapper[4922]: I0218 11:55:24.629925 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/080a4dbf-a721-4b07-8c48-1ed03637a871-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 18 11:55:24 crc kubenswrapper[4922]: I0218 11:55:24.629935 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/080a4dbf-a721-4b07-8c48-1ed03637a871-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 18 11:55:24 crc kubenswrapper[4922]: I0218 11:55:24.990267 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df0f7ce3-64d4-45c3-b416-58e49b5b5bac" path="/var/lib/kubelet/pods/df0f7ce3-64d4-45c3-b416-58e49b5b5bac/volumes" Feb 18 11:55:25 crc kubenswrapper[4922]: I0218 11:55:25.108429 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"c7e64f4b-f6d6-43b3-aeab-47a0da094a84","Type":"ContainerStarted","Data":"580ec3bf82a2cc3d74061b0ae6a5ff2d1e8371f633928e9b01fc3ab3ece19784"} Feb 18 11:55:25 crc kubenswrapper[4922]: I0218 11:55:25.108663 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="c7e64f4b-f6d6-43b3-aeab-47a0da094a84" containerName="watcher-api-log" containerID="cri-o://45913156357b5b17fd4ecdb6fa34c39561bf28bcfc8935781691cdc1a3026c9b" gracePeriod=30 Feb 18 11:55:25 crc kubenswrapper[4922]: I0218 11:55:25.109311 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Feb 18 11:55:25 crc kubenswrapper[4922]: I0218 11:55:25.109396 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="c7e64f4b-f6d6-43b3-aeab-47a0da094a84" containerName="watcher-api" containerID="cri-o://580ec3bf82a2cc3d74061b0ae6a5ff2d1e8371f633928e9b01fc3ab3ece19784" gracePeriod=30 Feb 18 11:55:25 crc kubenswrapper[4922]: I0218 11:55:25.134749 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b868669f-bms2q" event={"ID":"080a4dbf-a721-4b07-8c48-1ed03637a871","Type":"ContainerDied","Data":"953a10f63b24926b9e9c17b705d09c9065c66847ee05a8c88823e22fb95fe0b7"} Feb 18 11:55:25 crc kubenswrapper[4922]: I0218 11:55:25.134811 4922 scope.go:117] "RemoveContainer" containerID="62ed2a0efd29b6e0c16d2156abed1ba070dd313cfdfdd01c3fa25fadbcf98e85" Feb 18 11:55:25 crc kubenswrapper[4922]: I0218 11:55:25.134948 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b868669f-bms2q" Feb 18 11:55:25 crc kubenswrapper[4922]: I0218 11:55:25.145059 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="c7e64f4b-f6d6-43b3-aeab-47a0da094a84" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.149:9322/\": EOF" Feb 18 11:55:25 crc kubenswrapper[4922]: I0218 11:55:25.149158 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=6.149135978 podStartE2EDuration="6.149135978s" podCreationTimestamp="2026-02-18 11:55:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:55:25.134007356 +0000 UTC m=+1126.861711466" watchObservedRunningTime="2026-02-18 11:55:25.149135978 +0000 UTC m=+1126.876840058" Feb 18 11:55:25 crc kubenswrapper[4922]: I0218 11:55:25.165191 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-zjb6x" event={"ID":"4bcd3608-244b-44f0-be1f-5d953cd35964","Type":"ContainerStarted","Data":"b2705f7646a829371102cfea131123135c7490beec56768c960d999d9c5fa2de"} Feb 18 11:55:25 crc kubenswrapper[4922]: I0218 11:55:25.216832 4922 generic.go:334] "Generic (PLEG): container finished" podID="24bbb94b-821e-4c8c-ae27-356f296903bf" containerID="7f1abe52f752c943d157e484e47d8d790a98ef3e4904432fd3c21618fac5c1e6" exitCode=0 Feb 18 11:55:25 crc kubenswrapper[4922]: I0218 11:55:25.217387 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-2ktqs" event={"ID":"24bbb94b-821e-4c8c-ae27-356f296903bf","Type":"ContainerDied","Data":"7f1abe52f752c943d157e484e47d8d790a98ef3e4904432fd3c21618fac5c1e6"} Feb 18 11:55:25 crc kubenswrapper[4922]: I0218 11:55:25.229064 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-bms2q"] Feb 18 11:55:25 crc kubenswrapper[4922]: I0218 11:55:25.254082 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-bms2q"] Feb 18 11:55:25 crc kubenswrapper[4922]: I0218 11:55:25.265064 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-zjb6x" podStartSLOduration=6.265046129 podStartE2EDuration="6.265046129s" podCreationTimestamp="2026-02-18 11:55:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:55:25.211520506 +0000 UTC m=+1126.939224576" watchObservedRunningTime="2026-02-18 11:55:25.265046129 +0000 UTC m=+1126.992750209" Feb 18 11:55:26 crc kubenswrapper[4922]: I0218 11:55:26.232837 4922 generic.go:334] "Generic (PLEG): container finished" podID="c7e64f4b-f6d6-43b3-aeab-47a0da094a84" containerID="45913156357b5b17fd4ecdb6fa34c39561bf28bcfc8935781691cdc1a3026c9b" exitCode=143 Feb 18 11:55:26 crc kubenswrapper[4922]: I0218 11:55:26.232921 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"c7e64f4b-f6d6-43b3-aeab-47a0da094a84","Type":"ContainerDied","Data":"45913156357b5b17fd4ecdb6fa34c39561bf28bcfc8935781691cdc1a3026c9b"} Feb 18 11:55:26 crc kubenswrapper[4922]: I0218 11:55:26.984736 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="080a4dbf-a721-4b07-8c48-1ed03637a871" path="/var/lib/kubelet/pods/080a4dbf-a721-4b07-8c48-1ed03637a871/volumes" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.008743 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-97df67fc7-qxhz9"] Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.048329 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-9d79df67b-mg9kq"] Feb 18 11:55:28 crc kubenswrapper[4922]: E0218 11:55:28.048745 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="080a4dbf-a721-4b07-8c48-1ed03637a871" containerName="init" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.048761 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="080a4dbf-a721-4b07-8c48-1ed03637a871" containerName="init" Feb 18 11:55:28 crc kubenswrapper[4922]: E0218 11:55:28.048779 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df0f7ce3-64d4-45c3-b416-58e49b5b5bac" containerName="dnsmasq-dns" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.048787 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="df0f7ce3-64d4-45c3-b416-58e49b5b5bac" containerName="dnsmasq-dns" Feb 18 11:55:28 crc kubenswrapper[4922]: E0218 11:55:28.048809 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df0f7ce3-64d4-45c3-b416-58e49b5b5bac" containerName="init" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.048824 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="df0f7ce3-64d4-45c3-b416-58e49b5b5bac" containerName="init" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.048986 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="df0f7ce3-64d4-45c3-b416-58e49b5b5bac" containerName="dnsmasq-dns" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.049012 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="080a4dbf-a721-4b07-8c48-1ed03637a871" containerName="init" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.049938 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9d79df67b-mg9kq" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.054881 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.080002 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-9d79df67b-mg9kq"] Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.126954 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5b45644857-ghjwx"] Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.183177 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7bbf5454f6-d5958"] Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.187196 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7bbf5454f6-d5958" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.196649 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7bbf5454f6-d5958"] Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.214863 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3b4cee1-5234-4b6c-93fa-3cb5687ecba9-horizon-tls-certs\") pod \"horizon-9d79df67b-mg9kq\" (UID: \"e3b4cee1-5234-4b6c-93fa-3cb5687ecba9\") " pod="openstack/horizon-9d79df67b-mg9kq" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.215017 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e3b4cee1-5234-4b6c-93fa-3cb5687ecba9-scripts\") pod \"horizon-9d79df67b-mg9kq\" (UID: \"e3b4cee1-5234-4b6c-93fa-3cb5687ecba9\") " pod="openstack/horizon-9d79df67b-mg9kq" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.215051 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3b4cee1-5234-4b6c-93fa-3cb5687ecba9-config-data\") pod \"horizon-9d79df67b-mg9kq\" (UID: \"e3b4cee1-5234-4b6c-93fa-3cb5687ecba9\") " pod="openstack/horizon-9d79df67b-mg9kq" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.215096 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3b4cee1-5234-4b6c-93fa-3cb5687ecba9-combined-ca-bundle\") pod \"horizon-9d79df67b-mg9kq\" (UID: \"e3b4cee1-5234-4b6c-93fa-3cb5687ecba9\") " pod="openstack/horizon-9d79df67b-mg9kq" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.215129 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3b4cee1-5234-4b6c-93fa-3cb5687ecba9-logs\") pod \"horizon-9d79df67b-mg9kq\" (UID: \"e3b4cee1-5234-4b6c-93fa-3cb5687ecba9\") " pod="openstack/horizon-9d79df67b-mg9kq" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.215452 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e3b4cee1-5234-4b6c-93fa-3cb5687ecba9-horizon-secret-key\") pod \"horizon-9d79df67b-mg9kq\" (UID: \"e3b4cee1-5234-4b6c-93fa-3cb5687ecba9\") " pod="openstack/horizon-9d79df67b-mg9kq" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.215501 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvkcr\" (UniqueName: \"kubernetes.io/projected/e3b4cee1-5234-4b6c-93fa-3cb5687ecba9-kube-api-access-kvkcr\") pod \"horizon-9d79df67b-mg9kq\" (UID: \"e3b4cee1-5234-4b6c-93fa-3cb5687ecba9\") " pod="openstack/horizon-9d79df67b-mg9kq" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.318311 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3bc8759d-86ff-415d-936a-064ef742f0d9-config-data\") pod \"horizon-7bbf5454f6-d5958\" (UID: \"3bc8759d-86ff-415d-936a-064ef742f0d9\") " pod="openstack/horizon-7bbf5454f6-d5958" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.318394 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvkcr\" (UniqueName: \"kubernetes.io/projected/e3b4cee1-5234-4b6c-93fa-3cb5687ecba9-kube-api-access-kvkcr\") pod \"horizon-9d79df67b-mg9kq\" (UID: \"e3b4cee1-5234-4b6c-93fa-3cb5687ecba9\") " pod="openstack/horizon-9d79df67b-mg9kq" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.318573 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bc8759d-86ff-415d-936a-064ef742f0d9-logs\") pod \"horizon-7bbf5454f6-d5958\" (UID: \"3bc8759d-86ff-415d-936a-064ef742f0d9\") " pod="openstack/horizon-7bbf5454f6-d5958" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.318676 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3b4cee1-5234-4b6c-93fa-3cb5687ecba9-horizon-tls-certs\") pod \"horizon-9d79df67b-mg9kq\" (UID: \"e3b4cee1-5234-4b6c-93fa-3cb5687ecba9\") " pod="openstack/horizon-9d79df67b-mg9kq" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.320217 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e3b4cee1-5234-4b6c-93fa-3cb5687ecba9-scripts\") pod \"horizon-9d79df67b-mg9kq\" (UID: \"e3b4cee1-5234-4b6c-93fa-3cb5687ecba9\") " pod="openstack/horizon-9d79df67b-mg9kq" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.320267 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3b4cee1-5234-4b6c-93fa-3cb5687ecba9-config-data\") pod \"horizon-9d79df67b-mg9kq\" (UID: \"e3b4cee1-5234-4b6c-93fa-3cb5687ecba9\") " pod="openstack/horizon-9d79df67b-mg9kq" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.320303 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3bc8759d-86ff-415d-936a-064ef742f0d9-horizon-secret-key\") pod \"horizon-7bbf5454f6-d5958\" (UID: \"3bc8759d-86ff-415d-936a-064ef742f0d9\") " pod="openstack/horizon-7bbf5454f6-d5958" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.320324 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bc8759d-86ff-415d-936a-064ef742f0d9-horizon-tls-certs\") pod \"horizon-7bbf5454f6-d5958\" (UID: \"3bc8759d-86ff-415d-936a-064ef742f0d9\") " pod="openstack/horizon-7bbf5454f6-d5958" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.320379 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3b4cee1-5234-4b6c-93fa-3cb5687ecba9-combined-ca-bundle\") pod \"horizon-9d79df67b-mg9kq\" (UID: \"e3b4cee1-5234-4b6c-93fa-3cb5687ecba9\") " pod="openstack/horizon-9d79df67b-mg9kq" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.320416 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bc8759d-86ff-415d-936a-064ef742f0d9-combined-ca-bundle\") pod \"horizon-7bbf5454f6-d5958\" (UID: \"3bc8759d-86ff-415d-936a-064ef742f0d9\") " pod="openstack/horizon-7bbf5454f6-d5958" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.320646 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3b4cee1-5234-4b6c-93fa-3cb5687ecba9-logs\") pod \"horizon-9d79df67b-mg9kq\" (UID: \"e3b4cee1-5234-4b6c-93fa-3cb5687ecba9\") " pod="openstack/horizon-9d79df67b-mg9kq" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.321161 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3b4cee1-5234-4b6c-93fa-3cb5687ecba9-logs\") pod \"horizon-9d79df67b-mg9kq\" (UID: \"e3b4cee1-5234-4b6c-93fa-3cb5687ecba9\") " pod="openstack/horizon-9d79df67b-mg9kq" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.321469 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3bc8759d-86ff-415d-936a-064ef742f0d9-scripts\") pod \"horizon-7bbf5454f6-d5958\" (UID: \"3bc8759d-86ff-415d-936a-064ef742f0d9\") " pod="openstack/horizon-7bbf5454f6-d5958" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.321525 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e3b4cee1-5234-4b6c-93fa-3cb5687ecba9-horizon-secret-key\") pod \"horizon-9d79df67b-mg9kq\" (UID: \"e3b4cee1-5234-4b6c-93fa-3cb5687ecba9\") " pod="openstack/horizon-9d79df67b-mg9kq" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.321554 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r6qh\" (UniqueName: \"kubernetes.io/projected/3bc8759d-86ff-415d-936a-064ef742f0d9-kube-api-access-9r6qh\") pod \"horizon-7bbf5454f6-d5958\" (UID: \"3bc8759d-86ff-415d-936a-064ef742f0d9\") " pod="openstack/horizon-7bbf5454f6-d5958" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.321657 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3b4cee1-5234-4b6c-93fa-3cb5687ecba9-config-data\") pod \"horizon-9d79df67b-mg9kq\" (UID: \"e3b4cee1-5234-4b6c-93fa-3cb5687ecba9\") " pod="openstack/horizon-9d79df67b-mg9kq" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.321702 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e3b4cee1-5234-4b6c-93fa-3cb5687ecba9-scripts\") pod \"horizon-9d79df67b-mg9kq\" (UID: \"e3b4cee1-5234-4b6c-93fa-3cb5687ecba9\") " pod="openstack/horizon-9d79df67b-mg9kq" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.327937 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e3b4cee1-5234-4b6c-93fa-3cb5687ecba9-horizon-secret-key\") pod \"horizon-9d79df67b-mg9kq\" (UID: \"e3b4cee1-5234-4b6c-93fa-3cb5687ecba9\") " pod="openstack/horizon-9d79df67b-mg9kq" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.328038 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3b4cee1-5234-4b6c-93fa-3cb5687ecba9-horizon-tls-certs\") pod \"horizon-9d79df67b-mg9kq\" (UID: \"e3b4cee1-5234-4b6c-93fa-3cb5687ecba9\") " pod="openstack/horizon-9d79df67b-mg9kq" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.328555 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3b4cee1-5234-4b6c-93fa-3cb5687ecba9-combined-ca-bundle\") pod \"horizon-9d79df67b-mg9kq\" (UID: \"e3b4cee1-5234-4b6c-93fa-3cb5687ecba9\") " pod="openstack/horizon-9d79df67b-mg9kq" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.343608 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvkcr\" (UniqueName: \"kubernetes.io/projected/e3b4cee1-5234-4b6c-93fa-3cb5687ecba9-kube-api-access-kvkcr\") pod \"horizon-9d79df67b-mg9kq\" (UID: \"e3b4cee1-5234-4b6c-93fa-3cb5687ecba9\") " pod="openstack/horizon-9d79df67b-mg9kq" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.387810 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9d79df67b-mg9kq" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.424097 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3bc8759d-86ff-415d-936a-064ef742f0d9-horizon-secret-key\") pod \"horizon-7bbf5454f6-d5958\" (UID: \"3bc8759d-86ff-415d-936a-064ef742f0d9\") " pod="openstack/horizon-7bbf5454f6-d5958" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.424155 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bc8759d-86ff-415d-936a-064ef742f0d9-horizon-tls-certs\") pod \"horizon-7bbf5454f6-d5958\" (UID: \"3bc8759d-86ff-415d-936a-064ef742f0d9\") " pod="openstack/horizon-7bbf5454f6-d5958" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.424208 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bc8759d-86ff-415d-936a-064ef742f0d9-combined-ca-bundle\") pod \"horizon-7bbf5454f6-d5958\" (UID: \"3bc8759d-86ff-415d-936a-064ef742f0d9\") " pod="openstack/horizon-7bbf5454f6-d5958" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.424307 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3bc8759d-86ff-415d-936a-064ef742f0d9-scripts\") pod \"horizon-7bbf5454f6-d5958\" (UID: \"3bc8759d-86ff-415d-936a-064ef742f0d9\") " pod="openstack/horizon-7bbf5454f6-d5958" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.424347 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9r6qh\" (UniqueName: \"kubernetes.io/projected/3bc8759d-86ff-415d-936a-064ef742f0d9-kube-api-access-9r6qh\") pod \"horizon-7bbf5454f6-d5958\" (UID: \"3bc8759d-86ff-415d-936a-064ef742f0d9\") " pod="openstack/horizon-7bbf5454f6-d5958" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.424407 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3bc8759d-86ff-415d-936a-064ef742f0d9-config-data\") pod \"horizon-7bbf5454f6-d5958\" (UID: \"3bc8759d-86ff-415d-936a-064ef742f0d9\") " pod="openstack/horizon-7bbf5454f6-d5958" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.424437 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bc8759d-86ff-415d-936a-064ef742f0d9-logs\") pod \"horizon-7bbf5454f6-d5958\" (UID: \"3bc8759d-86ff-415d-936a-064ef742f0d9\") " pod="openstack/horizon-7bbf5454f6-d5958" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.424892 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bc8759d-86ff-415d-936a-064ef742f0d9-logs\") pod \"horizon-7bbf5454f6-d5958\" (UID: \"3bc8759d-86ff-415d-936a-064ef742f0d9\") " pod="openstack/horizon-7bbf5454f6-d5958" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.425457 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3bc8759d-86ff-415d-936a-064ef742f0d9-scripts\") pod \"horizon-7bbf5454f6-d5958\" (UID: \"3bc8759d-86ff-415d-936a-064ef742f0d9\") " pod="openstack/horizon-7bbf5454f6-d5958" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.433289 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3bc8759d-86ff-415d-936a-064ef742f0d9-config-data\") pod \"horizon-7bbf5454f6-d5958\" (UID: \"3bc8759d-86ff-415d-936a-064ef742f0d9\") " pod="openstack/horizon-7bbf5454f6-d5958" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.436336 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bc8759d-86ff-415d-936a-064ef742f0d9-horizon-tls-certs\") pod \"horizon-7bbf5454f6-d5958\" (UID: \"3bc8759d-86ff-415d-936a-064ef742f0d9\") " pod="openstack/horizon-7bbf5454f6-d5958" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.437182 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3bc8759d-86ff-415d-936a-064ef742f0d9-horizon-secret-key\") pod \"horizon-7bbf5454f6-d5958\" (UID: \"3bc8759d-86ff-415d-936a-064ef742f0d9\") " pod="openstack/horizon-7bbf5454f6-d5958" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.437238 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bc8759d-86ff-415d-936a-064ef742f0d9-combined-ca-bundle\") pod \"horizon-7bbf5454f6-d5958\" (UID: \"3bc8759d-86ff-415d-936a-064ef742f0d9\") " pod="openstack/horizon-7bbf5454f6-d5958" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.446223 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r6qh\" (UniqueName: \"kubernetes.io/projected/3bc8759d-86ff-415d-936a-064ef742f0d9-kube-api-access-9r6qh\") pod \"horizon-7bbf5454f6-d5958\" (UID: \"3bc8759d-86ff-415d-936a-064ef742f0d9\") " pod="openstack/horizon-7bbf5454f6-d5958" Feb 18 11:55:28 crc kubenswrapper[4922]: I0218 11:55:28.522895 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7bbf5454f6-d5958" Feb 18 11:55:30 crc kubenswrapper[4922]: I0218 11:55:30.004772 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Feb 18 11:55:31 crc kubenswrapper[4922]: I0218 11:55:31.542959 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="c7e64f4b-f6d6-43b3-aeab-47a0da094a84" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.149:9322/\": read tcp 10.217.0.2:55390->10.217.0.149:9322: read: connection reset by peer" Feb 18 11:55:32 crc kubenswrapper[4922]: I0218 11:55:32.303941 4922 generic.go:334] "Generic (PLEG): container finished" podID="c7e64f4b-f6d6-43b3-aeab-47a0da094a84" containerID="580ec3bf82a2cc3d74061b0ae6a5ff2d1e8371f633928e9b01fc3ab3ece19784" exitCode=0 Feb 18 11:55:32 crc kubenswrapper[4922]: I0218 11:55:32.304023 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"c7e64f4b-f6d6-43b3-aeab-47a0da094a84","Type":"ContainerDied","Data":"580ec3bf82a2cc3d74061b0ae6a5ff2d1e8371f633928e9b01fc3ab3ece19784"} Feb 18 11:55:34 crc kubenswrapper[4922]: I0218 11:55:34.322332 4922 generic.go:334] "Generic (PLEG): container finished" podID="7598fc1c-8735-4c0e-a095-f13117d3037e" containerID="ffcb369954fde5f2470b893096827224a7452c99e4a60328dd0f303414ba87d8" exitCode=0 Feb 18 11:55:34 crc kubenswrapper[4922]: I0218 11:55:34.322412 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mqkkx" event={"ID":"7598fc1c-8735-4c0e-a095-f13117d3037e","Type":"ContainerDied","Data":"ffcb369954fde5f2470b893096827224a7452c99e4a60328dd0f303414ba87d8"} Feb 18 11:55:35 crc kubenswrapper[4922]: I0218 11:55:35.006183 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="c7e64f4b-f6d6-43b3-aeab-47a0da094a84" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.149:9322/\": dial tcp 10.217.0.149:9322: connect: connection refused" Feb 18 11:55:40 crc kubenswrapper[4922]: I0218 11:55:40.005539 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="c7e64f4b-f6d6-43b3-aeab-47a0da094a84" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.149:9322/\": dial tcp 10.217.0.149:9322: connect: connection refused" Feb 18 11:55:41 crc kubenswrapper[4922]: E0218 11:55:41.387542 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Feb 18 11:55:41 crc kubenswrapper[4922]: E0218 11:55:41.388733 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nd8h4h576hd5h66ch9bh58h8fh5ffh64fh545h5b6h5dbh58bh64h65bh67fh648h645h579hcdh68dh5cbh558h74h7fh5f5h5b8h5bfh555h5d5h689q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p24jm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-97df67fc7-qxhz9_openstack(20b58cdc-a36d-4a63-b86d-474dac5d4566): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 11:55:41 crc kubenswrapper[4922]: E0218 11:55:41.391319 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-97df67fc7-qxhz9" podUID="20b58cdc-a36d-4a63-b86d-474dac5d4566" Feb 18 11:55:45 crc kubenswrapper[4922]: I0218 11:55:45.005975 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="c7e64f4b-f6d6-43b3-aeab-47a0da094a84" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.149:9322/\": dial tcp 10.217.0.149:9322: connect: connection refused" Feb 18 11:55:50 crc kubenswrapper[4922]: I0218 11:55:50.005715 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="c7e64f4b-f6d6-43b3-aeab-47a0da094a84" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.149:9322/\": dial tcp 10.217.0.149:9322: connect: connection refused" Feb 18 11:56:00 crc kubenswrapper[4922]: I0218 11:56:00.005883 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="c7e64f4b-f6d6-43b3-aeab-47a0da094a84" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.149:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 18 11:56:05 crc kubenswrapper[4922]: I0218 11:56:05.007575 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="c7e64f4b-f6d6-43b3-aeab-47a0da094a84" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.149:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 18 11:56:05 crc kubenswrapper[4922]: E0218 11:56:05.545355 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Feb 18 11:56:05 crc kubenswrapper[4922]: E0218 11:56:05.545889 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-725mj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-mqfhx_openstack(31aad152-dcb7-472f-a0f8-d90ae972442b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 11:56:05 crc kubenswrapper[4922]: E0218 11:56:05.547039 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-mqfhx" podUID="31aad152-dcb7-472f-a0f8-d90ae972442b" Feb 18 11:56:05 crc kubenswrapper[4922]: I0218 11:56:05.600760 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-97df67fc7-qxhz9" Feb 18 11:56:05 crc kubenswrapper[4922]: I0218 11:56:05.608815 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mqkkx" Feb 18 11:56:05 crc kubenswrapper[4922]: I0218 11:56:05.609252 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mqkkx" event={"ID":"7598fc1c-8735-4c0e-a095-f13117d3037e","Type":"ContainerDied","Data":"5cd177217aaeb4ee4e85b6aadff7f1be10663fa2d35eb9127c500266083c678f"} Feb 18 11:56:05 crc kubenswrapper[4922]: I0218 11:56:05.609288 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5cd177217aaeb4ee4e85b6aadff7f1be10663fa2d35eb9127c500266083c678f" Feb 18 11:56:05 crc kubenswrapper[4922]: I0218 11:56:05.611415 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-97df67fc7-qxhz9" event={"ID":"20b58cdc-a36d-4a63-b86d-474dac5d4566","Type":"ContainerDied","Data":"45548fdfaf9ace756d38cd2bba939d9053aaab1d8d9828430bf80dd188bf6b85"} Feb 18 11:56:05 crc kubenswrapper[4922]: I0218 11:56:05.611488 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-97df67fc7-qxhz9" Feb 18 11:56:05 crc kubenswrapper[4922]: E0218 11:56:05.612387 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-mqfhx" podUID="31aad152-dcb7-472f-a0f8-d90ae972442b" Feb 18 11:56:05 crc kubenswrapper[4922]: E0218 11:56:05.732918 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Feb 18 11:56:05 crc kubenswrapper[4922]: E0218 11:56:05.733094 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n657h5b4h564hcfh5d8hch8bh699h86h5bch678h5b8hd7h675h84h658h76hd9h56dh675h58h5dfh68dh5c6hdhb5h5f7h54ch67bh654h57ch67dq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gmpjz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-5b45644857-ghjwx_openstack(92ff9b70-4f7e-43b8-b270-3470a18fcbda): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 11:56:05 crc kubenswrapper[4922]: E0218 11:56:05.736108 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-5b45644857-ghjwx" podUID="92ff9b70-4f7e-43b8-b270-3470a18fcbda" Feb 18 11:56:05 crc kubenswrapper[4922]: I0218 11:56:05.800007 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7598fc1c-8735-4c0e-a095-f13117d3037e-scripts\") pod \"7598fc1c-8735-4c0e-a095-f13117d3037e\" (UID: \"7598fc1c-8735-4c0e-a095-f13117d3037e\") " Feb 18 11:56:05 crc kubenswrapper[4922]: I0218 11:56:05.800080 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/20b58cdc-a36d-4a63-b86d-474dac5d4566-config-data\") pod \"20b58cdc-a36d-4a63-b86d-474dac5d4566\" (UID: \"20b58cdc-a36d-4a63-b86d-474dac5d4566\") " Feb 18 11:56:05 crc kubenswrapper[4922]: I0218 11:56:05.800166 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7598fc1c-8735-4c0e-a095-f13117d3037e-credential-keys\") pod \"7598fc1c-8735-4c0e-a095-f13117d3037e\" (UID: \"7598fc1c-8735-4c0e-a095-f13117d3037e\") " Feb 18 11:56:05 crc kubenswrapper[4922]: I0218 11:56:05.800208 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/20b58cdc-a36d-4a63-b86d-474dac5d4566-scripts\") pod \"20b58cdc-a36d-4a63-b86d-474dac5d4566\" (UID: \"20b58cdc-a36d-4a63-b86d-474dac5d4566\") " Feb 18 11:56:05 crc kubenswrapper[4922]: I0218 11:56:05.800247 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7598fc1c-8735-4c0e-a095-f13117d3037e-config-data\") pod \"7598fc1c-8735-4c0e-a095-f13117d3037e\" (UID: \"7598fc1c-8735-4c0e-a095-f13117d3037e\") " Feb 18 11:56:05 crc kubenswrapper[4922]: I0218 11:56:05.800274 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7598fc1c-8735-4c0e-a095-f13117d3037e-combined-ca-bundle\") pod \"7598fc1c-8735-4c0e-a095-f13117d3037e\" (UID: \"7598fc1c-8735-4c0e-a095-f13117d3037e\") " Feb 18 11:56:05 crc kubenswrapper[4922]: I0218 11:56:05.800325 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p24jm\" (UniqueName: \"kubernetes.io/projected/20b58cdc-a36d-4a63-b86d-474dac5d4566-kube-api-access-p24jm\") pod \"20b58cdc-a36d-4a63-b86d-474dac5d4566\" (UID: \"20b58cdc-a36d-4a63-b86d-474dac5d4566\") " Feb 18 11:56:05 crc kubenswrapper[4922]: I0218 11:56:05.800417 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/20b58cdc-a36d-4a63-b86d-474dac5d4566-horizon-secret-key\") pod \"20b58cdc-a36d-4a63-b86d-474dac5d4566\" (UID: \"20b58cdc-a36d-4a63-b86d-474dac5d4566\") " Feb 18 11:56:05 crc kubenswrapper[4922]: I0218 11:56:05.800447 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kj8x7\" (UniqueName: \"kubernetes.io/projected/7598fc1c-8735-4c0e-a095-f13117d3037e-kube-api-access-kj8x7\") pod \"7598fc1c-8735-4c0e-a095-f13117d3037e\" (UID: \"7598fc1c-8735-4c0e-a095-f13117d3037e\") " Feb 18 11:56:05 crc kubenswrapper[4922]: I0218 11:56:05.800488 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20b58cdc-a36d-4a63-b86d-474dac5d4566-logs\") pod \"20b58cdc-a36d-4a63-b86d-474dac5d4566\" (UID: \"20b58cdc-a36d-4a63-b86d-474dac5d4566\") " Feb 18 11:56:05 crc kubenswrapper[4922]: I0218 11:56:05.800562 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7598fc1c-8735-4c0e-a095-f13117d3037e-fernet-keys\") pod \"7598fc1c-8735-4c0e-a095-f13117d3037e\" (UID: \"7598fc1c-8735-4c0e-a095-f13117d3037e\") " Feb 18 11:56:05 crc kubenswrapper[4922]: I0218 11:56:05.802822 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20b58cdc-a36d-4a63-b86d-474dac5d4566-config-data" (OuterVolumeSpecName: "config-data") pod "20b58cdc-a36d-4a63-b86d-474dac5d4566" (UID: "20b58cdc-a36d-4a63-b86d-474dac5d4566"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:56:05 crc kubenswrapper[4922]: I0218 11:56:05.803278 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20b58cdc-a36d-4a63-b86d-474dac5d4566-logs" (OuterVolumeSpecName: "logs") pod "20b58cdc-a36d-4a63-b86d-474dac5d4566" (UID: "20b58cdc-a36d-4a63-b86d-474dac5d4566"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:56:05 crc kubenswrapper[4922]: I0218 11:56:05.812860 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20b58cdc-a36d-4a63-b86d-474dac5d4566-scripts" (OuterVolumeSpecName: "scripts") pod "20b58cdc-a36d-4a63-b86d-474dac5d4566" (UID: "20b58cdc-a36d-4a63-b86d-474dac5d4566"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:56:05 crc kubenswrapper[4922]: I0218 11:56:05.824871 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7598fc1c-8735-4c0e-a095-f13117d3037e-scripts" (OuterVolumeSpecName: "scripts") pod "7598fc1c-8735-4c0e-a095-f13117d3037e" (UID: "7598fc1c-8735-4c0e-a095-f13117d3037e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:56:05 crc kubenswrapper[4922]: I0218 11:56:05.827135 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7598fc1c-8735-4c0e-a095-f13117d3037e-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "7598fc1c-8735-4c0e-a095-f13117d3037e" (UID: "7598fc1c-8735-4c0e-a095-f13117d3037e"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:56:05 crc kubenswrapper[4922]: I0218 11:56:05.827316 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b58cdc-a36d-4a63-b86d-474dac5d4566-kube-api-access-p24jm" (OuterVolumeSpecName: "kube-api-access-p24jm") pod "20b58cdc-a36d-4a63-b86d-474dac5d4566" (UID: "20b58cdc-a36d-4a63-b86d-474dac5d4566"). InnerVolumeSpecName "kube-api-access-p24jm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:56:05 crc kubenswrapper[4922]: I0218 11:56:05.829352 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7598fc1c-8735-4c0e-a095-f13117d3037e-kube-api-access-kj8x7" (OuterVolumeSpecName: "kube-api-access-kj8x7") pod "7598fc1c-8735-4c0e-a095-f13117d3037e" (UID: "7598fc1c-8735-4c0e-a095-f13117d3037e"). InnerVolumeSpecName "kube-api-access-kj8x7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:56:05 crc kubenswrapper[4922]: I0218 11:56:05.829724 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b58cdc-a36d-4a63-b86d-474dac5d4566-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "20b58cdc-a36d-4a63-b86d-474dac5d4566" (UID: "20b58cdc-a36d-4a63-b86d-474dac5d4566"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:56:05 crc kubenswrapper[4922]: I0218 11:56:05.830121 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7598fc1c-8735-4c0e-a095-f13117d3037e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "7598fc1c-8735-4c0e-a095-f13117d3037e" (UID: "7598fc1c-8735-4c0e-a095-f13117d3037e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:56:05 crc kubenswrapper[4922]: I0218 11:56:05.855756 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7598fc1c-8735-4c0e-a095-f13117d3037e-config-data" (OuterVolumeSpecName: "config-data") pod "7598fc1c-8735-4c0e-a095-f13117d3037e" (UID: "7598fc1c-8735-4c0e-a095-f13117d3037e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:56:05 crc kubenswrapper[4922]: I0218 11:56:05.856627 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7598fc1c-8735-4c0e-a095-f13117d3037e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7598fc1c-8735-4c0e-a095-f13117d3037e" (UID: "7598fc1c-8735-4c0e-a095-f13117d3037e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:56:05 crc kubenswrapper[4922]: I0218 11:56:05.902509 4922 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7598fc1c-8735-4c0e-a095-f13117d3037e-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:05 crc kubenswrapper[4922]: I0218 11:56:05.903039 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7598fc1c-8735-4c0e-a095-f13117d3037e-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:05 crc kubenswrapper[4922]: I0218 11:56:05.903060 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/20b58cdc-a36d-4a63-b86d-474dac5d4566-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:05 crc kubenswrapper[4922]: I0218 11:56:05.903070 4922 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7598fc1c-8735-4c0e-a095-f13117d3037e-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:05 crc kubenswrapper[4922]: I0218 11:56:05.903081 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/20b58cdc-a36d-4a63-b86d-474dac5d4566-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:05 crc kubenswrapper[4922]: I0218 11:56:05.903089 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7598fc1c-8735-4c0e-a095-f13117d3037e-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:05 crc kubenswrapper[4922]: I0218 11:56:05.903098 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7598fc1c-8735-4c0e-a095-f13117d3037e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:05 crc kubenswrapper[4922]: I0218 11:56:05.903107 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p24jm\" (UniqueName: \"kubernetes.io/projected/20b58cdc-a36d-4a63-b86d-474dac5d4566-kube-api-access-p24jm\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:05 crc kubenswrapper[4922]: I0218 11:56:05.903116 4922 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/20b58cdc-a36d-4a63-b86d-474dac5d4566-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:05 crc kubenswrapper[4922]: I0218 11:56:05.903124 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kj8x7\" (UniqueName: \"kubernetes.io/projected/7598fc1c-8735-4c0e-a095-f13117d3037e-kube-api-access-kj8x7\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:05 crc kubenswrapper[4922]: I0218 11:56:05.903132 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20b58cdc-a36d-4a63-b86d-474dac5d4566-logs\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:05 crc kubenswrapper[4922]: I0218 11:56:05.995295 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-97df67fc7-qxhz9"] Feb 18 11:56:06 crc kubenswrapper[4922]: I0218 11:56:06.005907 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-97df67fc7-qxhz9"] Feb 18 11:56:06 crc kubenswrapper[4922]: E0218 11:56:06.045155 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Feb 18 11:56:06 crc kubenswrapper[4922]: E0218 11:56:06.045379 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n64bh648hb7h56chd8h679hcfh54fh658h688h76hbch545h584h5ffh5d5h59fh648h6h54bh5cfhbbh568hdch55ch645h5dbh76h5b7h598h78h5f4q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lbxnl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-569dfc4865-ndwdj_openstack(8d1daa08-43b5-47db-bd94-3efb0eb4dce2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 11:56:06 crc kubenswrapper[4922]: E0218 11:56:06.048027 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-569dfc4865-ndwdj" podUID="8d1daa08-43b5-47db-bd94-3efb0eb4dce2" Feb 18 11:56:06 crc kubenswrapper[4922]: E0218 11:56:06.590754 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Feb 18 11:56:06 crc kubenswrapper[4922]: E0218 11:56:06.590919 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n556hfbh5c8h658h586hdfh5c5h6dh64dh9chfchd6h8ch548h579h8chb5h95h74h64ch65bh5c8hch65fh66ch644h589h5b8h567h579h5c7h7q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jmwrr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(a3873c9e-308f-46ea-ac8f-4ee78ca92235): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 11:56:06 crc kubenswrapper[4922]: I0218 11:56:06.619032 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mqkkx" Feb 18 11:56:06 crc kubenswrapper[4922]: I0218 11:56:06.792647 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-mqkkx"] Feb 18 11:56:06 crc kubenswrapper[4922]: I0218 11:56:06.801507 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-mqkkx"] Feb 18 11:56:06 crc kubenswrapper[4922]: I0218 11:56:06.902997 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-8pvj2"] Feb 18 11:56:06 crc kubenswrapper[4922]: E0218 11:56:06.903497 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7598fc1c-8735-4c0e-a095-f13117d3037e" containerName="keystone-bootstrap" Feb 18 11:56:06 crc kubenswrapper[4922]: I0218 11:56:06.903513 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="7598fc1c-8735-4c0e-a095-f13117d3037e" containerName="keystone-bootstrap" Feb 18 11:56:06 crc kubenswrapper[4922]: I0218 11:56:06.903702 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="7598fc1c-8735-4c0e-a095-f13117d3037e" containerName="keystone-bootstrap" Feb 18 11:56:06 crc kubenswrapper[4922]: I0218 11:56:06.904287 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8pvj2" Feb 18 11:56:06 crc kubenswrapper[4922]: I0218 11:56:06.908972 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 18 11:56:06 crc kubenswrapper[4922]: I0218 11:56:06.909044 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-r8rn4" Feb 18 11:56:06 crc kubenswrapper[4922]: I0218 11:56:06.909121 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 18 11:56:06 crc kubenswrapper[4922]: I0218 11:56:06.909621 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 18 11:56:06 crc kubenswrapper[4922]: I0218 11:56:06.909865 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 18 11:56:06 crc kubenswrapper[4922]: I0218 11:56:06.927585 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-8pvj2"] Feb 18 11:56:06 crc kubenswrapper[4922]: I0218 11:56:06.933485 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0-combined-ca-bundle\") pod \"keystone-bootstrap-8pvj2\" (UID: \"53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0\") " pod="openstack/keystone-bootstrap-8pvj2" Feb 18 11:56:06 crc kubenswrapper[4922]: I0218 11:56:06.933593 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcv5t\" (UniqueName: \"kubernetes.io/projected/53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0-kube-api-access-bcv5t\") pod \"keystone-bootstrap-8pvj2\" (UID: \"53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0\") " pod="openstack/keystone-bootstrap-8pvj2" Feb 18 11:56:06 crc kubenswrapper[4922]: I0218 11:56:06.933626 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0-config-data\") pod \"keystone-bootstrap-8pvj2\" (UID: \"53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0\") " pod="openstack/keystone-bootstrap-8pvj2" Feb 18 11:56:06 crc kubenswrapper[4922]: I0218 11:56:06.933674 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0-fernet-keys\") pod \"keystone-bootstrap-8pvj2\" (UID: \"53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0\") " pod="openstack/keystone-bootstrap-8pvj2" Feb 18 11:56:06 crc kubenswrapper[4922]: I0218 11:56:06.933731 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0-credential-keys\") pod \"keystone-bootstrap-8pvj2\" (UID: \"53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0\") " pod="openstack/keystone-bootstrap-8pvj2" Feb 18 11:56:06 crc kubenswrapper[4922]: I0218 11:56:06.933808 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0-scripts\") pod \"keystone-bootstrap-8pvj2\" (UID: \"53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0\") " pod="openstack/keystone-bootstrap-8pvj2" Feb 18 11:56:06 crc kubenswrapper[4922]: I0218 11:56:06.988652 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b58cdc-a36d-4a63-b86d-474dac5d4566" path="/var/lib/kubelet/pods/20b58cdc-a36d-4a63-b86d-474dac5d4566/volumes" Feb 18 11:56:06 crc kubenswrapper[4922]: I0218 11:56:06.989033 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7598fc1c-8735-4c0e-a095-f13117d3037e" path="/var/lib/kubelet/pods/7598fc1c-8735-4c0e-a095-f13117d3037e/volumes" Feb 18 11:56:07 crc kubenswrapper[4922]: I0218 11:56:07.038150 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0-fernet-keys\") pod \"keystone-bootstrap-8pvj2\" (UID: \"53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0\") " pod="openstack/keystone-bootstrap-8pvj2" Feb 18 11:56:07 crc kubenswrapper[4922]: I0218 11:56:07.038299 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0-credential-keys\") pod \"keystone-bootstrap-8pvj2\" (UID: \"53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0\") " pod="openstack/keystone-bootstrap-8pvj2" Feb 18 11:56:07 crc kubenswrapper[4922]: I0218 11:56:07.038343 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0-scripts\") pod \"keystone-bootstrap-8pvj2\" (UID: \"53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0\") " pod="openstack/keystone-bootstrap-8pvj2" Feb 18 11:56:07 crc kubenswrapper[4922]: I0218 11:56:07.038457 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0-combined-ca-bundle\") pod \"keystone-bootstrap-8pvj2\" (UID: \"53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0\") " pod="openstack/keystone-bootstrap-8pvj2" Feb 18 11:56:07 crc kubenswrapper[4922]: I0218 11:56:07.038698 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcv5t\" (UniqueName: \"kubernetes.io/projected/53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0-kube-api-access-bcv5t\") pod \"keystone-bootstrap-8pvj2\" (UID: \"53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0\") " pod="openstack/keystone-bootstrap-8pvj2" Feb 18 11:56:07 crc kubenswrapper[4922]: I0218 11:56:07.038737 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0-config-data\") pod \"keystone-bootstrap-8pvj2\" (UID: \"53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0\") " pod="openstack/keystone-bootstrap-8pvj2" Feb 18 11:56:07 crc kubenswrapper[4922]: I0218 11:56:07.047477 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0-credential-keys\") pod \"keystone-bootstrap-8pvj2\" (UID: \"53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0\") " pod="openstack/keystone-bootstrap-8pvj2" Feb 18 11:56:07 crc kubenswrapper[4922]: I0218 11:56:07.047782 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0-config-data\") pod \"keystone-bootstrap-8pvj2\" (UID: \"53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0\") " pod="openstack/keystone-bootstrap-8pvj2" Feb 18 11:56:07 crc kubenswrapper[4922]: I0218 11:56:07.048807 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0-fernet-keys\") pod \"keystone-bootstrap-8pvj2\" (UID: \"53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0\") " pod="openstack/keystone-bootstrap-8pvj2" Feb 18 11:56:07 crc kubenswrapper[4922]: I0218 11:56:07.051142 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0-combined-ca-bundle\") pod \"keystone-bootstrap-8pvj2\" (UID: \"53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0\") " pod="openstack/keystone-bootstrap-8pvj2" Feb 18 11:56:07 crc kubenswrapper[4922]: I0218 11:56:07.052843 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0-scripts\") pod \"keystone-bootstrap-8pvj2\" (UID: \"53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0\") " pod="openstack/keystone-bootstrap-8pvj2" Feb 18 11:56:07 crc kubenswrapper[4922]: I0218 11:56:07.066688 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcv5t\" (UniqueName: \"kubernetes.io/projected/53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0-kube-api-access-bcv5t\") pod \"keystone-bootstrap-8pvj2\" (UID: \"53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0\") " pod="openstack/keystone-bootstrap-8pvj2" Feb 18 11:56:07 crc kubenswrapper[4922]: I0218 11:56:07.236955 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8pvj2" Feb 18 11:56:08 crc kubenswrapper[4922]: E0218 11:56:08.352905 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Feb 18 11:56:08 crc kubenswrapper[4922]: E0218 11:56:08.353666 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4h8nh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-rvpx7_openstack(d7852f85-b8c5-458e-901c-3659c5ed2713): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 11:56:08 crc kubenswrapper[4922]: E0218 11:56:08.354842 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-rvpx7" podUID="d7852f85-b8c5-458e-901c-3659c5ed2713" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.390189 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.453340 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-569dfc4865-ndwdj" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.471914 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c7e64f4b-f6d6-43b3-aeab-47a0da094a84-custom-prometheus-ca\") pod \"c7e64f4b-f6d6-43b3-aeab-47a0da094a84\" (UID: \"c7e64f4b-f6d6-43b3-aeab-47a0da094a84\") " Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.471993 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d1daa08-43b5-47db-bd94-3efb0eb4dce2-logs\") pod \"8d1daa08-43b5-47db-bd94-3efb0eb4dce2\" (UID: \"8d1daa08-43b5-47db-bd94-3efb0eb4dce2\") " Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.472025 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6bkw\" (UniqueName: \"kubernetes.io/projected/c7e64f4b-f6d6-43b3-aeab-47a0da094a84-kube-api-access-k6bkw\") pod \"c7e64f4b-f6d6-43b3-aeab-47a0da094a84\" (UID: \"c7e64f4b-f6d6-43b3-aeab-47a0da094a84\") " Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.472043 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7e64f4b-f6d6-43b3-aeab-47a0da094a84-config-data\") pod \"c7e64f4b-f6d6-43b3-aeab-47a0da094a84\" (UID: \"c7e64f4b-f6d6-43b3-aeab-47a0da094a84\") " Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.472060 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8d1daa08-43b5-47db-bd94-3efb0eb4dce2-horizon-secret-key\") pod \"8d1daa08-43b5-47db-bd94-3efb0eb4dce2\" (UID: \"8d1daa08-43b5-47db-bd94-3efb0eb4dce2\") " Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.472165 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e64f4b-f6d6-43b3-aeab-47a0da094a84-combined-ca-bundle\") pod \"c7e64f4b-f6d6-43b3-aeab-47a0da094a84\" (UID: \"c7e64f4b-f6d6-43b3-aeab-47a0da094a84\") " Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.472233 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8d1daa08-43b5-47db-bd94-3efb0eb4dce2-scripts\") pod \"8d1daa08-43b5-47db-bd94-3efb0eb4dce2\" (UID: \"8d1daa08-43b5-47db-bd94-3efb0eb4dce2\") " Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.472261 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7e64f4b-f6d6-43b3-aeab-47a0da094a84-logs\") pod \"c7e64f4b-f6d6-43b3-aeab-47a0da094a84\" (UID: \"c7e64f4b-f6d6-43b3-aeab-47a0da094a84\") " Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.472287 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8d1daa08-43b5-47db-bd94-3efb0eb4dce2-config-data\") pod \"8d1daa08-43b5-47db-bd94-3efb0eb4dce2\" (UID: \"8d1daa08-43b5-47db-bd94-3efb0eb4dce2\") " Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.472378 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbxnl\" (UniqueName: \"kubernetes.io/projected/8d1daa08-43b5-47db-bd94-3efb0eb4dce2-kube-api-access-lbxnl\") pod \"8d1daa08-43b5-47db-bd94-3efb0eb4dce2\" (UID: \"8d1daa08-43b5-47db-bd94-3efb0eb4dce2\") " Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.478107 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d1daa08-43b5-47db-bd94-3efb0eb4dce2-kube-api-access-lbxnl" (OuterVolumeSpecName: "kube-api-access-lbxnl") pod "8d1daa08-43b5-47db-bd94-3efb0eb4dce2" (UID: "8d1daa08-43b5-47db-bd94-3efb0eb4dce2"). InnerVolumeSpecName "kube-api-access-lbxnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.478675 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d1daa08-43b5-47db-bd94-3efb0eb4dce2-logs" (OuterVolumeSpecName: "logs") pod "8d1daa08-43b5-47db-bd94-3efb0eb4dce2" (UID: "8d1daa08-43b5-47db-bd94-3efb0eb4dce2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.478758 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7e64f4b-f6d6-43b3-aeab-47a0da094a84-logs" (OuterVolumeSpecName: "logs") pod "c7e64f4b-f6d6-43b3-aeab-47a0da094a84" (UID: "c7e64f4b-f6d6-43b3-aeab-47a0da094a84"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.479230 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d1daa08-43b5-47db-bd94-3efb0eb4dce2-scripts" (OuterVolumeSpecName: "scripts") pod "8d1daa08-43b5-47db-bd94-3efb0eb4dce2" (UID: "8d1daa08-43b5-47db-bd94-3efb0eb4dce2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.480149 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d1daa08-43b5-47db-bd94-3efb0eb4dce2-config-data" (OuterVolumeSpecName: "config-data") pod "8d1daa08-43b5-47db-bd94-3efb0eb4dce2" (UID: "8d1daa08-43b5-47db-bd94-3efb0eb4dce2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.481475 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d1daa08-43b5-47db-bd94-3efb0eb4dce2-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "8d1daa08-43b5-47db-bd94-3efb0eb4dce2" (UID: "8d1daa08-43b5-47db-bd94-3efb0eb4dce2"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.481969 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7e64f4b-f6d6-43b3-aeab-47a0da094a84-kube-api-access-k6bkw" (OuterVolumeSpecName: "kube-api-access-k6bkw") pod "c7e64f4b-f6d6-43b3-aeab-47a0da094a84" (UID: "c7e64f4b-f6d6-43b3-aeab-47a0da094a84"). InnerVolumeSpecName "kube-api-access-k6bkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.486039 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5b45644857-ghjwx" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.487858 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-8pvj2"] Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.530109 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-9d79df67b-mg9kq"] Feb 18 11:56:08 crc kubenswrapper[4922]: W0218 11:56:08.540567 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3b4cee1_5234_4b6c_93fa_3cb5687ecba9.slice/crio-daa7d2ea30da11dcfd9f4970f896f52151306629ac261d217fb3c7d7a4b261fd WatchSource:0}: Error finding container daa7d2ea30da11dcfd9f4970f896f52151306629ac261d217fb3c7d7a4b261fd: Status 404 returned error can't find the container with id daa7d2ea30da11dcfd9f4970f896f52151306629ac261d217fb3c7d7a4b261fd Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.569244 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7bbf5454f6-d5958"] Feb 18 11:56:08 crc kubenswrapper[4922]: W0218 11:56:08.570314 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3bc8759d_86ff_415d_936a_064ef742f0d9.slice/crio-bcfc19d1c521778a93513ec653e31694999fc6019dbd35eb5f3614b0b07075a0 WatchSource:0}: Error finding container bcfc19d1c521778a93513ec653e31694999fc6019dbd35eb5f3614b0b07075a0: Status 404 returned error can't find the container with id bcfc19d1c521778a93513ec653e31694999fc6019dbd35eb5f3614b0b07075a0 Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.573204 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/92ff9b70-4f7e-43b8-b270-3470a18fcbda-scripts\") pod \"92ff9b70-4f7e-43b8-b270-3470a18fcbda\" (UID: \"92ff9b70-4f7e-43b8-b270-3470a18fcbda\") " Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.573257 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/92ff9b70-4f7e-43b8-b270-3470a18fcbda-config-data\") pod \"92ff9b70-4f7e-43b8-b270-3470a18fcbda\" (UID: \"92ff9b70-4f7e-43b8-b270-3470a18fcbda\") " Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.573274 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/92ff9b70-4f7e-43b8-b270-3470a18fcbda-horizon-secret-key\") pod \"92ff9b70-4f7e-43b8-b270-3470a18fcbda\" (UID: \"92ff9b70-4f7e-43b8-b270-3470a18fcbda\") " Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.573323 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92ff9b70-4f7e-43b8-b270-3470a18fcbda-logs\") pod \"92ff9b70-4f7e-43b8-b270-3470a18fcbda\" (UID: \"92ff9b70-4f7e-43b8-b270-3470a18fcbda\") " Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.573383 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmpjz\" (UniqueName: \"kubernetes.io/projected/92ff9b70-4f7e-43b8-b270-3470a18fcbda-kube-api-access-gmpjz\") pod \"92ff9b70-4f7e-43b8-b270-3470a18fcbda\" (UID: \"92ff9b70-4f7e-43b8-b270-3470a18fcbda\") " Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.573706 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d1daa08-43b5-47db-bd94-3efb0eb4dce2-logs\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.573720 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6bkw\" (UniqueName: \"kubernetes.io/projected/c7e64f4b-f6d6-43b3-aeab-47a0da094a84-kube-api-access-k6bkw\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.573730 4922 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8d1daa08-43b5-47db-bd94-3efb0eb4dce2-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.573739 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8d1daa08-43b5-47db-bd94-3efb0eb4dce2-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.573746 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7e64f4b-f6d6-43b3-aeab-47a0da094a84-logs\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.573755 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8d1daa08-43b5-47db-bd94-3efb0eb4dce2-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.573762 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbxnl\" (UniqueName: \"kubernetes.io/projected/8d1daa08-43b5-47db-bd94-3efb0eb4dce2-kube-api-access-lbxnl\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.574576 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92ff9b70-4f7e-43b8-b270-3470a18fcbda-logs" (OuterVolumeSpecName: "logs") pod "92ff9b70-4f7e-43b8-b270-3470a18fcbda" (UID: "92ff9b70-4f7e-43b8-b270-3470a18fcbda"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.574640 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92ff9b70-4f7e-43b8-b270-3470a18fcbda-scripts" (OuterVolumeSpecName: "scripts") pod "92ff9b70-4f7e-43b8-b270-3470a18fcbda" (UID: "92ff9b70-4f7e-43b8-b270-3470a18fcbda"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.575037 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92ff9b70-4f7e-43b8-b270-3470a18fcbda-config-data" (OuterVolumeSpecName: "config-data") pod "92ff9b70-4f7e-43b8-b270-3470a18fcbda" (UID: "92ff9b70-4f7e-43b8-b270-3470a18fcbda"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.584205 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92ff9b70-4f7e-43b8-b270-3470a18fcbda-kube-api-access-gmpjz" (OuterVolumeSpecName: "kube-api-access-gmpjz") pod "92ff9b70-4f7e-43b8-b270-3470a18fcbda" (UID: "92ff9b70-4f7e-43b8-b270-3470a18fcbda"). InnerVolumeSpecName "kube-api-access-gmpjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.584786 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92ff9b70-4f7e-43b8-b270-3470a18fcbda-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "92ff9b70-4f7e-43b8-b270-3470a18fcbda" (UID: "92ff9b70-4f7e-43b8-b270-3470a18fcbda"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.616670 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7e64f4b-f6d6-43b3-aeab-47a0da094a84-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "c7e64f4b-f6d6-43b3-aeab-47a0da094a84" (UID: "c7e64f4b-f6d6-43b3-aeab-47a0da094a84"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.633893 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7e64f4b-f6d6-43b3-aeab-47a0da094a84-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c7e64f4b-f6d6-43b3-aeab-47a0da094a84" (UID: "c7e64f4b-f6d6-43b3-aeab-47a0da094a84"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.635518 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9d79df67b-mg9kq" event={"ID":"e3b4cee1-5234-4b6c-93fa-3cb5687ecba9","Type":"ContainerStarted","Data":"daa7d2ea30da11dcfd9f4970f896f52151306629ac261d217fb3c7d7a4b261fd"} Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.636662 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7bbf5454f6-d5958" event={"ID":"3bc8759d-86ff-415d-936a-064ef742f0d9","Type":"ContainerStarted","Data":"bcfc19d1c521778a93513ec653e31694999fc6019dbd35eb5f3614b0b07075a0"} Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.637525 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-569dfc4865-ndwdj" event={"ID":"8d1daa08-43b5-47db-bd94-3efb0eb4dce2","Type":"ContainerDied","Data":"de849e1d2ee9efed619f5a1fae0183071acffda1776ddbf93a3e733f6554e51b"} Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.637554 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-569dfc4865-ndwdj" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.638320 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5b45644857-ghjwx" event={"ID":"92ff9b70-4f7e-43b8-b270-3470a18fcbda","Type":"ContainerDied","Data":"bea8a00a01eb30a986c3d9e163f1f6f16bd92caadc91d9f743ebfba2b622cd5d"} Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.638424 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5b45644857-ghjwx" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.646814 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8pvj2" event={"ID":"53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0","Type":"ContainerStarted","Data":"00caa03b7a061a2933729fdd382be3e5af6da2b0ea0a8fe37af7145a15ee06a2"} Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.647126 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7e64f4b-f6d6-43b3-aeab-47a0da094a84-config-data" (OuterVolumeSpecName: "config-data") pod "c7e64f4b-f6d6-43b3-aeab-47a0da094a84" (UID: "c7e64f4b-f6d6-43b3-aeab-47a0da094a84"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.650312 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.650645 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"c7e64f4b-f6d6-43b3-aeab-47a0da094a84","Type":"ContainerDied","Data":"5443076c45ea0609742b231b2d38cf6b7ce01d78bf985c32d5e9cd70f2e17de2"} Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.650686 4922 scope.go:117] "RemoveContainer" containerID="580ec3bf82a2cc3d74061b0ae6a5ff2d1e8371f633928e9b01fc3ab3ece19784" Feb 18 11:56:08 crc kubenswrapper[4922]: E0218 11:56:08.652642 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-rvpx7" podUID="d7852f85-b8c5-458e-901c-3659c5ed2713" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.675631 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92ff9b70-4f7e-43b8-b270-3470a18fcbda-logs\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.675713 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmpjz\" (UniqueName: \"kubernetes.io/projected/92ff9b70-4f7e-43b8-b270-3470a18fcbda-kube-api-access-gmpjz\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.675959 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e64f4b-f6d6-43b3-aeab-47a0da094a84-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.676001 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/92ff9b70-4f7e-43b8-b270-3470a18fcbda-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.676017 4922 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c7e64f4b-f6d6-43b3-aeab-47a0da094a84-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.676028 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/92ff9b70-4f7e-43b8-b270-3470a18fcbda-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.676039 4922 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/92ff9b70-4f7e-43b8-b270-3470a18fcbda-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.676050 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7e64f4b-f6d6-43b3-aeab-47a0da094a84-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.697300 4922 scope.go:117] "RemoveContainer" containerID="45913156357b5b17fd4ecdb6fa34c39561bf28bcfc8935781691cdc1a3026c9b" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.710081 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5b45644857-ghjwx"] Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.721451 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5b45644857-ghjwx"] Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.761028 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-569dfc4865-ndwdj"] Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.776680 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-569dfc4865-ndwdj"] Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.782754 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.792639 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.799718 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Feb 18 11:56:08 crc kubenswrapper[4922]: E0218 11:56:08.800132 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7e64f4b-f6d6-43b3-aeab-47a0da094a84" containerName="watcher-api" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.800150 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7e64f4b-f6d6-43b3-aeab-47a0da094a84" containerName="watcher-api" Feb 18 11:56:08 crc kubenswrapper[4922]: E0218 11:56:08.800178 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7e64f4b-f6d6-43b3-aeab-47a0da094a84" containerName="watcher-api-log" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.800185 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7e64f4b-f6d6-43b3-aeab-47a0da094a84" containerName="watcher-api-log" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.800353 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7e64f4b-f6d6-43b3-aeab-47a0da094a84" containerName="watcher-api-log" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.800434 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7e64f4b-f6d6-43b3-aeab-47a0da094a84" containerName="watcher-api" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.801401 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.804165 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.810164 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.879105 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ade99914-d8c2-4a1a-9492-d3bb2a83a64d-logs\") pod \"watcher-api-0\" (UID: \"ade99914-d8c2-4a1a-9492-d3bb2a83a64d\") " pod="openstack/watcher-api-0" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.879178 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ade99914-d8c2-4a1a-9492-d3bb2a83a64d-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"ade99914-d8c2-4a1a-9492-d3bb2a83a64d\") " pod="openstack/watcher-api-0" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.879246 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ade99914-d8c2-4a1a-9492-d3bb2a83a64d-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"ade99914-d8c2-4a1a-9492-d3bb2a83a64d\") " pod="openstack/watcher-api-0" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.879318 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ade99914-d8c2-4a1a-9492-d3bb2a83a64d-config-data\") pod \"watcher-api-0\" (UID: \"ade99914-d8c2-4a1a-9492-d3bb2a83a64d\") " pod="openstack/watcher-api-0" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.879476 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhfnt\" (UniqueName: \"kubernetes.io/projected/ade99914-d8c2-4a1a-9492-d3bb2a83a64d-kube-api-access-vhfnt\") pod \"watcher-api-0\" (UID: \"ade99914-d8c2-4a1a-9492-d3bb2a83a64d\") " pod="openstack/watcher-api-0" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.997277 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhfnt\" (UniqueName: \"kubernetes.io/projected/ade99914-d8c2-4a1a-9492-d3bb2a83a64d-kube-api-access-vhfnt\") pod \"watcher-api-0\" (UID: \"ade99914-d8c2-4a1a-9492-d3bb2a83a64d\") " pod="openstack/watcher-api-0" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.997390 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ade99914-d8c2-4a1a-9492-d3bb2a83a64d-logs\") pod \"watcher-api-0\" (UID: \"ade99914-d8c2-4a1a-9492-d3bb2a83a64d\") " pod="openstack/watcher-api-0" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.997551 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ade99914-d8c2-4a1a-9492-d3bb2a83a64d-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"ade99914-d8c2-4a1a-9492-d3bb2a83a64d\") " pod="openstack/watcher-api-0" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.997613 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ade99914-d8c2-4a1a-9492-d3bb2a83a64d-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"ade99914-d8c2-4a1a-9492-d3bb2a83a64d\") " pod="openstack/watcher-api-0" Feb 18 11:56:08 crc kubenswrapper[4922]: I0218 11:56:08.997693 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ade99914-d8c2-4a1a-9492-d3bb2a83a64d-config-data\") pod \"watcher-api-0\" (UID: \"ade99914-d8c2-4a1a-9492-d3bb2a83a64d\") " pod="openstack/watcher-api-0" Feb 18 11:56:09 crc kubenswrapper[4922]: I0218 11:56:09.000291 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ade99914-d8c2-4a1a-9492-d3bb2a83a64d-logs\") pod \"watcher-api-0\" (UID: \"ade99914-d8c2-4a1a-9492-d3bb2a83a64d\") " pod="openstack/watcher-api-0" Feb 18 11:56:09 crc kubenswrapper[4922]: I0218 11:56:09.003304 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ade99914-d8c2-4a1a-9492-d3bb2a83a64d-config-data\") pod \"watcher-api-0\" (UID: \"ade99914-d8c2-4a1a-9492-d3bb2a83a64d\") " pod="openstack/watcher-api-0" Feb 18 11:56:09 crc kubenswrapper[4922]: I0218 11:56:09.003624 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ade99914-d8c2-4a1a-9492-d3bb2a83a64d-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"ade99914-d8c2-4a1a-9492-d3bb2a83a64d\") " pod="openstack/watcher-api-0" Feb 18 11:56:09 crc kubenswrapper[4922]: I0218 11:56:09.004815 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ade99914-d8c2-4a1a-9492-d3bb2a83a64d-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"ade99914-d8c2-4a1a-9492-d3bb2a83a64d\") " pod="openstack/watcher-api-0" Feb 18 11:56:09 crc kubenswrapper[4922]: I0218 11:56:09.015937 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d1daa08-43b5-47db-bd94-3efb0eb4dce2" path="/var/lib/kubelet/pods/8d1daa08-43b5-47db-bd94-3efb0eb4dce2/volumes" Feb 18 11:56:09 crc kubenswrapper[4922]: I0218 11:56:09.016681 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92ff9b70-4f7e-43b8-b270-3470a18fcbda" path="/var/lib/kubelet/pods/92ff9b70-4f7e-43b8-b270-3470a18fcbda/volumes" Feb 18 11:56:09 crc kubenswrapper[4922]: I0218 11:56:09.017866 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhfnt\" (UniqueName: \"kubernetes.io/projected/ade99914-d8c2-4a1a-9492-d3bb2a83a64d-kube-api-access-vhfnt\") pod \"watcher-api-0\" (UID: \"ade99914-d8c2-4a1a-9492-d3bb2a83a64d\") " pod="openstack/watcher-api-0" Feb 18 11:56:09 crc kubenswrapper[4922]: I0218 11:56:09.026763 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7e64f4b-f6d6-43b3-aeab-47a0da094a84" path="/var/lib/kubelet/pods/c7e64f4b-f6d6-43b3-aeab-47a0da094a84/volumes" Feb 18 11:56:09 crc kubenswrapper[4922]: I0218 11:56:09.126544 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 18 11:56:09 crc kubenswrapper[4922]: I0218 11:56:09.574602 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Feb 18 11:56:09 crc kubenswrapper[4922]: I0218 11:56:09.662963 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-2ktqs" event={"ID":"24bbb94b-821e-4c8c-ae27-356f296903bf","Type":"ContainerStarted","Data":"4fbf873b475eb4db471744da3b986195e954cf285ed89e854f16fe22b1d17ca4"} Feb 18 11:56:09 crc kubenswrapper[4922]: I0218 11:56:09.663303 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cf78879c9-2ktqs" Feb 18 11:56:09 crc kubenswrapper[4922]: I0218 11:56:09.667078 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"9e9fc515-5e15-41fc-8e76-b9f3af099a0f","Type":"ContainerStarted","Data":"538ff05accf8e69796a85607f99df1d694327aae777210403771939c222ef98d"} Feb 18 11:56:09 crc kubenswrapper[4922]: I0218 11:56:09.669706 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8pvj2" event={"ID":"53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0","Type":"ContainerStarted","Data":"a94ac8d35c2954541cdb5fa078a3d4a480cc9002ceb6fe5897faee87fe9e9f1f"} Feb 18 11:56:09 crc kubenswrapper[4922]: I0218 11:56:09.673578 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"0c614b6a-8d46-4a07-89f6-1a7cc64dfdad","Type":"ContainerStarted","Data":"17cce53602151ba6d1c560fb5734c8a81055d3f6f58806eded906ecc6d4b9003"} Feb 18 11:56:09 crc kubenswrapper[4922]: I0218 11:56:09.677723 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-clz29" event={"ID":"71318c6d-61ee-4fb4-8682-7cf3fc0ae044","Type":"ContainerStarted","Data":"5597fcea1ce088dbc75d5c2c33874c67f09dcb474426ad6372bb91fff5a29ef3"} Feb 18 11:56:09 crc kubenswrapper[4922]: I0218 11:56:09.700204 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cf78879c9-2ktqs" podStartSLOduration=50.70018397 podStartE2EDuration="50.70018397s" podCreationTimestamp="2026-02-18 11:55:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:56:09.693621644 +0000 UTC m=+1171.421325724" watchObservedRunningTime="2026-02-18 11:56:09.70018397 +0000 UTC m=+1171.427888050" Feb 18 11:56:09 crc kubenswrapper[4922]: I0218 11:56:09.733826 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=9.042885889 podStartE2EDuration="50.73380616s" podCreationTimestamp="2026-02-18 11:55:19 +0000 UTC" firstStartedPulling="2026-02-18 11:55:23.767857836 +0000 UTC m=+1125.495561906" lastFinishedPulling="2026-02-18 11:56:05.458778097 +0000 UTC m=+1167.186482177" observedRunningTime="2026-02-18 11:56:09.711511267 +0000 UTC m=+1171.439215347" watchObservedRunningTime="2026-02-18 11:56:09.73380616 +0000 UTC m=+1171.461510240" Feb 18 11:56:09 crc kubenswrapper[4922]: I0218 11:56:09.736754 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-clz29" podStartSLOduration=6.241876883 podStartE2EDuration="50.736743415s" podCreationTimestamp="2026-02-18 11:55:19 +0000 UTC" firstStartedPulling="2026-02-18 11:55:23.562051022 +0000 UTC m=+1125.289755102" lastFinishedPulling="2026-02-18 11:56:08.056917554 +0000 UTC m=+1169.784621634" observedRunningTime="2026-02-18 11:56:09.730054295 +0000 UTC m=+1171.457758365" watchObservedRunningTime="2026-02-18 11:56:09.736743415 +0000 UTC m=+1171.464447505" Feb 18 11:56:09 crc kubenswrapper[4922]: I0218 11:56:09.753558 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=13.550213046 podStartE2EDuration="50.753540099s" podCreationTimestamp="2026-02-18 11:55:19 +0000 UTC" firstStartedPulling="2026-02-18 11:55:23.219562284 +0000 UTC m=+1124.947266364" lastFinishedPulling="2026-02-18 11:56:00.422889337 +0000 UTC m=+1162.150593417" observedRunningTime="2026-02-18 11:56:09.749556749 +0000 UTC m=+1171.477260839" watchObservedRunningTime="2026-02-18 11:56:09.753540099 +0000 UTC m=+1171.481244169" Feb 18 11:56:09 crc kubenswrapper[4922]: I0218 11:56:09.770418 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-8pvj2" podStartSLOduration=3.770386755 podStartE2EDuration="3.770386755s" podCreationTimestamp="2026-02-18 11:56:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:56:09.766045195 +0000 UTC m=+1171.493749275" watchObservedRunningTime="2026-02-18 11:56:09.770386755 +0000 UTC m=+1171.498090835" Feb 18 11:56:09 crc kubenswrapper[4922]: I0218 11:56:09.891966 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Feb 18 11:56:09 crc kubenswrapper[4922]: I0218 11:56:09.921137 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Feb 18 11:56:10 crc kubenswrapper[4922]: I0218 11:56:10.009196 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="c7e64f4b-f6d6-43b3-aeab-47a0da094a84" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.149:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 18 11:56:10 crc kubenswrapper[4922]: I0218 11:56:10.131575 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Feb 18 11:56:10 crc kubenswrapper[4922]: I0218 11:56:10.131622 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Feb 18 11:56:10 crc kubenswrapper[4922]: I0218 11:56:10.158998 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Feb 18 11:56:10 crc kubenswrapper[4922]: I0218 11:56:10.687830 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Feb 18 11:56:10 crc kubenswrapper[4922]: I0218 11:56:10.719568 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Feb 18 11:56:10 crc kubenswrapper[4922]: I0218 11:56:10.725192 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Feb 18 11:56:10 crc kubenswrapper[4922]: I0218 11:56:10.770132 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 18 11:56:10 crc kubenswrapper[4922]: I0218 11:56:10.790276 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Feb 18 11:56:11 crc kubenswrapper[4922]: W0218 11:56:11.586629 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podade99914_d8c2_4a1a_9492_d3bb2a83a64d.slice/crio-fd553f3191fff116bcb61d1761b6014e37c31888c5dc3e6ec76e921294d669b1 WatchSource:0}: Error finding container fd553f3191fff116bcb61d1761b6014e37c31888c5dc3e6ec76e921294d669b1: Status 404 returned error can't find the container with id fd553f3191fff116bcb61d1761b6014e37c31888c5dc3e6ec76e921294d669b1 Feb 18 11:56:11 crc kubenswrapper[4922]: I0218 11:56:11.698578 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"ade99914-d8c2-4a1a-9492-d3bb2a83a64d","Type":"ContainerStarted","Data":"fd553f3191fff116bcb61d1761b6014e37c31888c5dc3e6ec76e921294d669b1"} Feb 18 11:56:12 crc kubenswrapper[4922]: I0218 11:56:12.707870 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"ade99914-d8c2-4a1a-9492-d3bb2a83a64d","Type":"ContainerStarted","Data":"f656a5eab3bc7d824593462123204765b1a3f4dc494bd8b0ed339492dd2506ae"} Feb 18 11:56:12 crc kubenswrapper[4922]: I0218 11:56:12.712040 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9d79df67b-mg9kq" event={"ID":"e3b4cee1-5234-4b6c-93fa-3cb5687ecba9","Type":"ContainerStarted","Data":"666106d3ce79297e2d8ed502af3f7dcfa2bfde51946ace3e35cdb24c5fc323a9"} Feb 18 11:56:12 crc kubenswrapper[4922]: I0218 11:56:12.714019 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-applier-0" podUID="0c614b6a-8d46-4a07-89f6-1a7cc64dfdad" containerName="watcher-applier" containerID="cri-o://17cce53602151ba6d1c560fb5734c8a81055d3f6f58806eded906ecc6d4b9003" gracePeriod=30 Feb 18 11:56:12 crc kubenswrapper[4922]: I0218 11:56:12.714345 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7bbf5454f6-d5958" event={"ID":"3bc8759d-86ff-415d-936a-064ef742f0d9","Type":"ContainerStarted","Data":"51e2a8b5f6d89d25573eab92eb04e45aafe7a4105df852b660f1bd03727a3929"} Feb 18 11:56:12 crc kubenswrapper[4922]: I0218 11:56:12.714479 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-decision-engine-0" podUID="9e9fc515-5e15-41fc-8e76-b9f3af099a0f" containerName="watcher-decision-engine" containerID="cri-o://538ff05accf8e69796a85607f99df1d694327aae777210403771939c222ef98d" gracePeriod=30 Feb 18 11:56:13 crc kubenswrapper[4922]: I0218 11:56:13.726293 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3873c9e-308f-46ea-ac8f-4ee78ca92235","Type":"ContainerStarted","Data":"a7b1fa49f2a63b5b267b8b5143ef790e17f341b707fb18ac5be2dd797d33cc2a"} Feb 18 11:56:13 crc kubenswrapper[4922]: I0218 11:56:13.728851 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"ade99914-d8c2-4a1a-9492-d3bb2a83a64d","Type":"ContainerStarted","Data":"0b0005d984f6a47d8e4c600b875d8f557a7684614b4f1a8f4fe087932cc0bade"} Feb 18 11:56:13 crc kubenswrapper[4922]: I0218 11:56:13.729214 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Feb 18 11:56:13 crc kubenswrapper[4922]: I0218 11:56:13.730904 4922 generic.go:334] "Generic (PLEG): container finished" podID="53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0" containerID="a94ac8d35c2954541cdb5fa078a3d4a480cc9002ceb6fe5897faee87fe9e9f1f" exitCode=0 Feb 18 11:56:13 crc kubenswrapper[4922]: I0218 11:56:13.730997 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8pvj2" event={"ID":"53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0","Type":"ContainerDied","Data":"a94ac8d35c2954541cdb5fa078a3d4a480cc9002ceb6fe5897faee87fe9e9f1f"} Feb 18 11:56:13 crc kubenswrapper[4922]: I0218 11:56:13.740429 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9d79df67b-mg9kq" event={"ID":"e3b4cee1-5234-4b6c-93fa-3cb5687ecba9","Type":"ContainerStarted","Data":"5d696a6b456c46bcc03a5a4e42955c244ebb50a3f0ae3ca3ca3bec18dc6258ea"} Feb 18 11:56:13 crc kubenswrapper[4922]: I0218 11:56:13.744300 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7bbf5454f6-d5958" event={"ID":"3bc8759d-86ff-415d-936a-064ef742f0d9","Type":"ContainerStarted","Data":"16cae7749b19b6f7611278b6d47dec0ae3d07e81721fc5c98ebe25f67bfc34ab"} Feb 18 11:56:13 crc kubenswrapper[4922]: I0218 11:56:13.754619 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=5.7545976450000005 podStartE2EDuration="5.754597645s" podCreationTimestamp="2026-02-18 11:56:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:56:13.752464581 +0000 UTC m=+1175.480168681" watchObservedRunningTime="2026-02-18 11:56:13.754597645 +0000 UTC m=+1175.482301725" Feb 18 11:56:13 crc kubenswrapper[4922]: I0218 11:56:13.780196 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7bbf5454f6-d5958" podStartSLOduration=42.155892871 podStartE2EDuration="45.780174061s" podCreationTimestamp="2026-02-18 11:55:28 +0000 UTC" firstStartedPulling="2026-02-18 11:56:08.572237353 +0000 UTC m=+1170.299941433" lastFinishedPulling="2026-02-18 11:56:12.196518533 +0000 UTC m=+1173.924222623" observedRunningTime="2026-02-18 11:56:13.77061986 +0000 UTC m=+1175.498323940" watchObservedRunningTime="2026-02-18 11:56:13.780174061 +0000 UTC m=+1175.507878141" Feb 18 11:56:13 crc kubenswrapper[4922]: I0218 11:56:13.827991 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-9d79df67b-mg9kq" podStartSLOduration=42.19425674 podStartE2EDuration="45.82796957s" podCreationTimestamp="2026-02-18 11:55:28 +0000 UTC" firstStartedPulling="2026-02-18 11:56:08.552494523 +0000 UTC m=+1170.280198603" lastFinishedPulling="2026-02-18 11:56:12.186207343 +0000 UTC m=+1173.913911433" observedRunningTime="2026-02-18 11:56:13.821170568 +0000 UTC m=+1175.548874648" watchObservedRunningTime="2026-02-18 11:56:13.82796957 +0000 UTC m=+1175.555673650" Feb 18 11:56:14 crc kubenswrapper[4922]: I0218 11:56:14.127819 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Feb 18 11:56:15 crc kubenswrapper[4922]: I0218 11:56:15.125785 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8pvj2" Feb 18 11:56:15 crc kubenswrapper[4922]: E0218 11:56:15.137654 4922 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="17cce53602151ba6d1c560fb5734c8a81055d3f6f58806eded906ecc6d4b9003" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 18 11:56:15 crc kubenswrapper[4922]: E0218 11:56:15.139459 4922 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="17cce53602151ba6d1c560fb5734c8a81055d3f6f58806eded906ecc6d4b9003" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 18 11:56:15 crc kubenswrapper[4922]: E0218 11:56:15.145826 4922 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="17cce53602151ba6d1c560fb5734c8a81055d3f6f58806eded906ecc6d4b9003" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 18 11:56:15 crc kubenswrapper[4922]: E0218 11:56:15.145905 4922 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="0c614b6a-8d46-4a07-89f6-1a7cc64dfdad" containerName="watcher-applier" Feb 18 11:56:15 crc kubenswrapper[4922]: I0218 11:56:15.226794 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0-scripts\") pod \"53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0\" (UID: \"53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0\") " Feb 18 11:56:15 crc kubenswrapper[4922]: I0218 11:56:15.226890 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0-credential-keys\") pod \"53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0\" (UID: \"53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0\") " Feb 18 11:56:15 crc kubenswrapper[4922]: I0218 11:56:15.227040 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0-config-data\") pod \"53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0\" (UID: \"53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0\") " Feb 18 11:56:15 crc kubenswrapper[4922]: I0218 11:56:15.227109 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcv5t\" (UniqueName: \"kubernetes.io/projected/53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0-kube-api-access-bcv5t\") pod \"53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0\" (UID: \"53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0\") " Feb 18 11:56:15 crc kubenswrapper[4922]: I0218 11:56:15.227154 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0-fernet-keys\") pod \"53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0\" (UID: \"53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0\") " Feb 18 11:56:15 crc kubenswrapper[4922]: I0218 11:56:15.227197 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0-combined-ca-bundle\") pod \"53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0\" (UID: \"53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0\") " Feb 18 11:56:15 crc kubenswrapper[4922]: I0218 11:56:15.236613 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0" (UID: "53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:56:15 crc kubenswrapper[4922]: I0218 11:56:15.236711 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0-kube-api-access-bcv5t" (OuterVolumeSpecName: "kube-api-access-bcv5t") pod "53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0" (UID: "53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0"). InnerVolumeSpecName "kube-api-access-bcv5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:56:15 crc kubenswrapper[4922]: I0218 11:56:15.250611 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0" (UID: "53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:56:15 crc kubenswrapper[4922]: I0218 11:56:15.251347 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0-scripts" (OuterVolumeSpecName: "scripts") pod "53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0" (UID: "53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:56:15 crc kubenswrapper[4922]: I0218 11:56:15.255078 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0" (UID: "53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:56:15 crc kubenswrapper[4922]: I0218 11:56:15.259190 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0-config-data" (OuterVolumeSpecName: "config-data") pod "53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0" (UID: "53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:56:15 crc kubenswrapper[4922]: I0218 11:56:15.330158 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:15 crc kubenswrapper[4922]: I0218 11:56:15.330217 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:15 crc kubenswrapper[4922]: I0218 11:56:15.330231 4922 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:15 crc kubenswrapper[4922]: I0218 11:56:15.330244 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:15 crc kubenswrapper[4922]: I0218 11:56:15.330258 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcv5t\" (UniqueName: \"kubernetes.io/projected/53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0-kube-api-access-bcv5t\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:15 crc kubenswrapper[4922]: I0218 11:56:15.330272 4922 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:15 crc kubenswrapper[4922]: I0218 11:56:15.690555 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cf78879c9-2ktqs" Feb 18 11:56:15 crc kubenswrapper[4922]: I0218 11:56:15.787878 4922 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 18 11:56:15 crc kubenswrapper[4922]: I0218 11:56:15.788437 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8pvj2" Feb 18 11:56:15 crc kubenswrapper[4922]: I0218 11:56:15.788498 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8pvj2" event={"ID":"53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0","Type":"ContainerDied","Data":"00caa03b7a061a2933729fdd382be3e5af6da2b0ea0a8fe37af7145a15ee06a2"} Feb 18 11:56:15 crc kubenswrapper[4922]: I0218 11:56:15.788547 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00caa03b7a061a2933729fdd382be3e5af6da2b0ea0a8fe37af7145a15ee06a2" Feb 18 11:56:15 crc kubenswrapper[4922]: I0218 11:56:15.789246 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-zmcwq"] Feb 18 11:56:15 crc kubenswrapper[4922]: I0218 11:56:15.789482 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-zmcwq" podUID="ec6256f2-24ef-4e83-9a3c-c98ef206f7b0" containerName="dnsmasq-dns" containerID="cri-o://9c3a3f840bf392e642ec5f6d88074e14f3634fce3f5fe950f1ca6ef37f5d1bd5" gracePeriod=10 Feb 18 11:56:15 crc kubenswrapper[4922]: I0218 11:56:15.919138 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-b854f8786-pls2t"] Feb 18 11:56:15 crc kubenswrapper[4922]: E0218 11:56:15.919838 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0" containerName="keystone-bootstrap" Feb 18 11:56:15 crc kubenswrapper[4922]: I0218 11:56:15.919853 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0" containerName="keystone-bootstrap" Feb 18 11:56:15 crc kubenswrapper[4922]: I0218 11:56:15.920101 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0" containerName="keystone-bootstrap" Feb 18 11:56:15 crc kubenswrapper[4922]: I0218 11:56:15.920847 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b854f8786-pls2t" Feb 18 11:56:15 crc kubenswrapper[4922]: I0218 11:56:15.926902 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 18 11:56:15 crc kubenswrapper[4922]: I0218 11:56:15.927350 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 18 11:56:15 crc kubenswrapper[4922]: I0218 11:56:15.927607 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 18 11:56:15 crc kubenswrapper[4922]: I0218 11:56:15.927732 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 18 11:56:15 crc kubenswrapper[4922]: I0218 11:56:15.927774 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-r8rn4" Feb 18 11:56:15 crc kubenswrapper[4922]: I0218 11:56:15.939612 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 18 11:56:15 crc kubenswrapper[4922]: I0218 11:56:15.944661 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2jn2\" (UniqueName: \"kubernetes.io/projected/2efd0609-4858-47ce-8213-6a74510e8acf-kube-api-access-z2jn2\") pod \"keystone-b854f8786-pls2t\" (UID: \"2efd0609-4858-47ce-8213-6a74510e8acf\") " pod="openstack/keystone-b854f8786-pls2t" Feb 18 11:56:15 crc kubenswrapper[4922]: I0218 11:56:15.944755 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2efd0609-4858-47ce-8213-6a74510e8acf-fernet-keys\") pod \"keystone-b854f8786-pls2t\" (UID: \"2efd0609-4858-47ce-8213-6a74510e8acf\") " pod="openstack/keystone-b854f8786-pls2t" Feb 18 11:56:15 crc kubenswrapper[4922]: I0218 11:56:15.944790 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2efd0609-4858-47ce-8213-6a74510e8acf-combined-ca-bundle\") pod \"keystone-b854f8786-pls2t\" (UID: \"2efd0609-4858-47ce-8213-6a74510e8acf\") " pod="openstack/keystone-b854f8786-pls2t" Feb 18 11:56:15 crc kubenswrapper[4922]: I0218 11:56:15.944885 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2efd0609-4858-47ce-8213-6a74510e8acf-internal-tls-certs\") pod \"keystone-b854f8786-pls2t\" (UID: \"2efd0609-4858-47ce-8213-6a74510e8acf\") " pod="openstack/keystone-b854f8786-pls2t" Feb 18 11:56:15 crc kubenswrapper[4922]: I0218 11:56:15.944929 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2efd0609-4858-47ce-8213-6a74510e8acf-config-data\") pod \"keystone-b854f8786-pls2t\" (UID: \"2efd0609-4858-47ce-8213-6a74510e8acf\") " pod="openstack/keystone-b854f8786-pls2t" Feb 18 11:56:15 crc kubenswrapper[4922]: I0218 11:56:15.944961 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2efd0609-4858-47ce-8213-6a74510e8acf-scripts\") pod \"keystone-b854f8786-pls2t\" (UID: \"2efd0609-4858-47ce-8213-6a74510e8acf\") " pod="openstack/keystone-b854f8786-pls2t" Feb 18 11:56:15 crc kubenswrapper[4922]: I0218 11:56:15.944985 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2efd0609-4858-47ce-8213-6a74510e8acf-public-tls-certs\") pod \"keystone-b854f8786-pls2t\" (UID: \"2efd0609-4858-47ce-8213-6a74510e8acf\") " pod="openstack/keystone-b854f8786-pls2t" Feb 18 11:56:15 crc kubenswrapper[4922]: I0218 11:56:15.945012 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2efd0609-4858-47ce-8213-6a74510e8acf-credential-keys\") pod \"keystone-b854f8786-pls2t\" (UID: \"2efd0609-4858-47ce-8213-6a74510e8acf\") " pod="openstack/keystone-b854f8786-pls2t" Feb 18 11:56:15 crc kubenswrapper[4922]: I0218 11:56:15.966826 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-b854f8786-pls2t"] Feb 18 11:56:16 crc kubenswrapper[4922]: I0218 11:56:16.047578 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2efd0609-4858-47ce-8213-6a74510e8acf-internal-tls-certs\") pod \"keystone-b854f8786-pls2t\" (UID: \"2efd0609-4858-47ce-8213-6a74510e8acf\") " pod="openstack/keystone-b854f8786-pls2t" Feb 18 11:56:16 crc kubenswrapper[4922]: I0218 11:56:16.047727 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2efd0609-4858-47ce-8213-6a74510e8acf-config-data\") pod \"keystone-b854f8786-pls2t\" (UID: \"2efd0609-4858-47ce-8213-6a74510e8acf\") " pod="openstack/keystone-b854f8786-pls2t" Feb 18 11:56:16 crc kubenswrapper[4922]: I0218 11:56:16.047758 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2efd0609-4858-47ce-8213-6a74510e8acf-scripts\") pod \"keystone-b854f8786-pls2t\" (UID: \"2efd0609-4858-47ce-8213-6a74510e8acf\") " pod="openstack/keystone-b854f8786-pls2t" Feb 18 11:56:16 crc kubenswrapper[4922]: I0218 11:56:16.047775 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2efd0609-4858-47ce-8213-6a74510e8acf-public-tls-certs\") pod \"keystone-b854f8786-pls2t\" (UID: \"2efd0609-4858-47ce-8213-6a74510e8acf\") " pod="openstack/keystone-b854f8786-pls2t" Feb 18 11:56:16 crc kubenswrapper[4922]: I0218 11:56:16.047802 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2efd0609-4858-47ce-8213-6a74510e8acf-credential-keys\") pod \"keystone-b854f8786-pls2t\" (UID: \"2efd0609-4858-47ce-8213-6a74510e8acf\") " pod="openstack/keystone-b854f8786-pls2t" Feb 18 11:56:16 crc kubenswrapper[4922]: I0218 11:56:16.049521 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2jn2\" (UniqueName: \"kubernetes.io/projected/2efd0609-4858-47ce-8213-6a74510e8acf-kube-api-access-z2jn2\") pod \"keystone-b854f8786-pls2t\" (UID: \"2efd0609-4858-47ce-8213-6a74510e8acf\") " pod="openstack/keystone-b854f8786-pls2t" Feb 18 11:56:16 crc kubenswrapper[4922]: I0218 11:56:16.049566 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2efd0609-4858-47ce-8213-6a74510e8acf-fernet-keys\") pod \"keystone-b854f8786-pls2t\" (UID: \"2efd0609-4858-47ce-8213-6a74510e8acf\") " pod="openstack/keystone-b854f8786-pls2t" Feb 18 11:56:16 crc kubenswrapper[4922]: I0218 11:56:16.049614 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2efd0609-4858-47ce-8213-6a74510e8acf-combined-ca-bundle\") pod \"keystone-b854f8786-pls2t\" (UID: \"2efd0609-4858-47ce-8213-6a74510e8acf\") " pod="openstack/keystone-b854f8786-pls2t" Feb 18 11:56:16 crc kubenswrapper[4922]: I0218 11:56:16.053576 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2efd0609-4858-47ce-8213-6a74510e8acf-config-data\") pod \"keystone-b854f8786-pls2t\" (UID: \"2efd0609-4858-47ce-8213-6a74510e8acf\") " pod="openstack/keystone-b854f8786-pls2t" Feb 18 11:56:16 crc kubenswrapper[4922]: I0218 11:56:16.054978 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2efd0609-4858-47ce-8213-6a74510e8acf-credential-keys\") pod \"keystone-b854f8786-pls2t\" (UID: \"2efd0609-4858-47ce-8213-6a74510e8acf\") " pod="openstack/keystone-b854f8786-pls2t" Feb 18 11:56:16 crc kubenswrapper[4922]: I0218 11:56:16.055712 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2efd0609-4858-47ce-8213-6a74510e8acf-public-tls-certs\") pod \"keystone-b854f8786-pls2t\" (UID: \"2efd0609-4858-47ce-8213-6a74510e8acf\") " pod="openstack/keystone-b854f8786-pls2t" Feb 18 11:56:16 crc kubenswrapper[4922]: I0218 11:56:16.058943 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2efd0609-4858-47ce-8213-6a74510e8acf-internal-tls-certs\") pod \"keystone-b854f8786-pls2t\" (UID: \"2efd0609-4858-47ce-8213-6a74510e8acf\") " pod="openstack/keystone-b854f8786-pls2t" Feb 18 11:56:16 crc kubenswrapper[4922]: I0218 11:56:16.060064 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2efd0609-4858-47ce-8213-6a74510e8acf-scripts\") pod \"keystone-b854f8786-pls2t\" (UID: \"2efd0609-4858-47ce-8213-6a74510e8acf\") " pod="openstack/keystone-b854f8786-pls2t" Feb 18 11:56:16 crc kubenswrapper[4922]: I0218 11:56:16.060515 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2efd0609-4858-47ce-8213-6a74510e8acf-combined-ca-bundle\") pod \"keystone-b854f8786-pls2t\" (UID: \"2efd0609-4858-47ce-8213-6a74510e8acf\") " pod="openstack/keystone-b854f8786-pls2t" Feb 18 11:56:16 crc kubenswrapper[4922]: I0218 11:56:16.071223 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2efd0609-4858-47ce-8213-6a74510e8acf-fernet-keys\") pod \"keystone-b854f8786-pls2t\" (UID: \"2efd0609-4858-47ce-8213-6a74510e8acf\") " pod="openstack/keystone-b854f8786-pls2t" Feb 18 11:56:16 crc kubenswrapper[4922]: I0218 11:56:16.076715 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2jn2\" (UniqueName: \"kubernetes.io/projected/2efd0609-4858-47ce-8213-6a74510e8acf-kube-api-access-z2jn2\") pod \"keystone-b854f8786-pls2t\" (UID: \"2efd0609-4858-47ce-8213-6a74510e8acf\") " pod="openstack/keystone-b854f8786-pls2t" Feb 18 11:56:16 crc kubenswrapper[4922]: I0218 11:56:16.255467 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Feb 18 11:56:16 crc kubenswrapper[4922]: I0218 11:56:16.299581 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b854f8786-pls2t" Feb 18 11:56:16 crc kubenswrapper[4922]: I0218 11:56:16.809637 4922 generic.go:334] "Generic (PLEG): container finished" podID="ec6256f2-24ef-4e83-9a3c-c98ef206f7b0" containerID="9c3a3f840bf392e642ec5f6d88074e14f3634fce3f5fe950f1ca6ef37f5d1bd5" exitCode=0 Feb 18 11:56:16 crc kubenswrapper[4922]: I0218 11:56:16.811215 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-zmcwq" event={"ID":"ec6256f2-24ef-4e83-9a3c-c98ef206f7b0","Type":"ContainerDied","Data":"9c3a3f840bf392e642ec5f6d88074e14f3634fce3f5fe950f1ca6ef37f5d1bd5"} Feb 18 11:56:16 crc kubenswrapper[4922]: I0218 11:56:16.814021 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-b854f8786-pls2t"] Feb 18 11:56:18 crc kubenswrapper[4922]: I0218 11:56:18.388921 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-9d79df67b-mg9kq" Feb 18 11:56:18 crc kubenswrapper[4922]: I0218 11:56:18.389880 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-9d79df67b-mg9kq" Feb 18 11:56:18 crc kubenswrapper[4922]: I0218 11:56:18.523290 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7bbf5454f6-d5958" Feb 18 11:56:18 crc kubenswrapper[4922]: I0218 11:56:18.523418 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7bbf5454f6-d5958" Feb 18 11:56:19 crc kubenswrapper[4922]: I0218 11:56:19.126995 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Feb 18 11:56:19 crc kubenswrapper[4922]: I0218 11:56:19.131330 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Feb 18 11:56:19 crc kubenswrapper[4922]: I0218 11:56:19.848470 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Feb 18 11:56:19 crc kubenswrapper[4922]: E0218 11:56:19.895647 4922 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="538ff05accf8e69796a85607f99df1d694327aae777210403771939c222ef98d" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Feb 18 11:56:19 crc kubenswrapper[4922]: E0218 11:56:19.898924 4922 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="538ff05accf8e69796a85607f99df1d694327aae777210403771939c222ef98d" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Feb 18 11:56:19 crc kubenswrapper[4922]: E0218 11:56:19.900599 4922 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="538ff05accf8e69796a85607f99df1d694327aae777210403771939c222ef98d" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Feb 18 11:56:19 crc kubenswrapper[4922]: E0218 11:56:19.900641 4922 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-decision-engine-0" podUID="9e9fc515-5e15-41fc-8e76-b9f3af099a0f" containerName="watcher-decision-engine" Feb 18 11:56:20 crc kubenswrapper[4922]: E0218 11:56:20.133701 4922 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="17cce53602151ba6d1c560fb5734c8a81055d3f6f58806eded906ecc6d4b9003" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 18 11:56:20 crc kubenswrapper[4922]: E0218 11:56:20.135382 4922 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="17cce53602151ba6d1c560fb5734c8a81055d3f6f58806eded906ecc6d4b9003" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 18 11:56:20 crc kubenswrapper[4922]: E0218 11:56:20.139704 4922 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="17cce53602151ba6d1c560fb5734c8a81055d3f6f58806eded906ecc6d4b9003" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 18 11:56:20 crc kubenswrapper[4922]: E0218 11:56:20.139769 4922 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="0c614b6a-8d46-4a07-89f6-1a7cc64dfdad" containerName="watcher-applier" Feb 18 11:56:22 crc kubenswrapper[4922]: W0218 11:56:22.390467 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2efd0609_4858_47ce_8213_6a74510e8acf.slice/crio-3d68965d6f82575327207faca5029e68947c2c261cc7f048c8228ca314476d56 WatchSource:0}: Error finding container 3d68965d6f82575327207faca5029e68947c2c261cc7f048c8228ca314476d56: Status 404 returned error can't find the container with id 3d68965d6f82575327207faca5029e68947c2c261cc7f048c8228ca314476d56 Feb 18 11:56:22 crc kubenswrapper[4922]: I0218 11:56:22.473221 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-zmcwq" Feb 18 11:56:22 crc kubenswrapper[4922]: I0218 11:56:22.617980 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7r9n\" (UniqueName: \"kubernetes.io/projected/ec6256f2-24ef-4e83-9a3c-c98ef206f7b0-kube-api-access-f7r9n\") pod \"ec6256f2-24ef-4e83-9a3c-c98ef206f7b0\" (UID: \"ec6256f2-24ef-4e83-9a3c-c98ef206f7b0\") " Feb 18 11:56:22 crc kubenswrapper[4922]: I0218 11:56:22.618072 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec6256f2-24ef-4e83-9a3c-c98ef206f7b0-config\") pod \"ec6256f2-24ef-4e83-9a3c-c98ef206f7b0\" (UID: \"ec6256f2-24ef-4e83-9a3c-c98ef206f7b0\") " Feb 18 11:56:22 crc kubenswrapper[4922]: I0218 11:56:22.618240 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec6256f2-24ef-4e83-9a3c-c98ef206f7b0-dns-svc\") pod \"ec6256f2-24ef-4e83-9a3c-c98ef206f7b0\" (UID: \"ec6256f2-24ef-4e83-9a3c-c98ef206f7b0\") " Feb 18 11:56:22 crc kubenswrapper[4922]: I0218 11:56:22.618271 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec6256f2-24ef-4e83-9a3c-c98ef206f7b0-ovsdbserver-nb\") pod \"ec6256f2-24ef-4e83-9a3c-c98ef206f7b0\" (UID: \"ec6256f2-24ef-4e83-9a3c-c98ef206f7b0\") " Feb 18 11:56:22 crc kubenswrapper[4922]: I0218 11:56:22.618300 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec6256f2-24ef-4e83-9a3c-c98ef206f7b0-ovsdbserver-sb\") pod \"ec6256f2-24ef-4e83-9a3c-c98ef206f7b0\" (UID: \"ec6256f2-24ef-4e83-9a3c-c98ef206f7b0\") " Feb 18 11:56:22 crc kubenswrapper[4922]: I0218 11:56:22.624394 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec6256f2-24ef-4e83-9a3c-c98ef206f7b0-kube-api-access-f7r9n" (OuterVolumeSpecName: "kube-api-access-f7r9n") pod "ec6256f2-24ef-4e83-9a3c-c98ef206f7b0" (UID: "ec6256f2-24ef-4e83-9a3c-c98ef206f7b0"). InnerVolumeSpecName "kube-api-access-f7r9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:56:22 crc kubenswrapper[4922]: I0218 11:56:22.679662 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec6256f2-24ef-4e83-9a3c-c98ef206f7b0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ec6256f2-24ef-4e83-9a3c-c98ef206f7b0" (UID: "ec6256f2-24ef-4e83-9a3c-c98ef206f7b0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:56:22 crc kubenswrapper[4922]: I0218 11:56:22.684764 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec6256f2-24ef-4e83-9a3c-c98ef206f7b0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ec6256f2-24ef-4e83-9a3c-c98ef206f7b0" (UID: "ec6256f2-24ef-4e83-9a3c-c98ef206f7b0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:56:22 crc kubenswrapper[4922]: I0218 11:56:22.685989 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec6256f2-24ef-4e83-9a3c-c98ef206f7b0-config" (OuterVolumeSpecName: "config") pod "ec6256f2-24ef-4e83-9a3c-c98ef206f7b0" (UID: "ec6256f2-24ef-4e83-9a3c-c98ef206f7b0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:56:22 crc kubenswrapper[4922]: I0218 11:56:22.696188 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec6256f2-24ef-4e83-9a3c-c98ef206f7b0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ec6256f2-24ef-4e83-9a3c-c98ef206f7b0" (UID: "ec6256f2-24ef-4e83-9a3c-c98ef206f7b0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:56:22 crc kubenswrapper[4922]: I0218 11:56:22.720283 4922 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec6256f2-24ef-4e83-9a3c-c98ef206f7b0-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:22 crc kubenswrapper[4922]: I0218 11:56:22.720314 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec6256f2-24ef-4e83-9a3c-c98ef206f7b0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:22 crc kubenswrapper[4922]: I0218 11:56:22.720328 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec6256f2-24ef-4e83-9a3c-c98ef206f7b0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:22 crc kubenswrapper[4922]: I0218 11:56:22.720337 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7r9n\" (UniqueName: \"kubernetes.io/projected/ec6256f2-24ef-4e83-9a3c-c98ef206f7b0-kube-api-access-f7r9n\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:22 crc kubenswrapper[4922]: I0218 11:56:22.720346 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec6256f2-24ef-4e83-9a3c-c98ef206f7b0-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:22 crc kubenswrapper[4922]: I0218 11:56:22.882148 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b854f8786-pls2t" event={"ID":"2efd0609-4858-47ce-8213-6a74510e8acf","Type":"ContainerStarted","Data":"3d68965d6f82575327207faca5029e68947c2c261cc7f048c8228ca314476d56"} Feb 18 11:56:22 crc kubenswrapper[4922]: I0218 11:56:22.885324 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-zmcwq" event={"ID":"ec6256f2-24ef-4e83-9a3c-c98ef206f7b0","Type":"ContainerDied","Data":"dd9d9badec862e12fd53c23629f63f90af2fbdcdb63f9f7d603ffed75ff6a6ad"} Feb 18 11:56:22 crc kubenswrapper[4922]: I0218 11:56:22.885438 4922 scope.go:117] "RemoveContainer" containerID="9c3a3f840bf392e642ec5f6d88074e14f3634fce3f5fe950f1ca6ef37f5d1bd5" Feb 18 11:56:22 crc kubenswrapper[4922]: I0218 11:56:22.910819 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-zmcwq" Feb 18 11:56:22 crc kubenswrapper[4922]: I0218 11:56:22.946027 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-zmcwq"] Feb 18 11:56:22 crc kubenswrapper[4922]: I0218 11:56:22.955503 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-zmcwq"] Feb 18 11:56:22 crc kubenswrapper[4922]: I0218 11:56:22.989532 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec6256f2-24ef-4e83-9a3c-c98ef206f7b0" path="/var/lib/kubelet/pods/ec6256f2-24ef-4e83-9a3c-c98ef206f7b0/volumes" Feb 18 11:56:24 crc kubenswrapper[4922]: I0218 11:56:24.200427 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Feb 18 11:56:24 crc kubenswrapper[4922]: I0218 11:56:24.200704 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="ade99914-d8c2-4a1a-9492-d3bb2a83a64d" containerName="watcher-api-log" containerID="cri-o://f656a5eab3bc7d824593462123204765b1a3f4dc494bd8b0ed339492dd2506ae" gracePeriod=30 Feb 18 11:56:24 crc kubenswrapper[4922]: I0218 11:56:24.200911 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="ade99914-d8c2-4a1a-9492-d3bb2a83a64d" containerName="watcher-api" containerID="cri-o://0b0005d984f6a47d8e4c600b875d8f557a7684614b4f1a8f4fe087932cc0bade" gracePeriod=30 Feb 18 11:56:24 crc kubenswrapper[4922]: I0218 11:56:24.812737 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-zmcwq" podUID="ec6256f2-24ef-4e83-9a3c-c98ef206f7b0" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: i/o timeout" Feb 18 11:56:25 crc kubenswrapper[4922]: E0218 11:56:25.133625 4922 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="17cce53602151ba6d1c560fb5734c8a81055d3f6f58806eded906ecc6d4b9003" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 18 11:56:25 crc kubenswrapper[4922]: E0218 11:56:25.135575 4922 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="17cce53602151ba6d1c560fb5734c8a81055d3f6f58806eded906ecc6d4b9003" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 18 11:56:25 crc kubenswrapper[4922]: E0218 11:56:25.137481 4922 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="17cce53602151ba6d1c560fb5734c8a81055d3f6f58806eded906ecc6d4b9003" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 18 11:56:25 crc kubenswrapper[4922]: E0218 11:56:25.137522 4922 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="0c614b6a-8d46-4a07-89f6-1a7cc64dfdad" containerName="watcher-applier" Feb 18 11:56:25 crc kubenswrapper[4922]: I0218 11:56:25.922786 4922 generic.go:334] "Generic (PLEG): container finished" podID="ade99914-d8c2-4a1a-9492-d3bb2a83a64d" containerID="f656a5eab3bc7d824593462123204765b1a3f4dc494bd8b0ed339492dd2506ae" exitCode=143 Feb 18 11:56:25 crc kubenswrapper[4922]: I0218 11:56:25.922867 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"ade99914-d8c2-4a1a-9492-d3bb2a83a64d","Type":"ContainerDied","Data":"f656a5eab3bc7d824593462123204765b1a3f4dc494bd8b0ed339492dd2506ae"} Feb 18 11:56:27 crc kubenswrapper[4922]: I0218 11:56:27.969549 4922 generic.go:334] "Generic (PLEG): container finished" podID="ade99914-d8c2-4a1a-9492-d3bb2a83a64d" containerID="0b0005d984f6a47d8e4c600b875d8f557a7684614b4f1a8f4fe087932cc0bade" exitCode=0 Feb 18 11:56:27 crc kubenswrapper[4922]: I0218 11:56:27.969578 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"ade99914-d8c2-4a1a-9492-d3bb2a83a64d","Type":"ContainerDied","Data":"0b0005d984f6a47d8e4c600b875d8f557a7684614b4f1a8f4fe087932cc0bade"} Feb 18 11:56:28 crc kubenswrapper[4922]: I0218 11:56:28.389926 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-9d79df67b-mg9kq" podUID="e3b4cee1-5234-4b6c-93fa-3cb5687ecba9" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.159:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.159:8443: connect: connection refused" Feb 18 11:56:28 crc kubenswrapper[4922]: I0218 11:56:28.525993 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7bbf5454f6-d5958" podUID="3bc8759d-86ff-415d-936a-064ef742f0d9" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.160:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.160:8443: connect: connection refused" Feb 18 11:56:29 crc kubenswrapper[4922]: I0218 11:56:29.127313 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="ade99914-d8c2-4a1a-9492-d3bb2a83a64d" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.162:9322/\": dial tcp 10.217.0.162:9322: connect: connection refused" Feb 18 11:56:29 crc kubenswrapper[4922]: I0218 11:56:29.127333 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="ade99914-d8c2-4a1a-9492-d3bb2a83a64d" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.162:9322/\": dial tcp 10.217.0.162:9322: connect: connection refused" Feb 18 11:56:30 crc kubenswrapper[4922]: E0218 11:56:30.167976 4922 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="17cce53602151ba6d1c560fb5734c8a81055d3f6f58806eded906ecc6d4b9003" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 18 11:56:30 crc kubenswrapper[4922]: E0218 11:56:30.172141 4922 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="17cce53602151ba6d1c560fb5734c8a81055d3f6f58806eded906ecc6d4b9003" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 18 11:56:30 crc kubenswrapper[4922]: E0218 11:56:30.173567 4922 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="17cce53602151ba6d1c560fb5734c8a81055d3f6f58806eded906ecc6d4b9003" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 18 11:56:30 crc kubenswrapper[4922]: E0218 11:56:30.173628 4922 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="0c614b6a-8d46-4a07-89f6-1a7cc64dfdad" containerName="watcher-applier" Feb 18 11:56:35 crc kubenswrapper[4922]: E0218 11:56:35.133201 4922 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="17cce53602151ba6d1c560fb5734c8a81055d3f6f58806eded906ecc6d4b9003" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 18 11:56:35 crc kubenswrapper[4922]: E0218 11:56:35.135406 4922 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="17cce53602151ba6d1c560fb5734c8a81055d3f6f58806eded906ecc6d4b9003" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 18 11:56:35 crc kubenswrapper[4922]: E0218 11:56:35.136331 4922 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="17cce53602151ba6d1c560fb5734c8a81055d3f6f58806eded906ecc6d4b9003" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 18 11:56:35 crc kubenswrapper[4922]: E0218 11:56:35.136387 4922 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="0c614b6a-8d46-4a07-89f6-1a7cc64dfdad" containerName="watcher-applier" Feb 18 11:56:36 crc kubenswrapper[4922]: E0218 11:56:36.239873 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/sg-core:latest" Feb 18 11:56:36 crc kubenswrapper[4922]: E0218 11:56:36.240020 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:sg-core,Image:quay.io/openstack-k8s-operators/sg-core:latest,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:sg-core-conf-yaml,ReadOnly:false,MountPath:/etc/sg-core.conf.yaml,SubPath:sg-core.conf.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jmwrr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(a3873c9e-308f-46ea-ac8f-4ee78ca92235): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 11:56:36 crc kubenswrapper[4922]: I0218 11:56:36.359466 4922 scope.go:117] "RemoveContainer" containerID="95d55b3314d502d1433cbc46f43ce2797b326e2911e0feadc7c77b360ebeb491" Feb 18 11:56:37 crc kubenswrapper[4922]: I0218 11:56:37.811287 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 18 11:56:37 crc kubenswrapper[4922]: I0218 11:56:37.839468 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ade99914-d8c2-4a1a-9492-d3bb2a83a64d-logs\") pod \"ade99914-d8c2-4a1a-9492-d3bb2a83a64d\" (UID: \"ade99914-d8c2-4a1a-9492-d3bb2a83a64d\") " Feb 18 11:56:37 crc kubenswrapper[4922]: I0218 11:56:37.839541 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ade99914-d8c2-4a1a-9492-d3bb2a83a64d-combined-ca-bundle\") pod \"ade99914-d8c2-4a1a-9492-d3bb2a83a64d\" (UID: \"ade99914-d8c2-4a1a-9492-d3bb2a83a64d\") " Feb 18 11:56:37 crc kubenswrapper[4922]: I0218 11:56:37.839611 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ade99914-d8c2-4a1a-9492-d3bb2a83a64d-custom-prometheus-ca\") pod \"ade99914-d8c2-4a1a-9492-d3bb2a83a64d\" (UID: \"ade99914-d8c2-4a1a-9492-d3bb2a83a64d\") " Feb 18 11:56:37 crc kubenswrapper[4922]: I0218 11:56:37.839631 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhfnt\" (UniqueName: \"kubernetes.io/projected/ade99914-d8c2-4a1a-9492-d3bb2a83a64d-kube-api-access-vhfnt\") pod \"ade99914-d8c2-4a1a-9492-d3bb2a83a64d\" (UID: \"ade99914-d8c2-4a1a-9492-d3bb2a83a64d\") " Feb 18 11:56:37 crc kubenswrapper[4922]: I0218 11:56:37.839653 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ade99914-d8c2-4a1a-9492-d3bb2a83a64d-config-data\") pod \"ade99914-d8c2-4a1a-9492-d3bb2a83a64d\" (UID: \"ade99914-d8c2-4a1a-9492-d3bb2a83a64d\") " Feb 18 11:56:37 crc kubenswrapper[4922]: I0218 11:56:37.848700 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ade99914-d8c2-4a1a-9492-d3bb2a83a64d-logs" (OuterVolumeSpecName: "logs") pod "ade99914-d8c2-4a1a-9492-d3bb2a83a64d" (UID: "ade99914-d8c2-4a1a-9492-d3bb2a83a64d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:56:37 crc kubenswrapper[4922]: I0218 11:56:37.871554 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ade99914-d8c2-4a1a-9492-d3bb2a83a64d-kube-api-access-vhfnt" (OuterVolumeSpecName: "kube-api-access-vhfnt") pod "ade99914-d8c2-4a1a-9492-d3bb2a83a64d" (UID: "ade99914-d8c2-4a1a-9492-d3bb2a83a64d"). InnerVolumeSpecName "kube-api-access-vhfnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:56:37 crc kubenswrapper[4922]: I0218 11:56:37.909370 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ade99914-d8c2-4a1a-9492-d3bb2a83a64d-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "ade99914-d8c2-4a1a-9492-d3bb2a83a64d" (UID: "ade99914-d8c2-4a1a-9492-d3bb2a83a64d"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:56:37 crc kubenswrapper[4922]: I0218 11:56:37.929912 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ade99914-d8c2-4a1a-9492-d3bb2a83a64d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ade99914-d8c2-4a1a-9492-d3bb2a83a64d" (UID: "ade99914-d8c2-4a1a-9492-d3bb2a83a64d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:56:37 crc kubenswrapper[4922]: I0218 11:56:37.951872 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ade99914-d8c2-4a1a-9492-d3bb2a83a64d-logs\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:37 crc kubenswrapper[4922]: I0218 11:56:37.951914 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ade99914-d8c2-4a1a-9492-d3bb2a83a64d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:37 crc kubenswrapper[4922]: I0218 11:56:37.951931 4922 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ade99914-d8c2-4a1a-9492-d3bb2a83a64d-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:37 crc kubenswrapper[4922]: I0218 11:56:37.951948 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhfnt\" (UniqueName: \"kubernetes.io/projected/ade99914-d8c2-4a1a-9492-d3bb2a83a64d-kube-api-access-vhfnt\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:37 crc kubenswrapper[4922]: I0218 11:56:37.989439 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ade99914-d8c2-4a1a-9492-d3bb2a83a64d-config-data" (OuterVolumeSpecName: "config-data") pod "ade99914-d8c2-4a1a-9492-d3bb2a83a64d" (UID: "ade99914-d8c2-4a1a-9492-d3bb2a83a64d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:56:38 crc kubenswrapper[4922]: I0218 11:56:38.054202 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ade99914-d8c2-4a1a-9492-d3bb2a83a64d-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:38 crc kubenswrapper[4922]: I0218 11:56:38.056739 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b854f8786-pls2t" event={"ID":"2efd0609-4858-47ce-8213-6a74510e8acf","Type":"ContainerStarted","Data":"56fe05bf9b6b7132f87e22c520afda7a7ad80144754369b0d0ab69f18eb6ea5a"} Feb 18 11:56:38 crc kubenswrapper[4922]: I0218 11:56:38.056901 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-b854f8786-pls2t" Feb 18 11:56:38 crc kubenswrapper[4922]: I0218 11:56:38.061536 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-mqfhx" event={"ID":"31aad152-dcb7-472f-a0f8-d90ae972442b","Type":"ContainerStarted","Data":"fdbd37640dccec05f284f7ac0ade4661f3a31188c1203c2107b809735927d31b"} Feb 18 11:56:38 crc kubenswrapper[4922]: I0218 11:56:38.067765 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"ade99914-d8c2-4a1a-9492-d3bb2a83a64d","Type":"ContainerDied","Data":"fd553f3191fff116bcb61d1761b6014e37c31888c5dc3e6ec76e921294d669b1"} Feb 18 11:56:38 crc kubenswrapper[4922]: I0218 11:56:38.067824 4922 scope.go:117] "RemoveContainer" containerID="0b0005d984f6a47d8e4c600b875d8f557a7684614b4f1a8f4fe087932cc0bade" Feb 18 11:56:38 crc kubenswrapper[4922]: I0218 11:56:38.067985 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 18 11:56:38 crc kubenswrapper[4922]: I0218 11:56:38.080913 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-b854f8786-pls2t" podStartSLOduration=23.080895895 podStartE2EDuration="23.080895895s" podCreationTimestamp="2026-02-18 11:56:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:56:38.073945889 +0000 UTC m=+1199.801649959" watchObservedRunningTime="2026-02-18 11:56:38.080895895 +0000 UTC m=+1199.808599975" Feb 18 11:56:38 crc kubenswrapper[4922]: I0218 11:56:38.101070 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-mqfhx" podStartSLOduration=5.031170574 podStartE2EDuration="1m19.101053945s" podCreationTimestamp="2026-02-18 11:55:19 +0000 UTC" firstStartedPulling="2026-02-18 11:55:23.590736598 +0000 UTC m=+1125.318440668" lastFinishedPulling="2026-02-18 11:56:37.660619959 +0000 UTC m=+1199.388324039" observedRunningTime="2026-02-18 11:56:38.100015159 +0000 UTC m=+1199.827719289" watchObservedRunningTime="2026-02-18 11:56:38.101053945 +0000 UTC m=+1199.828758025" Feb 18 11:56:38 crc kubenswrapper[4922]: I0218 11:56:38.137800 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Feb 18 11:56:38 crc kubenswrapper[4922]: I0218 11:56:38.140968 4922 scope.go:117] "RemoveContainer" containerID="f656a5eab3bc7d824593462123204765b1a3f4dc494bd8b0ed339492dd2506ae" Feb 18 11:56:38 crc kubenswrapper[4922]: I0218 11:56:38.156398 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Feb 18 11:56:38 crc kubenswrapper[4922]: I0218 11:56:38.176747 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Feb 18 11:56:38 crc kubenswrapper[4922]: E0218 11:56:38.177890 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ade99914-d8c2-4a1a-9492-d3bb2a83a64d" containerName="watcher-api-log" Feb 18 11:56:38 crc kubenswrapper[4922]: I0218 11:56:38.177944 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="ade99914-d8c2-4a1a-9492-d3bb2a83a64d" containerName="watcher-api-log" Feb 18 11:56:38 crc kubenswrapper[4922]: E0218 11:56:38.177961 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec6256f2-24ef-4e83-9a3c-c98ef206f7b0" containerName="dnsmasq-dns" Feb 18 11:56:38 crc kubenswrapper[4922]: I0218 11:56:38.177967 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec6256f2-24ef-4e83-9a3c-c98ef206f7b0" containerName="dnsmasq-dns" Feb 18 11:56:38 crc kubenswrapper[4922]: E0218 11:56:38.178025 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec6256f2-24ef-4e83-9a3c-c98ef206f7b0" containerName="init" Feb 18 11:56:38 crc kubenswrapper[4922]: I0218 11:56:38.178033 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec6256f2-24ef-4e83-9a3c-c98ef206f7b0" containerName="init" Feb 18 11:56:38 crc kubenswrapper[4922]: E0218 11:56:38.178053 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ade99914-d8c2-4a1a-9492-d3bb2a83a64d" containerName="watcher-api" Feb 18 11:56:38 crc kubenswrapper[4922]: I0218 11:56:38.178059 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="ade99914-d8c2-4a1a-9492-d3bb2a83a64d" containerName="watcher-api" Feb 18 11:56:38 crc kubenswrapper[4922]: I0218 11:56:38.178478 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="ade99914-d8c2-4a1a-9492-d3bb2a83a64d" containerName="watcher-api" Feb 18 11:56:38 crc kubenswrapper[4922]: I0218 11:56:38.178538 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="ade99914-d8c2-4a1a-9492-d3bb2a83a64d" containerName="watcher-api-log" Feb 18 11:56:38 crc kubenswrapper[4922]: I0218 11:56:38.178554 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec6256f2-24ef-4e83-9a3c-c98ef206f7b0" containerName="dnsmasq-dns" Feb 18 11:56:38 crc kubenswrapper[4922]: I0218 11:56:38.180136 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Feb 18 11:56:38 crc kubenswrapper[4922]: I0218 11:56:38.180293 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 18 11:56:38 crc kubenswrapper[4922]: I0218 11:56:38.183665 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Feb 18 11:56:38 crc kubenswrapper[4922]: I0218 11:56:38.183970 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-internal-svc" Feb 18 11:56:38 crc kubenswrapper[4922]: I0218 11:56:38.184260 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-public-svc" Feb 18 11:56:38 crc kubenswrapper[4922]: I0218 11:56:38.360465 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43b1edea-6c95-42ae-b30a-d3ce2eb1e0de-logs\") pod \"watcher-api-0\" (UID: \"43b1edea-6c95-42ae-b30a-d3ce2eb1e0de\") " pod="openstack/watcher-api-0" Feb 18 11:56:38 crc kubenswrapper[4922]: I0218 11:56:38.360522 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43b1edea-6c95-42ae-b30a-d3ce2eb1e0de-config-data\") pod \"watcher-api-0\" (UID: \"43b1edea-6c95-42ae-b30a-d3ce2eb1e0de\") " pod="openstack/watcher-api-0" Feb 18 11:56:38 crc kubenswrapper[4922]: I0218 11:56:38.360579 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/43b1edea-6c95-42ae-b30a-d3ce2eb1e0de-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"43b1edea-6c95-42ae-b30a-d3ce2eb1e0de\") " pod="openstack/watcher-api-0" Feb 18 11:56:38 crc kubenswrapper[4922]: I0218 11:56:38.360779 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jffxp\" (UniqueName: \"kubernetes.io/projected/43b1edea-6c95-42ae-b30a-d3ce2eb1e0de-kube-api-access-jffxp\") pod \"watcher-api-0\" (UID: \"43b1edea-6c95-42ae-b30a-d3ce2eb1e0de\") " pod="openstack/watcher-api-0" Feb 18 11:56:38 crc kubenswrapper[4922]: I0218 11:56:38.360839 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43b1edea-6c95-42ae-b30a-d3ce2eb1e0de-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"43b1edea-6c95-42ae-b30a-d3ce2eb1e0de\") " pod="openstack/watcher-api-0" Feb 18 11:56:38 crc kubenswrapper[4922]: I0218 11:56:38.360879 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/43b1edea-6c95-42ae-b30a-d3ce2eb1e0de-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"43b1edea-6c95-42ae-b30a-d3ce2eb1e0de\") " pod="openstack/watcher-api-0" Feb 18 11:56:38 crc kubenswrapper[4922]: I0218 11:56:38.361058 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/43b1edea-6c95-42ae-b30a-d3ce2eb1e0de-public-tls-certs\") pod \"watcher-api-0\" (UID: \"43b1edea-6c95-42ae-b30a-d3ce2eb1e0de\") " pod="openstack/watcher-api-0" Feb 18 11:56:38 crc kubenswrapper[4922]: I0218 11:56:38.462870 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/43b1edea-6c95-42ae-b30a-d3ce2eb1e0de-public-tls-certs\") pod \"watcher-api-0\" (UID: \"43b1edea-6c95-42ae-b30a-d3ce2eb1e0de\") " pod="openstack/watcher-api-0" Feb 18 11:56:38 crc kubenswrapper[4922]: I0218 11:56:38.462965 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43b1edea-6c95-42ae-b30a-d3ce2eb1e0de-logs\") pod \"watcher-api-0\" (UID: \"43b1edea-6c95-42ae-b30a-d3ce2eb1e0de\") " pod="openstack/watcher-api-0" Feb 18 11:56:38 crc kubenswrapper[4922]: I0218 11:56:38.463043 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43b1edea-6c95-42ae-b30a-d3ce2eb1e0de-config-data\") pod \"watcher-api-0\" (UID: \"43b1edea-6c95-42ae-b30a-d3ce2eb1e0de\") " pod="openstack/watcher-api-0" Feb 18 11:56:38 crc kubenswrapper[4922]: I0218 11:56:38.463064 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/43b1edea-6c95-42ae-b30a-d3ce2eb1e0de-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"43b1edea-6c95-42ae-b30a-d3ce2eb1e0de\") " pod="openstack/watcher-api-0" Feb 18 11:56:38 crc kubenswrapper[4922]: I0218 11:56:38.463203 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jffxp\" (UniqueName: \"kubernetes.io/projected/43b1edea-6c95-42ae-b30a-d3ce2eb1e0de-kube-api-access-jffxp\") pod \"watcher-api-0\" (UID: \"43b1edea-6c95-42ae-b30a-d3ce2eb1e0de\") " pod="openstack/watcher-api-0" Feb 18 11:56:38 crc kubenswrapper[4922]: I0218 11:56:38.463241 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43b1edea-6c95-42ae-b30a-d3ce2eb1e0de-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"43b1edea-6c95-42ae-b30a-d3ce2eb1e0de\") " pod="openstack/watcher-api-0" Feb 18 11:56:38 crc kubenswrapper[4922]: I0218 11:56:38.463286 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/43b1edea-6c95-42ae-b30a-d3ce2eb1e0de-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"43b1edea-6c95-42ae-b30a-d3ce2eb1e0de\") " pod="openstack/watcher-api-0" Feb 18 11:56:38 crc kubenswrapper[4922]: I0218 11:56:38.463671 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43b1edea-6c95-42ae-b30a-d3ce2eb1e0de-logs\") pod \"watcher-api-0\" (UID: \"43b1edea-6c95-42ae-b30a-d3ce2eb1e0de\") " pod="openstack/watcher-api-0" Feb 18 11:56:38 crc kubenswrapper[4922]: I0218 11:56:38.471444 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43b1edea-6c95-42ae-b30a-d3ce2eb1e0de-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"43b1edea-6c95-42ae-b30a-d3ce2eb1e0de\") " pod="openstack/watcher-api-0" Feb 18 11:56:38 crc kubenswrapper[4922]: I0218 11:56:38.471966 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/43b1edea-6c95-42ae-b30a-d3ce2eb1e0de-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"43b1edea-6c95-42ae-b30a-d3ce2eb1e0de\") " pod="openstack/watcher-api-0" Feb 18 11:56:38 crc kubenswrapper[4922]: I0218 11:56:38.474176 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43b1edea-6c95-42ae-b30a-d3ce2eb1e0de-config-data\") pod \"watcher-api-0\" (UID: \"43b1edea-6c95-42ae-b30a-d3ce2eb1e0de\") " pod="openstack/watcher-api-0" Feb 18 11:56:38 crc kubenswrapper[4922]: I0218 11:56:38.475085 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/43b1edea-6c95-42ae-b30a-d3ce2eb1e0de-public-tls-certs\") pod \"watcher-api-0\" (UID: \"43b1edea-6c95-42ae-b30a-d3ce2eb1e0de\") " pod="openstack/watcher-api-0" Feb 18 11:56:38 crc kubenswrapper[4922]: I0218 11:56:38.476108 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/43b1edea-6c95-42ae-b30a-d3ce2eb1e0de-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"43b1edea-6c95-42ae-b30a-d3ce2eb1e0de\") " pod="openstack/watcher-api-0" Feb 18 11:56:38 crc kubenswrapper[4922]: I0218 11:56:38.487141 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jffxp\" (UniqueName: \"kubernetes.io/projected/43b1edea-6c95-42ae-b30a-d3ce2eb1e0de-kube-api-access-jffxp\") pod \"watcher-api-0\" (UID: \"43b1edea-6c95-42ae-b30a-d3ce2eb1e0de\") " pod="openstack/watcher-api-0" Feb 18 11:56:38 crc kubenswrapper[4922]: I0218 11:56:38.516769 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 18 11:56:38 crc kubenswrapper[4922]: I0218 11:56:38.822501 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Feb 18 11:56:39 crc kubenswrapper[4922]: I0218 11:56:39.005844 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ade99914-d8c2-4a1a-9492-d3bb2a83a64d" path="/var/lib/kubelet/pods/ade99914-d8c2-4a1a-9492-d3bb2a83a64d/volumes" Feb 18 11:56:39 crc kubenswrapper[4922]: I0218 11:56:39.092416 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-rvpx7" event={"ID":"d7852f85-b8c5-458e-901c-3659c5ed2713","Type":"ContainerStarted","Data":"483ace8c8816a57983828e0b595d4b4e906ff56406e775bb5aac307ba89d5e30"} Feb 18 11:56:39 crc kubenswrapper[4922]: I0218 11:56:39.095223 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"43b1edea-6c95-42ae-b30a-d3ce2eb1e0de","Type":"ContainerStarted","Data":"5150e1ea135ca30602971c47aa10843dc651670d87ed44c36bdcbd6e651bf96f"} Feb 18 11:56:39 crc kubenswrapper[4922]: I0218 11:56:39.095264 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"43b1edea-6c95-42ae-b30a-d3ce2eb1e0de","Type":"ContainerStarted","Data":"835aa3653d2597b3da5bab3dd4e7277274d3fa27526015cb79464cf620c313bf"} Feb 18 11:56:39 crc kubenswrapper[4922]: I0218 11:56:39.127871 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="ade99914-d8c2-4a1a-9492-d3bb2a83a64d" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.162:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 18 11:56:39 crc kubenswrapper[4922]: I0218 11:56:39.128346 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="ade99914-d8c2-4a1a-9492-d3bb2a83a64d" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.162:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 18 11:56:39 crc kubenswrapper[4922]: I0218 11:56:39.807394 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 11:56:39 crc kubenswrapper[4922]: I0218 11:56:39.807470 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 11:56:40 crc kubenswrapper[4922]: I0218 11:56:40.107609 4922 generic.go:334] "Generic (PLEG): container finished" podID="71318c6d-61ee-4fb4-8682-7cf3fc0ae044" containerID="5597fcea1ce088dbc75d5c2c33874c67f09dcb474426ad6372bb91fff5a29ef3" exitCode=0 Feb 18 11:56:40 crc kubenswrapper[4922]: I0218 11:56:40.107664 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-clz29" event={"ID":"71318c6d-61ee-4fb4-8682-7cf3fc0ae044","Type":"ContainerDied","Data":"5597fcea1ce088dbc75d5c2c33874c67f09dcb474426ad6372bb91fff5a29ef3"} Feb 18 11:56:40 crc kubenswrapper[4922]: I0218 11:56:40.112396 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"43b1edea-6c95-42ae-b30a-d3ce2eb1e0de","Type":"ContainerStarted","Data":"94eb0f2dfb2c70c102724ce9c06e392c0ec15f8e812096448c4b43eb5202059c"} Feb 18 11:56:40 crc kubenswrapper[4922]: I0218 11:56:40.113428 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Feb 18 11:56:40 crc kubenswrapper[4922]: I0218 11:56:40.131501 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-rvpx7" podStartSLOduration=7.083547602 podStartE2EDuration="1m21.13033859s" podCreationTimestamp="2026-02-18 11:55:19 +0000 UTC" firstStartedPulling="2026-02-18 11:55:23.640709111 +0000 UTC m=+1125.368413191" lastFinishedPulling="2026-02-18 11:56:37.687500099 +0000 UTC m=+1199.415204179" observedRunningTime="2026-02-18 11:56:39.121243407 +0000 UTC m=+1200.848947497" watchObservedRunningTime="2026-02-18 11:56:40.13033859 +0000 UTC m=+1201.858042670" Feb 18 11:56:40 crc kubenswrapper[4922]: E0218 11:56:40.133254 4922 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="17cce53602151ba6d1c560fb5734c8a81055d3f6f58806eded906ecc6d4b9003" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 18 11:56:40 crc kubenswrapper[4922]: E0218 11:56:40.134650 4922 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="17cce53602151ba6d1c560fb5734c8a81055d3f6f58806eded906ecc6d4b9003" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 18 11:56:40 crc kubenswrapper[4922]: E0218 11:56:40.137752 4922 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="17cce53602151ba6d1c560fb5734c8a81055d3f6f58806eded906ecc6d4b9003" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 18 11:56:40 crc kubenswrapper[4922]: E0218 11:56:40.137822 4922 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="0c614b6a-8d46-4a07-89f6-1a7cc64dfdad" containerName="watcher-applier" Feb 18 11:56:40 crc kubenswrapper[4922]: I0218 11:56:40.156780 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=2.156760138 podStartE2EDuration="2.156760138s" podCreationTimestamp="2026-02-18 11:56:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:56:40.149976176 +0000 UTC m=+1201.877680256" watchObservedRunningTime="2026-02-18 11:56:40.156760138 +0000 UTC m=+1201.884464218" Feb 18 11:56:40 crc kubenswrapper[4922]: I0218 11:56:40.993670 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-9d79df67b-mg9kq" Feb 18 11:56:41 crc kubenswrapper[4922]: I0218 11:56:41.170416 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7bbf5454f6-d5958" Feb 18 11:56:42 crc kubenswrapper[4922]: I0218 11:56:42.129279 4922 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 18 11:56:42 crc kubenswrapper[4922]: I0218 11:56:42.759753 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Feb 18 11:56:42 crc kubenswrapper[4922]: I0218 11:56:42.863546 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-9d79df67b-mg9kq" Feb 18 11:56:43 crc kubenswrapper[4922]: I0218 11:56:43.146023 4922 generic.go:334] "Generic (PLEG): container finished" podID="0c614b6a-8d46-4a07-89f6-1a7cc64dfdad" containerID="17cce53602151ba6d1c560fb5734c8a81055d3f6f58806eded906ecc6d4b9003" exitCode=137 Feb 18 11:56:43 crc kubenswrapper[4922]: I0218 11:56:43.146119 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"0c614b6a-8d46-4a07-89f6-1a7cc64dfdad","Type":"ContainerDied","Data":"17cce53602151ba6d1c560fb5734c8a81055d3f6f58806eded906ecc6d4b9003"} Feb 18 11:56:43 crc kubenswrapper[4922]: I0218 11:56:43.150592 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"9e9fc515-5e15-41fc-8e76-b9f3af099a0f","Type":"ContainerDied","Data":"538ff05accf8e69796a85607f99df1d694327aae777210403771939c222ef98d"} Feb 18 11:56:43 crc kubenswrapper[4922]: I0218 11:56:43.150518 4922 generic.go:334] "Generic (PLEG): container finished" podID="9e9fc515-5e15-41fc-8e76-b9f3af099a0f" containerID="538ff05accf8e69796a85607f99df1d694327aae777210403771939c222ef98d" exitCode=137 Feb 18 11:56:43 crc kubenswrapper[4922]: I0218 11:56:43.238893 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7bbf5454f6-d5958" Feb 18 11:56:43 crc kubenswrapper[4922]: I0218 11:56:43.347821 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-9d79df67b-mg9kq"] Feb 18 11:56:43 crc kubenswrapper[4922]: I0218 11:56:43.348101 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-9d79df67b-mg9kq" podUID="e3b4cee1-5234-4b6c-93fa-3cb5687ecba9" containerName="horizon-log" containerID="cri-o://666106d3ce79297e2d8ed502af3f7dcfa2bfde51946ace3e35cdb24c5fc323a9" gracePeriod=30 Feb 18 11:56:43 crc kubenswrapper[4922]: I0218 11:56:43.348500 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-9d79df67b-mg9kq" podUID="e3b4cee1-5234-4b6c-93fa-3cb5687ecba9" containerName="horizon" containerID="cri-o://5d696a6b456c46bcc03a5a4e42955c244ebb50a3f0ae3ca3ca3bec18dc6258ea" gracePeriod=30 Feb 18 11:56:43 crc kubenswrapper[4922]: I0218 11:56:43.517345 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Feb 18 11:56:44 crc kubenswrapper[4922]: I0218 11:56:44.164944 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-clz29" event={"ID":"71318c6d-61ee-4fb4-8682-7cf3fc0ae044","Type":"ContainerDied","Data":"c85150d5f0f00107589b346911782e9cacbfd0c2f75cf23b1582f2bb391c607c"} Feb 18 11:56:44 crc kubenswrapper[4922]: I0218 11:56:44.165208 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c85150d5f0f00107589b346911782e9cacbfd0c2f75cf23b1582f2bb391c607c" Feb 18 11:56:44 crc kubenswrapper[4922]: I0218 11:56:44.215183 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-clz29" Feb 18 11:56:44 crc kubenswrapper[4922]: I0218 11:56:44.297981 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71318c6d-61ee-4fb4-8682-7cf3fc0ae044-combined-ca-bundle\") pod \"71318c6d-61ee-4fb4-8682-7cf3fc0ae044\" (UID: \"71318c6d-61ee-4fb4-8682-7cf3fc0ae044\") " Feb 18 11:56:44 crc kubenswrapper[4922]: I0218 11:56:44.299010 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5d8ft\" (UniqueName: \"kubernetes.io/projected/71318c6d-61ee-4fb4-8682-7cf3fc0ae044-kube-api-access-5d8ft\") pod \"71318c6d-61ee-4fb4-8682-7cf3fc0ae044\" (UID: \"71318c6d-61ee-4fb4-8682-7cf3fc0ae044\") " Feb 18 11:56:44 crc kubenswrapper[4922]: I0218 11:56:44.299876 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71318c6d-61ee-4fb4-8682-7cf3fc0ae044-scripts\") pod \"71318c6d-61ee-4fb4-8682-7cf3fc0ae044\" (UID: \"71318c6d-61ee-4fb4-8682-7cf3fc0ae044\") " Feb 18 11:56:44 crc kubenswrapper[4922]: I0218 11:56:44.299936 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71318c6d-61ee-4fb4-8682-7cf3fc0ae044-logs\") pod \"71318c6d-61ee-4fb4-8682-7cf3fc0ae044\" (UID: \"71318c6d-61ee-4fb4-8682-7cf3fc0ae044\") " Feb 18 11:56:44 crc kubenswrapper[4922]: I0218 11:56:44.299959 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71318c6d-61ee-4fb4-8682-7cf3fc0ae044-config-data\") pod \"71318c6d-61ee-4fb4-8682-7cf3fc0ae044\" (UID: \"71318c6d-61ee-4fb4-8682-7cf3fc0ae044\") " Feb 18 11:56:44 crc kubenswrapper[4922]: I0218 11:56:44.300248 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71318c6d-61ee-4fb4-8682-7cf3fc0ae044-logs" (OuterVolumeSpecName: "logs") pod "71318c6d-61ee-4fb4-8682-7cf3fc0ae044" (UID: "71318c6d-61ee-4fb4-8682-7cf3fc0ae044"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:56:44 crc kubenswrapper[4922]: I0218 11:56:44.300651 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71318c6d-61ee-4fb4-8682-7cf3fc0ae044-logs\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:44 crc kubenswrapper[4922]: I0218 11:56:44.304646 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71318c6d-61ee-4fb4-8682-7cf3fc0ae044-scripts" (OuterVolumeSpecName: "scripts") pod "71318c6d-61ee-4fb4-8682-7cf3fc0ae044" (UID: "71318c6d-61ee-4fb4-8682-7cf3fc0ae044"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:56:44 crc kubenswrapper[4922]: I0218 11:56:44.304730 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71318c6d-61ee-4fb4-8682-7cf3fc0ae044-kube-api-access-5d8ft" (OuterVolumeSpecName: "kube-api-access-5d8ft") pod "71318c6d-61ee-4fb4-8682-7cf3fc0ae044" (UID: "71318c6d-61ee-4fb4-8682-7cf3fc0ae044"). InnerVolumeSpecName "kube-api-access-5d8ft". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:56:44 crc kubenswrapper[4922]: I0218 11:56:44.326157 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71318c6d-61ee-4fb4-8682-7cf3fc0ae044-config-data" (OuterVolumeSpecName: "config-data") pod "71318c6d-61ee-4fb4-8682-7cf3fc0ae044" (UID: "71318c6d-61ee-4fb4-8682-7cf3fc0ae044"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:56:44 crc kubenswrapper[4922]: I0218 11:56:44.351295 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71318c6d-61ee-4fb4-8682-7cf3fc0ae044-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "71318c6d-61ee-4fb4-8682-7cf3fc0ae044" (UID: "71318c6d-61ee-4fb4-8682-7cf3fc0ae044"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:56:44 crc kubenswrapper[4922]: I0218 11:56:44.402856 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71318c6d-61ee-4fb4-8682-7cf3fc0ae044-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:44 crc kubenswrapper[4922]: I0218 11:56:44.402893 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5d8ft\" (UniqueName: \"kubernetes.io/projected/71318c6d-61ee-4fb4-8682-7cf3fc0ae044-kube-api-access-5d8ft\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:44 crc kubenswrapper[4922]: I0218 11:56:44.402910 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71318c6d-61ee-4fb4-8682-7cf3fc0ae044-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:44 crc kubenswrapper[4922]: I0218 11:56:44.402920 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71318c6d-61ee-4fb4-8682-7cf3fc0ae044-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:45 crc kubenswrapper[4922]: E0218 11:56:45.132276 4922 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 17cce53602151ba6d1c560fb5734c8a81055d3f6f58806eded906ecc6d4b9003 is running failed: container process not found" containerID="17cce53602151ba6d1c560fb5734c8a81055d3f6f58806eded906ecc6d4b9003" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 18 11:56:45 crc kubenswrapper[4922]: E0218 11:56:45.133071 4922 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 17cce53602151ba6d1c560fb5734c8a81055d3f6f58806eded906ecc6d4b9003 is running failed: container process not found" containerID="17cce53602151ba6d1c560fb5734c8a81055d3f6f58806eded906ecc6d4b9003" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 18 11:56:45 crc kubenswrapper[4922]: E0218 11:56:45.133844 4922 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 17cce53602151ba6d1c560fb5734c8a81055d3f6f58806eded906ecc6d4b9003 is running failed: container process not found" containerID="17cce53602151ba6d1c560fb5734c8a81055d3f6f58806eded906ecc6d4b9003" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 18 11:56:45 crc kubenswrapper[4922]: E0218 11:56:45.133915 4922 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 17cce53602151ba6d1c560fb5734c8a81055d3f6f58806eded906ecc6d4b9003 is running failed: container process not found" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="0c614b6a-8d46-4a07-89f6-1a7cc64dfdad" containerName="watcher-applier" Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.179065 4922 generic.go:334] "Generic (PLEG): container finished" podID="31aad152-dcb7-472f-a0f8-d90ae972442b" containerID="fdbd37640dccec05f284f7ac0ade4661f3a31188c1203c2107b809735927d31b" exitCode=0 Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.179156 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-clz29" Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.179396 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-mqfhx" event={"ID":"31aad152-dcb7-472f-a0f8-d90ae972442b","Type":"ContainerDied","Data":"fdbd37640dccec05f284f7ac0ade4661f3a31188c1203c2107b809735927d31b"} Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.342331 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7c7b84785b-f8lmj"] Feb 18 11:56:45 crc kubenswrapper[4922]: E0218 11:56:45.342836 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71318c6d-61ee-4fb4-8682-7cf3fc0ae044" containerName="placement-db-sync" Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.342854 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="71318c6d-61ee-4fb4-8682-7cf3fc0ae044" containerName="placement-db-sync" Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.352493 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="71318c6d-61ee-4fb4-8682-7cf3fc0ae044" containerName="placement-db-sync" Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.356679 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7c7b84785b-f8lmj"] Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.356798 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7c7b84785b-f8lmj" Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.361937 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-pnzs4" Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.362222 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.362971 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.363269 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.363430 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.432732 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/280ad3f5-10de-4dc8-866b-c7502c004835-internal-tls-certs\") pod \"placement-7c7b84785b-f8lmj\" (UID: \"280ad3f5-10de-4dc8-866b-c7502c004835\") " pod="openstack/placement-7c7b84785b-f8lmj" Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.432815 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/280ad3f5-10de-4dc8-866b-c7502c004835-config-data\") pod \"placement-7c7b84785b-f8lmj\" (UID: \"280ad3f5-10de-4dc8-866b-c7502c004835\") " pod="openstack/placement-7c7b84785b-f8lmj" Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.433055 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/280ad3f5-10de-4dc8-866b-c7502c004835-combined-ca-bundle\") pod \"placement-7c7b84785b-f8lmj\" (UID: \"280ad3f5-10de-4dc8-866b-c7502c004835\") " pod="openstack/placement-7c7b84785b-f8lmj" Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.433099 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/280ad3f5-10de-4dc8-866b-c7502c004835-public-tls-certs\") pod \"placement-7c7b84785b-f8lmj\" (UID: \"280ad3f5-10de-4dc8-866b-c7502c004835\") " pod="openstack/placement-7c7b84785b-f8lmj" Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.433136 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhxhk\" (UniqueName: \"kubernetes.io/projected/280ad3f5-10de-4dc8-866b-c7502c004835-kube-api-access-xhxhk\") pod \"placement-7c7b84785b-f8lmj\" (UID: \"280ad3f5-10de-4dc8-866b-c7502c004835\") " pod="openstack/placement-7c7b84785b-f8lmj" Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.433155 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/280ad3f5-10de-4dc8-866b-c7502c004835-scripts\") pod \"placement-7c7b84785b-f8lmj\" (UID: \"280ad3f5-10de-4dc8-866b-c7502c004835\") " pod="openstack/placement-7c7b84785b-f8lmj" Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.433389 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/280ad3f5-10de-4dc8-866b-c7502c004835-logs\") pod \"placement-7c7b84785b-f8lmj\" (UID: \"280ad3f5-10de-4dc8-866b-c7502c004835\") " pod="openstack/placement-7c7b84785b-f8lmj" Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.535070 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/280ad3f5-10de-4dc8-866b-c7502c004835-combined-ca-bundle\") pod \"placement-7c7b84785b-f8lmj\" (UID: \"280ad3f5-10de-4dc8-866b-c7502c004835\") " pod="openstack/placement-7c7b84785b-f8lmj" Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.535162 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/280ad3f5-10de-4dc8-866b-c7502c004835-public-tls-certs\") pod \"placement-7c7b84785b-f8lmj\" (UID: \"280ad3f5-10de-4dc8-866b-c7502c004835\") " pod="openstack/placement-7c7b84785b-f8lmj" Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.535215 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhxhk\" (UniqueName: \"kubernetes.io/projected/280ad3f5-10de-4dc8-866b-c7502c004835-kube-api-access-xhxhk\") pod \"placement-7c7b84785b-f8lmj\" (UID: \"280ad3f5-10de-4dc8-866b-c7502c004835\") " pod="openstack/placement-7c7b84785b-f8lmj" Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.535241 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/280ad3f5-10de-4dc8-866b-c7502c004835-scripts\") pod \"placement-7c7b84785b-f8lmj\" (UID: \"280ad3f5-10de-4dc8-866b-c7502c004835\") " pod="openstack/placement-7c7b84785b-f8lmj" Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.535287 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/280ad3f5-10de-4dc8-866b-c7502c004835-logs\") pod \"placement-7c7b84785b-f8lmj\" (UID: \"280ad3f5-10de-4dc8-866b-c7502c004835\") " pod="openstack/placement-7c7b84785b-f8lmj" Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.535341 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/280ad3f5-10de-4dc8-866b-c7502c004835-internal-tls-certs\") pod \"placement-7c7b84785b-f8lmj\" (UID: \"280ad3f5-10de-4dc8-866b-c7502c004835\") " pod="openstack/placement-7c7b84785b-f8lmj" Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.535428 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/280ad3f5-10de-4dc8-866b-c7502c004835-config-data\") pod \"placement-7c7b84785b-f8lmj\" (UID: \"280ad3f5-10de-4dc8-866b-c7502c004835\") " pod="openstack/placement-7c7b84785b-f8lmj" Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.536104 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/280ad3f5-10de-4dc8-866b-c7502c004835-logs\") pod \"placement-7c7b84785b-f8lmj\" (UID: \"280ad3f5-10de-4dc8-866b-c7502c004835\") " pod="openstack/placement-7c7b84785b-f8lmj" Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.540905 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/280ad3f5-10de-4dc8-866b-c7502c004835-internal-tls-certs\") pod \"placement-7c7b84785b-f8lmj\" (UID: \"280ad3f5-10de-4dc8-866b-c7502c004835\") " pod="openstack/placement-7c7b84785b-f8lmj" Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.540909 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/280ad3f5-10de-4dc8-866b-c7502c004835-scripts\") pod \"placement-7c7b84785b-f8lmj\" (UID: \"280ad3f5-10de-4dc8-866b-c7502c004835\") " pod="openstack/placement-7c7b84785b-f8lmj" Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.541845 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/280ad3f5-10de-4dc8-866b-c7502c004835-public-tls-certs\") pod \"placement-7c7b84785b-f8lmj\" (UID: \"280ad3f5-10de-4dc8-866b-c7502c004835\") " pod="openstack/placement-7c7b84785b-f8lmj" Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.542017 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/280ad3f5-10de-4dc8-866b-c7502c004835-config-data\") pod \"placement-7c7b84785b-f8lmj\" (UID: \"280ad3f5-10de-4dc8-866b-c7502c004835\") " pod="openstack/placement-7c7b84785b-f8lmj" Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.542160 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/280ad3f5-10de-4dc8-866b-c7502c004835-combined-ca-bundle\") pod \"placement-7c7b84785b-f8lmj\" (UID: \"280ad3f5-10de-4dc8-866b-c7502c004835\") " pod="openstack/placement-7c7b84785b-f8lmj" Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.557168 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhxhk\" (UniqueName: \"kubernetes.io/projected/280ad3f5-10de-4dc8-866b-c7502c004835-kube-api-access-xhxhk\") pod \"placement-7c7b84785b-f8lmj\" (UID: \"280ad3f5-10de-4dc8-866b-c7502c004835\") " pod="openstack/placement-7c7b84785b-f8lmj" Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.685561 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7c7b84785b-f8lmj" Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.737109 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.747025 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.840216 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e9fc515-5e15-41fc-8e76-b9f3af099a0f-config-data\") pod \"9e9fc515-5e15-41fc-8e76-b9f3af099a0f\" (UID: \"9e9fc515-5e15-41fc-8e76-b9f3af099a0f\") " Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.840331 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlfjr\" (UniqueName: \"kubernetes.io/projected/0c614b6a-8d46-4a07-89f6-1a7cc64dfdad-kube-api-access-dlfjr\") pod \"0c614b6a-8d46-4a07-89f6-1a7cc64dfdad\" (UID: \"0c614b6a-8d46-4a07-89f6-1a7cc64dfdad\") " Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.840399 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lkg5\" (UniqueName: \"kubernetes.io/projected/9e9fc515-5e15-41fc-8e76-b9f3af099a0f-kube-api-access-6lkg5\") pod \"9e9fc515-5e15-41fc-8e76-b9f3af099a0f\" (UID: \"9e9fc515-5e15-41fc-8e76-b9f3af099a0f\") " Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.840469 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9e9fc515-5e15-41fc-8e76-b9f3af099a0f-custom-prometheus-ca\") pod \"9e9fc515-5e15-41fc-8e76-b9f3af099a0f\" (UID: \"9e9fc515-5e15-41fc-8e76-b9f3af099a0f\") " Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.840523 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c614b6a-8d46-4a07-89f6-1a7cc64dfdad-config-data\") pod \"0c614b6a-8d46-4a07-89f6-1a7cc64dfdad\" (UID: \"0c614b6a-8d46-4a07-89f6-1a7cc64dfdad\") " Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.840571 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c614b6a-8d46-4a07-89f6-1a7cc64dfdad-combined-ca-bundle\") pod \"0c614b6a-8d46-4a07-89f6-1a7cc64dfdad\" (UID: \"0c614b6a-8d46-4a07-89f6-1a7cc64dfdad\") " Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.840643 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c614b6a-8d46-4a07-89f6-1a7cc64dfdad-logs\") pod \"0c614b6a-8d46-4a07-89f6-1a7cc64dfdad\" (UID: \"0c614b6a-8d46-4a07-89f6-1a7cc64dfdad\") " Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.840757 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e9fc515-5e15-41fc-8e76-b9f3af099a0f-logs\") pod \"9e9fc515-5e15-41fc-8e76-b9f3af099a0f\" (UID: \"9e9fc515-5e15-41fc-8e76-b9f3af099a0f\") " Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.840810 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e9fc515-5e15-41fc-8e76-b9f3af099a0f-combined-ca-bundle\") pod \"9e9fc515-5e15-41fc-8e76-b9f3af099a0f\" (UID: \"9e9fc515-5e15-41fc-8e76-b9f3af099a0f\") " Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.841959 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c614b6a-8d46-4a07-89f6-1a7cc64dfdad-logs" (OuterVolumeSpecName: "logs") pod "0c614b6a-8d46-4a07-89f6-1a7cc64dfdad" (UID: "0c614b6a-8d46-4a07-89f6-1a7cc64dfdad"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.842286 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e9fc515-5e15-41fc-8e76-b9f3af099a0f-logs" (OuterVolumeSpecName: "logs") pod "9e9fc515-5e15-41fc-8e76-b9f3af099a0f" (UID: "9e9fc515-5e15-41fc-8e76-b9f3af099a0f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.847180 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c614b6a-8d46-4a07-89f6-1a7cc64dfdad-kube-api-access-dlfjr" (OuterVolumeSpecName: "kube-api-access-dlfjr") pod "0c614b6a-8d46-4a07-89f6-1a7cc64dfdad" (UID: "0c614b6a-8d46-4a07-89f6-1a7cc64dfdad"). InnerVolumeSpecName "kube-api-access-dlfjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.858699 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e9fc515-5e15-41fc-8e76-b9f3af099a0f-kube-api-access-6lkg5" (OuterVolumeSpecName: "kube-api-access-6lkg5") pod "9e9fc515-5e15-41fc-8e76-b9f3af099a0f" (UID: "9e9fc515-5e15-41fc-8e76-b9f3af099a0f"). InnerVolumeSpecName "kube-api-access-6lkg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.896634 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e9fc515-5e15-41fc-8e76-b9f3af099a0f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9e9fc515-5e15-41fc-8e76-b9f3af099a0f" (UID: "9e9fc515-5e15-41fc-8e76-b9f3af099a0f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.904755 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e9fc515-5e15-41fc-8e76-b9f3af099a0f-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "9e9fc515-5e15-41fc-8e76-b9f3af099a0f" (UID: "9e9fc515-5e15-41fc-8e76-b9f3af099a0f"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.913491 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c614b6a-8d46-4a07-89f6-1a7cc64dfdad-config-data" (OuterVolumeSpecName: "config-data") pod "0c614b6a-8d46-4a07-89f6-1a7cc64dfdad" (UID: "0c614b6a-8d46-4a07-89f6-1a7cc64dfdad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.942025 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c614b6a-8d46-4a07-89f6-1a7cc64dfdad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c614b6a-8d46-4a07-89f6-1a7cc64dfdad" (UID: "0c614b6a-8d46-4a07-89f6-1a7cc64dfdad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.943207 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c614b6a-8d46-4a07-89f6-1a7cc64dfdad-logs\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.943222 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e9fc515-5e15-41fc-8e76-b9f3af099a0f-logs\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.943231 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e9fc515-5e15-41fc-8e76-b9f3af099a0f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.943240 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlfjr\" (UniqueName: \"kubernetes.io/projected/0c614b6a-8d46-4a07-89f6-1a7cc64dfdad-kube-api-access-dlfjr\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.943250 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lkg5\" (UniqueName: \"kubernetes.io/projected/9e9fc515-5e15-41fc-8e76-b9f3af099a0f-kube-api-access-6lkg5\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.943258 4922 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9e9fc515-5e15-41fc-8e76-b9f3af099a0f-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.943266 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c614b6a-8d46-4a07-89f6-1a7cc64dfdad-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.943273 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c614b6a-8d46-4a07-89f6-1a7cc64dfdad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:45 crc kubenswrapper[4922]: I0218 11:56:45.958545 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e9fc515-5e15-41fc-8e76-b9f3af099a0f-config-data" (OuterVolumeSpecName: "config-data") pod "9e9fc515-5e15-41fc-8e76-b9f3af099a0f" (UID: "9e9fc515-5e15-41fc-8e76-b9f3af099a0f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:56:46 crc kubenswrapper[4922]: E0218 11:56:46.046835 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"sg-core\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack/ceilometer-0" podUID="a3873c9e-308f-46ea-ac8f-4ee78ca92235" Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.049113 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e9fc515-5e15-41fc-8e76-b9f3af099a0f-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.190910 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"9e9fc515-5e15-41fc-8e76-b9f3af099a0f","Type":"ContainerDied","Data":"089198a21158f6f6b56f6630ad31cc5105f4b0001f6d2937e2ed72533d2aaec8"} Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.190950 4922 scope.go:117] "RemoveContainer" containerID="538ff05accf8e69796a85607f99df1d694327aae777210403771939c222ef98d" Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.191064 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.194652 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"0c614b6a-8d46-4a07-89f6-1a7cc64dfdad","Type":"ContainerDied","Data":"c1680a690b411a1b27ee20c720d7966628a4b267d54560c222bee5a47c521f74"} Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.194737 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.200661 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3873c9e-308f-46ea-ac8f-4ee78ca92235","Type":"ContainerStarted","Data":"1f093722d754d2692788936f8e0c90e9361058a061c775d23bfb1b1c56a36b4c"} Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.200787 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a3873c9e-308f-46ea-ac8f-4ee78ca92235" containerName="ceilometer-notification-agent" containerID="cri-o://a7b1fa49f2a63b5b267b8b5143ef790e17f341b707fb18ac5be2dd797d33cc2a" gracePeriod=30 Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.201407 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a3873c9e-308f-46ea-ac8f-4ee78ca92235" containerName="proxy-httpd" containerID="cri-o://1f093722d754d2692788936f8e0c90e9361058a061c775d23bfb1b1c56a36b4c" gracePeriod=30 Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.236185 4922 scope.go:117] "RemoveContainer" containerID="17cce53602151ba6d1c560fb5734c8a81055d3f6f58806eded906ecc6d4b9003" Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.264501 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.286117 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-applier-0"] Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.292590 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.308221 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.372428 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Feb 18 11:56:46 crc kubenswrapper[4922]: E0218 11:56:46.373344 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c614b6a-8d46-4a07-89f6-1a7cc64dfdad" containerName="watcher-applier" Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.373420 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c614b6a-8d46-4a07-89f6-1a7cc64dfdad" containerName="watcher-applier" Feb 18 11:56:46 crc kubenswrapper[4922]: E0218 11:56:46.373431 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e9fc515-5e15-41fc-8e76-b9f3af099a0f" containerName="watcher-decision-engine" Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.373437 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e9fc515-5e15-41fc-8e76-b9f3af099a0f" containerName="watcher-decision-engine" Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.373836 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e9fc515-5e15-41fc-8e76-b9f3af099a0f" containerName="watcher-decision-engine" Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.373878 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c614b6a-8d46-4a07-89f6-1a7cc64dfdad" containerName="watcher-applier" Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.374735 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.377325 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.388244 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.410300 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.411983 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.414922 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.420095 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.466348 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/bdb6fddf-10f2-476b-822f-130f6fa12007-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"bdb6fddf-10f2-476b-822f-130f6fa12007\") " pod="openstack/watcher-decision-engine-0" Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.466405 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zsgq\" (UniqueName: \"kubernetes.io/projected/cd84d8c9-0a98-4f6b-b6da-887f4d294a38-kube-api-access-9zsgq\") pod \"watcher-applier-0\" (UID: \"cd84d8c9-0a98-4f6b-b6da-887f4d294a38\") " pod="openstack/watcher-applier-0" Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.466447 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd84d8c9-0a98-4f6b-b6da-887f4d294a38-config-data\") pod \"watcher-applier-0\" (UID: \"cd84d8c9-0a98-4f6b-b6da-887f4d294a38\") " pod="openstack/watcher-applier-0" Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.466481 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdb6fddf-10f2-476b-822f-130f6fa12007-config-data\") pod \"watcher-decision-engine-0\" (UID: \"bdb6fddf-10f2-476b-822f-130f6fa12007\") " pod="openstack/watcher-decision-engine-0" Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.466507 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff6wh\" (UniqueName: \"kubernetes.io/projected/bdb6fddf-10f2-476b-822f-130f6fa12007-kube-api-access-ff6wh\") pod \"watcher-decision-engine-0\" (UID: \"bdb6fddf-10f2-476b-822f-130f6fa12007\") " pod="openstack/watcher-decision-engine-0" Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.466526 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdb6fddf-10f2-476b-822f-130f6fa12007-logs\") pod \"watcher-decision-engine-0\" (UID: \"bdb6fddf-10f2-476b-822f-130f6fa12007\") " pod="openstack/watcher-decision-engine-0" Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.466571 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd84d8c9-0a98-4f6b-b6da-887f4d294a38-logs\") pod \"watcher-applier-0\" (UID: \"cd84d8c9-0a98-4f6b-b6da-887f4d294a38\") " pod="openstack/watcher-applier-0" Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.466605 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdb6fddf-10f2-476b-822f-130f6fa12007-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"bdb6fddf-10f2-476b-822f-130f6fa12007\") " pod="openstack/watcher-decision-engine-0" Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.466633 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd84d8c9-0a98-4f6b-b6da-887f4d294a38-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"cd84d8c9-0a98-4f6b-b6da-887f4d294a38\") " pod="openstack/watcher-applier-0" Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.529752 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7c7b84785b-f8lmj"] Feb 18 11:56:46 crc kubenswrapper[4922]: W0218 11:56:46.532254 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod280ad3f5_10de_4dc8_866b_c7502c004835.slice/crio-18e2c3e882668ec70d6f9fd6f54e2aaecf4f14013bbcf2f79e39db1abbf09685 WatchSource:0}: Error finding container 18e2c3e882668ec70d6f9fd6f54e2aaecf4f14013bbcf2f79e39db1abbf09685: Status 404 returned error can't find the container with id 18e2c3e882668ec70d6f9fd6f54e2aaecf4f14013bbcf2f79e39db1abbf09685 Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.568052 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdb6fddf-10f2-476b-822f-130f6fa12007-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"bdb6fddf-10f2-476b-822f-130f6fa12007\") " pod="openstack/watcher-decision-engine-0" Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.568926 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd84d8c9-0a98-4f6b-b6da-887f4d294a38-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"cd84d8c9-0a98-4f6b-b6da-887f4d294a38\") " pod="openstack/watcher-applier-0" Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.569017 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/bdb6fddf-10f2-476b-822f-130f6fa12007-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"bdb6fddf-10f2-476b-822f-130f6fa12007\") " pod="openstack/watcher-decision-engine-0" Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.569054 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zsgq\" (UniqueName: \"kubernetes.io/projected/cd84d8c9-0a98-4f6b-b6da-887f4d294a38-kube-api-access-9zsgq\") pod \"watcher-applier-0\" (UID: \"cd84d8c9-0a98-4f6b-b6da-887f4d294a38\") " pod="openstack/watcher-applier-0" Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.569111 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd84d8c9-0a98-4f6b-b6da-887f4d294a38-config-data\") pod \"watcher-applier-0\" (UID: \"cd84d8c9-0a98-4f6b-b6da-887f4d294a38\") " pod="openstack/watcher-applier-0" Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.569157 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdb6fddf-10f2-476b-822f-130f6fa12007-config-data\") pod \"watcher-decision-engine-0\" (UID: \"bdb6fddf-10f2-476b-822f-130f6fa12007\") " pod="openstack/watcher-decision-engine-0" Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.569196 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff6wh\" (UniqueName: \"kubernetes.io/projected/bdb6fddf-10f2-476b-822f-130f6fa12007-kube-api-access-ff6wh\") pod \"watcher-decision-engine-0\" (UID: \"bdb6fddf-10f2-476b-822f-130f6fa12007\") " pod="openstack/watcher-decision-engine-0" Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.569227 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdb6fddf-10f2-476b-822f-130f6fa12007-logs\") pod \"watcher-decision-engine-0\" (UID: \"bdb6fddf-10f2-476b-822f-130f6fa12007\") " pod="openstack/watcher-decision-engine-0" Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.569297 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd84d8c9-0a98-4f6b-b6da-887f4d294a38-logs\") pod \"watcher-applier-0\" (UID: \"cd84d8c9-0a98-4f6b-b6da-887f4d294a38\") " pod="openstack/watcher-applier-0" Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.569887 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd84d8c9-0a98-4f6b-b6da-887f4d294a38-logs\") pod \"watcher-applier-0\" (UID: \"cd84d8c9-0a98-4f6b-b6da-887f4d294a38\") " pod="openstack/watcher-applier-0" Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.573021 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdb6fddf-10f2-476b-822f-130f6fa12007-logs\") pod \"watcher-decision-engine-0\" (UID: \"bdb6fddf-10f2-476b-822f-130f6fa12007\") " pod="openstack/watcher-decision-engine-0" Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.573877 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd84d8c9-0a98-4f6b-b6da-887f4d294a38-config-data\") pod \"watcher-applier-0\" (UID: \"cd84d8c9-0a98-4f6b-b6da-887f4d294a38\") " pod="openstack/watcher-applier-0" Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.574462 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdb6fddf-10f2-476b-822f-130f6fa12007-config-data\") pod \"watcher-decision-engine-0\" (UID: \"bdb6fddf-10f2-476b-822f-130f6fa12007\") " pod="openstack/watcher-decision-engine-0" Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.575874 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/bdb6fddf-10f2-476b-822f-130f6fa12007-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"bdb6fddf-10f2-476b-822f-130f6fa12007\") " pod="openstack/watcher-decision-engine-0" Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.575881 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdb6fddf-10f2-476b-822f-130f6fa12007-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"bdb6fddf-10f2-476b-822f-130f6fa12007\") " pod="openstack/watcher-decision-engine-0" Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.587167 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff6wh\" (UniqueName: \"kubernetes.io/projected/bdb6fddf-10f2-476b-822f-130f6fa12007-kube-api-access-ff6wh\") pod \"watcher-decision-engine-0\" (UID: \"bdb6fddf-10f2-476b-822f-130f6fa12007\") " pod="openstack/watcher-decision-engine-0" Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.589379 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zsgq\" (UniqueName: \"kubernetes.io/projected/cd84d8c9-0a98-4f6b-b6da-887f4d294a38-kube-api-access-9zsgq\") pod \"watcher-applier-0\" (UID: \"cd84d8c9-0a98-4f6b-b6da-887f4d294a38\") " pod="openstack/watcher-applier-0" Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.589980 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd84d8c9-0a98-4f6b-b6da-887f4d294a38-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"cd84d8c9-0a98-4f6b-b6da-887f4d294a38\") " pod="openstack/watcher-applier-0" Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.699167 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.737923 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.812758 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-mqfhx" Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.878932 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/31aad152-dcb7-472f-a0f8-d90ae972442b-db-sync-config-data\") pod \"31aad152-dcb7-472f-a0f8-d90ae972442b\" (UID: \"31aad152-dcb7-472f-a0f8-d90ae972442b\") " Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.879561 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31aad152-dcb7-472f-a0f8-d90ae972442b-combined-ca-bundle\") pod \"31aad152-dcb7-472f-a0f8-d90ae972442b\" (UID: \"31aad152-dcb7-472f-a0f8-d90ae972442b\") " Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.879627 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-725mj\" (UniqueName: \"kubernetes.io/projected/31aad152-dcb7-472f-a0f8-d90ae972442b-kube-api-access-725mj\") pod \"31aad152-dcb7-472f-a0f8-d90ae972442b\" (UID: \"31aad152-dcb7-472f-a0f8-d90ae972442b\") " Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.886253 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31aad152-dcb7-472f-a0f8-d90ae972442b-kube-api-access-725mj" (OuterVolumeSpecName: "kube-api-access-725mj") pod "31aad152-dcb7-472f-a0f8-d90ae972442b" (UID: "31aad152-dcb7-472f-a0f8-d90ae972442b"). InnerVolumeSpecName "kube-api-access-725mj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.888048 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31aad152-dcb7-472f-a0f8-d90ae972442b-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "31aad152-dcb7-472f-a0f8-d90ae972442b" (UID: "31aad152-dcb7-472f-a0f8-d90ae972442b"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.929760 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31aad152-dcb7-472f-a0f8-d90ae972442b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "31aad152-dcb7-472f-a0f8-d90ae972442b" (UID: "31aad152-dcb7-472f-a0f8-d90ae972442b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.982672 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31aad152-dcb7-472f-a0f8-d90ae972442b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.982700 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-725mj\" (UniqueName: \"kubernetes.io/projected/31aad152-dcb7-472f-a0f8-d90ae972442b-kube-api-access-725mj\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:46 crc kubenswrapper[4922]: I0218 11:56:46.982710 4922 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/31aad152-dcb7-472f-a0f8-d90ae972442b-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.014482 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c614b6a-8d46-4a07-89f6-1a7cc64dfdad" path="/var/lib/kubelet/pods/0c614b6a-8d46-4a07-89f6-1a7cc64dfdad/volumes" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.015378 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e9fc515-5e15-41fc-8e76-b9f3af099a0f" path="/var/lib/kubelet/pods/9e9fc515-5e15-41fc-8e76-b9f3af099a0f/volumes" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.222887 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.233423 4922 generic.go:334] "Generic (PLEG): container finished" podID="a3873c9e-308f-46ea-ac8f-4ee78ca92235" containerID="1f093722d754d2692788936f8e0c90e9361058a061c775d23bfb1b1c56a36b4c" exitCode=0 Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.233501 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3873c9e-308f-46ea-ac8f-4ee78ca92235","Type":"ContainerDied","Data":"1f093722d754d2692788936f8e0c90e9361058a061c775d23bfb1b1c56a36b4c"} Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.239416 4922 generic.go:334] "Generic (PLEG): container finished" podID="e3b4cee1-5234-4b6c-93fa-3cb5687ecba9" containerID="5d696a6b456c46bcc03a5a4e42955c244ebb50a3f0ae3ca3ca3bec18dc6258ea" exitCode=0 Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.239496 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9d79df67b-mg9kq" event={"ID":"e3b4cee1-5234-4b6c-93fa-3cb5687ecba9","Type":"ContainerDied","Data":"5d696a6b456c46bcc03a5a4e42955c244ebb50a3f0ae3ca3ca3bec18dc6258ea"} Feb 18 11:56:47 crc kubenswrapper[4922]: W0218 11:56:47.241083 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd84d8c9_0a98_4f6b_b6da_887f4d294a38.slice/crio-4dbe71facafe23dee1111cdea2c33c997148d5887f41ef4c5ff76c38faf18673 WatchSource:0}: Error finding container 4dbe71facafe23dee1111cdea2c33c997148d5887f41ef4c5ff76c38faf18673: Status 404 returned error can't find the container with id 4dbe71facafe23dee1111cdea2c33c997148d5887f41ef4c5ff76c38faf18673 Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.243179 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-mqfhx" event={"ID":"31aad152-dcb7-472f-a0f8-d90ae972442b","Type":"ContainerDied","Data":"9b5d6cd64e7bf271ae9d437b3dc8665c7fa56f4097bd261a86583c671e88f922"} Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.243216 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b5d6cd64e7bf271ae9d437b3dc8665c7fa56f4097bd261a86583c671e88f922" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.243192 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-mqfhx" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.249949 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7c7b84785b-f8lmj" event={"ID":"280ad3f5-10de-4dc8-866b-c7502c004835","Type":"ContainerStarted","Data":"527f7ccfe35683cb19a402f51d6fe18b2f99d6b702a02283209d599ae2f74449"} Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.250012 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7c7b84785b-f8lmj" event={"ID":"280ad3f5-10de-4dc8-866b-c7502c004835","Type":"ContainerStarted","Data":"f4c4c35c75d436d0fa800652e1da8b91f7e3df925d6dd620eb4f664edd6c92b1"} Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.250027 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7c7b84785b-f8lmj" event={"ID":"280ad3f5-10de-4dc8-866b-c7502c004835","Type":"ContainerStarted","Data":"18e2c3e882668ec70d6f9fd6f54e2aaecf4f14013bbcf2f79e39db1abbf09685"} Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.250528 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7c7b84785b-f8lmj" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.250651 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7c7b84785b-f8lmj" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.324509 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7c7b84785b-f8lmj" podStartSLOduration=2.324191738 podStartE2EDuration="2.324191738s" podCreationTimestamp="2026-02-18 11:56:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:56:47.322387562 +0000 UTC m=+1209.050091652" watchObservedRunningTime="2026-02-18 11:56:47.324191738 +0000 UTC m=+1209.051895808" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.394327 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.563725 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-676bd4cb85-2ggtc"] Feb 18 11:56:47 crc kubenswrapper[4922]: E0218 11:56:47.564117 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31aad152-dcb7-472f-a0f8-d90ae972442b" containerName="barbican-db-sync" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.564128 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="31aad152-dcb7-472f-a0f8-d90ae972442b" containerName="barbican-db-sync" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.564309 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="31aad152-dcb7-472f-a0f8-d90ae972442b" containerName="barbican-db-sync" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.565271 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-676bd4cb85-2ggtc" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.570695 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.571008 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-pktbk" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.579935 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.617808 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93be7893-0b89-4762-870d-f5878ecddb3b-config-data\") pod \"barbican-worker-676bd4cb85-2ggtc\" (UID: \"93be7893-0b89-4762-870d-f5878ecddb3b\") " pod="openstack/barbican-worker-676bd4cb85-2ggtc" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.617856 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93be7893-0b89-4762-870d-f5878ecddb3b-logs\") pod \"barbican-worker-676bd4cb85-2ggtc\" (UID: \"93be7893-0b89-4762-870d-f5878ecddb3b\") " pod="openstack/barbican-worker-676bd4cb85-2ggtc" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.617889 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/93be7893-0b89-4762-870d-f5878ecddb3b-config-data-custom\") pod \"barbican-worker-676bd4cb85-2ggtc\" (UID: \"93be7893-0b89-4762-870d-f5878ecddb3b\") " pod="openstack/barbican-worker-676bd4cb85-2ggtc" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.617951 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93be7893-0b89-4762-870d-f5878ecddb3b-combined-ca-bundle\") pod \"barbican-worker-676bd4cb85-2ggtc\" (UID: \"93be7893-0b89-4762-870d-f5878ecddb3b\") " pod="openstack/barbican-worker-676bd4cb85-2ggtc" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.617973 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr9mb\" (UniqueName: \"kubernetes.io/projected/93be7893-0b89-4762-870d-f5878ecddb3b-kube-api-access-mr9mb\") pod \"barbican-worker-676bd4cb85-2ggtc\" (UID: \"93be7893-0b89-4762-870d-f5878ecddb3b\") " pod="openstack/barbican-worker-676bd4cb85-2ggtc" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.651829 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-676bd4cb85-2ggtc"] Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.705194 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-78995b5fcd-pmbbf"] Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.707077 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-78995b5fcd-pmbbf" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.709915 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.737796 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93be7893-0b89-4762-870d-f5878ecddb3b-config-data\") pod \"barbican-worker-676bd4cb85-2ggtc\" (UID: \"93be7893-0b89-4762-870d-f5878ecddb3b\") " pod="openstack/barbican-worker-676bd4cb85-2ggtc" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.738108 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93be7893-0b89-4762-870d-f5878ecddb3b-logs\") pod \"barbican-worker-676bd4cb85-2ggtc\" (UID: \"93be7893-0b89-4762-870d-f5878ecddb3b\") " pod="openstack/barbican-worker-676bd4cb85-2ggtc" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.738136 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/93be7893-0b89-4762-870d-f5878ecddb3b-config-data-custom\") pod \"barbican-worker-676bd4cb85-2ggtc\" (UID: \"93be7893-0b89-4762-870d-f5878ecddb3b\") " pod="openstack/barbican-worker-676bd4cb85-2ggtc" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.738211 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93be7893-0b89-4762-870d-f5878ecddb3b-combined-ca-bundle\") pod \"barbican-worker-676bd4cb85-2ggtc\" (UID: \"93be7893-0b89-4762-870d-f5878ecddb3b\") " pod="openstack/barbican-worker-676bd4cb85-2ggtc" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.738233 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr9mb\" (UniqueName: \"kubernetes.io/projected/93be7893-0b89-4762-870d-f5878ecddb3b-kube-api-access-mr9mb\") pod \"barbican-worker-676bd4cb85-2ggtc\" (UID: \"93be7893-0b89-4762-870d-f5878ecddb3b\") " pod="openstack/barbican-worker-676bd4cb85-2ggtc" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.740545 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93be7893-0b89-4762-870d-f5878ecddb3b-logs\") pod \"barbican-worker-676bd4cb85-2ggtc\" (UID: \"93be7893-0b89-4762-870d-f5878ecddb3b\") " pod="openstack/barbican-worker-676bd4cb85-2ggtc" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.761406 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/93be7893-0b89-4762-870d-f5878ecddb3b-config-data-custom\") pod \"barbican-worker-676bd4cb85-2ggtc\" (UID: \"93be7893-0b89-4762-870d-f5878ecddb3b\") " pod="openstack/barbican-worker-676bd4cb85-2ggtc" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.765388 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93be7893-0b89-4762-870d-f5878ecddb3b-config-data\") pod \"barbican-worker-676bd4cb85-2ggtc\" (UID: \"93be7893-0b89-4762-870d-f5878ecddb3b\") " pod="openstack/barbican-worker-676bd4cb85-2ggtc" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.775492 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93be7893-0b89-4762-870d-f5878ecddb3b-combined-ca-bundle\") pod \"barbican-worker-676bd4cb85-2ggtc\" (UID: \"93be7893-0b89-4762-870d-f5878ecddb3b\") " pod="openstack/barbican-worker-676bd4cb85-2ggtc" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.782544 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-78995b5fcd-pmbbf"] Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.807432 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr9mb\" (UniqueName: \"kubernetes.io/projected/93be7893-0b89-4762-870d-f5878ecddb3b-kube-api-access-mr9mb\") pod \"barbican-worker-676bd4cb85-2ggtc\" (UID: \"93be7893-0b89-4762-870d-f5878ecddb3b\") " pod="openstack/barbican-worker-676bd4cb85-2ggtc" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.826268 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bdb97b9f9-22mm8"] Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.828328 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bdb97b9f9-22mm8" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.841673 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bdb97b9f9-22mm8"] Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.841860 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2664c9b6-f62a-4453-8771-8c273f5f9ec1-logs\") pod \"barbican-keystone-listener-78995b5fcd-pmbbf\" (UID: \"2664c9b6-f62a-4453-8771-8c273f5f9ec1\") " pod="openstack/barbican-keystone-listener-78995b5fcd-pmbbf" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.841931 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2664c9b6-f62a-4453-8771-8c273f5f9ec1-config-data-custom\") pod \"barbican-keystone-listener-78995b5fcd-pmbbf\" (UID: \"2664c9b6-f62a-4453-8771-8c273f5f9ec1\") " pod="openstack/barbican-keystone-listener-78995b5fcd-pmbbf" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.842065 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf8ls\" (UniqueName: \"kubernetes.io/projected/2664c9b6-f62a-4453-8771-8c273f5f9ec1-kube-api-access-xf8ls\") pod \"barbican-keystone-listener-78995b5fcd-pmbbf\" (UID: \"2664c9b6-f62a-4453-8771-8c273f5f9ec1\") " pod="openstack/barbican-keystone-listener-78995b5fcd-pmbbf" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.842168 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2664c9b6-f62a-4453-8771-8c273f5f9ec1-config-data\") pod \"barbican-keystone-listener-78995b5fcd-pmbbf\" (UID: \"2664c9b6-f62a-4453-8771-8c273f5f9ec1\") " pod="openstack/barbican-keystone-listener-78995b5fcd-pmbbf" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.842294 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2664c9b6-f62a-4453-8771-8c273f5f9ec1-combined-ca-bundle\") pod \"barbican-keystone-listener-78995b5fcd-pmbbf\" (UID: \"2664c9b6-f62a-4453-8771-8c273f5f9ec1\") " pod="openstack/barbican-keystone-listener-78995b5fcd-pmbbf" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.875198 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6d677498bd-cxq98"] Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.877058 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6d677498bd-cxq98" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.879864 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.886514 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6d677498bd-cxq98"] Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.945592 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2664c9b6-f62a-4453-8771-8c273f5f9ec1-combined-ca-bundle\") pod \"barbican-keystone-listener-78995b5fcd-pmbbf\" (UID: \"2664c9b6-f62a-4453-8771-8c273f5f9ec1\") " pod="openstack/barbican-keystone-listener-78995b5fcd-pmbbf" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.945645 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53371b07-a65f-4fec-8564-bcd51df6c010-combined-ca-bundle\") pod \"barbican-api-6d677498bd-cxq98\" (UID: \"53371b07-a65f-4fec-8564-bcd51df6c010\") " pod="openstack/barbican-api-6d677498bd-cxq98" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.945682 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d15f2ab3-202d-4241-a636-4d00475874aa-ovsdbserver-sb\") pod \"dnsmasq-dns-5bdb97b9f9-22mm8\" (UID: \"d15f2ab3-202d-4241-a636-4d00475874aa\") " pod="openstack/dnsmasq-dns-5bdb97b9f9-22mm8" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.945705 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/53371b07-a65f-4fec-8564-bcd51df6c010-config-data-custom\") pod \"barbican-api-6d677498bd-cxq98\" (UID: \"53371b07-a65f-4fec-8564-bcd51df6c010\") " pod="openstack/barbican-api-6d677498bd-cxq98" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.945733 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2664c9b6-f62a-4453-8771-8c273f5f9ec1-logs\") pod \"barbican-keystone-listener-78995b5fcd-pmbbf\" (UID: \"2664c9b6-f62a-4453-8771-8c273f5f9ec1\") " pod="openstack/barbican-keystone-listener-78995b5fcd-pmbbf" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.945751 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53371b07-a65f-4fec-8564-bcd51df6c010-config-data\") pod \"barbican-api-6d677498bd-cxq98\" (UID: \"53371b07-a65f-4fec-8564-bcd51df6c010\") " pod="openstack/barbican-api-6d677498bd-cxq98" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.945778 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdmzm\" (UniqueName: \"kubernetes.io/projected/53371b07-a65f-4fec-8564-bcd51df6c010-kube-api-access-jdmzm\") pod \"barbican-api-6d677498bd-cxq98\" (UID: \"53371b07-a65f-4fec-8564-bcd51df6c010\") " pod="openstack/barbican-api-6d677498bd-cxq98" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.945807 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d15f2ab3-202d-4241-a636-4d00475874aa-dns-svc\") pod \"dnsmasq-dns-5bdb97b9f9-22mm8\" (UID: \"d15f2ab3-202d-4241-a636-4d00475874aa\") " pod="openstack/dnsmasq-dns-5bdb97b9f9-22mm8" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.945834 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2664c9b6-f62a-4453-8771-8c273f5f9ec1-config-data-custom\") pod \"barbican-keystone-listener-78995b5fcd-pmbbf\" (UID: \"2664c9b6-f62a-4453-8771-8c273f5f9ec1\") " pod="openstack/barbican-keystone-listener-78995b5fcd-pmbbf" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.945853 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d15f2ab3-202d-4241-a636-4d00475874aa-config\") pod \"dnsmasq-dns-5bdb97b9f9-22mm8\" (UID: \"d15f2ab3-202d-4241-a636-4d00475874aa\") " pod="openstack/dnsmasq-dns-5bdb97b9f9-22mm8" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.945891 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xf8ls\" (UniqueName: \"kubernetes.io/projected/2664c9b6-f62a-4453-8771-8c273f5f9ec1-kube-api-access-xf8ls\") pod \"barbican-keystone-listener-78995b5fcd-pmbbf\" (UID: \"2664c9b6-f62a-4453-8771-8c273f5f9ec1\") " pod="openstack/barbican-keystone-listener-78995b5fcd-pmbbf" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.945935 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d15f2ab3-202d-4241-a636-4d00475874aa-dns-swift-storage-0\") pod \"dnsmasq-dns-5bdb97b9f9-22mm8\" (UID: \"d15f2ab3-202d-4241-a636-4d00475874aa\") " pod="openstack/dnsmasq-dns-5bdb97b9f9-22mm8" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.945965 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2664c9b6-f62a-4453-8771-8c273f5f9ec1-config-data\") pod \"barbican-keystone-listener-78995b5fcd-pmbbf\" (UID: \"2664c9b6-f62a-4453-8771-8c273f5f9ec1\") " pod="openstack/barbican-keystone-listener-78995b5fcd-pmbbf" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.945985 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53371b07-a65f-4fec-8564-bcd51df6c010-logs\") pod \"barbican-api-6d677498bd-cxq98\" (UID: \"53371b07-a65f-4fec-8564-bcd51df6c010\") " pod="openstack/barbican-api-6d677498bd-cxq98" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.946021 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d15f2ab3-202d-4241-a636-4d00475874aa-ovsdbserver-nb\") pod \"dnsmasq-dns-5bdb97b9f9-22mm8\" (UID: \"d15f2ab3-202d-4241-a636-4d00475874aa\") " pod="openstack/dnsmasq-dns-5bdb97b9f9-22mm8" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.946562 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8ptt\" (UniqueName: \"kubernetes.io/projected/d15f2ab3-202d-4241-a636-4d00475874aa-kube-api-access-b8ptt\") pod \"dnsmasq-dns-5bdb97b9f9-22mm8\" (UID: \"d15f2ab3-202d-4241-a636-4d00475874aa\") " pod="openstack/dnsmasq-dns-5bdb97b9f9-22mm8" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.947117 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2664c9b6-f62a-4453-8771-8c273f5f9ec1-logs\") pod \"barbican-keystone-listener-78995b5fcd-pmbbf\" (UID: \"2664c9b6-f62a-4453-8771-8c273f5f9ec1\") " pod="openstack/barbican-keystone-listener-78995b5fcd-pmbbf" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.956123 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2664c9b6-f62a-4453-8771-8c273f5f9ec1-config-data\") pod \"barbican-keystone-listener-78995b5fcd-pmbbf\" (UID: \"2664c9b6-f62a-4453-8771-8c273f5f9ec1\") " pod="openstack/barbican-keystone-listener-78995b5fcd-pmbbf" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.959736 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2664c9b6-f62a-4453-8771-8c273f5f9ec1-config-data-custom\") pod \"barbican-keystone-listener-78995b5fcd-pmbbf\" (UID: \"2664c9b6-f62a-4453-8771-8c273f5f9ec1\") " pod="openstack/barbican-keystone-listener-78995b5fcd-pmbbf" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.963765 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2664c9b6-f62a-4453-8771-8c273f5f9ec1-combined-ca-bundle\") pod \"barbican-keystone-listener-78995b5fcd-pmbbf\" (UID: \"2664c9b6-f62a-4453-8771-8c273f5f9ec1\") " pod="openstack/barbican-keystone-listener-78995b5fcd-pmbbf" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.971931 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-676bd4cb85-2ggtc" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.987793 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 11:56:47 crc kubenswrapper[4922]: I0218 11:56:47.988020 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf8ls\" (UniqueName: \"kubernetes.io/projected/2664c9b6-f62a-4453-8771-8c273f5f9ec1-kube-api-access-xf8ls\") pod \"barbican-keystone-listener-78995b5fcd-pmbbf\" (UID: \"2664c9b6-f62a-4453-8771-8c273f5f9ec1\") " pod="openstack/barbican-keystone-listener-78995b5fcd-pmbbf" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.042911 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-78995b5fcd-pmbbf" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.052014 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53371b07-a65f-4fec-8564-bcd51df6c010-combined-ca-bundle\") pod \"barbican-api-6d677498bd-cxq98\" (UID: \"53371b07-a65f-4fec-8564-bcd51df6c010\") " pod="openstack/barbican-api-6d677498bd-cxq98" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.052124 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d15f2ab3-202d-4241-a636-4d00475874aa-ovsdbserver-sb\") pod \"dnsmasq-dns-5bdb97b9f9-22mm8\" (UID: \"d15f2ab3-202d-4241-a636-4d00475874aa\") " pod="openstack/dnsmasq-dns-5bdb97b9f9-22mm8" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.052159 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/53371b07-a65f-4fec-8564-bcd51df6c010-config-data-custom\") pod \"barbican-api-6d677498bd-cxq98\" (UID: \"53371b07-a65f-4fec-8564-bcd51df6c010\") " pod="openstack/barbican-api-6d677498bd-cxq98" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.052197 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53371b07-a65f-4fec-8564-bcd51df6c010-config-data\") pod \"barbican-api-6d677498bd-cxq98\" (UID: \"53371b07-a65f-4fec-8564-bcd51df6c010\") " pod="openstack/barbican-api-6d677498bd-cxq98" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.052232 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdmzm\" (UniqueName: \"kubernetes.io/projected/53371b07-a65f-4fec-8564-bcd51df6c010-kube-api-access-jdmzm\") pod \"barbican-api-6d677498bd-cxq98\" (UID: \"53371b07-a65f-4fec-8564-bcd51df6c010\") " pod="openstack/barbican-api-6d677498bd-cxq98" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.052280 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d15f2ab3-202d-4241-a636-4d00475874aa-dns-svc\") pod \"dnsmasq-dns-5bdb97b9f9-22mm8\" (UID: \"d15f2ab3-202d-4241-a636-4d00475874aa\") " pod="openstack/dnsmasq-dns-5bdb97b9f9-22mm8" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.052345 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d15f2ab3-202d-4241-a636-4d00475874aa-config\") pod \"dnsmasq-dns-5bdb97b9f9-22mm8\" (UID: \"d15f2ab3-202d-4241-a636-4d00475874aa\") " pod="openstack/dnsmasq-dns-5bdb97b9f9-22mm8" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.052575 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d15f2ab3-202d-4241-a636-4d00475874aa-dns-swift-storage-0\") pod \"dnsmasq-dns-5bdb97b9f9-22mm8\" (UID: \"d15f2ab3-202d-4241-a636-4d00475874aa\") " pod="openstack/dnsmasq-dns-5bdb97b9f9-22mm8" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.052615 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53371b07-a65f-4fec-8564-bcd51df6c010-logs\") pod \"barbican-api-6d677498bd-cxq98\" (UID: \"53371b07-a65f-4fec-8564-bcd51df6c010\") " pod="openstack/barbican-api-6d677498bd-cxq98" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.052676 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d15f2ab3-202d-4241-a636-4d00475874aa-ovsdbserver-nb\") pod \"dnsmasq-dns-5bdb97b9f9-22mm8\" (UID: \"d15f2ab3-202d-4241-a636-4d00475874aa\") " pod="openstack/dnsmasq-dns-5bdb97b9f9-22mm8" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.052744 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8ptt\" (UniqueName: \"kubernetes.io/projected/d15f2ab3-202d-4241-a636-4d00475874aa-kube-api-access-b8ptt\") pod \"dnsmasq-dns-5bdb97b9f9-22mm8\" (UID: \"d15f2ab3-202d-4241-a636-4d00475874aa\") " pod="openstack/dnsmasq-dns-5bdb97b9f9-22mm8" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.054512 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53371b07-a65f-4fec-8564-bcd51df6c010-logs\") pod \"barbican-api-6d677498bd-cxq98\" (UID: \"53371b07-a65f-4fec-8564-bcd51df6c010\") " pod="openstack/barbican-api-6d677498bd-cxq98" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.058637 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d15f2ab3-202d-4241-a636-4d00475874aa-config\") pod \"dnsmasq-dns-5bdb97b9f9-22mm8\" (UID: \"d15f2ab3-202d-4241-a636-4d00475874aa\") " pod="openstack/dnsmasq-dns-5bdb97b9f9-22mm8" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.060230 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/53371b07-a65f-4fec-8564-bcd51df6c010-config-data-custom\") pod \"barbican-api-6d677498bd-cxq98\" (UID: \"53371b07-a65f-4fec-8564-bcd51df6c010\") " pod="openstack/barbican-api-6d677498bd-cxq98" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.062769 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d15f2ab3-202d-4241-a636-4d00475874aa-ovsdbserver-nb\") pod \"dnsmasq-dns-5bdb97b9f9-22mm8\" (UID: \"d15f2ab3-202d-4241-a636-4d00475874aa\") " pod="openstack/dnsmasq-dns-5bdb97b9f9-22mm8" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.062999 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53371b07-a65f-4fec-8564-bcd51df6c010-combined-ca-bundle\") pod \"barbican-api-6d677498bd-cxq98\" (UID: \"53371b07-a65f-4fec-8564-bcd51df6c010\") " pod="openstack/barbican-api-6d677498bd-cxq98" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.063571 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53371b07-a65f-4fec-8564-bcd51df6c010-config-data\") pod \"barbican-api-6d677498bd-cxq98\" (UID: \"53371b07-a65f-4fec-8564-bcd51df6c010\") " pod="openstack/barbican-api-6d677498bd-cxq98" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.066070 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d15f2ab3-202d-4241-a636-4d00475874aa-dns-svc\") pod \"dnsmasq-dns-5bdb97b9f9-22mm8\" (UID: \"d15f2ab3-202d-4241-a636-4d00475874aa\") " pod="openstack/dnsmasq-dns-5bdb97b9f9-22mm8" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.066257 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d15f2ab3-202d-4241-a636-4d00475874aa-dns-swift-storage-0\") pod \"dnsmasq-dns-5bdb97b9f9-22mm8\" (UID: \"d15f2ab3-202d-4241-a636-4d00475874aa\") " pod="openstack/dnsmasq-dns-5bdb97b9f9-22mm8" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.067467 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d15f2ab3-202d-4241-a636-4d00475874aa-ovsdbserver-sb\") pod \"dnsmasq-dns-5bdb97b9f9-22mm8\" (UID: \"d15f2ab3-202d-4241-a636-4d00475874aa\") " pod="openstack/dnsmasq-dns-5bdb97b9f9-22mm8" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.086800 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdmzm\" (UniqueName: \"kubernetes.io/projected/53371b07-a65f-4fec-8564-bcd51df6c010-kube-api-access-jdmzm\") pod \"barbican-api-6d677498bd-cxq98\" (UID: \"53371b07-a65f-4fec-8564-bcd51df6c010\") " pod="openstack/barbican-api-6d677498bd-cxq98" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.102499 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8ptt\" (UniqueName: \"kubernetes.io/projected/d15f2ab3-202d-4241-a636-4d00475874aa-kube-api-access-b8ptt\") pod \"dnsmasq-dns-5bdb97b9f9-22mm8\" (UID: \"d15f2ab3-202d-4241-a636-4d00475874aa\") " pod="openstack/dnsmasq-dns-5bdb97b9f9-22mm8" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.157804 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmwrr\" (UniqueName: \"kubernetes.io/projected/a3873c9e-308f-46ea-ac8f-4ee78ca92235-kube-api-access-jmwrr\") pod \"a3873c9e-308f-46ea-ac8f-4ee78ca92235\" (UID: \"a3873c9e-308f-46ea-ac8f-4ee78ca92235\") " Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.158102 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3873c9e-308f-46ea-ac8f-4ee78ca92235-config-data\") pod \"a3873c9e-308f-46ea-ac8f-4ee78ca92235\" (UID: \"a3873c9e-308f-46ea-ac8f-4ee78ca92235\") " Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.158125 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3873c9e-308f-46ea-ac8f-4ee78ca92235-scripts\") pod \"a3873c9e-308f-46ea-ac8f-4ee78ca92235\" (UID: \"a3873c9e-308f-46ea-ac8f-4ee78ca92235\") " Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.158171 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3873c9e-308f-46ea-ac8f-4ee78ca92235-run-httpd\") pod \"a3873c9e-308f-46ea-ac8f-4ee78ca92235\" (UID: \"a3873c9e-308f-46ea-ac8f-4ee78ca92235\") " Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.158249 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a3873c9e-308f-46ea-ac8f-4ee78ca92235-sg-core-conf-yaml\") pod \"a3873c9e-308f-46ea-ac8f-4ee78ca92235\" (UID: \"a3873c9e-308f-46ea-ac8f-4ee78ca92235\") " Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.158283 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3873c9e-308f-46ea-ac8f-4ee78ca92235-combined-ca-bundle\") pod \"a3873c9e-308f-46ea-ac8f-4ee78ca92235\" (UID: \"a3873c9e-308f-46ea-ac8f-4ee78ca92235\") " Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.158352 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3873c9e-308f-46ea-ac8f-4ee78ca92235-log-httpd\") pod \"a3873c9e-308f-46ea-ac8f-4ee78ca92235\" (UID: \"a3873c9e-308f-46ea-ac8f-4ee78ca92235\") " Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.161301 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3873c9e-308f-46ea-ac8f-4ee78ca92235-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a3873c9e-308f-46ea-ac8f-4ee78ca92235" (UID: "a3873c9e-308f-46ea-ac8f-4ee78ca92235"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.164287 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3873c9e-308f-46ea-ac8f-4ee78ca92235-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a3873c9e-308f-46ea-ac8f-4ee78ca92235" (UID: "a3873c9e-308f-46ea-ac8f-4ee78ca92235"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.175247 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3873c9e-308f-46ea-ac8f-4ee78ca92235-kube-api-access-jmwrr" (OuterVolumeSpecName: "kube-api-access-jmwrr") pod "a3873c9e-308f-46ea-ac8f-4ee78ca92235" (UID: "a3873c9e-308f-46ea-ac8f-4ee78ca92235"). InnerVolumeSpecName "kube-api-access-jmwrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.181706 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3873c9e-308f-46ea-ac8f-4ee78ca92235-scripts" (OuterVolumeSpecName: "scripts") pod "a3873c9e-308f-46ea-ac8f-4ee78ca92235" (UID: "a3873c9e-308f-46ea-ac8f-4ee78ca92235"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.201244 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3873c9e-308f-46ea-ac8f-4ee78ca92235-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a3873c9e-308f-46ea-ac8f-4ee78ca92235" (UID: "a3873c9e-308f-46ea-ac8f-4ee78ca92235"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.263122 4922 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3873c9e-308f-46ea-ac8f-4ee78ca92235-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.263164 4922 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a3873c9e-308f-46ea-ac8f-4ee78ca92235-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.263197 4922 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3873c9e-308f-46ea-ac8f-4ee78ca92235-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.263208 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmwrr\" (UniqueName: \"kubernetes.io/projected/a3873c9e-308f-46ea-ac8f-4ee78ca92235-kube-api-access-jmwrr\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.263265 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3873c9e-308f-46ea-ac8f-4ee78ca92235-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.280936 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bdb97b9f9-22mm8" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.290831 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"bdb6fddf-10f2-476b-822f-130f6fa12007","Type":"ContainerStarted","Data":"bc2ff599d529126d7879e500e35b5132d881d5c58dbce8cd92a858ece8e6e115"} Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.290874 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"bdb6fddf-10f2-476b-822f-130f6fa12007","Type":"ContainerStarted","Data":"767ebbd1c5208f481e0a0c9a07d1e2942ae4da643c6fc17067643a26968c3ac5"} Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.293487 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"cd84d8c9-0a98-4f6b-b6da-887f4d294a38","Type":"ContainerStarted","Data":"4708bc7e5a3ee308741085c1464cbadd339fb2255ed09bf71ea1b5148a7b42d4"} Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.293524 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"cd84d8c9-0a98-4f6b-b6da-887f4d294a38","Type":"ContainerStarted","Data":"4dbe71facafe23dee1111cdea2c33c997148d5887f41ef4c5ff76c38faf18673"} Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.312781 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6d677498bd-cxq98" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.315767 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3873c9e-308f-46ea-ac8f-4ee78ca92235-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a3873c9e-308f-46ea-ac8f-4ee78ca92235" (UID: "a3873c9e-308f-46ea-ac8f-4ee78ca92235"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.320420 4922 generic.go:334] "Generic (PLEG): container finished" podID="a3873c9e-308f-46ea-ac8f-4ee78ca92235" containerID="a7b1fa49f2a63b5b267b8b5143ef790e17f341b707fb18ac5be2dd797d33cc2a" exitCode=0 Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.322151 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.322638 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3873c9e-308f-46ea-ac8f-4ee78ca92235","Type":"ContainerDied","Data":"a7b1fa49f2a63b5b267b8b5143ef790e17f341b707fb18ac5be2dd797d33cc2a"} Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.322671 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3873c9e-308f-46ea-ac8f-4ee78ca92235","Type":"ContainerDied","Data":"d077394e189534489b8a5cebe609760985017f8cfefceb856dec5e4e90cc20e1"} Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.322692 4922 scope.go:117] "RemoveContainer" containerID="1f093722d754d2692788936f8e0c90e9361058a061c775d23bfb1b1c56a36b4c" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.329275 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=2.329247279 podStartE2EDuration="2.329247279s" podCreationTimestamp="2026-02-18 11:56:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:56:48.316076626 +0000 UTC m=+1210.043780726" watchObservedRunningTime="2026-02-18 11:56:48.329247279 +0000 UTC m=+1210.056951359" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.341348 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=2.341331124 podStartE2EDuration="2.341331124s" podCreationTimestamp="2026-02-18 11:56:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:56:48.335874216 +0000 UTC m=+1210.063578296" watchObservedRunningTime="2026-02-18 11:56:48.341331124 +0000 UTC m=+1210.069035204" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.365638 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3873c9e-308f-46ea-ac8f-4ee78ca92235-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.391258 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-9d79df67b-mg9kq" podUID="e3b4cee1-5234-4b6c-93fa-3cb5687ecba9" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.159:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.159:8443: connect: connection refused" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.410157 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3873c9e-308f-46ea-ac8f-4ee78ca92235-config-data" (OuterVolumeSpecName: "config-data") pod "a3873c9e-308f-46ea-ac8f-4ee78ca92235" (UID: "a3873c9e-308f-46ea-ac8f-4ee78ca92235"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.454786 4922 scope.go:117] "RemoveContainer" containerID="a7b1fa49f2a63b5b267b8b5143ef790e17f341b707fb18ac5be2dd797d33cc2a" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.455761 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-676bd4cb85-2ggtc"] Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.470137 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3873c9e-308f-46ea-ac8f-4ee78ca92235-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.518760 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.522579 4922 scope.go:117] "RemoveContainer" containerID="1f093722d754d2692788936f8e0c90e9361058a061c775d23bfb1b1c56a36b4c" Feb 18 11:56:48 crc kubenswrapper[4922]: E0218 11:56:48.524975 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f093722d754d2692788936f8e0c90e9361058a061c775d23bfb1b1c56a36b4c\": container with ID starting with 1f093722d754d2692788936f8e0c90e9361058a061c775d23bfb1b1c56a36b4c not found: ID does not exist" containerID="1f093722d754d2692788936f8e0c90e9361058a061c775d23bfb1b1c56a36b4c" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.525038 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f093722d754d2692788936f8e0c90e9361058a061c775d23bfb1b1c56a36b4c"} err="failed to get container status \"1f093722d754d2692788936f8e0c90e9361058a061c775d23bfb1b1c56a36b4c\": rpc error: code = NotFound desc = could not find container \"1f093722d754d2692788936f8e0c90e9361058a061c775d23bfb1b1c56a36b4c\": container with ID starting with 1f093722d754d2692788936f8e0c90e9361058a061c775d23bfb1b1c56a36b4c not found: ID does not exist" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.525082 4922 scope.go:117] "RemoveContainer" containerID="a7b1fa49f2a63b5b267b8b5143ef790e17f341b707fb18ac5be2dd797d33cc2a" Feb 18 11:56:48 crc kubenswrapper[4922]: E0218 11:56:48.526488 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7b1fa49f2a63b5b267b8b5143ef790e17f341b707fb18ac5be2dd797d33cc2a\": container with ID starting with a7b1fa49f2a63b5b267b8b5143ef790e17f341b707fb18ac5be2dd797d33cc2a not found: ID does not exist" containerID="a7b1fa49f2a63b5b267b8b5143ef790e17f341b707fb18ac5be2dd797d33cc2a" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.526602 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7b1fa49f2a63b5b267b8b5143ef790e17f341b707fb18ac5be2dd797d33cc2a"} err="failed to get container status \"a7b1fa49f2a63b5b267b8b5143ef790e17f341b707fb18ac5be2dd797d33cc2a\": rpc error: code = NotFound desc = could not find container \"a7b1fa49f2a63b5b267b8b5143ef790e17f341b707fb18ac5be2dd797d33cc2a\": container with ID starting with a7b1fa49f2a63b5b267b8b5143ef790e17f341b707fb18ac5be2dd797d33cc2a not found: ID does not exist" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.641200 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.815876 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-78995b5fcd-pmbbf"] Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.834349 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.843441 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.854673 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 18 11:56:48 crc kubenswrapper[4922]: E0218 11:56:48.855115 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3873c9e-308f-46ea-ac8f-4ee78ca92235" containerName="proxy-httpd" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.855138 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3873c9e-308f-46ea-ac8f-4ee78ca92235" containerName="proxy-httpd" Feb 18 11:56:48 crc kubenswrapper[4922]: E0218 11:56:48.855191 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3873c9e-308f-46ea-ac8f-4ee78ca92235" containerName="ceilometer-notification-agent" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.855198 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3873c9e-308f-46ea-ac8f-4ee78ca92235" containerName="ceilometer-notification-agent" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.855393 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3873c9e-308f-46ea-ac8f-4ee78ca92235" containerName="ceilometer-notification-agent" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.855412 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3873c9e-308f-46ea-ac8f-4ee78ca92235" containerName="proxy-httpd" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.857172 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.861082 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.876118 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.876693 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.990565 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3873c9e-308f-46ea-ac8f-4ee78ca92235" path="/var/lib/kubelet/pods/a3873c9e-308f-46ea-ac8f-4ee78ca92235/volumes" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.990888 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31a682bb-b881-47f5-960b-c6ae54c24275-config-data\") pod \"ceilometer-0\" (UID: \"31a682bb-b881-47f5-960b-c6ae54c24275\") " pod="openstack/ceilometer-0" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.991698 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt6qk\" (UniqueName: \"kubernetes.io/projected/31a682bb-b881-47f5-960b-c6ae54c24275-kube-api-access-tt6qk\") pod \"ceilometer-0\" (UID: \"31a682bb-b881-47f5-960b-c6ae54c24275\") " pod="openstack/ceilometer-0" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.991976 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31a682bb-b881-47f5-960b-c6ae54c24275-log-httpd\") pod \"ceilometer-0\" (UID: \"31a682bb-b881-47f5-960b-c6ae54c24275\") " pod="openstack/ceilometer-0" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.992279 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/31a682bb-b881-47f5-960b-c6ae54c24275-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"31a682bb-b881-47f5-960b-c6ae54c24275\") " pod="openstack/ceilometer-0" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.992326 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31a682bb-b881-47f5-960b-c6ae54c24275-scripts\") pod \"ceilometer-0\" (UID: \"31a682bb-b881-47f5-960b-c6ae54c24275\") " pod="openstack/ceilometer-0" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.992415 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31a682bb-b881-47f5-960b-c6ae54c24275-run-httpd\") pod \"ceilometer-0\" (UID: \"31a682bb-b881-47f5-960b-c6ae54c24275\") " pod="openstack/ceilometer-0" Feb 18 11:56:48 crc kubenswrapper[4922]: I0218 11:56:48.999733 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31a682bb-b881-47f5-960b-c6ae54c24275-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"31a682bb-b881-47f5-960b-c6ae54c24275\") " pod="openstack/ceilometer-0" Feb 18 11:56:49 crc kubenswrapper[4922]: I0218 11:56:49.094149 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6d677498bd-cxq98"] Feb 18 11:56:49 crc kubenswrapper[4922]: I0218 11:56:49.101969 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tt6qk\" (UniqueName: \"kubernetes.io/projected/31a682bb-b881-47f5-960b-c6ae54c24275-kube-api-access-tt6qk\") pod \"ceilometer-0\" (UID: \"31a682bb-b881-47f5-960b-c6ae54c24275\") " pod="openstack/ceilometer-0" Feb 18 11:56:49 crc kubenswrapper[4922]: I0218 11:56:49.102173 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31a682bb-b881-47f5-960b-c6ae54c24275-log-httpd\") pod \"ceilometer-0\" (UID: \"31a682bb-b881-47f5-960b-c6ae54c24275\") " pod="openstack/ceilometer-0" Feb 18 11:56:49 crc kubenswrapper[4922]: I0218 11:56:49.102418 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/31a682bb-b881-47f5-960b-c6ae54c24275-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"31a682bb-b881-47f5-960b-c6ae54c24275\") " pod="openstack/ceilometer-0" Feb 18 11:56:49 crc kubenswrapper[4922]: I0218 11:56:49.102541 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31a682bb-b881-47f5-960b-c6ae54c24275-scripts\") pod \"ceilometer-0\" (UID: \"31a682bb-b881-47f5-960b-c6ae54c24275\") " pod="openstack/ceilometer-0" Feb 18 11:56:49 crc kubenswrapper[4922]: I0218 11:56:49.102658 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31a682bb-b881-47f5-960b-c6ae54c24275-run-httpd\") pod \"ceilometer-0\" (UID: \"31a682bb-b881-47f5-960b-c6ae54c24275\") " pod="openstack/ceilometer-0" Feb 18 11:56:49 crc kubenswrapper[4922]: I0218 11:56:49.102770 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31a682bb-b881-47f5-960b-c6ae54c24275-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"31a682bb-b881-47f5-960b-c6ae54c24275\") " pod="openstack/ceilometer-0" Feb 18 11:56:49 crc kubenswrapper[4922]: I0218 11:56:49.102968 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31a682bb-b881-47f5-960b-c6ae54c24275-config-data\") pod \"ceilometer-0\" (UID: \"31a682bb-b881-47f5-960b-c6ae54c24275\") " pod="openstack/ceilometer-0" Feb 18 11:56:49 crc kubenswrapper[4922]: I0218 11:56:49.111576 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31a682bb-b881-47f5-960b-c6ae54c24275-log-httpd\") pod \"ceilometer-0\" (UID: \"31a682bb-b881-47f5-960b-c6ae54c24275\") " pod="openstack/ceilometer-0" Feb 18 11:56:49 crc kubenswrapper[4922]: I0218 11:56:49.113779 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31a682bb-b881-47f5-960b-c6ae54c24275-run-httpd\") pod \"ceilometer-0\" (UID: \"31a682bb-b881-47f5-960b-c6ae54c24275\") " pod="openstack/ceilometer-0" Feb 18 11:56:49 crc kubenswrapper[4922]: I0218 11:56:49.119110 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31a682bb-b881-47f5-960b-c6ae54c24275-scripts\") pod \"ceilometer-0\" (UID: \"31a682bb-b881-47f5-960b-c6ae54c24275\") " pod="openstack/ceilometer-0" Feb 18 11:56:49 crc kubenswrapper[4922]: I0218 11:56:49.119110 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/31a682bb-b881-47f5-960b-c6ae54c24275-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"31a682bb-b881-47f5-960b-c6ae54c24275\") " pod="openstack/ceilometer-0" Feb 18 11:56:49 crc kubenswrapper[4922]: I0218 11:56:49.125270 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31a682bb-b881-47f5-960b-c6ae54c24275-config-data\") pod \"ceilometer-0\" (UID: \"31a682bb-b881-47f5-960b-c6ae54c24275\") " pod="openstack/ceilometer-0" Feb 18 11:56:49 crc kubenswrapper[4922]: I0218 11:56:49.133925 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt6qk\" (UniqueName: \"kubernetes.io/projected/31a682bb-b881-47f5-960b-c6ae54c24275-kube-api-access-tt6qk\") pod \"ceilometer-0\" (UID: \"31a682bb-b881-47f5-960b-c6ae54c24275\") " pod="openstack/ceilometer-0" Feb 18 11:56:49 crc kubenswrapper[4922]: I0218 11:56:49.186147 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31a682bb-b881-47f5-960b-c6ae54c24275-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"31a682bb-b881-47f5-960b-c6ae54c24275\") " pod="openstack/ceilometer-0" Feb 18 11:56:49 crc kubenswrapper[4922]: I0218 11:56:49.209795 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 11:56:49 crc kubenswrapper[4922]: I0218 11:56:49.227461 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bdb97b9f9-22mm8"] Feb 18 11:56:49 crc kubenswrapper[4922]: I0218 11:56:49.273988 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-b854f8786-pls2t" Feb 18 11:56:49 crc kubenswrapper[4922]: I0218 11:56:49.383897 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6d677498bd-cxq98" event={"ID":"53371b07-a65f-4fec-8564-bcd51df6c010","Type":"ContainerStarted","Data":"6019911b846afdb67dccdb7e03471f7cbf11c9aa081bfb07a432273f6bc2c54b"} Feb 18 11:56:49 crc kubenswrapper[4922]: I0218 11:56:49.401775 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-676bd4cb85-2ggtc" event={"ID":"93be7893-0b89-4762-870d-f5878ecddb3b","Type":"ContainerStarted","Data":"b7ac1e08c61e2809e23e4e7b68ab2c87a26179cf441a50aa9a12026cabe3e74a"} Feb 18 11:56:49 crc kubenswrapper[4922]: I0218 11:56:49.412525 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bdb97b9f9-22mm8" event={"ID":"d15f2ab3-202d-4241-a636-4d00475874aa","Type":"ContainerStarted","Data":"b70a4c113ae5df77985adf93070afceb0e7e0972f2d420cb43d3ae8ed3526536"} Feb 18 11:56:49 crc kubenswrapper[4922]: I0218 11:56:49.445313 4922 generic.go:334] "Generic (PLEG): container finished" podID="d7852f85-b8c5-458e-901c-3659c5ed2713" containerID="483ace8c8816a57983828e0b595d4b4e906ff56406e775bb5aac307ba89d5e30" exitCode=0 Feb 18 11:56:49 crc kubenswrapper[4922]: I0218 11:56:49.445461 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-rvpx7" event={"ID":"d7852f85-b8c5-458e-901c-3659c5ed2713","Type":"ContainerDied","Data":"483ace8c8816a57983828e0b595d4b4e906ff56406e775bb5aac307ba89d5e30"} Feb 18 11:56:49 crc kubenswrapper[4922]: I0218 11:56:49.456553 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-78995b5fcd-pmbbf" event={"ID":"2664c9b6-f62a-4453-8771-8c273f5f9ec1","Type":"ContainerStarted","Data":"108bfbb2b14078785a032b05d82d62866bc5e08d6fa3e8a5b2f4eae587e2386f"} Feb 18 11:56:49 crc kubenswrapper[4922]: I0218 11:56:49.487163 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Feb 18 11:56:50 crc kubenswrapper[4922]: I0218 11:56:50.103585 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 11:56:50 crc kubenswrapper[4922]: I0218 11:56:50.501843 4922 generic.go:334] "Generic (PLEG): container finished" podID="d15f2ab3-202d-4241-a636-4d00475874aa" containerID="007affd267df0eaaea70525abb6cf65b3773291056b908124aaf8cd367384660" exitCode=0 Feb 18 11:56:50 crc kubenswrapper[4922]: I0218 11:56:50.501991 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bdb97b9f9-22mm8" event={"ID":"d15f2ab3-202d-4241-a636-4d00475874aa","Type":"ContainerDied","Data":"007affd267df0eaaea70525abb6cf65b3773291056b908124aaf8cd367384660"} Feb 18 11:56:50 crc kubenswrapper[4922]: I0218 11:56:50.510053 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31a682bb-b881-47f5-960b-c6ae54c24275","Type":"ContainerStarted","Data":"2f2fa694e60fe2de69033e6edac945c54110ee47bb40b70b10dec8b4f330dcd3"} Feb 18 11:56:50 crc kubenswrapper[4922]: I0218 11:56:50.568851 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6d677498bd-cxq98" event={"ID":"53371b07-a65f-4fec-8564-bcd51df6c010","Type":"ContainerStarted","Data":"d307cc88ce59602f69621bcf0371da0ee28a8272ec22fc60147af239fd78badf"} Feb 18 11:56:50 crc kubenswrapper[4922]: I0218 11:56:50.569163 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6d677498bd-cxq98" event={"ID":"53371b07-a65f-4fec-8564-bcd51df6c010","Type":"ContainerStarted","Data":"95063da6319615da44d0b1f12dd9abb6a109354894bb0cb11f9c25d007446b4c"} Feb 18 11:56:50 crc kubenswrapper[4922]: I0218 11:56:50.570147 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6d677498bd-cxq98" Feb 18 11:56:50 crc kubenswrapper[4922]: I0218 11:56:50.570183 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6d677498bd-cxq98" Feb 18 11:56:51 crc kubenswrapper[4922]: I0218 11:56:51.183802 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6d677498bd-cxq98" podStartSLOduration=4.183781428 podStartE2EDuration="4.183781428s" podCreationTimestamp="2026-02-18 11:56:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:56:50.616699001 +0000 UTC m=+1212.344403081" watchObservedRunningTime="2026-02-18 11:56:51.183781428 +0000 UTC m=+1212.911485528" Feb 18 11:56:51 crc kubenswrapper[4922]: I0218 11:56:51.195462 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-794d859fd8-fbbnx"] Feb 18 11:56:51 crc kubenswrapper[4922]: I0218 11:56:51.202409 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-794d859fd8-fbbnx" Feb 18 11:56:51 crc kubenswrapper[4922]: I0218 11:56:51.209310 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 18 11:56:51 crc kubenswrapper[4922]: I0218 11:56:51.209621 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 18 11:56:51 crc kubenswrapper[4922]: I0218 11:56:51.218305 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-794d859fd8-fbbnx"] Feb 18 11:56:51 crc kubenswrapper[4922]: I0218 11:56:51.323381 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d8d3eec1-763e-4874-b2af-19401e383fed-config-data-custom\") pod \"barbican-api-794d859fd8-fbbnx\" (UID: \"d8d3eec1-763e-4874-b2af-19401e383fed\") " pod="openstack/barbican-api-794d859fd8-fbbnx" Feb 18 11:56:51 crc kubenswrapper[4922]: I0218 11:56:51.323433 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8d3eec1-763e-4874-b2af-19401e383fed-public-tls-certs\") pod \"barbican-api-794d859fd8-fbbnx\" (UID: \"d8d3eec1-763e-4874-b2af-19401e383fed\") " pod="openstack/barbican-api-794d859fd8-fbbnx" Feb 18 11:56:51 crc kubenswrapper[4922]: I0218 11:56:51.323592 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8d3eec1-763e-4874-b2af-19401e383fed-config-data\") pod \"barbican-api-794d859fd8-fbbnx\" (UID: \"d8d3eec1-763e-4874-b2af-19401e383fed\") " pod="openstack/barbican-api-794d859fd8-fbbnx" Feb 18 11:56:51 crc kubenswrapper[4922]: I0218 11:56:51.323667 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8d3eec1-763e-4874-b2af-19401e383fed-internal-tls-certs\") pod \"barbican-api-794d859fd8-fbbnx\" (UID: \"d8d3eec1-763e-4874-b2af-19401e383fed\") " pod="openstack/barbican-api-794d859fd8-fbbnx" Feb 18 11:56:51 crc kubenswrapper[4922]: I0218 11:56:51.323702 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngrm7\" (UniqueName: \"kubernetes.io/projected/d8d3eec1-763e-4874-b2af-19401e383fed-kube-api-access-ngrm7\") pod \"barbican-api-794d859fd8-fbbnx\" (UID: \"d8d3eec1-763e-4874-b2af-19401e383fed\") " pod="openstack/barbican-api-794d859fd8-fbbnx" Feb 18 11:56:51 crc kubenswrapper[4922]: I0218 11:56:51.323719 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8d3eec1-763e-4874-b2af-19401e383fed-combined-ca-bundle\") pod \"barbican-api-794d859fd8-fbbnx\" (UID: \"d8d3eec1-763e-4874-b2af-19401e383fed\") " pod="openstack/barbican-api-794d859fd8-fbbnx" Feb 18 11:56:51 crc kubenswrapper[4922]: I0218 11:56:51.323738 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8d3eec1-763e-4874-b2af-19401e383fed-logs\") pod \"barbican-api-794d859fd8-fbbnx\" (UID: \"d8d3eec1-763e-4874-b2af-19401e383fed\") " pod="openstack/barbican-api-794d859fd8-fbbnx" Feb 18 11:56:51 crc kubenswrapper[4922]: I0218 11:56:51.428484 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8d3eec1-763e-4874-b2af-19401e383fed-config-data\") pod \"barbican-api-794d859fd8-fbbnx\" (UID: \"d8d3eec1-763e-4874-b2af-19401e383fed\") " pod="openstack/barbican-api-794d859fd8-fbbnx" Feb 18 11:56:51 crc kubenswrapper[4922]: I0218 11:56:51.428651 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8d3eec1-763e-4874-b2af-19401e383fed-internal-tls-certs\") pod \"barbican-api-794d859fd8-fbbnx\" (UID: \"d8d3eec1-763e-4874-b2af-19401e383fed\") " pod="openstack/barbican-api-794d859fd8-fbbnx" Feb 18 11:56:51 crc kubenswrapper[4922]: I0218 11:56:51.428714 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngrm7\" (UniqueName: \"kubernetes.io/projected/d8d3eec1-763e-4874-b2af-19401e383fed-kube-api-access-ngrm7\") pod \"barbican-api-794d859fd8-fbbnx\" (UID: \"d8d3eec1-763e-4874-b2af-19401e383fed\") " pod="openstack/barbican-api-794d859fd8-fbbnx" Feb 18 11:56:51 crc kubenswrapper[4922]: I0218 11:56:51.428742 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8d3eec1-763e-4874-b2af-19401e383fed-combined-ca-bundle\") pod \"barbican-api-794d859fd8-fbbnx\" (UID: \"d8d3eec1-763e-4874-b2af-19401e383fed\") " pod="openstack/barbican-api-794d859fd8-fbbnx" Feb 18 11:56:51 crc kubenswrapper[4922]: I0218 11:56:51.428774 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8d3eec1-763e-4874-b2af-19401e383fed-logs\") pod \"barbican-api-794d859fd8-fbbnx\" (UID: \"d8d3eec1-763e-4874-b2af-19401e383fed\") " pod="openstack/barbican-api-794d859fd8-fbbnx" Feb 18 11:56:51 crc kubenswrapper[4922]: I0218 11:56:51.428832 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d8d3eec1-763e-4874-b2af-19401e383fed-config-data-custom\") pod \"barbican-api-794d859fd8-fbbnx\" (UID: \"d8d3eec1-763e-4874-b2af-19401e383fed\") " pod="openstack/barbican-api-794d859fd8-fbbnx" Feb 18 11:56:51 crc kubenswrapper[4922]: I0218 11:56:51.428856 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8d3eec1-763e-4874-b2af-19401e383fed-public-tls-certs\") pod \"barbican-api-794d859fd8-fbbnx\" (UID: \"d8d3eec1-763e-4874-b2af-19401e383fed\") " pod="openstack/barbican-api-794d859fd8-fbbnx" Feb 18 11:56:51 crc kubenswrapper[4922]: I0218 11:56:51.429497 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8d3eec1-763e-4874-b2af-19401e383fed-logs\") pod \"barbican-api-794d859fd8-fbbnx\" (UID: \"d8d3eec1-763e-4874-b2af-19401e383fed\") " pod="openstack/barbican-api-794d859fd8-fbbnx" Feb 18 11:56:51 crc kubenswrapper[4922]: I0218 11:56:51.434270 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8d3eec1-763e-4874-b2af-19401e383fed-combined-ca-bundle\") pod \"barbican-api-794d859fd8-fbbnx\" (UID: \"d8d3eec1-763e-4874-b2af-19401e383fed\") " pod="openstack/barbican-api-794d859fd8-fbbnx" Feb 18 11:56:51 crc kubenswrapper[4922]: I0218 11:56:51.434882 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8d3eec1-763e-4874-b2af-19401e383fed-config-data\") pod \"barbican-api-794d859fd8-fbbnx\" (UID: \"d8d3eec1-763e-4874-b2af-19401e383fed\") " pod="openstack/barbican-api-794d859fd8-fbbnx" Feb 18 11:56:51 crc kubenswrapper[4922]: I0218 11:56:51.436063 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8d3eec1-763e-4874-b2af-19401e383fed-internal-tls-certs\") pod \"barbican-api-794d859fd8-fbbnx\" (UID: \"d8d3eec1-763e-4874-b2af-19401e383fed\") " pod="openstack/barbican-api-794d859fd8-fbbnx" Feb 18 11:56:51 crc kubenswrapper[4922]: I0218 11:56:51.439617 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d8d3eec1-763e-4874-b2af-19401e383fed-config-data-custom\") pod \"barbican-api-794d859fd8-fbbnx\" (UID: \"d8d3eec1-763e-4874-b2af-19401e383fed\") " pod="openstack/barbican-api-794d859fd8-fbbnx" Feb 18 11:56:51 crc kubenswrapper[4922]: I0218 11:56:51.440516 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8d3eec1-763e-4874-b2af-19401e383fed-public-tls-certs\") pod \"barbican-api-794d859fd8-fbbnx\" (UID: \"d8d3eec1-763e-4874-b2af-19401e383fed\") " pod="openstack/barbican-api-794d859fd8-fbbnx" Feb 18 11:56:51 crc kubenswrapper[4922]: I0218 11:56:51.456240 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngrm7\" (UniqueName: \"kubernetes.io/projected/d8d3eec1-763e-4874-b2af-19401e383fed-kube-api-access-ngrm7\") pod \"barbican-api-794d859fd8-fbbnx\" (UID: \"d8d3eec1-763e-4874-b2af-19401e383fed\") " pod="openstack/barbican-api-794d859fd8-fbbnx" Feb 18 11:56:51 crc kubenswrapper[4922]: I0218 11:56:51.536932 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-794d859fd8-fbbnx" Feb 18 11:56:51 crc kubenswrapper[4922]: I0218 11:56:51.588578 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bdb97b9f9-22mm8" event={"ID":"d15f2ab3-202d-4241-a636-4d00475874aa","Type":"ContainerStarted","Data":"8439e696b2fd32672982849d8b37ce88e9906b06a73598f166488f6da611b3af"} Feb 18 11:56:51 crc kubenswrapper[4922]: I0218 11:56:51.588921 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5bdb97b9f9-22mm8" Feb 18 11:56:51 crc kubenswrapper[4922]: I0218 11:56:51.613860 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5bdb97b9f9-22mm8" podStartSLOduration=4.613825201 podStartE2EDuration="4.613825201s" podCreationTimestamp="2026-02-18 11:56:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:56:51.609217414 +0000 UTC m=+1213.336921494" watchObservedRunningTime="2026-02-18 11:56:51.613825201 +0000 UTC m=+1213.341529291" Feb 18 11:56:51 crc kubenswrapper[4922]: I0218 11:56:51.700243 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Feb 18 11:56:52 crc kubenswrapper[4922]: I0218 11:56:52.550499 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-rvpx7" Feb 18 11:56:52 crc kubenswrapper[4922]: I0218 11:56:52.600685 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d7852f85-b8c5-458e-901c-3659c5ed2713-etc-machine-id\") pod \"d7852f85-b8c5-458e-901c-3659c5ed2713\" (UID: \"d7852f85-b8c5-458e-901c-3659c5ed2713\") " Feb 18 11:56:52 crc kubenswrapper[4922]: I0218 11:56:52.600858 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7852f85-b8c5-458e-901c-3659c5ed2713-scripts\") pod \"d7852f85-b8c5-458e-901c-3659c5ed2713\" (UID: \"d7852f85-b8c5-458e-901c-3659c5ed2713\") " Feb 18 11:56:52 crc kubenswrapper[4922]: I0218 11:56:52.600936 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4h8nh\" (UniqueName: \"kubernetes.io/projected/d7852f85-b8c5-458e-901c-3659c5ed2713-kube-api-access-4h8nh\") pod \"d7852f85-b8c5-458e-901c-3659c5ed2713\" (UID: \"d7852f85-b8c5-458e-901c-3659c5ed2713\") " Feb 18 11:56:52 crc kubenswrapper[4922]: I0218 11:56:52.600999 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7852f85-b8c5-458e-901c-3659c5ed2713-combined-ca-bundle\") pod \"d7852f85-b8c5-458e-901c-3659c5ed2713\" (UID: \"d7852f85-b8c5-458e-901c-3659c5ed2713\") " Feb 18 11:56:52 crc kubenswrapper[4922]: I0218 11:56:52.601027 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7852f85-b8c5-458e-901c-3659c5ed2713-config-data\") pod \"d7852f85-b8c5-458e-901c-3659c5ed2713\" (UID: \"d7852f85-b8c5-458e-901c-3659c5ed2713\") " Feb 18 11:56:52 crc kubenswrapper[4922]: I0218 11:56:52.601164 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d7852f85-b8c5-458e-901c-3659c5ed2713-db-sync-config-data\") pod \"d7852f85-b8c5-458e-901c-3659c5ed2713\" (UID: \"d7852f85-b8c5-458e-901c-3659c5ed2713\") " Feb 18 11:56:52 crc kubenswrapper[4922]: I0218 11:56:52.605845 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d7852f85-b8c5-458e-901c-3659c5ed2713-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d7852f85-b8c5-458e-901c-3659c5ed2713" (UID: "d7852f85-b8c5-458e-901c-3659c5ed2713"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 11:56:52 crc kubenswrapper[4922]: I0218 11:56:52.617193 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7852f85-b8c5-458e-901c-3659c5ed2713-scripts" (OuterVolumeSpecName: "scripts") pod "d7852f85-b8c5-458e-901c-3659c5ed2713" (UID: "d7852f85-b8c5-458e-901c-3659c5ed2713"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:56:52 crc kubenswrapper[4922]: I0218 11:56:52.622869 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7852f85-b8c5-458e-901c-3659c5ed2713-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d7852f85-b8c5-458e-901c-3659c5ed2713" (UID: "d7852f85-b8c5-458e-901c-3659c5ed2713"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:56:52 crc kubenswrapper[4922]: I0218 11:56:52.626703 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7852f85-b8c5-458e-901c-3659c5ed2713-kube-api-access-4h8nh" (OuterVolumeSpecName: "kube-api-access-4h8nh") pod "d7852f85-b8c5-458e-901c-3659c5ed2713" (UID: "d7852f85-b8c5-458e-901c-3659c5ed2713"). InnerVolumeSpecName "kube-api-access-4h8nh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:56:52 crc kubenswrapper[4922]: I0218 11:56:52.630035 4922 generic.go:334] "Generic (PLEG): container finished" podID="855fb3ec-e473-4a99-a94f-cc96dda6d9c4" containerID="f6b3d065bfd75b2cfcd638eb81caf2052d3e13f6ce152807bf20c4d48f24622e" exitCode=0 Feb 18 11:56:52 crc kubenswrapper[4922]: I0218 11:56:52.630161 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-st9pz" event={"ID":"855fb3ec-e473-4a99-a94f-cc96dda6d9c4","Type":"ContainerDied","Data":"f6b3d065bfd75b2cfcd638eb81caf2052d3e13f6ce152807bf20c4d48f24622e"} Feb 18 11:56:52 crc kubenswrapper[4922]: I0218 11:56:52.635417 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-rvpx7" Feb 18 11:56:52 crc kubenswrapper[4922]: I0218 11:56:52.635652 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-rvpx7" event={"ID":"d7852f85-b8c5-458e-901c-3659c5ed2713","Type":"ContainerDied","Data":"b5010985da36e7523bd0bc3fdfdcc8c443c58a1cc63438b6b3af9b7f64ca52d5"} Feb 18 11:56:52 crc kubenswrapper[4922]: I0218 11:56:52.635676 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5010985da36e7523bd0bc3fdfdcc8c443c58a1cc63438b6b3af9b7f64ca52d5" Feb 18 11:56:52 crc kubenswrapper[4922]: I0218 11:56:52.698908 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7852f85-b8c5-458e-901c-3659c5ed2713-config-data" (OuterVolumeSpecName: "config-data") pod "d7852f85-b8c5-458e-901c-3659c5ed2713" (UID: "d7852f85-b8c5-458e-901c-3659c5ed2713"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:56:52 crc kubenswrapper[4922]: I0218 11:56:52.707013 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7852f85-b8c5-458e-901c-3659c5ed2713-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:52 crc kubenswrapper[4922]: I0218 11:56:52.707061 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4h8nh\" (UniqueName: \"kubernetes.io/projected/d7852f85-b8c5-458e-901c-3659c5ed2713-kube-api-access-4h8nh\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:52 crc kubenswrapper[4922]: I0218 11:56:52.707078 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7852f85-b8c5-458e-901c-3659c5ed2713-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:52 crc kubenswrapper[4922]: I0218 11:56:52.707104 4922 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d7852f85-b8c5-458e-901c-3659c5ed2713-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:52 crc kubenswrapper[4922]: I0218 11:56:52.707118 4922 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d7852f85-b8c5-458e-901c-3659c5ed2713-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:52 crc kubenswrapper[4922]: I0218 11:56:52.725405 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7852f85-b8c5-458e-901c-3659c5ed2713-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d7852f85-b8c5-458e-901c-3659c5ed2713" (UID: "d7852f85-b8c5-458e-901c-3659c5ed2713"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:56:52 crc kubenswrapper[4922]: I0218 11:56:52.809032 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7852f85-b8c5-458e-901c-3659c5ed2713-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:52 crc kubenswrapper[4922]: I0218 11:56:52.964450 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-794d859fd8-fbbnx"] Feb 18 11:56:53 crc kubenswrapper[4922]: I0218 11:56:53.234560 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 18 11:56:53 crc kubenswrapper[4922]: E0218 11:56:53.234973 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7852f85-b8c5-458e-901c-3659c5ed2713" containerName="cinder-db-sync" Feb 18 11:56:53 crc kubenswrapper[4922]: I0218 11:56:53.234992 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7852f85-b8c5-458e-901c-3659c5ed2713" containerName="cinder-db-sync" Feb 18 11:56:53 crc kubenswrapper[4922]: I0218 11:56:53.235156 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7852f85-b8c5-458e-901c-3659c5ed2713" containerName="cinder-db-sync" Feb 18 11:56:53 crc kubenswrapper[4922]: I0218 11:56:53.235802 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 18 11:56:53 crc kubenswrapper[4922]: I0218 11:56:53.239185 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-5hxjt" Feb 18 11:56:53 crc kubenswrapper[4922]: I0218 11:56:53.239449 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 18 11:56:53 crc kubenswrapper[4922]: I0218 11:56:53.239617 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 18 11:56:53 crc kubenswrapper[4922]: I0218 11:56:53.263324 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 18 11:56:53 crc kubenswrapper[4922]: I0218 11:56:53.322818 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/245b1cb9-d98f-4875-adf6-ab887f76849d-openstack-config-secret\") pod \"openstackclient\" (UID: \"245b1cb9-d98f-4875-adf6-ab887f76849d\") " pod="openstack/openstackclient" Feb 18 11:56:53 crc kubenswrapper[4922]: I0218 11:56:53.322914 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/245b1cb9-d98f-4875-adf6-ab887f76849d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"245b1cb9-d98f-4875-adf6-ab887f76849d\") " pod="openstack/openstackclient" Feb 18 11:56:53 crc kubenswrapper[4922]: I0218 11:56:53.322957 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7gp9\" (UniqueName: \"kubernetes.io/projected/245b1cb9-d98f-4875-adf6-ab887f76849d-kube-api-access-t7gp9\") pod \"openstackclient\" (UID: \"245b1cb9-d98f-4875-adf6-ab887f76849d\") " pod="openstack/openstackclient" Feb 18 11:56:53 crc kubenswrapper[4922]: I0218 11:56:53.322976 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/245b1cb9-d98f-4875-adf6-ab887f76849d-openstack-config\") pod \"openstackclient\" (UID: \"245b1cb9-d98f-4875-adf6-ab887f76849d\") " pod="openstack/openstackclient" Feb 18 11:56:53 crc kubenswrapper[4922]: I0218 11:56:53.424911 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/245b1cb9-d98f-4875-adf6-ab887f76849d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"245b1cb9-d98f-4875-adf6-ab887f76849d\") " pod="openstack/openstackclient" Feb 18 11:56:53 crc kubenswrapper[4922]: I0218 11:56:53.424985 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7gp9\" (UniqueName: \"kubernetes.io/projected/245b1cb9-d98f-4875-adf6-ab887f76849d-kube-api-access-t7gp9\") pod \"openstackclient\" (UID: \"245b1cb9-d98f-4875-adf6-ab887f76849d\") " pod="openstack/openstackclient" Feb 18 11:56:53 crc kubenswrapper[4922]: I0218 11:56:53.425005 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/245b1cb9-d98f-4875-adf6-ab887f76849d-openstack-config\") pod \"openstackclient\" (UID: \"245b1cb9-d98f-4875-adf6-ab887f76849d\") " pod="openstack/openstackclient" Feb 18 11:56:53 crc kubenswrapper[4922]: I0218 11:56:53.425192 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/245b1cb9-d98f-4875-adf6-ab887f76849d-openstack-config-secret\") pod \"openstackclient\" (UID: \"245b1cb9-d98f-4875-adf6-ab887f76849d\") " pod="openstack/openstackclient" Feb 18 11:56:53 crc kubenswrapper[4922]: I0218 11:56:53.426236 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/245b1cb9-d98f-4875-adf6-ab887f76849d-openstack-config\") pod \"openstackclient\" (UID: \"245b1cb9-d98f-4875-adf6-ab887f76849d\") " pod="openstack/openstackclient" Feb 18 11:56:53 crc kubenswrapper[4922]: I0218 11:56:53.429787 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/245b1cb9-d98f-4875-adf6-ab887f76849d-openstack-config-secret\") pod \"openstackclient\" (UID: \"245b1cb9-d98f-4875-adf6-ab887f76849d\") " pod="openstack/openstackclient" Feb 18 11:56:53 crc kubenswrapper[4922]: I0218 11:56:53.431600 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/245b1cb9-d98f-4875-adf6-ab887f76849d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"245b1cb9-d98f-4875-adf6-ab887f76849d\") " pod="openstack/openstackclient" Feb 18 11:56:53 crc kubenswrapper[4922]: I0218 11:56:53.444077 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7gp9\" (UniqueName: \"kubernetes.io/projected/245b1cb9-d98f-4875-adf6-ab887f76849d-kube-api-access-t7gp9\") pod \"openstackclient\" (UID: \"245b1cb9-d98f-4875-adf6-ab887f76849d\") " pod="openstack/openstackclient" Feb 18 11:56:53 crc kubenswrapper[4922]: I0218 11:56:53.568053 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 18 11:56:53 crc kubenswrapper[4922]: I0218 11:56:53.666031 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-794d859fd8-fbbnx" event={"ID":"d8d3eec1-763e-4874-b2af-19401e383fed","Type":"ContainerStarted","Data":"8a38b5b6376e30dcea7ce225715f54f7135ad4fc04a4edab5936124dc4064df7"} Feb 18 11:56:53 crc kubenswrapper[4922]: I0218 11:56:53.666084 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-794d859fd8-fbbnx" event={"ID":"d8d3eec1-763e-4874-b2af-19401e383fed","Type":"ContainerStarted","Data":"1a28858de0c713418f4f9ee6545f7adadbfc9acbc4e0e4d9bc3221e41de85448"} Feb 18 11:56:53 crc kubenswrapper[4922]: I0218 11:56:53.666100 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-794d859fd8-fbbnx" event={"ID":"d8d3eec1-763e-4874-b2af-19401e383fed","Type":"ContainerStarted","Data":"21e3cc36af7c6097e3f661f26114d5be79ab01892f5cbbbbd05ee60714014159"} Feb 18 11:56:53 crc kubenswrapper[4922]: I0218 11:56:53.667288 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-794d859fd8-fbbnx" Feb 18 11:56:53 crc kubenswrapper[4922]: I0218 11:56:53.667326 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-794d859fd8-fbbnx" Feb 18 11:56:53 crc kubenswrapper[4922]: I0218 11:56:53.674269 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-676bd4cb85-2ggtc" event={"ID":"93be7893-0b89-4762-870d-f5878ecddb3b","Type":"ContainerStarted","Data":"fae49d17506120b2a78bcfca5e18c4d6fd434cdafa79a896bc1f733849d9800b"} Feb 18 11:56:53 crc kubenswrapper[4922]: I0218 11:56:53.674313 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-676bd4cb85-2ggtc" event={"ID":"93be7893-0b89-4762-870d-f5878ecddb3b","Type":"ContainerStarted","Data":"79b1f242c2b5b0800d2c146fe57c62199ca37df35738ebc40daa8f9fac0612d4"} Feb 18 11:56:53 crc kubenswrapper[4922]: I0218 11:56:53.683236 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31a682bb-b881-47f5-960b-c6ae54c24275","Type":"ContainerStarted","Data":"71670bed8cf5414a16d8e3139affef7f174a018a3fa1f96f685382e7b236b20a"} Feb 18 11:56:53 crc kubenswrapper[4922]: I0218 11:56:53.683287 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31a682bb-b881-47f5-960b-c6ae54c24275","Type":"ContainerStarted","Data":"ba51c32817af83f75090d7c0f3e6b11f618358a906fe3fa4abfa17b358890f4c"} Feb 18 11:56:53 crc kubenswrapper[4922]: I0218 11:56:53.688064 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-78995b5fcd-pmbbf" event={"ID":"2664c9b6-f62a-4453-8771-8c273f5f9ec1","Type":"ContainerStarted","Data":"756d7e01bc6c98bf139561d99c15433814b74887d986df3ef8da0eb944781525"} Feb 18 11:56:53 crc kubenswrapper[4922]: I0218 11:56:53.688120 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-78995b5fcd-pmbbf" event={"ID":"2664c9b6-f62a-4453-8771-8c273f5f9ec1","Type":"ContainerStarted","Data":"af357bdbc40a1eae94c5dcecdd2fd47279eb9382c9a8f5047d146d2b14e62c89"} Feb 18 11:56:53 crc kubenswrapper[4922]: I0218 11:56:53.712743 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-794d859fd8-fbbnx" podStartSLOduration=2.712722596 podStartE2EDuration="2.712722596s" podCreationTimestamp="2026-02-18 11:56:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:56:53.697543612 +0000 UTC m=+1215.425247712" watchObservedRunningTime="2026-02-18 11:56:53.712722596 +0000 UTC m=+1215.440426676" Feb 18 11:56:53 crc kubenswrapper[4922]: I0218 11:56:53.782110 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-676bd4cb85-2ggtc" podStartSLOduration=2.850595521 podStartE2EDuration="6.782089169s" podCreationTimestamp="2026-02-18 11:56:47 +0000 UTC" firstStartedPulling="2026-02-18 11:56:48.48712123 +0000 UTC m=+1210.214825310" lastFinishedPulling="2026-02-18 11:56:52.418614878 +0000 UTC m=+1214.146318958" observedRunningTime="2026-02-18 11:56:53.729043258 +0000 UTC m=+1215.456747338" watchObservedRunningTime="2026-02-18 11:56:53.782089169 +0000 UTC m=+1215.509793249" Feb 18 11:56:53 crc kubenswrapper[4922]: I0218 11:56:53.888335 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-78995b5fcd-pmbbf" podStartSLOduration=3.243567477 podStartE2EDuration="6.888301295s" podCreationTimestamp="2026-02-18 11:56:47 +0000 UTC" firstStartedPulling="2026-02-18 11:56:48.773221563 +0000 UTC m=+1210.500925643" lastFinishedPulling="2026-02-18 11:56:52.417955391 +0000 UTC m=+1214.145659461" observedRunningTime="2026-02-18 11:56:53.793712523 +0000 UTC m=+1215.521416603" watchObservedRunningTime="2026-02-18 11:56:53.888301295 +0000 UTC m=+1215.616005375" Feb 18 11:56:53 crc kubenswrapper[4922]: I0218 11:56:53.898326 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 18 11:56:53 crc kubenswrapper[4922]: I0218 11:56:53.907107 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 18 11:56:53 crc kubenswrapper[4922]: I0218 11:56:53.918037 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-xpg9l" Feb 18 11:56:53 crc kubenswrapper[4922]: I0218 11:56:53.919035 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 18 11:56:53 crc kubenswrapper[4922]: I0218 11:56:53.919220 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 18 11:56:53 crc kubenswrapper[4922]: I0218 11:56:53.933535 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 18 11:56:53 crc kubenswrapper[4922]: I0218 11:56:53.956512 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/74a83ecb-de31-4767-a178-bccf8a37e93e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"74a83ecb-de31-4767-a178-bccf8a37e93e\") " pod="openstack/cinder-scheduler-0" Feb 18 11:56:53 crc kubenswrapper[4922]: I0218 11:56:53.956571 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r77f\" (UniqueName: \"kubernetes.io/projected/74a83ecb-de31-4767-a178-bccf8a37e93e-kube-api-access-4r77f\") pod \"cinder-scheduler-0\" (UID: \"74a83ecb-de31-4767-a178-bccf8a37e93e\") " pod="openstack/cinder-scheduler-0" Feb 18 11:56:53 crc kubenswrapper[4922]: I0218 11:56:53.956594 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a83ecb-de31-4767-a178-bccf8a37e93e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"74a83ecb-de31-4767-a178-bccf8a37e93e\") " pod="openstack/cinder-scheduler-0" Feb 18 11:56:53 crc kubenswrapper[4922]: I0218 11:56:53.956637 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/74a83ecb-de31-4767-a178-bccf8a37e93e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"74a83ecb-de31-4767-a178-bccf8a37e93e\") " pod="openstack/cinder-scheduler-0" Feb 18 11:56:53 crc kubenswrapper[4922]: I0218 11:56:53.956655 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74a83ecb-de31-4767-a178-bccf8a37e93e-scripts\") pod \"cinder-scheduler-0\" (UID: \"74a83ecb-de31-4767-a178-bccf8a37e93e\") " pod="openstack/cinder-scheduler-0" Feb 18 11:56:53 crc kubenswrapper[4922]: I0218 11:56:53.956747 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74a83ecb-de31-4767-a178-bccf8a37e93e-config-data\") pod \"cinder-scheduler-0\" (UID: \"74a83ecb-de31-4767-a178-bccf8a37e93e\") " pod="openstack/cinder-scheduler-0" Feb 18 11:56:53 crc kubenswrapper[4922]: I0218 11:56:53.961446 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.020563 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bdb97b9f9-22mm8"] Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.065596 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/74a83ecb-de31-4767-a178-bccf8a37e93e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"74a83ecb-de31-4767-a178-bccf8a37e93e\") " pod="openstack/cinder-scheduler-0" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.065659 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74a83ecb-de31-4767-a178-bccf8a37e93e-scripts\") pod \"cinder-scheduler-0\" (UID: \"74a83ecb-de31-4767-a178-bccf8a37e93e\") " pod="openstack/cinder-scheduler-0" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.065960 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74a83ecb-de31-4767-a178-bccf8a37e93e-config-data\") pod \"cinder-scheduler-0\" (UID: \"74a83ecb-de31-4767-a178-bccf8a37e93e\") " pod="openstack/cinder-scheduler-0" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.066146 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/74a83ecb-de31-4767-a178-bccf8a37e93e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"74a83ecb-de31-4767-a178-bccf8a37e93e\") " pod="openstack/cinder-scheduler-0" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.066208 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4r77f\" (UniqueName: \"kubernetes.io/projected/74a83ecb-de31-4767-a178-bccf8a37e93e-kube-api-access-4r77f\") pod \"cinder-scheduler-0\" (UID: \"74a83ecb-de31-4767-a178-bccf8a37e93e\") " pod="openstack/cinder-scheduler-0" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.066243 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a83ecb-de31-4767-a178-bccf8a37e93e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"74a83ecb-de31-4767-a178-bccf8a37e93e\") " pod="openstack/cinder-scheduler-0" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.069824 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/74a83ecb-de31-4767-a178-bccf8a37e93e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"74a83ecb-de31-4767-a178-bccf8a37e93e\") " pod="openstack/cinder-scheduler-0" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.077478 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-dc6887bc5-xfpt6"] Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.079780 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dc6887bc5-xfpt6" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.080783 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74a83ecb-de31-4767-a178-bccf8a37e93e-scripts\") pod \"cinder-scheduler-0\" (UID: \"74a83ecb-de31-4767-a178-bccf8a37e93e\") " pod="openstack/cinder-scheduler-0" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.094335 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/74a83ecb-de31-4767-a178-bccf8a37e93e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"74a83ecb-de31-4767-a178-bccf8a37e93e\") " pod="openstack/cinder-scheduler-0" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.099080 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a83ecb-de31-4767-a178-bccf8a37e93e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"74a83ecb-de31-4767-a178-bccf8a37e93e\") " pod="openstack/cinder-scheduler-0" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.099217 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74a83ecb-de31-4767-a178-bccf8a37e93e-config-data\") pod \"cinder-scheduler-0\" (UID: \"74a83ecb-de31-4767-a178-bccf8a37e93e\") " pod="openstack/cinder-scheduler-0" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.102403 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r77f\" (UniqueName: \"kubernetes.io/projected/74a83ecb-de31-4767-a178-bccf8a37e93e-kube-api-access-4r77f\") pod \"cinder-scheduler-0\" (UID: \"74a83ecb-de31-4767-a178-bccf8a37e93e\") " pod="openstack/cinder-scheduler-0" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.123472 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dc6887bc5-xfpt6"] Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.161860 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.164508 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.167758 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c5a7cb59-a6a3-4653-a63a-5942277f6663-dns-swift-storage-0\") pod \"dnsmasq-dns-dc6887bc5-xfpt6\" (UID: \"c5a7cb59-a6a3-4653-a63a-5942277f6663\") " pod="openstack/dnsmasq-dns-dc6887bc5-xfpt6" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.167817 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5a7cb59-a6a3-4653-a63a-5942277f6663-dns-svc\") pod \"dnsmasq-dns-dc6887bc5-xfpt6\" (UID: \"c5a7cb59-a6a3-4653-a63a-5942277f6663\") " pod="openstack/dnsmasq-dns-dc6887bc5-xfpt6" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.167887 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5a7cb59-a6a3-4653-a63a-5942277f6663-ovsdbserver-nb\") pod \"dnsmasq-dns-dc6887bc5-xfpt6\" (UID: \"c5a7cb59-a6a3-4653-a63a-5942277f6663\") " pod="openstack/dnsmasq-dns-dc6887bc5-xfpt6" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.167908 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5a7cb59-a6a3-4653-a63a-5942277f6663-ovsdbserver-sb\") pod \"dnsmasq-dns-dc6887bc5-xfpt6\" (UID: \"c5a7cb59-a6a3-4653-a63a-5942277f6663\") " pod="openstack/dnsmasq-dns-dc6887bc5-xfpt6" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.167992 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5a7cb59-a6a3-4653-a63a-5942277f6663-config\") pod \"dnsmasq-dns-dc6887bc5-xfpt6\" (UID: \"c5a7cb59-a6a3-4653-a63a-5942277f6663\") " pod="openstack/dnsmasq-dns-dc6887bc5-xfpt6" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.168059 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgkxw\" (UniqueName: \"kubernetes.io/projected/c5a7cb59-a6a3-4653-a63a-5942277f6663-kube-api-access-rgkxw\") pod \"dnsmasq-dns-dc6887bc5-xfpt6\" (UID: \"c5a7cb59-a6a3-4653-a63a-5942277f6663\") " pod="openstack/dnsmasq-dns-dc6887bc5-xfpt6" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.168546 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.233569 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.265379 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.270712 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c5a7cb59-a6a3-4653-a63a-5942277f6663-dns-swift-storage-0\") pod \"dnsmasq-dns-dc6887bc5-xfpt6\" (UID: \"c5a7cb59-a6a3-4653-a63a-5942277f6663\") " pod="openstack/dnsmasq-dns-dc6887bc5-xfpt6" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.270746 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5a7cb59-a6a3-4653-a63a-5942277f6663-dns-svc\") pod \"dnsmasq-dns-dc6887bc5-xfpt6\" (UID: \"c5a7cb59-a6a3-4653-a63a-5942277f6663\") " pod="openstack/dnsmasq-dns-dc6887bc5-xfpt6" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.270796 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tqmb\" (UniqueName: \"kubernetes.io/projected/ed1bc0f3-a613-4565-b1f5-e962556acb00-kube-api-access-2tqmb\") pod \"cinder-api-0\" (UID: \"ed1bc0f3-a613-4565-b1f5-e962556acb00\") " pod="openstack/cinder-api-0" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.270827 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5a7cb59-a6a3-4653-a63a-5942277f6663-ovsdbserver-nb\") pod \"dnsmasq-dns-dc6887bc5-xfpt6\" (UID: \"c5a7cb59-a6a3-4653-a63a-5942277f6663\") " pod="openstack/dnsmasq-dns-dc6887bc5-xfpt6" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.270846 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5a7cb59-a6a3-4653-a63a-5942277f6663-ovsdbserver-sb\") pod \"dnsmasq-dns-dc6887bc5-xfpt6\" (UID: \"c5a7cb59-a6a3-4653-a63a-5942277f6663\") " pod="openstack/dnsmasq-dns-dc6887bc5-xfpt6" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.270877 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed1bc0f3-a613-4565-b1f5-e962556acb00-config-data\") pod \"cinder-api-0\" (UID: \"ed1bc0f3-a613-4565-b1f5-e962556acb00\") " pod="openstack/cinder-api-0" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.270919 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed1bc0f3-a613-4565-b1f5-e962556acb00-scripts\") pod \"cinder-api-0\" (UID: \"ed1bc0f3-a613-4565-b1f5-e962556acb00\") " pod="openstack/cinder-api-0" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.270938 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ed1bc0f3-a613-4565-b1f5-e962556acb00-config-data-custom\") pod \"cinder-api-0\" (UID: \"ed1bc0f3-a613-4565-b1f5-e962556acb00\") " pod="openstack/cinder-api-0" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.270958 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5a7cb59-a6a3-4653-a63a-5942277f6663-config\") pod \"dnsmasq-dns-dc6887bc5-xfpt6\" (UID: \"c5a7cb59-a6a3-4653-a63a-5942277f6663\") " pod="openstack/dnsmasq-dns-dc6887bc5-xfpt6" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.270977 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed1bc0f3-a613-4565-b1f5-e962556acb00-logs\") pod \"cinder-api-0\" (UID: \"ed1bc0f3-a613-4565-b1f5-e962556acb00\") " pod="openstack/cinder-api-0" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.270998 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed1bc0f3-a613-4565-b1f5-e962556acb00-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ed1bc0f3-a613-4565-b1f5-e962556acb00\") " pod="openstack/cinder-api-0" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.271043 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ed1bc0f3-a613-4565-b1f5-e962556acb00-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ed1bc0f3-a613-4565-b1f5-e962556acb00\") " pod="openstack/cinder-api-0" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.271068 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgkxw\" (UniqueName: \"kubernetes.io/projected/c5a7cb59-a6a3-4653-a63a-5942277f6663-kube-api-access-rgkxw\") pod \"dnsmasq-dns-dc6887bc5-xfpt6\" (UID: \"c5a7cb59-a6a3-4653-a63a-5942277f6663\") " pod="openstack/dnsmasq-dns-dc6887bc5-xfpt6" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.272600 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c5a7cb59-a6a3-4653-a63a-5942277f6663-dns-swift-storage-0\") pod \"dnsmasq-dns-dc6887bc5-xfpt6\" (UID: \"c5a7cb59-a6a3-4653-a63a-5942277f6663\") " pod="openstack/dnsmasq-dns-dc6887bc5-xfpt6" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.272659 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5a7cb59-a6a3-4653-a63a-5942277f6663-dns-svc\") pod \"dnsmasq-dns-dc6887bc5-xfpt6\" (UID: \"c5a7cb59-a6a3-4653-a63a-5942277f6663\") " pod="openstack/dnsmasq-dns-dc6887bc5-xfpt6" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.273094 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5a7cb59-a6a3-4653-a63a-5942277f6663-config\") pod \"dnsmasq-dns-dc6887bc5-xfpt6\" (UID: \"c5a7cb59-a6a3-4653-a63a-5942277f6663\") " pod="openstack/dnsmasq-dns-dc6887bc5-xfpt6" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.273311 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5a7cb59-a6a3-4653-a63a-5942277f6663-ovsdbserver-nb\") pod \"dnsmasq-dns-dc6887bc5-xfpt6\" (UID: \"c5a7cb59-a6a3-4653-a63a-5942277f6663\") " pod="openstack/dnsmasq-dns-dc6887bc5-xfpt6" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.273506 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5a7cb59-a6a3-4653-a63a-5942277f6663-ovsdbserver-sb\") pod \"dnsmasq-dns-dc6887bc5-xfpt6\" (UID: \"c5a7cb59-a6a3-4653-a63a-5942277f6663\") " pod="openstack/dnsmasq-dns-dc6887bc5-xfpt6" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.297401 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgkxw\" (UniqueName: \"kubernetes.io/projected/c5a7cb59-a6a3-4653-a63a-5942277f6663-kube-api-access-rgkxw\") pod \"dnsmasq-dns-dc6887bc5-xfpt6\" (UID: \"c5a7cb59-a6a3-4653-a63a-5942277f6663\") " pod="openstack/dnsmasq-dns-dc6887bc5-xfpt6" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.374020 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tqmb\" (UniqueName: \"kubernetes.io/projected/ed1bc0f3-a613-4565-b1f5-e962556acb00-kube-api-access-2tqmb\") pod \"cinder-api-0\" (UID: \"ed1bc0f3-a613-4565-b1f5-e962556acb00\") " pod="openstack/cinder-api-0" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.375695 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed1bc0f3-a613-4565-b1f5-e962556acb00-config-data\") pod \"cinder-api-0\" (UID: \"ed1bc0f3-a613-4565-b1f5-e962556acb00\") " pod="openstack/cinder-api-0" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.375867 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed1bc0f3-a613-4565-b1f5-e962556acb00-scripts\") pod \"cinder-api-0\" (UID: \"ed1bc0f3-a613-4565-b1f5-e962556acb00\") " pod="openstack/cinder-api-0" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.375950 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ed1bc0f3-a613-4565-b1f5-e962556acb00-config-data-custom\") pod \"cinder-api-0\" (UID: \"ed1bc0f3-a613-4565-b1f5-e962556acb00\") " pod="openstack/cinder-api-0" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.376189 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed1bc0f3-a613-4565-b1f5-e962556acb00-logs\") pod \"cinder-api-0\" (UID: \"ed1bc0f3-a613-4565-b1f5-e962556acb00\") " pod="openstack/cinder-api-0" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.376349 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed1bc0f3-a613-4565-b1f5-e962556acb00-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ed1bc0f3-a613-4565-b1f5-e962556acb00\") " pod="openstack/cinder-api-0" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.376962 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ed1bc0f3-a613-4565-b1f5-e962556acb00-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ed1bc0f3-a613-4565-b1f5-e962556acb00\") " pod="openstack/cinder-api-0" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.378917 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ed1bc0f3-a613-4565-b1f5-e962556acb00-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ed1bc0f3-a613-4565-b1f5-e962556acb00\") " pod="openstack/cinder-api-0" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.379449 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed1bc0f3-a613-4565-b1f5-e962556acb00-logs\") pod \"cinder-api-0\" (UID: \"ed1bc0f3-a613-4565-b1f5-e962556acb00\") " pod="openstack/cinder-api-0" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.384977 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed1bc0f3-a613-4565-b1f5-e962556acb00-config-data\") pod \"cinder-api-0\" (UID: \"ed1bc0f3-a613-4565-b1f5-e962556acb00\") " pod="openstack/cinder-api-0" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.394005 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed1bc0f3-a613-4565-b1f5-e962556acb00-scripts\") pod \"cinder-api-0\" (UID: \"ed1bc0f3-a613-4565-b1f5-e962556acb00\") " pod="openstack/cinder-api-0" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.394573 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ed1bc0f3-a613-4565-b1f5-e962556acb00-config-data-custom\") pod \"cinder-api-0\" (UID: \"ed1bc0f3-a613-4565-b1f5-e962556acb00\") " pod="openstack/cinder-api-0" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.408048 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed1bc0f3-a613-4565-b1f5-e962556acb00-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ed1bc0f3-a613-4565-b1f5-e962556acb00\") " pod="openstack/cinder-api-0" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.441781 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tqmb\" (UniqueName: \"kubernetes.io/projected/ed1bc0f3-a613-4565-b1f5-e962556acb00-kube-api-access-2tqmb\") pod \"cinder-api-0\" (UID: \"ed1bc0f3-a613-4565-b1f5-e962556acb00\") " pod="openstack/cinder-api-0" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.553891 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dc6887bc5-xfpt6" Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.644446 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.698426 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5bdb97b9f9-22mm8" podUID="d15f2ab3-202d-4241-a636-4d00475874aa" containerName="dnsmasq-dns" containerID="cri-o://8439e696b2fd32672982849d8b37ce88e9906b06a73598f166488f6da611b3af" gracePeriod=10 Feb 18 11:56:54 crc kubenswrapper[4922]: I0218 11:56:54.716535 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 18 11:56:55 crc kubenswrapper[4922]: I0218 11:56:55.018013 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-st9pz" Feb 18 11:56:55 crc kubenswrapper[4922]: I0218 11:56:55.136735 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmckq\" (UniqueName: \"kubernetes.io/projected/855fb3ec-e473-4a99-a94f-cc96dda6d9c4-kube-api-access-zmckq\") pod \"855fb3ec-e473-4a99-a94f-cc96dda6d9c4\" (UID: \"855fb3ec-e473-4a99-a94f-cc96dda6d9c4\") " Feb 18 11:56:55 crc kubenswrapper[4922]: I0218 11:56:55.137133 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/855fb3ec-e473-4a99-a94f-cc96dda6d9c4-config-data\") pod \"855fb3ec-e473-4a99-a94f-cc96dda6d9c4\" (UID: \"855fb3ec-e473-4a99-a94f-cc96dda6d9c4\") " Feb 18 11:56:55 crc kubenswrapper[4922]: I0218 11:56:55.137250 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/855fb3ec-e473-4a99-a94f-cc96dda6d9c4-combined-ca-bundle\") pod \"855fb3ec-e473-4a99-a94f-cc96dda6d9c4\" (UID: \"855fb3ec-e473-4a99-a94f-cc96dda6d9c4\") " Feb 18 11:56:55 crc kubenswrapper[4922]: I0218 11:56:55.137350 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/855fb3ec-e473-4a99-a94f-cc96dda6d9c4-db-sync-config-data\") pod \"855fb3ec-e473-4a99-a94f-cc96dda6d9c4\" (UID: \"855fb3ec-e473-4a99-a94f-cc96dda6d9c4\") " Feb 18 11:56:55 crc kubenswrapper[4922]: I0218 11:56:55.164059 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 18 11:56:55 crc kubenswrapper[4922]: I0218 11:56:55.212513 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/855fb3ec-e473-4a99-a94f-cc96dda6d9c4-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "855fb3ec-e473-4a99-a94f-cc96dda6d9c4" (UID: "855fb3ec-e473-4a99-a94f-cc96dda6d9c4"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:56:55 crc kubenswrapper[4922]: W0218 11:56:55.221650 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74a83ecb_de31_4767_a178_bccf8a37e93e.slice/crio-94eaad4dfdfd79b9d6e5320a8b3266331430b86c446a5d0f1d18cea2bf387427 WatchSource:0}: Error finding container 94eaad4dfdfd79b9d6e5320a8b3266331430b86c446a5d0f1d18cea2bf387427: Status 404 returned error can't find the container with id 94eaad4dfdfd79b9d6e5320a8b3266331430b86c446a5d0f1d18cea2bf387427 Feb 18 11:56:55 crc kubenswrapper[4922]: I0218 11:56:55.227504 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/855fb3ec-e473-4a99-a94f-cc96dda6d9c4-kube-api-access-zmckq" (OuterVolumeSpecName: "kube-api-access-zmckq") pod "855fb3ec-e473-4a99-a94f-cc96dda6d9c4" (UID: "855fb3ec-e473-4a99-a94f-cc96dda6d9c4"). InnerVolumeSpecName "kube-api-access-zmckq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:56:55 crc kubenswrapper[4922]: I0218 11:56:55.239418 4922 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/855fb3ec-e473-4a99-a94f-cc96dda6d9c4-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:55 crc kubenswrapper[4922]: I0218 11:56:55.239449 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmckq\" (UniqueName: \"kubernetes.io/projected/855fb3ec-e473-4a99-a94f-cc96dda6d9c4-kube-api-access-zmckq\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:55 crc kubenswrapper[4922]: I0218 11:56:55.322072 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/855fb3ec-e473-4a99-a94f-cc96dda6d9c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "855fb3ec-e473-4a99-a94f-cc96dda6d9c4" (UID: "855fb3ec-e473-4a99-a94f-cc96dda6d9c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:56:55 crc kubenswrapper[4922]: I0218 11:56:55.344648 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/855fb3ec-e473-4a99-a94f-cc96dda6d9c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:55 crc kubenswrapper[4922]: I0218 11:56:55.385349 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/855fb3ec-e473-4a99-a94f-cc96dda6d9c4-config-data" (OuterVolumeSpecName: "config-data") pod "855fb3ec-e473-4a99-a94f-cc96dda6d9c4" (UID: "855fb3ec-e473-4a99-a94f-cc96dda6d9c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:56:55 crc kubenswrapper[4922]: I0218 11:56:55.446936 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/855fb3ec-e473-4a99-a94f-cc96dda6d9c4-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:55 crc kubenswrapper[4922]: I0218 11:56:55.692836 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 18 11:56:55 crc kubenswrapper[4922]: I0218 11:56:55.748327 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31a682bb-b881-47f5-960b-c6ae54c24275","Type":"ContainerStarted","Data":"da256f656dc5f88aa583a65920c90cb68f469934b1ee4eea56bd44df48dcdf88"} Feb 18 11:56:55 crc kubenswrapper[4922]: I0218 11:56:55.759788 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"74a83ecb-de31-4767-a178-bccf8a37e93e","Type":"ContainerStarted","Data":"94eaad4dfdfd79b9d6e5320a8b3266331430b86c446a5d0f1d18cea2bf387427"} Feb 18 11:56:55 crc kubenswrapper[4922]: I0218 11:56:55.776121 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-st9pz" event={"ID":"855fb3ec-e473-4a99-a94f-cc96dda6d9c4","Type":"ContainerDied","Data":"bb03e387f02f6078ba9ca11f5028b069ffe62c115543a3d26dcd8e4428a02edd"} Feb 18 11:56:55 crc kubenswrapper[4922]: I0218 11:56:55.776163 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb03e387f02f6078ba9ca11f5028b069ffe62c115543a3d26dcd8e4428a02edd" Feb 18 11:56:55 crc kubenswrapper[4922]: I0218 11:56:55.776226 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-st9pz" Feb 18 11:56:55 crc kubenswrapper[4922]: I0218 11:56:55.796734 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"245b1cb9-d98f-4875-adf6-ab887f76849d","Type":"ContainerStarted","Data":"4e8f10c7492476510baebc3943dff72a9dc028154bb64b93d0af81c6c13b9994"} Feb 18 11:56:55 crc kubenswrapper[4922]: I0218 11:56:55.905632 4922 generic.go:334] "Generic (PLEG): container finished" podID="d15f2ab3-202d-4241-a636-4d00475874aa" containerID="8439e696b2fd32672982849d8b37ce88e9906b06a73598f166488f6da611b3af" exitCode=0 Feb 18 11:56:55 crc kubenswrapper[4922]: I0218 11:56:55.906802 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bdb97b9f9-22mm8" event={"ID":"d15f2ab3-202d-4241-a636-4d00475874aa","Type":"ContainerDied","Data":"8439e696b2fd32672982849d8b37ce88e9906b06a73598f166488f6da611b3af"} Feb 18 11:56:55 crc kubenswrapper[4922]: I0218 11:56:55.958697 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dc6887bc5-xfpt6"] Feb 18 11:56:56 crc kubenswrapper[4922]: I0218 11:56:56.015581 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bdb97b9f9-22mm8" Feb 18 11:56:56 crc kubenswrapper[4922]: I0218 11:56:56.176959 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d15f2ab3-202d-4241-a636-4d00475874aa-config\") pod \"d15f2ab3-202d-4241-a636-4d00475874aa\" (UID: \"d15f2ab3-202d-4241-a636-4d00475874aa\") " Feb 18 11:56:56 crc kubenswrapper[4922]: I0218 11:56:56.177135 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d15f2ab3-202d-4241-a636-4d00475874aa-dns-swift-storage-0\") pod \"d15f2ab3-202d-4241-a636-4d00475874aa\" (UID: \"d15f2ab3-202d-4241-a636-4d00475874aa\") " Feb 18 11:56:56 crc kubenswrapper[4922]: I0218 11:56:56.177191 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d15f2ab3-202d-4241-a636-4d00475874aa-ovsdbserver-nb\") pod \"d15f2ab3-202d-4241-a636-4d00475874aa\" (UID: \"d15f2ab3-202d-4241-a636-4d00475874aa\") " Feb 18 11:56:56 crc kubenswrapper[4922]: I0218 11:56:56.177278 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d15f2ab3-202d-4241-a636-4d00475874aa-ovsdbserver-sb\") pod \"d15f2ab3-202d-4241-a636-4d00475874aa\" (UID: \"d15f2ab3-202d-4241-a636-4d00475874aa\") " Feb 18 11:56:56 crc kubenswrapper[4922]: I0218 11:56:56.177393 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d15f2ab3-202d-4241-a636-4d00475874aa-dns-svc\") pod \"d15f2ab3-202d-4241-a636-4d00475874aa\" (UID: \"d15f2ab3-202d-4241-a636-4d00475874aa\") " Feb 18 11:56:56 crc kubenswrapper[4922]: I0218 11:56:56.177683 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8ptt\" (UniqueName: \"kubernetes.io/projected/d15f2ab3-202d-4241-a636-4d00475874aa-kube-api-access-b8ptt\") pod \"d15f2ab3-202d-4241-a636-4d00475874aa\" (UID: \"d15f2ab3-202d-4241-a636-4d00475874aa\") " Feb 18 11:56:56 crc kubenswrapper[4922]: I0218 11:56:56.245166 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d15f2ab3-202d-4241-a636-4d00475874aa-kube-api-access-b8ptt" (OuterVolumeSpecName: "kube-api-access-b8ptt") pod "d15f2ab3-202d-4241-a636-4d00475874aa" (UID: "d15f2ab3-202d-4241-a636-4d00475874aa"). InnerVolumeSpecName "kube-api-access-b8ptt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:56:56 crc kubenswrapper[4922]: I0218 11:56:56.284464 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8ptt\" (UniqueName: \"kubernetes.io/projected/d15f2ab3-202d-4241-a636-4d00475874aa-kube-api-access-b8ptt\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:56 crc kubenswrapper[4922]: I0218 11:56:56.327156 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d15f2ab3-202d-4241-a636-4d00475874aa-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d15f2ab3-202d-4241-a636-4d00475874aa" (UID: "d15f2ab3-202d-4241-a636-4d00475874aa"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:56:56 crc kubenswrapper[4922]: I0218 11:56:56.370102 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d15f2ab3-202d-4241-a636-4d00475874aa-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d15f2ab3-202d-4241-a636-4d00475874aa" (UID: "d15f2ab3-202d-4241-a636-4d00475874aa"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:56:56 crc kubenswrapper[4922]: I0218 11:56:56.387437 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 18 11:56:56 crc kubenswrapper[4922]: I0218 11:56:56.388683 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d15f2ab3-202d-4241-a636-4d00475874aa-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:56 crc kubenswrapper[4922]: I0218 11:56:56.388715 4922 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d15f2ab3-202d-4241-a636-4d00475874aa-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:56 crc kubenswrapper[4922]: I0218 11:56:56.389279 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d15f2ab3-202d-4241-a636-4d00475874aa-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d15f2ab3-202d-4241-a636-4d00475874aa" (UID: "d15f2ab3-202d-4241-a636-4d00475874aa"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:56:56 crc kubenswrapper[4922]: I0218 11:56:56.394740 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d15f2ab3-202d-4241-a636-4d00475874aa-config" (OuterVolumeSpecName: "config") pod "d15f2ab3-202d-4241-a636-4d00475874aa" (UID: "d15f2ab3-202d-4241-a636-4d00475874aa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:56:56 crc kubenswrapper[4922]: I0218 11:56:56.447631 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d15f2ab3-202d-4241-a636-4d00475874aa-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d15f2ab3-202d-4241-a636-4d00475874aa" (UID: "d15f2ab3-202d-4241-a636-4d00475874aa"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:56:56 crc kubenswrapper[4922]: I0218 11:56:56.492929 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d15f2ab3-202d-4241-a636-4d00475874aa-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:56 crc kubenswrapper[4922]: I0218 11:56:56.493201 4922 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d15f2ab3-202d-4241-a636-4d00475874aa-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:56 crc kubenswrapper[4922]: I0218 11:56:56.493292 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d15f2ab3-202d-4241-a636-4d00475874aa-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:56 crc kubenswrapper[4922]: I0218 11:56:56.705697 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Feb 18 11:56:56 crc kubenswrapper[4922]: I0218 11:56:56.741510 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Feb 18 11:56:56 crc kubenswrapper[4922]: I0218 11:56:56.745760 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dc6887bc5-xfpt6"] Feb 18 11:56:56 crc kubenswrapper[4922]: I0218 11:56:56.815831 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5cc8b5d5c5-jrg5v"] Feb 18 11:56:56 crc kubenswrapper[4922]: E0218 11:56:56.816438 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="855fb3ec-e473-4a99-a94f-cc96dda6d9c4" containerName="glance-db-sync" Feb 18 11:56:56 crc kubenswrapper[4922]: I0218 11:56:56.816459 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="855fb3ec-e473-4a99-a94f-cc96dda6d9c4" containerName="glance-db-sync" Feb 18 11:56:56 crc kubenswrapper[4922]: E0218 11:56:56.816484 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d15f2ab3-202d-4241-a636-4d00475874aa" containerName="init" Feb 18 11:56:56 crc kubenswrapper[4922]: I0218 11:56:56.816492 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="d15f2ab3-202d-4241-a636-4d00475874aa" containerName="init" Feb 18 11:56:56 crc kubenswrapper[4922]: E0218 11:56:56.816507 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d15f2ab3-202d-4241-a636-4d00475874aa" containerName="dnsmasq-dns" Feb 18 11:56:56 crc kubenswrapper[4922]: I0218 11:56:56.816515 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="d15f2ab3-202d-4241-a636-4d00475874aa" containerName="dnsmasq-dns" Feb 18 11:56:56 crc kubenswrapper[4922]: I0218 11:56:56.821647 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="d15f2ab3-202d-4241-a636-4d00475874aa" containerName="dnsmasq-dns" Feb 18 11:56:56 crc kubenswrapper[4922]: I0218 11:56:56.821719 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="855fb3ec-e473-4a99-a94f-cc96dda6d9c4" containerName="glance-db-sync" Feb 18 11:56:56 crc kubenswrapper[4922]: I0218 11:56:56.823457 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cc8b5d5c5-jrg5v" Feb 18 11:56:56 crc kubenswrapper[4922]: I0218 11:56:56.825838 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Feb 18 11:56:56 crc kubenswrapper[4922]: I0218 11:56:56.835099 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Feb 18 11:56:56 crc kubenswrapper[4922]: I0218 11:56:56.860478 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cc8b5d5c5-jrg5v"] Feb 18 11:56:56 crc kubenswrapper[4922]: I0218 11:56:56.917864 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d61bee1b-0ee4-4c97-8d5d-8655406f124c-dns-swift-storage-0\") pod \"dnsmasq-dns-5cc8b5d5c5-jrg5v\" (UID: \"d61bee1b-0ee4-4c97-8d5d-8655406f124c\") " pod="openstack/dnsmasq-dns-5cc8b5d5c5-jrg5v" Feb 18 11:56:56 crc kubenswrapper[4922]: I0218 11:56:56.917992 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d61bee1b-0ee4-4c97-8d5d-8655406f124c-dns-svc\") pod \"dnsmasq-dns-5cc8b5d5c5-jrg5v\" (UID: \"d61bee1b-0ee4-4c97-8d5d-8655406f124c\") " pod="openstack/dnsmasq-dns-5cc8b5d5c5-jrg5v" Feb 18 11:56:56 crc kubenswrapper[4922]: I0218 11:56:56.918093 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d61bee1b-0ee4-4c97-8d5d-8655406f124c-ovsdbserver-sb\") pod \"dnsmasq-dns-5cc8b5d5c5-jrg5v\" (UID: \"d61bee1b-0ee4-4c97-8d5d-8655406f124c\") " pod="openstack/dnsmasq-dns-5cc8b5d5c5-jrg5v" Feb 18 11:56:56 crc kubenswrapper[4922]: I0218 11:56:56.918133 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d61bee1b-0ee4-4c97-8d5d-8655406f124c-config\") pod \"dnsmasq-dns-5cc8b5d5c5-jrg5v\" (UID: \"d61bee1b-0ee4-4c97-8d5d-8655406f124c\") " pod="openstack/dnsmasq-dns-5cc8b5d5c5-jrg5v" Feb 18 11:56:56 crc kubenswrapper[4922]: I0218 11:56:56.918264 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2htl\" (UniqueName: \"kubernetes.io/projected/d61bee1b-0ee4-4c97-8d5d-8655406f124c-kube-api-access-k2htl\") pod \"dnsmasq-dns-5cc8b5d5c5-jrg5v\" (UID: \"d61bee1b-0ee4-4c97-8d5d-8655406f124c\") " pod="openstack/dnsmasq-dns-5cc8b5d5c5-jrg5v" Feb 18 11:56:56 crc kubenswrapper[4922]: I0218 11:56:56.918414 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d61bee1b-0ee4-4c97-8d5d-8655406f124c-ovsdbserver-nb\") pod \"dnsmasq-dns-5cc8b5d5c5-jrg5v\" (UID: \"d61bee1b-0ee4-4c97-8d5d-8655406f124c\") " pod="openstack/dnsmasq-dns-5cc8b5d5c5-jrg5v" Feb 18 11:56:56 crc kubenswrapper[4922]: I0218 11:56:56.956791 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dc6887bc5-xfpt6" event={"ID":"c5a7cb59-a6a3-4653-a63a-5942277f6663","Type":"ContainerStarted","Data":"1944f842e23fb615957951883e249b9c3deafa98b8f8ab3f95ea984230f6d29c"} Feb 18 11:56:56 crc kubenswrapper[4922]: I0218 11:56:56.971063 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ed1bc0f3-a613-4565-b1f5-e962556acb00","Type":"ContainerStarted","Data":"80ecfe632f1eb5d18cb8c8fa491a890fbee92c07e3b873e227a8dfcd9bb02d67"} Feb 18 11:56:57 crc kubenswrapper[4922]: I0218 11:56:57.020013 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bdb97b9f9-22mm8" Feb 18 11:56:57 crc kubenswrapper[4922]: I0218 11:56:57.032848 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d61bee1b-0ee4-4c97-8d5d-8655406f124c-ovsdbserver-nb\") pod \"dnsmasq-dns-5cc8b5d5c5-jrg5v\" (UID: \"d61bee1b-0ee4-4c97-8d5d-8655406f124c\") " pod="openstack/dnsmasq-dns-5cc8b5d5c5-jrg5v" Feb 18 11:56:57 crc kubenswrapper[4922]: I0218 11:56:57.032988 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d61bee1b-0ee4-4c97-8d5d-8655406f124c-dns-swift-storage-0\") pod \"dnsmasq-dns-5cc8b5d5c5-jrg5v\" (UID: \"d61bee1b-0ee4-4c97-8d5d-8655406f124c\") " pod="openstack/dnsmasq-dns-5cc8b5d5c5-jrg5v" Feb 18 11:56:57 crc kubenswrapper[4922]: I0218 11:56:57.033221 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d61bee1b-0ee4-4c97-8d5d-8655406f124c-dns-svc\") pod \"dnsmasq-dns-5cc8b5d5c5-jrg5v\" (UID: \"d61bee1b-0ee4-4c97-8d5d-8655406f124c\") " pod="openstack/dnsmasq-dns-5cc8b5d5c5-jrg5v" Feb 18 11:56:57 crc kubenswrapper[4922]: I0218 11:56:57.033432 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d61bee1b-0ee4-4c97-8d5d-8655406f124c-ovsdbserver-sb\") pod \"dnsmasq-dns-5cc8b5d5c5-jrg5v\" (UID: \"d61bee1b-0ee4-4c97-8d5d-8655406f124c\") " pod="openstack/dnsmasq-dns-5cc8b5d5c5-jrg5v" Feb 18 11:56:57 crc kubenswrapper[4922]: I0218 11:56:57.033501 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d61bee1b-0ee4-4c97-8d5d-8655406f124c-config\") pod \"dnsmasq-dns-5cc8b5d5c5-jrg5v\" (UID: \"d61bee1b-0ee4-4c97-8d5d-8655406f124c\") " pod="openstack/dnsmasq-dns-5cc8b5d5c5-jrg5v" Feb 18 11:56:57 crc kubenswrapper[4922]: I0218 11:56:57.033715 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2htl\" (UniqueName: \"kubernetes.io/projected/d61bee1b-0ee4-4c97-8d5d-8655406f124c-kube-api-access-k2htl\") pod \"dnsmasq-dns-5cc8b5d5c5-jrg5v\" (UID: \"d61bee1b-0ee4-4c97-8d5d-8655406f124c\") " pod="openstack/dnsmasq-dns-5cc8b5d5c5-jrg5v" Feb 18 11:56:57 crc kubenswrapper[4922]: I0218 11:56:57.045384 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d61bee1b-0ee4-4c97-8d5d-8655406f124c-ovsdbserver-nb\") pod \"dnsmasq-dns-5cc8b5d5c5-jrg5v\" (UID: \"d61bee1b-0ee4-4c97-8d5d-8655406f124c\") " pod="openstack/dnsmasq-dns-5cc8b5d5c5-jrg5v" Feb 18 11:56:57 crc kubenswrapper[4922]: I0218 11:56:57.046002 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d61bee1b-0ee4-4c97-8d5d-8655406f124c-dns-swift-storage-0\") pod \"dnsmasq-dns-5cc8b5d5c5-jrg5v\" (UID: \"d61bee1b-0ee4-4c97-8d5d-8655406f124c\") " pod="openstack/dnsmasq-dns-5cc8b5d5c5-jrg5v" Feb 18 11:56:57 crc kubenswrapper[4922]: I0218 11:56:57.046609 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d61bee1b-0ee4-4c97-8d5d-8655406f124c-dns-svc\") pod \"dnsmasq-dns-5cc8b5d5c5-jrg5v\" (UID: \"d61bee1b-0ee4-4c97-8d5d-8655406f124c\") " pod="openstack/dnsmasq-dns-5cc8b5d5c5-jrg5v" Feb 18 11:56:57 crc kubenswrapper[4922]: I0218 11:56:57.047189 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d61bee1b-0ee4-4c97-8d5d-8655406f124c-ovsdbserver-sb\") pod \"dnsmasq-dns-5cc8b5d5c5-jrg5v\" (UID: \"d61bee1b-0ee4-4c97-8d5d-8655406f124c\") " pod="openstack/dnsmasq-dns-5cc8b5d5c5-jrg5v" Feb 18 11:56:57 crc kubenswrapper[4922]: I0218 11:56:57.054805 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d61bee1b-0ee4-4c97-8d5d-8655406f124c-config\") pod \"dnsmasq-dns-5cc8b5d5c5-jrg5v\" (UID: \"d61bee1b-0ee4-4c97-8d5d-8655406f124c\") " pod="openstack/dnsmasq-dns-5cc8b5d5c5-jrg5v" Feb 18 11:56:57 crc kubenswrapper[4922]: I0218 11:56:57.098128 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Feb 18 11:56:57 crc kubenswrapper[4922]: I0218 11:56:57.112323 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bdb97b9f9-22mm8" event={"ID":"d15f2ab3-202d-4241-a636-4d00475874aa","Type":"ContainerDied","Data":"b70a4c113ae5df77985adf93070afceb0e7e0972f2d420cb43d3ae8ed3526536"} Feb 18 11:56:57 crc kubenswrapper[4922]: I0218 11:56:57.112445 4922 scope.go:117] "RemoveContainer" containerID="8439e696b2fd32672982849d8b37ce88e9906b06a73598f166488f6da611b3af" Feb 18 11:56:57 crc kubenswrapper[4922]: I0218 11:56:57.140983 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Feb 18 11:56:57 crc kubenswrapper[4922]: I0218 11:56:57.196346 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Feb 18 11:56:57 crc kubenswrapper[4922]: I0218 11:56:57.216181 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2htl\" (UniqueName: \"kubernetes.io/projected/d61bee1b-0ee4-4c97-8d5d-8655406f124c-kube-api-access-k2htl\") pod \"dnsmasq-dns-5cc8b5d5c5-jrg5v\" (UID: \"d61bee1b-0ee4-4c97-8d5d-8655406f124c\") " pod="openstack/dnsmasq-dns-5cc8b5d5c5-jrg5v" Feb 18 11:56:57 crc kubenswrapper[4922]: I0218 11:56:57.249668 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bdb97b9f9-22mm8"] Feb 18 11:56:57 crc kubenswrapper[4922]: I0218 11:56:57.250665 4922 scope.go:117] "RemoveContainer" containerID="007affd267df0eaaea70525abb6cf65b3773291056b908124aaf8cd367384660" Feb 18 11:56:57 crc kubenswrapper[4922]: I0218 11:56:57.272133 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cc8b5d5c5-jrg5v" Feb 18 11:56:57 crc kubenswrapper[4922]: I0218 11:56:57.272801 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bdb97b9f9-22mm8"] Feb 18 11:56:57 crc kubenswrapper[4922]: I0218 11:56:57.451425 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 11:56:57 crc kubenswrapper[4922]: I0218 11:56:57.453421 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 18 11:56:57 crc kubenswrapper[4922]: I0218 11:56:57.460107 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 18 11:56:57 crc kubenswrapper[4922]: I0218 11:56:57.460417 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-jr8f4" Feb 18 11:56:57 crc kubenswrapper[4922]: I0218 11:56:57.460547 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 18 11:56:57 crc kubenswrapper[4922]: I0218 11:56:57.463515 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 11:56:57 crc kubenswrapper[4922]: I0218 11:56:57.610068 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ad78d97-6a95-4bd9-9204-6f8d0af71cf3-config-data\") pod \"glance-default-external-api-0\" (UID: \"6ad78d97-6a95-4bd9-9204-6f8d0af71cf3\") " pod="openstack/glance-default-external-api-0" Feb 18 11:56:57 crc kubenswrapper[4922]: I0218 11:56:57.610652 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"6ad78d97-6a95-4bd9-9204-6f8d0af71cf3\") " pod="openstack/glance-default-external-api-0" Feb 18 11:56:57 crc kubenswrapper[4922]: I0218 11:56:57.610718 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ad78d97-6a95-4bd9-9204-6f8d0af71cf3-logs\") pod \"glance-default-external-api-0\" (UID: \"6ad78d97-6a95-4bd9-9204-6f8d0af71cf3\") " pod="openstack/glance-default-external-api-0" Feb 18 11:56:57 crc kubenswrapper[4922]: I0218 11:56:57.610818 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ad78d97-6a95-4bd9-9204-6f8d0af71cf3-scripts\") pod \"glance-default-external-api-0\" (UID: \"6ad78d97-6a95-4bd9-9204-6f8d0af71cf3\") " pod="openstack/glance-default-external-api-0" Feb 18 11:56:57 crc kubenswrapper[4922]: I0218 11:56:57.610854 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ad78d97-6a95-4bd9-9204-6f8d0af71cf3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6ad78d97-6a95-4bd9-9204-6f8d0af71cf3\") " pod="openstack/glance-default-external-api-0" Feb 18 11:56:57 crc kubenswrapper[4922]: I0218 11:56:57.610908 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbhgh\" (UniqueName: \"kubernetes.io/projected/6ad78d97-6a95-4bd9-9204-6f8d0af71cf3-kube-api-access-zbhgh\") pod \"glance-default-external-api-0\" (UID: \"6ad78d97-6a95-4bd9-9204-6f8d0af71cf3\") " pod="openstack/glance-default-external-api-0" Feb 18 11:56:57 crc kubenswrapper[4922]: I0218 11:56:57.611056 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6ad78d97-6a95-4bd9-9204-6f8d0af71cf3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6ad78d97-6a95-4bd9-9204-6f8d0af71cf3\") " pod="openstack/glance-default-external-api-0" Feb 18 11:56:57 crc kubenswrapper[4922]: I0218 11:56:57.713951 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ad78d97-6a95-4bd9-9204-6f8d0af71cf3-scripts\") pod \"glance-default-external-api-0\" (UID: \"6ad78d97-6a95-4bd9-9204-6f8d0af71cf3\") " pod="openstack/glance-default-external-api-0" Feb 18 11:56:57 crc kubenswrapper[4922]: I0218 11:56:57.714003 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ad78d97-6a95-4bd9-9204-6f8d0af71cf3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6ad78d97-6a95-4bd9-9204-6f8d0af71cf3\") " pod="openstack/glance-default-external-api-0" Feb 18 11:56:57 crc kubenswrapper[4922]: I0218 11:56:57.714032 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbhgh\" (UniqueName: \"kubernetes.io/projected/6ad78d97-6a95-4bd9-9204-6f8d0af71cf3-kube-api-access-zbhgh\") pod \"glance-default-external-api-0\" (UID: \"6ad78d97-6a95-4bd9-9204-6f8d0af71cf3\") " pod="openstack/glance-default-external-api-0" Feb 18 11:56:57 crc kubenswrapper[4922]: I0218 11:56:57.714103 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6ad78d97-6a95-4bd9-9204-6f8d0af71cf3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6ad78d97-6a95-4bd9-9204-6f8d0af71cf3\") " pod="openstack/glance-default-external-api-0" Feb 18 11:56:57 crc kubenswrapper[4922]: I0218 11:56:57.714152 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ad78d97-6a95-4bd9-9204-6f8d0af71cf3-config-data\") pod \"glance-default-external-api-0\" (UID: \"6ad78d97-6a95-4bd9-9204-6f8d0af71cf3\") " pod="openstack/glance-default-external-api-0" Feb 18 11:56:57 crc kubenswrapper[4922]: I0218 11:56:57.714178 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"6ad78d97-6a95-4bd9-9204-6f8d0af71cf3\") " pod="openstack/glance-default-external-api-0" Feb 18 11:56:57 crc kubenswrapper[4922]: I0218 11:56:57.714212 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ad78d97-6a95-4bd9-9204-6f8d0af71cf3-logs\") pod \"glance-default-external-api-0\" (UID: \"6ad78d97-6a95-4bd9-9204-6f8d0af71cf3\") " pod="openstack/glance-default-external-api-0" Feb 18 11:56:57 crc kubenswrapper[4922]: I0218 11:56:57.714838 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ad78d97-6a95-4bd9-9204-6f8d0af71cf3-logs\") pod \"glance-default-external-api-0\" (UID: \"6ad78d97-6a95-4bd9-9204-6f8d0af71cf3\") " pod="openstack/glance-default-external-api-0" Feb 18 11:56:57 crc kubenswrapper[4922]: I0218 11:56:57.721548 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6ad78d97-6a95-4bd9-9204-6f8d0af71cf3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6ad78d97-6a95-4bd9-9204-6f8d0af71cf3\") " pod="openstack/glance-default-external-api-0" Feb 18 11:56:57 crc kubenswrapper[4922]: I0218 11:56:57.726610 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ad78d97-6a95-4bd9-9204-6f8d0af71cf3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6ad78d97-6a95-4bd9-9204-6f8d0af71cf3\") " pod="openstack/glance-default-external-api-0" Feb 18 11:56:57 crc kubenswrapper[4922]: I0218 11:56:57.727302 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ad78d97-6a95-4bd9-9204-6f8d0af71cf3-scripts\") pod \"glance-default-external-api-0\" (UID: \"6ad78d97-6a95-4bd9-9204-6f8d0af71cf3\") " pod="openstack/glance-default-external-api-0" Feb 18 11:56:57 crc kubenswrapper[4922]: I0218 11:56:57.731494 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ad78d97-6a95-4bd9-9204-6f8d0af71cf3-config-data\") pod \"glance-default-external-api-0\" (UID: \"6ad78d97-6a95-4bd9-9204-6f8d0af71cf3\") " pod="openstack/glance-default-external-api-0" Feb 18 11:56:57 crc kubenswrapper[4922]: I0218 11:56:57.733631 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"6ad78d97-6a95-4bd9-9204-6f8d0af71cf3\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Feb 18 11:56:57 crc kubenswrapper[4922]: I0218 11:56:57.760703 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbhgh\" (UniqueName: \"kubernetes.io/projected/6ad78d97-6a95-4bd9-9204-6f8d0af71cf3-kube-api-access-zbhgh\") pod \"glance-default-external-api-0\" (UID: \"6ad78d97-6a95-4bd9-9204-6f8d0af71cf3\") " pod="openstack/glance-default-external-api-0" Feb 18 11:56:57 crc kubenswrapper[4922]: I0218 11:56:57.775599 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"6ad78d97-6a95-4bd9-9204-6f8d0af71cf3\") " pod="openstack/glance-default-external-api-0" Feb 18 11:56:57 crc kubenswrapper[4922]: I0218 11:56:57.886022 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 18 11:56:58 crc kubenswrapper[4922]: I0218 11:56:58.004101 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 11:56:58 crc kubenswrapper[4922]: I0218 11:56:58.005991 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 18 11:56:58 crc kubenswrapper[4922]: I0218 11:56:58.013144 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 18 11:56:58 crc kubenswrapper[4922]: I0218 11:56:58.047447 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 11:56:58 crc kubenswrapper[4922]: I0218 11:56:58.066774 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cc8b5d5c5-jrg5v"] Feb 18 11:56:58 crc kubenswrapper[4922]: I0218 11:56:58.082732 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31a682bb-b881-47f5-960b-c6ae54c24275","Type":"ContainerStarted","Data":"84badbb54b04871d4102bc412504ae7f5911b23870f7e7ded6df793d7986d02b"} Feb 18 11:56:58 crc kubenswrapper[4922]: I0218 11:56:58.083162 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 18 11:56:58 crc kubenswrapper[4922]: I0218 11:56:58.094770 4922 generic.go:334] "Generic (PLEG): container finished" podID="c5a7cb59-a6a3-4653-a63a-5942277f6663" containerID="828b6208153495b0d3b9a186701d25e980bb083d35eb836c7903452c2dfbbadf" exitCode=0 Feb 18 11:56:58 crc kubenswrapper[4922]: I0218 11:56:58.095121 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dc6887bc5-xfpt6" event={"ID":"c5a7cb59-a6a3-4653-a63a-5942277f6663","Type":"ContainerDied","Data":"828b6208153495b0d3b9a186701d25e980bb083d35eb836c7903452c2dfbbadf"} Feb 18 11:56:58 crc kubenswrapper[4922]: I0218 11:56:58.111494 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.703494765 podStartE2EDuration="10.111470397s" podCreationTimestamp="2026-02-18 11:56:48 +0000 UTC" firstStartedPulling="2026-02-18 11:56:50.164631081 +0000 UTC m=+1211.892335161" lastFinishedPulling="2026-02-18 11:56:57.572606713 +0000 UTC m=+1219.300310793" observedRunningTime="2026-02-18 11:56:58.109152338 +0000 UTC m=+1219.836856428" watchObservedRunningTime="2026-02-18 11:56:58.111470397 +0000 UTC m=+1219.839174477" Feb 18 11:56:58 crc kubenswrapper[4922]: W0218 11:56:58.113083 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd61bee1b_0ee4_4c97_8d5d_8655406f124c.slice/crio-18a7a7ab6ce87177724ccea1ae9a737d84614c82a96458fedc35807aee0f2e52 WatchSource:0}: Error finding container 18a7a7ab6ce87177724ccea1ae9a737d84614c82a96458fedc35807aee0f2e52: Status 404 returned error can't find the container with id 18a7a7ab6ce87177724ccea1ae9a737d84614c82a96458fedc35807aee0f2e52 Feb 18 11:56:58 crc kubenswrapper[4922]: I0218 11:56:58.122879 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ed1bc0f3-a613-4565-b1f5-e962556acb00","Type":"ContainerStarted","Data":"1c2064f4a7cb5c8c928c435089f458b706b8fa933a0140eb0fd478bbb13155df"} Feb 18 11:56:58 crc kubenswrapper[4922]: I0218 11:56:58.124100 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16d2c710-0adb-4543-8e7c-7e318d2e0091-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"16d2c710-0adb-4543-8e7c-7e318d2e0091\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:56:58 crc kubenswrapper[4922]: I0218 11:56:58.124130 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16d2c710-0adb-4543-8e7c-7e318d2e0091-config-data\") pod \"glance-default-internal-api-0\" (UID: \"16d2c710-0adb-4543-8e7c-7e318d2e0091\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:56:58 crc kubenswrapper[4922]: I0218 11:56:58.124147 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16d2c710-0adb-4543-8e7c-7e318d2e0091-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"16d2c710-0adb-4543-8e7c-7e318d2e0091\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:56:58 crc kubenswrapper[4922]: I0218 11:56:58.124214 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"16d2c710-0adb-4543-8e7c-7e318d2e0091\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:56:58 crc kubenswrapper[4922]: I0218 11:56:58.124248 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2trdb\" (UniqueName: \"kubernetes.io/projected/16d2c710-0adb-4543-8e7c-7e318d2e0091-kube-api-access-2trdb\") pod \"glance-default-internal-api-0\" (UID: \"16d2c710-0adb-4543-8e7c-7e318d2e0091\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:56:58 crc kubenswrapper[4922]: I0218 11:56:58.124271 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16d2c710-0adb-4543-8e7c-7e318d2e0091-logs\") pod \"glance-default-internal-api-0\" (UID: \"16d2c710-0adb-4543-8e7c-7e318d2e0091\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:56:58 crc kubenswrapper[4922]: I0218 11:56:58.124320 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16d2c710-0adb-4543-8e7c-7e318d2e0091-scripts\") pod \"glance-default-internal-api-0\" (UID: \"16d2c710-0adb-4543-8e7c-7e318d2e0091\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:56:58 crc kubenswrapper[4922]: I0218 11:56:58.225690 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16d2c710-0adb-4543-8e7c-7e318d2e0091-scripts\") pod \"glance-default-internal-api-0\" (UID: \"16d2c710-0adb-4543-8e7c-7e318d2e0091\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:56:58 crc kubenswrapper[4922]: I0218 11:56:58.226751 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16d2c710-0adb-4543-8e7c-7e318d2e0091-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"16d2c710-0adb-4543-8e7c-7e318d2e0091\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:56:58 crc kubenswrapper[4922]: I0218 11:56:58.226777 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16d2c710-0adb-4543-8e7c-7e318d2e0091-config-data\") pod \"glance-default-internal-api-0\" (UID: \"16d2c710-0adb-4543-8e7c-7e318d2e0091\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:56:58 crc kubenswrapper[4922]: I0218 11:56:58.226798 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16d2c710-0adb-4543-8e7c-7e318d2e0091-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"16d2c710-0adb-4543-8e7c-7e318d2e0091\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:56:58 crc kubenswrapper[4922]: I0218 11:56:58.226853 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"16d2c710-0adb-4543-8e7c-7e318d2e0091\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:56:58 crc kubenswrapper[4922]: I0218 11:56:58.226915 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2trdb\" (UniqueName: \"kubernetes.io/projected/16d2c710-0adb-4543-8e7c-7e318d2e0091-kube-api-access-2trdb\") pod \"glance-default-internal-api-0\" (UID: \"16d2c710-0adb-4543-8e7c-7e318d2e0091\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:56:58 crc kubenswrapper[4922]: I0218 11:56:58.226946 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16d2c710-0adb-4543-8e7c-7e318d2e0091-logs\") pod \"glance-default-internal-api-0\" (UID: \"16d2c710-0adb-4543-8e7c-7e318d2e0091\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:56:58 crc kubenswrapper[4922]: I0218 11:56:58.229206 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16d2c710-0adb-4543-8e7c-7e318d2e0091-logs\") pod \"glance-default-internal-api-0\" (UID: \"16d2c710-0adb-4543-8e7c-7e318d2e0091\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:56:58 crc kubenswrapper[4922]: I0218 11:56:58.231712 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"16d2c710-0adb-4543-8e7c-7e318d2e0091\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Feb 18 11:56:58 crc kubenswrapper[4922]: I0218 11:56:58.232322 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16d2c710-0adb-4543-8e7c-7e318d2e0091-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"16d2c710-0adb-4543-8e7c-7e318d2e0091\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:56:58 crc kubenswrapper[4922]: I0218 11:56:58.236574 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16d2c710-0adb-4543-8e7c-7e318d2e0091-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"16d2c710-0adb-4543-8e7c-7e318d2e0091\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:56:58 crc kubenswrapper[4922]: I0218 11:56:58.236715 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16d2c710-0adb-4543-8e7c-7e318d2e0091-scripts\") pod \"glance-default-internal-api-0\" (UID: \"16d2c710-0adb-4543-8e7c-7e318d2e0091\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:56:58 crc kubenswrapper[4922]: I0218 11:56:58.266179 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16d2c710-0adb-4543-8e7c-7e318d2e0091-config-data\") pod \"glance-default-internal-api-0\" (UID: \"16d2c710-0adb-4543-8e7c-7e318d2e0091\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:56:58 crc kubenswrapper[4922]: I0218 11:56:58.275254 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2trdb\" (UniqueName: \"kubernetes.io/projected/16d2c710-0adb-4543-8e7c-7e318d2e0091-kube-api-access-2trdb\") pod \"glance-default-internal-api-0\" (UID: \"16d2c710-0adb-4543-8e7c-7e318d2e0091\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:56:58 crc kubenswrapper[4922]: I0218 11:56:58.359834 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"16d2c710-0adb-4543-8e7c-7e318d2e0091\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:56:58 crc kubenswrapper[4922]: I0218 11:56:58.364214 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 18 11:56:58 crc kubenswrapper[4922]: I0218 11:56:58.389874 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-9d79df67b-mg9kq" podUID="e3b4cee1-5234-4b6c-93fa-3cb5687ecba9" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.159:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.159:8443: connect: connection refused" Feb 18 11:56:58 crc kubenswrapper[4922]: I0218 11:56:58.870961 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 11:56:58 crc kubenswrapper[4922]: W0218 11:56:58.880959 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ad78d97_6a95_4bd9_9204_6f8d0af71cf3.slice/crio-f239b779f8f04023b896a330788159d833d8acf6105f2514edc0ee6188559dff WatchSource:0}: Error finding container f239b779f8f04023b896a330788159d833d8acf6105f2514edc0ee6188559dff: Status 404 returned error can't find the container with id f239b779f8f04023b896a330788159d833d8acf6105f2514edc0ee6188559dff Feb 18 11:56:59 crc kubenswrapper[4922]: I0218 11:56:59.084427 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d15f2ab3-202d-4241-a636-4d00475874aa" path="/var/lib/kubelet/pods/d15f2ab3-202d-4241-a636-4d00475874aa/volumes" Feb 18 11:56:59 crc kubenswrapper[4922]: I0218 11:56:59.116315 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dc6887bc5-xfpt6" Feb 18 11:56:59 crc kubenswrapper[4922]: I0218 11:56:59.158079 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c5a7cb59-a6a3-4653-a63a-5942277f6663-dns-swift-storage-0\") pod \"c5a7cb59-a6a3-4653-a63a-5942277f6663\" (UID: \"c5a7cb59-a6a3-4653-a63a-5942277f6663\") " Feb 18 11:56:59 crc kubenswrapper[4922]: I0218 11:56:59.160196 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5a7cb59-a6a3-4653-a63a-5942277f6663-config\") pod \"c5a7cb59-a6a3-4653-a63a-5942277f6663\" (UID: \"c5a7cb59-a6a3-4653-a63a-5942277f6663\") " Feb 18 11:56:59 crc kubenswrapper[4922]: I0218 11:56:59.161011 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgkxw\" (UniqueName: \"kubernetes.io/projected/c5a7cb59-a6a3-4653-a63a-5942277f6663-kube-api-access-rgkxw\") pod \"c5a7cb59-a6a3-4653-a63a-5942277f6663\" (UID: \"c5a7cb59-a6a3-4653-a63a-5942277f6663\") " Feb 18 11:56:59 crc kubenswrapper[4922]: I0218 11:56:59.161057 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5a7cb59-a6a3-4653-a63a-5942277f6663-dns-svc\") pod \"c5a7cb59-a6a3-4653-a63a-5942277f6663\" (UID: \"c5a7cb59-a6a3-4653-a63a-5942277f6663\") " Feb 18 11:56:59 crc kubenswrapper[4922]: I0218 11:56:59.161100 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5a7cb59-a6a3-4653-a63a-5942277f6663-ovsdbserver-nb\") pod \"c5a7cb59-a6a3-4653-a63a-5942277f6663\" (UID: \"c5a7cb59-a6a3-4653-a63a-5942277f6663\") " Feb 18 11:56:59 crc kubenswrapper[4922]: I0218 11:56:59.161132 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5a7cb59-a6a3-4653-a63a-5942277f6663-ovsdbserver-sb\") pod \"c5a7cb59-a6a3-4653-a63a-5942277f6663\" (UID: \"c5a7cb59-a6a3-4653-a63a-5942277f6663\") " Feb 18 11:56:59 crc kubenswrapper[4922]: I0218 11:56:59.190076 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cc8b5d5c5-jrg5v" event={"ID":"d61bee1b-0ee4-4c97-8d5d-8655406f124c","Type":"ContainerStarted","Data":"ce7a1f8b346e07ed0990be6cb946929c06f52e0b61b2bdd4bba88e18309b0f61"} Feb 18 11:56:59 crc kubenswrapper[4922]: I0218 11:56:59.190142 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cc8b5d5c5-jrg5v" event={"ID":"d61bee1b-0ee4-4c97-8d5d-8655406f124c","Type":"ContainerStarted","Data":"18a7a7ab6ce87177724ccea1ae9a737d84614c82a96458fedc35807aee0f2e52"} Feb 18 11:56:59 crc kubenswrapper[4922]: I0218 11:56:59.196743 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5a7cb59-a6a3-4653-a63a-5942277f6663-kube-api-access-rgkxw" (OuterVolumeSpecName: "kube-api-access-rgkxw") pod "c5a7cb59-a6a3-4653-a63a-5942277f6663" (UID: "c5a7cb59-a6a3-4653-a63a-5942277f6663"). InnerVolumeSpecName "kube-api-access-rgkxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:56:59 crc kubenswrapper[4922]: I0218 11:56:59.197912 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"74a83ecb-de31-4767-a178-bccf8a37e93e","Type":"ContainerStarted","Data":"d5a387d1fd841c5e7a8b16f873f082fef6f77bf7e19af0404c6f60662142d1da"} Feb 18 11:56:59 crc kubenswrapper[4922]: I0218 11:56:59.202244 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5a7cb59-a6a3-4653-a63a-5942277f6663-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c5a7cb59-a6a3-4653-a63a-5942277f6663" (UID: "c5a7cb59-a6a3-4653-a63a-5942277f6663"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:56:59 crc kubenswrapper[4922]: I0218 11:56:59.208064 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5a7cb59-a6a3-4653-a63a-5942277f6663-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c5a7cb59-a6a3-4653-a63a-5942277f6663" (UID: "c5a7cb59-a6a3-4653-a63a-5942277f6663"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:56:59 crc kubenswrapper[4922]: I0218 11:56:59.209057 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dc6887bc5-xfpt6" event={"ID":"c5a7cb59-a6a3-4653-a63a-5942277f6663","Type":"ContainerDied","Data":"1944f842e23fb615957951883e249b9c3deafa98b8f8ab3f95ea984230f6d29c"} Feb 18 11:56:59 crc kubenswrapper[4922]: I0218 11:56:59.209100 4922 scope.go:117] "RemoveContainer" containerID="828b6208153495b0d3b9a186701d25e980bb083d35eb836c7903452c2dfbbadf" Feb 18 11:56:59 crc kubenswrapper[4922]: I0218 11:56:59.209219 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dc6887bc5-xfpt6" Feb 18 11:56:59 crc kubenswrapper[4922]: I0218 11:56:59.226725 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5a7cb59-a6a3-4653-a63a-5942277f6663-config" (OuterVolumeSpecName: "config") pod "c5a7cb59-a6a3-4653-a63a-5942277f6663" (UID: "c5a7cb59-a6a3-4653-a63a-5942277f6663"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:56:59 crc kubenswrapper[4922]: I0218 11:56:59.226811 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6ad78d97-6a95-4bd9-9204-6f8d0af71cf3","Type":"ContainerStarted","Data":"f239b779f8f04023b896a330788159d833d8acf6105f2514edc0ee6188559dff"} Feb 18 11:56:59 crc kubenswrapper[4922]: I0218 11:56:59.236184 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5a7cb59-a6a3-4653-a63a-5942277f6663-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c5a7cb59-a6a3-4653-a63a-5942277f6663" (UID: "c5a7cb59-a6a3-4653-a63a-5942277f6663"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:56:59 crc kubenswrapper[4922]: I0218 11:56:59.248835 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5a7cb59-a6a3-4653-a63a-5942277f6663-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c5a7cb59-a6a3-4653-a63a-5942277f6663" (UID: "c5a7cb59-a6a3-4653-a63a-5942277f6663"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:56:59 crc kubenswrapper[4922]: I0218 11:56:59.265143 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5a7cb59-a6a3-4653-a63a-5942277f6663-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:59 crc kubenswrapper[4922]: I0218 11:56:59.265177 4922 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c5a7cb59-a6a3-4653-a63a-5942277f6663-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:59 crc kubenswrapper[4922]: I0218 11:56:59.265187 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5a7cb59-a6a3-4653-a63a-5942277f6663-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:59 crc kubenswrapper[4922]: I0218 11:56:59.265197 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgkxw\" (UniqueName: \"kubernetes.io/projected/c5a7cb59-a6a3-4653-a63a-5942277f6663-kube-api-access-rgkxw\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:59 crc kubenswrapper[4922]: I0218 11:56:59.265208 4922 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5a7cb59-a6a3-4653-a63a-5942277f6663-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:59 crc kubenswrapper[4922]: I0218 11:56:59.265219 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5a7cb59-a6a3-4653-a63a-5942277f6663-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:59 crc kubenswrapper[4922]: I0218 11:56:59.321453 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 11:56:59 crc kubenswrapper[4922]: I0218 11:56:59.362823 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-6d677498bd-cxq98" podUID="53371b07-a65f-4fec-8564-bcd51df6c010" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.171:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 18 11:56:59 crc kubenswrapper[4922]: I0218 11:56:59.405622 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-6d677498bd-cxq98" podUID="53371b07-a65f-4fec-8564-bcd51df6c010" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.171:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 18 11:56:59 crc kubenswrapper[4922]: I0218 11:56:59.718463 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dc6887bc5-xfpt6"] Feb 18 11:56:59 crc kubenswrapper[4922]: I0218 11:56:59.719637 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-dc6887bc5-xfpt6"] Feb 18 11:57:00 crc kubenswrapper[4922]: I0218 11:57:00.242864 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"16d2c710-0adb-4543-8e7c-7e318d2e0091","Type":"ContainerStarted","Data":"a2ee0d21eb104489d672ffe18ba532d76b875730a7c6807acc779f5e1a1423e5"} Feb 18 11:57:00 crc kubenswrapper[4922]: I0218 11:57:00.245561 4922 generic.go:334] "Generic (PLEG): container finished" podID="d61bee1b-0ee4-4c97-8d5d-8655406f124c" containerID="ce7a1f8b346e07ed0990be6cb946929c06f52e0b61b2bdd4bba88e18309b0f61" exitCode=0 Feb 18 11:57:00 crc kubenswrapper[4922]: I0218 11:57:00.245617 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cc8b5d5c5-jrg5v" event={"ID":"d61bee1b-0ee4-4c97-8d5d-8655406f124c","Type":"ContainerDied","Data":"ce7a1f8b346e07ed0990be6cb946929c06f52e0b61b2bdd4bba88e18309b0f61"} Feb 18 11:57:01 crc kubenswrapper[4922]: I0218 11:57:01.003983 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5a7cb59-a6a3-4653-a63a-5942277f6663" path="/var/lib/kubelet/pods/c5a7cb59-a6a3-4653-a63a-5942277f6663/volumes" Feb 18 11:57:01 crc kubenswrapper[4922]: I0218 11:57:01.295116 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"74a83ecb-de31-4767-a178-bccf8a37e93e","Type":"ContainerStarted","Data":"fc4b0d98b4a0ffc17082b832ce25867774e01fb1582211f0e74f81571c524a12"} Feb 18 11:57:01 crc kubenswrapper[4922]: I0218 11:57:01.300887 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"16d2c710-0adb-4543-8e7c-7e318d2e0091","Type":"ContainerStarted","Data":"5384cbf5a529a7ac8c08ebb121d6b3f3cb726547d83ae663fede88079514676b"} Feb 18 11:57:01 crc kubenswrapper[4922]: I0218 11:57:01.306517 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6ad78d97-6a95-4bd9-9204-6f8d0af71cf3","Type":"ContainerStarted","Data":"b4ae6c8859c170ed82fdd4029620545643f83a046da270a4665b5e08aebee3d6"} Feb 18 11:57:01 crc kubenswrapper[4922]: I0218 11:57:01.308468 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ed1bc0f3-a613-4565-b1f5-e962556acb00","Type":"ContainerStarted","Data":"08eea4a1ca654ec0f46724358fe53403626c28895fe7bd0802a7388b3e60a117"} Feb 18 11:57:01 crc kubenswrapper[4922]: I0218 11:57:01.308659 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="ed1bc0f3-a613-4565-b1f5-e962556acb00" containerName="cinder-api-log" containerID="cri-o://1c2064f4a7cb5c8c928c435089f458b706b8fa933a0140eb0fd478bbb13155df" gracePeriod=30 Feb 18 11:57:01 crc kubenswrapper[4922]: I0218 11:57:01.308968 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 18 11:57:01 crc kubenswrapper[4922]: I0218 11:57:01.309007 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="ed1bc0f3-a613-4565-b1f5-e962556acb00" containerName="cinder-api" containerID="cri-o://08eea4a1ca654ec0f46724358fe53403626c28895fe7bd0802a7388b3e60a117" gracePeriod=30 Feb 18 11:57:01 crc kubenswrapper[4922]: I0218 11:57:01.312217 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cc8b5d5c5-jrg5v" event={"ID":"d61bee1b-0ee4-4c97-8d5d-8655406f124c","Type":"ContainerStarted","Data":"6fc40bd77d62f143a53ef0627d5b912914ae270a819ef4018ea2e7de4e360674"} Feb 18 11:57:01 crc kubenswrapper[4922]: I0218 11:57:01.312973 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5cc8b5d5c5-jrg5v" Feb 18 11:57:01 crc kubenswrapper[4922]: I0218 11:57:01.324992 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=7.159958047 podStartE2EDuration="8.324976322s" podCreationTimestamp="2026-02-18 11:56:53 +0000 UTC" firstStartedPulling="2026-02-18 11:56:55.230696774 +0000 UTC m=+1216.958400854" lastFinishedPulling="2026-02-18 11:56:56.395715059 +0000 UTC m=+1218.123419129" observedRunningTime="2026-02-18 11:57:01.320963561 +0000 UTC m=+1223.048667641" watchObservedRunningTime="2026-02-18 11:57:01.324976322 +0000 UTC m=+1223.052680402" Feb 18 11:57:01 crc kubenswrapper[4922]: I0218 11:57:01.359409 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=8.359390012 podStartE2EDuration="8.359390012s" podCreationTimestamp="2026-02-18 11:56:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:57:01.34584223 +0000 UTC m=+1223.073546310" watchObservedRunningTime="2026-02-18 11:57:01.359390012 +0000 UTC m=+1223.087094092" Feb 18 11:57:01 crc kubenswrapper[4922]: I0218 11:57:01.383118 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5cc8b5d5c5-jrg5v" podStartSLOduration=5.383098512 podStartE2EDuration="5.383098512s" podCreationTimestamp="2026-02-18 11:56:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:57:01.380808444 +0000 UTC m=+1223.108512524" watchObservedRunningTime="2026-02-18 11:57:01.383098512 +0000 UTC m=+1223.110802592" Feb 18 11:57:01 crc kubenswrapper[4922]: I0218 11:57:01.483168 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 11:57:01 crc kubenswrapper[4922]: I0218 11:57:01.513701 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6d677498bd-cxq98" Feb 18 11:57:01 crc kubenswrapper[4922]: I0218 11:57:01.560187 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 11:57:02 crc kubenswrapper[4922]: I0218 11:57:02.035134 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6d677498bd-cxq98" Feb 18 11:57:02 crc kubenswrapper[4922]: I0218 11:57:02.363110 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6ad78d97-6a95-4bd9-9204-6f8d0af71cf3","Type":"ContainerStarted","Data":"d3c510b21747ecd1b7af150cea8f55ebe831e769ad7c9a3529624a743396fb61"} Feb 18 11:57:02 crc kubenswrapper[4922]: I0218 11:57:02.363283 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6ad78d97-6a95-4bd9-9204-6f8d0af71cf3" containerName="glance-log" containerID="cri-o://b4ae6c8859c170ed82fdd4029620545643f83a046da270a4665b5e08aebee3d6" gracePeriod=30 Feb 18 11:57:02 crc kubenswrapper[4922]: I0218 11:57:02.363496 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6ad78d97-6a95-4bd9-9204-6f8d0af71cf3" containerName="glance-httpd" containerID="cri-o://d3c510b21747ecd1b7af150cea8f55ebe831e769ad7c9a3529624a743396fb61" gracePeriod=30 Feb 18 11:57:02 crc kubenswrapper[4922]: I0218 11:57:02.380781 4922 generic.go:334] "Generic (PLEG): container finished" podID="ed1bc0f3-a613-4565-b1f5-e962556acb00" containerID="08eea4a1ca654ec0f46724358fe53403626c28895fe7bd0802a7388b3e60a117" exitCode=0 Feb 18 11:57:02 crc kubenswrapper[4922]: I0218 11:57:02.380806 4922 generic.go:334] "Generic (PLEG): container finished" podID="ed1bc0f3-a613-4565-b1f5-e962556acb00" containerID="1c2064f4a7cb5c8c928c435089f458b706b8fa933a0140eb0fd478bbb13155df" exitCode=143 Feb 18 11:57:02 crc kubenswrapper[4922]: I0218 11:57:02.380849 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ed1bc0f3-a613-4565-b1f5-e962556acb00","Type":"ContainerDied","Data":"08eea4a1ca654ec0f46724358fe53403626c28895fe7bd0802a7388b3e60a117"} Feb 18 11:57:02 crc kubenswrapper[4922]: I0218 11:57:02.380874 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ed1bc0f3-a613-4565-b1f5-e962556acb00","Type":"ContainerDied","Data":"1c2064f4a7cb5c8c928c435089f458b706b8fa933a0140eb0fd478bbb13155df"} Feb 18 11:57:02 crc kubenswrapper[4922]: I0218 11:57:02.380883 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ed1bc0f3-a613-4565-b1f5-e962556acb00","Type":"ContainerDied","Data":"80ecfe632f1eb5d18cb8c8fa491a890fbee92c07e3b873e227a8dfcd9bb02d67"} Feb 18 11:57:02 crc kubenswrapper[4922]: I0218 11:57:02.380895 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80ecfe632f1eb5d18cb8c8fa491a890fbee92c07e3b873e227a8dfcd9bb02d67" Feb 18 11:57:02 crc kubenswrapper[4922]: I0218 11:57:02.384640 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="16d2c710-0adb-4543-8e7c-7e318d2e0091" containerName="glance-log" containerID="cri-o://5384cbf5a529a7ac8c08ebb121d6b3f3cb726547d83ae663fede88079514676b" gracePeriod=30 Feb 18 11:57:02 crc kubenswrapper[4922]: I0218 11:57:02.384904 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"16d2c710-0adb-4543-8e7c-7e318d2e0091","Type":"ContainerStarted","Data":"f6099997ca87e1e2e208c851a429f437033093112ae306d8aeb490803c1d2917"} Feb 18 11:57:02 crc kubenswrapper[4922]: I0218 11:57:02.385145 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="16d2c710-0adb-4543-8e7c-7e318d2e0091" containerName="glance-httpd" containerID="cri-o://f6099997ca87e1e2e208c851a429f437033093112ae306d8aeb490803c1d2917" gracePeriod=30 Feb 18 11:57:02 crc kubenswrapper[4922]: I0218 11:57:02.393854 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.393829986 podStartE2EDuration="6.393829986s" podCreationTimestamp="2026-02-18 11:56:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:57:02.385390423 +0000 UTC m=+1224.113094503" watchObservedRunningTime="2026-02-18 11:57:02.393829986 +0000 UTC m=+1224.121534066" Feb 18 11:57:02 crc kubenswrapper[4922]: I0218 11:57:02.435937 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 18 11:57:02 crc kubenswrapper[4922]: I0218 11:57:02.460843 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.46082124 podStartE2EDuration="6.46082124s" podCreationTimestamp="2026-02-18 11:56:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:57:02.427345863 +0000 UTC m=+1224.155049953" watchObservedRunningTime="2026-02-18 11:57:02.46082124 +0000 UTC m=+1224.188525320" Feb 18 11:57:02 crc kubenswrapper[4922]: I0218 11:57:02.616061 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed1bc0f3-a613-4565-b1f5-e962556acb00-combined-ca-bundle\") pod \"ed1bc0f3-a613-4565-b1f5-e962556acb00\" (UID: \"ed1bc0f3-a613-4565-b1f5-e962556acb00\") " Feb 18 11:57:02 crc kubenswrapper[4922]: I0218 11:57:02.616444 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tqmb\" (UniqueName: \"kubernetes.io/projected/ed1bc0f3-a613-4565-b1f5-e962556acb00-kube-api-access-2tqmb\") pod \"ed1bc0f3-a613-4565-b1f5-e962556acb00\" (UID: \"ed1bc0f3-a613-4565-b1f5-e962556acb00\") " Feb 18 11:57:02 crc kubenswrapper[4922]: I0218 11:57:02.616475 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ed1bc0f3-a613-4565-b1f5-e962556acb00-etc-machine-id\") pod \"ed1bc0f3-a613-4565-b1f5-e962556acb00\" (UID: \"ed1bc0f3-a613-4565-b1f5-e962556acb00\") " Feb 18 11:57:02 crc kubenswrapper[4922]: I0218 11:57:02.616561 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed1bc0f3-a613-4565-b1f5-e962556acb00-logs\") pod \"ed1bc0f3-a613-4565-b1f5-e962556acb00\" (UID: \"ed1bc0f3-a613-4565-b1f5-e962556acb00\") " Feb 18 11:57:02 crc kubenswrapper[4922]: I0218 11:57:02.616641 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ed1bc0f3-a613-4565-b1f5-e962556acb00-config-data-custom\") pod \"ed1bc0f3-a613-4565-b1f5-e962556acb00\" (UID: \"ed1bc0f3-a613-4565-b1f5-e962556acb00\") " Feb 18 11:57:02 crc kubenswrapper[4922]: I0218 11:57:02.616674 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed1bc0f3-a613-4565-b1f5-e962556acb00-scripts\") pod \"ed1bc0f3-a613-4565-b1f5-e962556acb00\" (UID: \"ed1bc0f3-a613-4565-b1f5-e962556acb00\") " Feb 18 11:57:02 crc kubenswrapper[4922]: I0218 11:57:02.616784 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed1bc0f3-a613-4565-b1f5-e962556acb00-config-data\") pod \"ed1bc0f3-a613-4565-b1f5-e962556acb00\" (UID: \"ed1bc0f3-a613-4565-b1f5-e962556acb00\") " Feb 18 11:57:02 crc kubenswrapper[4922]: I0218 11:57:02.618866 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed1bc0f3-a613-4565-b1f5-e962556acb00-logs" (OuterVolumeSpecName: "logs") pod "ed1bc0f3-a613-4565-b1f5-e962556acb00" (UID: "ed1bc0f3-a613-4565-b1f5-e962556acb00"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:57:02 crc kubenswrapper[4922]: I0218 11:57:02.621708 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ed1bc0f3-a613-4565-b1f5-e962556acb00-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ed1bc0f3-a613-4565-b1f5-e962556acb00" (UID: "ed1bc0f3-a613-4565-b1f5-e962556acb00"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 11:57:02 crc kubenswrapper[4922]: I0218 11:57:02.637772 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed1bc0f3-a613-4565-b1f5-e962556acb00-kube-api-access-2tqmb" (OuterVolumeSpecName: "kube-api-access-2tqmb") pod "ed1bc0f3-a613-4565-b1f5-e962556acb00" (UID: "ed1bc0f3-a613-4565-b1f5-e962556acb00"). InnerVolumeSpecName "kube-api-access-2tqmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:57:02 crc kubenswrapper[4922]: I0218 11:57:02.638428 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed1bc0f3-a613-4565-b1f5-e962556acb00-scripts" (OuterVolumeSpecName: "scripts") pod "ed1bc0f3-a613-4565-b1f5-e962556acb00" (UID: "ed1bc0f3-a613-4565-b1f5-e962556acb00"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:02 crc kubenswrapper[4922]: I0218 11:57:02.646540 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed1bc0f3-a613-4565-b1f5-e962556acb00-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ed1bc0f3-a613-4565-b1f5-e962556acb00" (UID: "ed1bc0f3-a613-4565-b1f5-e962556acb00"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:02 crc kubenswrapper[4922]: I0218 11:57:02.729564 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed1bc0f3-a613-4565-b1f5-e962556acb00-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed1bc0f3-a613-4565-b1f5-e962556acb00" (UID: "ed1bc0f3-a613-4565-b1f5-e962556acb00"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:02 crc kubenswrapper[4922]: I0218 11:57:02.730420 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed1bc0f3-a613-4565-b1f5-e962556acb00-logs\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:02 crc kubenswrapper[4922]: I0218 11:57:02.730496 4922 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ed1bc0f3-a613-4565-b1f5-e962556acb00-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:02 crc kubenswrapper[4922]: I0218 11:57:02.730511 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed1bc0f3-a613-4565-b1f5-e962556acb00-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:02 crc kubenswrapper[4922]: I0218 11:57:02.730525 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed1bc0f3-a613-4565-b1f5-e962556acb00-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:02 crc kubenswrapper[4922]: I0218 11:57:02.730538 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tqmb\" (UniqueName: \"kubernetes.io/projected/ed1bc0f3-a613-4565-b1f5-e962556acb00-kube-api-access-2tqmb\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:02 crc kubenswrapper[4922]: I0218 11:57:02.730551 4922 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ed1bc0f3-a613-4565-b1f5-e962556acb00-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:02 crc kubenswrapper[4922]: I0218 11:57:02.796634 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed1bc0f3-a613-4565-b1f5-e962556acb00-config-data" (OuterVolumeSpecName: "config-data") pod "ed1bc0f3-a613-4565-b1f5-e962556acb00" (UID: "ed1bc0f3-a613-4565-b1f5-e962556acb00"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:02 crc kubenswrapper[4922]: I0218 11:57:02.835716 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed1bc0f3-a613-4565-b1f5-e962556acb00-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:03 crc kubenswrapper[4922]: I0218 11:57:03.462248 4922 generic.go:334] "Generic (PLEG): container finished" podID="16d2c710-0adb-4543-8e7c-7e318d2e0091" containerID="f6099997ca87e1e2e208c851a429f437033093112ae306d8aeb490803c1d2917" exitCode=0 Feb 18 11:57:03 crc kubenswrapper[4922]: I0218 11:57:03.462637 4922 generic.go:334] "Generic (PLEG): container finished" podID="16d2c710-0adb-4543-8e7c-7e318d2e0091" containerID="5384cbf5a529a7ac8c08ebb121d6b3f3cb726547d83ae663fede88079514676b" exitCode=143 Feb 18 11:57:03 crc kubenswrapper[4922]: I0218 11:57:03.462496 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"16d2c710-0adb-4543-8e7c-7e318d2e0091","Type":"ContainerDied","Data":"f6099997ca87e1e2e208c851a429f437033093112ae306d8aeb490803c1d2917"} Feb 18 11:57:03 crc kubenswrapper[4922]: I0218 11:57:03.462750 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"16d2c710-0adb-4543-8e7c-7e318d2e0091","Type":"ContainerDied","Data":"5384cbf5a529a7ac8c08ebb121d6b3f3cb726547d83ae663fede88079514676b"} Feb 18 11:57:03 crc kubenswrapper[4922]: I0218 11:57:03.478984 4922 generic.go:334] "Generic (PLEG): container finished" podID="6ad78d97-6a95-4bd9-9204-6f8d0af71cf3" containerID="b4ae6c8859c170ed82fdd4029620545643f83a046da270a4665b5e08aebee3d6" exitCode=143 Feb 18 11:57:03 crc kubenswrapper[4922]: I0218 11:57:03.479139 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 18 11:57:03 crc kubenswrapper[4922]: I0218 11:57:03.480338 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6ad78d97-6a95-4bd9-9204-6f8d0af71cf3","Type":"ContainerDied","Data":"b4ae6c8859c170ed82fdd4029620545643f83a046da270a4665b5e08aebee3d6"} Feb 18 11:57:03 crc kubenswrapper[4922]: I0218 11:57:03.519221 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 18 11:57:03 crc kubenswrapper[4922]: I0218 11:57:03.554606 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 18 11:57:03 crc kubenswrapper[4922]: I0218 11:57:03.602448 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 18 11:57:03 crc kubenswrapper[4922]: E0218 11:57:03.603093 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed1bc0f3-a613-4565-b1f5-e962556acb00" containerName="cinder-api-log" Feb 18 11:57:03 crc kubenswrapper[4922]: I0218 11:57:03.603113 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed1bc0f3-a613-4565-b1f5-e962556acb00" containerName="cinder-api-log" Feb 18 11:57:03 crc kubenswrapper[4922]: E0218 11:57:03.603139 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5a7cb59-a6a3-4653-a63a-5942277f6663" containerName="init" Feb 18 11:57:03 crc kubenswrapper[4922]: I0218 11:57:03.603162 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5a7cb59-a6a3-4653-a63a-5942277f6663" containerName="init" Feb 18 11:57:03 crc kubenswrapper[4922]: E0218 11:57:03.603201 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed1bc0f3-a613-4565-b1f5-e962556acb00" containerName="cinder-api" Feb 18 11:57:03 crc kubenswrapper[4922]: I0218 11:57:03.603211 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed1bc0f3-a613-4565-b1f5-e962556acb00" containerName="cinder-api" Feb 18 11:57:03 crc kubenswrapper[4922]: I0218 11:57:03.603511 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5a7cb59-a6a3-4653-a63a-5942277f6663" containerName="init" Feb 18 11:57:03 crc kubenswrapper[4922]: I0218 11:57:03.603539 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed1bc0f3-a613-4565-b1f5-e962556acb00" containerName="cinder-api-log" Feb 18 11:57:03 crc kubenswrapper[4922]: I0218 11:57:03.603549 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed1bc0f3-a613-4565-b1f5-e962556acb00" containerName="cinder-api" Feb 18 11:57:03 crc kubenswrapper[4922]: I0218 11:57:03.605039 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 18 11:57:03 crc kubenswrapper[4922]: I0218 11:57:03.615777 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 18 11:57:03 crc kubenswrapper[4922]: I0218 11:57:03.616075 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 18 11:57:03 crc kubenswrapper[4922]: I0218 11:57:03.638099 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 18 11:57:03 crc kubenswrapper[4922]: I0218 11:57:03.718461 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 18 11:57:03 crc kubenswrapper[4922]: I0218 11:57:03.766589 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b897159b-9178-4f59-b254-08229460867d-config-data\") pod \"cinder-api-0\" (UID: \"b897159b-9178-4f59-b254-08229460867d\") " pod="openstack/cinder-api-0" Feb 18 11:57:03 crc kubenswrapper[4922]: I0218 11:57:03.766672 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4gqn\" (UniqueName: \"kubernetes.io/projected/b897159b-9178-4f59-b254-08229460867d-kube-api-access-v4gqn\") pod \"cinder-api-0\" (UID: \"b897159b-9178-4f59-b254-08229460867d\") " pod="openstack/cinder-api-0" Feb 18 11:57:03 crc kubenswrapper[4922]: I0218 11:57:03.766713 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b897159b-9178-4f59-b254-08229460867d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b897159b-9178-4f59-b254-08229460867d\") " pod="openstack/cinder-api-0" Feb 18 11:57:03 crc kubenswrapper[4922]: I0218 11:57:03.766729 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b897159b-9178-4f59-b254-08229460867d-scripts\") pod \"cinder-api-0\" (UID: \"b897159b-9178-4f59-b254-08229460867d\") " pod="openstack/cinder-api-0" Feb 18 11:57:03 crc kubenswrapper[4922]: I0218 11:57:03.766790 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b897159b-9178-4f59-b254-08229460867d-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"b897159b-9178-4f59-b254-08229460867d\") " pod="openstack/cinder-api-0" Feb 18 11:57:03 crc kubenswrapper[4922]: I0218 11:57:03.766823 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b897159b-9178-4f59-b254-08229460867d-logs\") pod \"cinder-api-0\" (UID: \"b897159b-9178-4f59-b254-08229460867d\") " pod="openstack/cinder-api-0" Feb 18 11:57:03 crc kubenswrapper[4922]: I0218 11:57:03.766857 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b897159b-9178-4f59-b254-08229460867d-config-data-custom\") pod \"cinder-api-0\" (UID: \"b897159b-9178-4f59-b254-08229460867d\") " pod="openstack/cinder-api-0" Feb 18 11:57:03 crc kubenswrapper[4922]: I0218 11:57:03.766882 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b897159b-9178-4f59-b254-08229460867d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b897159b-9178-4f59-b254-08229460867d\") " pod="openstack/cinder-api-0" Feb 18 11:57:03 crc kubenswrapper[4922]: I0218 11:57:03.766898 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b897159b-9178-4f59-b254-08229460867d-public-tls-certs\") pod \"cinder-api-0\" (UID: \"b897159b-9178-4f59-b254-08229460867d\") " pod="openstack/cinder-api-0" Feb 18 11:57:03 crc kubenswrapper[4922]: I0218 11:57:03.868799 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b897159b-9178-4f59-b254-08229460867d-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"b897159b-9178-4f59-b254-08229460867d\") " pod="openstack/cinder-api-0" Feb 18 11:57:03 crc kubenswrapper[4922]: I0218 11:57:03.869801 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b897159b-9178-4f59-b254-08229460867d-logs\") pod \"cinder-api-0\" (UID: \"b897159b-9178-4f59-b254-08229460867d\") " pod="openstack/cinder-api-0" Feb 18 11:57:03 crc kubenswrapper[4922]: I0218 11:57:03.872314 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b897159b-9178-4f59-b254-08229460867d-config-data-custom\") pod \"cinder-api-0\" (UID: \"b897159b-9178-4f59-b254-08229460867d\") " pod="openstack/cinder-api-0" Feb 18 11:57:03 crc kubenswrapper[4922]: I0218 11:57:03.870656 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b897159b-9178-4f59-b254-08229460867d-logs\") pod \"cinder-api-0\" (UID: \"b897159b-9178-4f59-b254-08229460867d\") " pod="openstack/cinder-api-0" Feb 18 11:57:03 crc kubenswrapper[4922]: I0218 11:57:03.873124 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b897159b-9178-4f59-b254-08229460867d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b897159b-9178-4f59-b254-08229460867d\") " pod="openstack/cinder-api-0" Feb 18 11:57:03 crc kubenswrapper[4922]: I0218 11:57:03.873561 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b897159b-9178-4f59-b254-08229460867d-public-tls-certs\") pod \"cinder-api-0\" (UID: \"b897159b-9178-4f59-b254-08229460867d\") " pod="openstack/cinder-api-0" Feb 18 11:57:03 crc kubenswrapper[4922]: I0218 11:57:03.874047 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b897159b-9178-4f59-b254-08229460867d-config-data\") pod \"cinder-api-0\" (UID: \"b897159b-9178-4f59-b254-08229460867d\") " pod="openstack/cinder-api-0" Feb 18 11:57:03 crc kubenswrapper[4922]: I0218 11:57:03.874172 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4gqn\" (UniqueName: \"kubernetes.io/projected/b897159b-9178-4f59-b254-08229460867d-kube-api-access-v4gqn\") pod \"cinder-api-0\" (UID: \"b897159b-9178-4f59-b254-08229460867d\") " pod="openstack/cinder-api-0" Feb 18 11:57:03 crc kubenswrapper[4922]: I0218 11:57:03.874299 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b897159b-9178-4f59-b254-08229460867d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b897159b-9178-4f59-b254-08229460867d\") " pod="openstack/cinder-api-0" Feb 18 11:57:03 crc kubenswrapper[4922]: I0218 11:57:03.874403 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b897159b-9178-4f59-b254-08229460867d-scripts\") pod \"cinder-api-0\" (UID: \"b897159b-9178-4f59-b254-08229460867d\") " pod="openstack/cinder-api-0" Feb 18 11:57:03 crc kubenswrapper[4922]: I0218 11:57:03.874883 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b897159b-9178-4f59-b254-08229460867d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b897159b-9178-4f59-b254-08229460867d\") " pod="openstack/cinder-api-0" Feb 18 11:57:03 crc kubenswrapper[4922]: I0218 11:57:03.878402 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b897159b-9178-4f59-b254-08229460867d-config-data-custom\") pod \"cinder-api-0\" (UID: \"b897159b-9178-4f59-b254-08229460867d\") " pod="openstack/cinder-api-0" Feb 18 11:57:03 crc kubenswrapper[4922]: I0218 11:57:03.879037 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b897159b-9178-4f59-b254-08229460867d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b897159b-9178-4f59-b254-08229460867d\") " pod="openstack/cinder-api-0" Feb 18 11:57:03 crc kubenswrapper[4922]: I0218 11:57:03.890974 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b897159b-9178-4f59-b254-08229460867d-scripts\") pod \"cinder-api-0\" (UID: \"b897159b-9178-4f59-b254-08229460867d\") " pod="openstack/cinder-api-0" Feb 18 11:57:03 crc kubenswrapper[4922]: I0218 11:57:03.891339 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b897159b-9178-4f59-b254-08229460867d-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"b897159b-9178-4f59-b254-08229460867d\") " pod="openstack/cinder-api-0" Feb 18 11:57:03 crc kubenswrapper[4922]: I0218 11:57:03.892280 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b897159b-9178-4f59-b254-08229460867d-config-data\") pod \"cinder-api-0\" (UID: \"b897159b-9178-4f59-b254-08229460867d\") " pod="openstack/cinder-api-0" Feb 18 11:57:03 crc kubenswrapper[4922]: I0218 11:57:03.907003 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4gqn\" (UniqueName: \"kubernetes.io/projected/b897159b-9178-4f59-b254-08229460867d-kube-api-access-v4gqn\") pod \"cinder-api-0\" (UID: \"b897159b-9178-4f59-b254-08229460867d\") " pod="openstack/cinder-api-0" Feb 18 11:57:03 crc kubenswrapper[4922]: I0218 11:57:03.916299 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b897159b-9178-4f59-b254-08229460867d-public-tls-certs\") pod \"cinder-api-0\" (UID: \"b897159b-9178-4f59-b254-08229460867d\") " pod="openstack/cinder-api-0" Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.028917 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.140634 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-6bb9876df9-jt7kg"] Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.142674 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6bb9876df9-jt7kg" Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.146924 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.147356 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.147607 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.153424 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6bb9876df9-jt7kg"] Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.266730 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.278122 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="74a83ecb-de31-4767-a178-bccf8a37e93e" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.0.175:8080/\": dial tcp 10.217.0.175:8080: connect: connection refused" Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.296189 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tp5k\" (UniqueName: \"kubernetes.io/projected/8cc5cf6d-c722-42a3-8389-b991e77d1bbf-kube-api-access-5tp5k\") pod \"swift-proxy-6bb9876df9-jt7kg\" (UID: \"8cc5cf6d-c722-42a3-8389-b991e77d1bbf\") " pod="openstack/swift-proxy-6bb9876df9-jt7kg" Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.296762 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8cc5cf6d-c722-42a3-8389-b991e77d1bbf-etc-swift\") pod \"swift-proxy-6bb9876df9-jt7kg\" (UID: \"8cc5cf6d-c722-42a3-8389-b991e77d1bbf\") " pod="openstack/swift-proxy-6bb9876df9-jt7kg" Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.296794 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8cc5cf6d-c722-42a3-8389-b991e77d1bbf-log-httpd\") pod \"swift-proxy-6bb9876df9-jt7kg\" (UID: \"8cc5cf6d-c722-42a3-8389-b991e77d1bbf\") " pod="openstack/swift-proxy-6bb9876df9-jt7kg" Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.296820 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cc5cf6d-c722-42a3-8389-b991e77d1bbf-internal-tls-certs\") pod \"swift-proxy-6bb9876df9-jt7kg\" (UID: \"8cc5cf6d-c722-42a3-8389-b991e77d1bbf\") " pod="openstack/swift-proxy-6bb9876df9-jt7kg" Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.296843 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cc5cf6d-c722-42a3-8389-b991e77d1bbf-config-data\") pod \"swift-proxy-6bb9876df9-jt7kg\" (UID: \"8cc5cf6d-c722-42a3-8389-b991e77d1bbf\") " pod="openstack/swift-proxy-6bb9876df9-jt7kg" Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.296940 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cc5cf6d-c722-42a3-8389-b991e77d1bbf-combined-ca-bundle\") pod \"swift-proxy-6bb9876df9-jt7kg\" (UID: \"8cc5cf6d-c722-42a3-8389-b991e77d1bbf\") " pod="openstack/swift-proxy-6bb9876df9-jt7kg" Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.296989 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cc5cf6d-c722-42a3-8389-b991e77d1bbf-public-tls-certs\") pod \"swift-proxy-6bb9876df9-jt7kg\" (UID: \"8cc5cf6d-c722-42a3-8389-b991e77d1bbf\") " pod="openstack/swift-proxy-6bb9876df9-jt7kg" Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.297037 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8cc5cf6d-c722-42a3-8389-b991e77d1bbf-run-httpd\") pod \"swift-proxy-6bb9876df9-jt7kg\" (UID: \"8cc5cf6d-c722-42a3-8389-b991e77d1bbf\") " pod="openstack/swift-proxy-6bb9876df9-jt7kg" Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.347636 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.398806 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cc5cf6d-c722-42a3-8389-b991e77d1bbf-combined-ca-bundle\") pod \"swift-proxy-6bb9876df9-jt7kg\" (UID: \"8cc5cf6d-c722-42a3-8389-b991e77d1bbf\") " pod="openstack/swift-proxy-6bb9876df9-jt7kg" Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.398897 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cc5cf6d-c722-42a3-8389-b991e77d1bbf-public-tls-certs\") pod \"swift-proxy-6bb9876df9-jt7kg\" (UID: \"8cc5cf6d-c722-42a3-8389-b991e77d1bbf\") " pod="openstack/swift-proxy-6bb9876df9-jt7kg" Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.398952 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8cc5cf6d-c722-42a3-8389-b991e77d1bbf-run-httpd\") pod \"swift-proxy-6bb9876df9-jt7kg\" (UID: \"8cc5cf6d-c722-42a3-8389-b991e77d1bbf\") " pod="openstack/swift-proxy-6bb9876df9-jt7kg" Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.398980 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tp5k\" (UniqueName: \"kubernetes.io/projected/8cc5cf6d-c722-42a3-8389-b991e77d1bbf-kube-api-access-5tp5k\") pod \"swift-proxy-6bb9876df9-jt7kg\" (UID: \"8cc5cf6d-c722-42a3-8389-b991e77d1bbf\") " pod="openstack/swift-proxy-6bb9876df9-jt7kg" Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.399022 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8cc5cf6d-c722-42a3-8389-b991e77d1bbf-etc-swift\") pod \"swift-proxy-6bb9876df9-jt7kg\" (UID: \"8cc5cf6d-c722-42a3-8389-b991e77d1bbf\") " pod="openstack/swift-proxy-6bb9876df9-jt7kg" Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.399041 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8cc5cf6d-c722-42a3-8389-b991e77d1bbf-log-httpd\") pod \"swift-proxy-6bb9876df9-jt7kg\" (UID: \"8cc5cf6d-c722-42a3-8389-b991e77d1bbf\") " pod="openstack/swift-proxy-6bb9876df9-jt7kg" Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.399062 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cc5cf6d-c722-42a3-8389-b991e77d1bbf-internal-tls-certs\") pod \"swift-proxy-6bb9876df9-jt7kg\" (UID: \"8cc5cf6d-c722-42a3-8389-b991e77d1bbf\") " pod="openstack/swift-proxy-6bb9876df9-jt7kg" Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.399077 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cc5cf6d-c722-42a3-8389-b991e77d1bbf-config-data\") pod \"swift-proxy-6bb9876df9-jt7kg\" (UID: \"8cc5cf6d-c722-42a3-8389-b991e77d1bbf\") " pod="openstack/swift-proxy-6bb9876df9-jt7kg" Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.401669 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8cc5cf6d-c722-42a3-8389-b991e77d1bbf-run-httpd\") pod \"swift-proxy-6bb9876df9-jt7kg\" (UID: \"8cc5cf6d-c722-42a3-8389-b991e77d1bbf\") " pod="openstack/swift-proxy-6bb9876df9-jt7kg" Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.405060 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8cc5cf6d-c722-42a3-8389-b991e77d1bbf-log-httpd\") pod \"swift-proxy-6bb9876df9-jt7kg\" (UID: \"8cc5cf6d-c722-42a3-8389-b991e77d1bbf\") " pod="openstack/swift-proxy-6bb9876df9-jt7kg" Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.414741 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cc5cf6d-c722-42a3-8389-b991e77d1bbf-combined-ca-bundle\") pod \"swift-proxy-6bb9876df9-jt7kg\" (UID: \"8cc5cf6d-c722-42a3-8389-b991e77d1bbf\") " pod="openstack/swift-proxy-6bb9876df9-jt7kg" Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.417899 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cc5cf6d-c722-42a3-8389-b991e77d1bbf-public-tls-certs\") pod \"swift-proxy-6bb9876df9-jt7kg\" (UID: \"8cc5cf6d-c722-42a3-8389-b991e77d1bbf\") " pod="openstack/swift-proxy-6bb9876df9-jt7kg" Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.419881 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cc5cf6d-c722-42a3-8389-b991e77d1bbf-config-data\") pod \"swift-proxy-6bb9876df9-jt7kg\" (UID: \"8cc5cf6d-c722-42a3-8389-b991e77d1bbf\") " pod="openstack/swift-proxy-6bb9876df9-jt7kg" Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.420561 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tp5k\" (UniqueName: \"kubernetes.io/projected/8cc5cf6d-c722-42a3-8389-b991e77d1bbf-kube-api-access-5tp5k\") pod \"swift-proxy-6bb9876df9-jt7kg\" (UID: \"8cc5cf6d-c722-42a3-8389-b991e77d1bbf\") " pod="openstack/swift-proxy-6bb9876df9-jt7kg" Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.422835 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cc5cf6d-c722-42a3-8389-b991e77d1bbf-internal-tls-certs\") pod \"swift-proxy-6bb9876df9-jt7kg\" (UID: \"8cc5cf6d-c722-42a3-8389-b991e77d1bbf\") " pod="openstack/swift-proxy-6bb9876df9-jt7kg" Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.424519 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8cc5cf6d-c722-42a3-8389-b991e77d1bbf-etc-swift\") pod \"swift-proxy-6bb9876df9-jt7kg\" (UID: \"8cc5cf6d-c722-42a3-8389-b991e77d1bbf\") " pod="openstack/swift-proxy-6bb9876df9-jt7kg" Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.504083 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"16d2c710-0adb-4543-8e7c-7e318d2e0091\" (UID: \"16d2c710-0adb-4543-8e7c-7e318d2e0091\") " Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.505245 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16d2c710-0adb-4543-8e7c-7e318d2e0091-scripts\") pod \"16d2c710-0adb-4543-8e7c-7e318d2e0091\" (UID: \"16d2c710-0adb-4543-8e7c-7e318d2e0091\") " Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.505400 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2trdb\" (UniqueName: \"kubernetes.io/projected/16d2c710-0adb-4543-8e7c-7e318d2e0091-kube-api-access-2trdb\") pod \"16d2c710-0adb-4543-8e7c-7e318d2e0091\" (UID: \"16d2c710-0adb-4543-8e7c-7e318d2e0091\") " Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.505543 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16d2c710-0adb-4543-8e7c-7e318d2e0091-config-data\") pod \"16d2c710-0adb-4543-8e7c-7e318d2e0091\" (UID: \"16d2c710-0adb-4543-8e7c-7e318d2e0091\") " Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.505775 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16d2c710-0adb-4543-8e7c-7e318d2e0091-logs\") pod \"16d2c710-0adb-4543-8e7c-7e318d2e0091\" (UID: \"16d2c710-0adb-4543-8e7c-7e318d2e0091\") " Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.505925 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16d2c710-0adb-4543-8e7c-7e318d2e0091-httpd-run\") pod \"16d2c710-0adb-4543-8e7c-7e318d2e0091\" (UID: \"16d2c710-0adb-4543-8e7c-7e318d2e0091\") " Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.506047 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16d2c710-0adb-4543-8e7c-7e318d2e0091-combined-ca-bundle\") pod \"16d2c710-0adb-4543-8e7c-7e318d2e0091\" (UID: \"16d2c710-0adb-4543-8e7c-7e318d2e0091\") " Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.515773 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16d2c710-0adb-4543-8e7c-7e318d2e0091-kube-api-access-2trdb" (OuterVolumeSpecName: "kube-api-access-2trdb") pod "16d2c710-0adb-4543-8e7c-7e318d2e0091" (UID: "16d2c710-0adb-4543-8e7c-7e318d2e0091"). InnerVolumeSpecName "kube-api-access-2trdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.528616 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "16d2c710-0adb-4543-8e7c-7e318d2e0091" (UID: "16d2c710-0adb-4543-8e7c-7e318d2e0091"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.540564 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"16d2c710-0adb-4543-8e7c-7e318d2e0091","Type":"ContainerDied","Data":"a2ee0d21eb104489d672ffe18ba532d76b875730a7c6807acc779f5e1a1423e5"} Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.540625 4922 scope.go:117] "RemoveContainer" containerID="f6099997ca87e1e2e208c851a429f437033093112ae306d8aeb490803c1d2917" Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.540784 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.564535 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16d2c710-0adb-4543-8e7c-7e318d2e0091-scripts" (OuterVolumeSpecName: "scripts") pod "16d2c710-0adb-4543-8e7c-7e318d2e0091" (UID: "16d2c710-0adb-4543-8e7c-7e318d2e0091"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.564697 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16d2c710-0adb-4543-8e7c-7e318d2e0091-logs" (OuterVolumeSpecName: "logs") pod "16d2c710-0adb-4543-8e7c-7e318d2e0091" (UID: "16d2c710-0adb-4543-8e7c-7e318d2e0091"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.564724 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16d2c710-0adb-4543-8e7c-7e318d2e0091-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "16d2c710-0adb-4543-8e7c-7e318d2e0091" (UID: "16d2c710-0adb-4543-8e7c-7e318d2e0091"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.574928 4922 generic.go:334] "Generic (PLEG): container finished" podID="6ad78d97-6a95-4bd9-9204-6f8d0af71cf3" containerID="d3c510b21747ecd1b7af150cea8f55ebe831e769ad7c9a3529624a743396fb61" exitCode=0 Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.574977 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6ad78d97-6a95-4bd9-9204-6f8d0af71cf3","Type":"ContainerDied","Data":"d3c510b21747ecd1b7af150cea8f55ebe831e769ad7c9a3529624a743396fb61"} Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.590705 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16d2c710-0adb-4543-8e7c-7e318d2e0091-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "16d2c710-0adb-4543-8e7c-7e318d2e0091" (UID: "16d2c710-0adb-4543-8e7c-7e318d2e0091"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.613653 4922 scope.go:117] "RemoveContainer" containerID="5384cbf5a529a7ac8c08ebb121d6b3f3cb726547d83ae663fede88079514676b" Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.615040 4922 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.615074 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16d2c710-0adb-4543-8e7c-7e318d2e0091-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.615084 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2trdb\" (UniqueName: \"kubernetes.io/projected/16d2c710-0adb-4543-8e7c-7e318d2e0091-kube-api-access-2trdb\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.615096 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16d2c710-0adb-4543-8e7c-7e318d2e0091-logs\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.615105 4922 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16d2c710-0adb-4543-8e7c-7e318d2e0091-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.615112 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16d2c710-0adb-4543-8e7c-7e318d2e0091-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.643716 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16d2c710-0adb-4543-8e7c-7e318d2e0091-config-data" (OuterVolumeSpecName: "config-data") pod "16d2c710-0adb-4543-8e7c-7e318d2e0091" (UID: "16d2c710-0adb-4543-8e7c-7e318d2e0091"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.654218 4922 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.667851 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6bb9876df9-jt7kg" Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.717452 4922 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.717507 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16d2c710-0adb-4543-8e7c-7e318d2e0091-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.743089 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 18 11:57:04 crc kubenswrapper[4922]: W0218 11:57:04.760566 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb897159b_9178_4f59_b254_08229460867d.slice/crio-66ec14c893f5abca1bc83a4e927703007a05fa08b70365f0103fc74dc81ffc65 WatchSource:0}: Error finding container 66ec14c893f5abca1bc83a4e927703007a05fa08b70365f0103fc74dc81ffc65: Status 404 returned error can't find the container with id 66ec14c893f5abca1bc83a4e927703007a05fa08b70365f0103fc74dc81ffc65 Feb 18 11:57:04 crc kubenswrapper[4922]: I0218 11:57:04.850728 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-794d859fd8-fbbnx" Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.030546 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed1bc0f3-a613-4565-b1f5-e962556acb00" path="/var/lib/kubelet/pods/ed1bc0f3-a613-4565-b1f5-e962556acb00/volumes" Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.031934 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.039484 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.078217 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 11:57:05 crc kubenswrapper[4922]: E0218 11:57:05.078785 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16d2c710-0adb-4543-8e7c-7e318d2e0091" containerName="glance-log" Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.078804 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="16d2c710-0adb-4543-8e7c-7e318d2e0091" containerName="glance-log" Feb 18 11:57:05 crc kubenswrapper[4922]: E0218 11:57:05.078823 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16d2c710-0adb-4543-8e7c-7e318d2e0091" containerName="glance-httpd" Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.079911 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="16d2c710-0adb-4543-8e7c-7e318d2e0091" containerName="glance-httpd" Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.081078 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="16d2c710-0adb-4543-8e7c-7e318d2e0091" containerName="glance-httpd" Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.081121 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="16d2c710-0adb-4543-8e7c-7e318d2e0091" containerName="glance-log" Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.083648 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.085861 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.086216 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.095061 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.142956 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-794d859fd8-fbbnx" Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.238837 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6d677498bd-cxq98"] Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.239177 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6d677498bd-cxq98" podUID="53371b07-a65f-4fec-8564-bcd51df6c010" containerName="barbican-api-log" containerID="cri-o://95063da6319615da44d0b1f12dd9abb6a109354894bb0cb11f9c25d007446b4c" gracePeriod=30 Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.239843 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6d677498bd-cxq98" podUID="53371b07-a65f-4fec-8564-bcd51df6c010" containerName="barbican-api" containerID="cri-o://d307cc88ce59602f69621bcf0371da0ee28a8272ec22fc60147af239fd78badf" gracePeriod=30 Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.250910 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-6d677498bd-cxq98" podUID="53371b07-a65f-4fec-8564-bcd51df6c010" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.171:9311/healthcheck\": EOF" Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.252717 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6d677498bd-cxq98" podUID="53371b07-a65f-4fec-8564-bcd51df6c010" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.171:9311/healthcheck\": EOF" Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.253311 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24eb828d-acb3-4b88-96dc-8d3bb8c49e86-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"24eb828d-acb3-4b88-96dc-8d3bb8c49e86\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.253458 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24eb828d-acb3-4b88-96dc-8d3bb8c49e86-scripts\") pod \"glance-default-internal-api-0\" (UID: \"24eb828d-acb3-4b88-96dc-8d3bb8c49e86\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.253489 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/24eb828d-acb3-4b88-96dc-8d3bb8c49e86-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"24eb828d-acb3-4b88-96dc-8d3bb8c49e86\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.253527 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24eb828d-acb3-4b88-96dc-8d3bb8c49e86-logs\") pod \"glance-default-internal-api-0\" (UID: \"24eb828d-acb3-4b88-96dc-8d3bb8c49e86\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.253660 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"24eb828d-acb3-4b88-96dc-8d3bb8c49e86\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.253714 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/24eb828d-acb3-4b88-96dc-8d3bb8c49e86-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"24eb828d-acb3-4b88-96dc-8d3bb8c49e86\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.253755 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24eb828d-acb3-4b88-96dc-8d3bb8c49e86-config-data\") pod \"glance-default-internal-api-0\" (UID: \"24eb828d-acb3-4b88-96dc-8d3bb8c49e86\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.253801 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcrkn\" (UniqueName: \"kubernetes.io/projected/24eb828d-acb3-4b88-96dc-8d3bb8c49e86-kube-api-access-qcrkn\") pod \"glance-default-internal-api-0\" (UID: \"24eb828d-acb3-4b88-96dc-8d3bb8c49e86\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.253893 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-6d677498bd-cxq98" podUID="53371b07-a65f-4fec-8564-bcd51df6c010" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.171:9311/healthcheck\": EOF" Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.353509 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6bb9876df9-jt7kg"] Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.355968 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24eb828d-acb3-4b88-96dc-8d3bb8c49e86-logs\") pod \"glance-default-internal-api-0\" (UID: \"24eb828d-acb3-4b88-96dc-8d3bb8c49e86\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.356120 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"24eb828d-acb3-4b88-96dc-8d3bb8c49e86\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.356182 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/24eb828d-acb3-4b88-96dc-8d3bb8c49e86-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"24eb828d-acb3-4b88-96dc-8d3bb8c49e86\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.356209 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24eb828d-acb3-4b88-96dc-8d3bb8c49e86-config-data\") pod \"glance-default-internal-api-0\" (UID: \"24eb828d-acb3-4b88-96dc-8d3bb8c49e86\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.356256 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcrkn\" (UniqueName: \"kubernetes.io/projected/24eb828d-acb3-4b88-96dc-8d3bb8c49e86-kube-api-access-qcrkn\") pod \"glance-default-internal-api-0\" (UID: \"24eb828d-acb3-4b88-96dc-8d3bb8c49e86\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.356339 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24eb828d-acb3-4b88-96dc-8d3bb8c49e86-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"24eb828d-acb3-4b88-96dc-8d3bb8c49e86\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.356412 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24eb828d-acb3-4b88-96dc-8d3bb8c49e86-scripts\") pod \"glance-default-internal-api-0\" (UID: \"24eb828d-acb3-4b88-96dc-8d3bb8c49e86\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.356440 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/24eb828d-acb3-4b88-96dc-8d3bb8c49e86-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"24eb828d-acb3-4b88-96dc-8d3bb8c49e86\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.358281 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"24eb828d-acb3-4b88-96dc-8d3bb8c49e86\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.358728 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24eb828d-acb3-4b88-96dc-8d3bb8c49e86-logs\") pod \"glance-default-internal-api-0\" (UID: \"24eb828d-acb3-4b88-96dc-8d3bb8c49e86\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.360388 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/24eb828d-acb3-4b88-96dc-8d3bb8c49e86-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"24eb828d-acb3-4b88-96dc-8d3bb8c49e86\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.370590 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/24eb828d-acb3-4b88-96dc-8d3bb8c49e86-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"24eb828d-acb3-4b88-96dc-8d3bb8c49e86\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.372538 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24eb828d-acb3-4b88-96dc-8d3bb8c49e86-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"24eb828d-acb3-4b88-96dc-8d3bb8c49e86\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.375238 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24eb828d-acb3-4b88-96dc-8d3bb8c49e86-config-data\") pod \"glance-default-internal-api-0\" (UID: \"24eb828d-acb3-4b88-96dc-8d3bb8c49e86\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.389996 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24eb828d-acb3-4b88-96dc-8d3bb8c49e86-scripts\") pod \"glance-default-internal-api-0\" (UID: \"24eb828d-acb3-4b88-96dc-8d3bb8c49e86\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.402023 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcrkn\" (UniqueName: \"kubernetes.io/projected/24eb828d-acb3-4b88-96dc-8d3bb8c49e86-kube-api-access-qcrkn\") pod \"glance-default-internal-api-0\" (UID: \"24eb828d-acb3-4b88-96dc-8d3bb8c49e86\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.428558 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"24eb828d-acb3-4b88-96dc-8d3bb8c49e86\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.435263 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.639350 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.640719 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="31a682bb-b881-47f5-960b-c6ae54c24275" containerName="ceilometer-central-agent" containerID="cri-o://ba51c32817af83f75090d7c0f3e6b11f618358a906fe3fa4abfa17b358890f4c" gracePeriod=30 Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.641034 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="31a682bb-b881-47f5-960b-c6ae54c24275" containerName="proxy-httpd" containerID="cri-o://84badbb54b04871d4102bc412504ae7f5911b23870f7e7ded6df793d7986d02b" gracePeriod=30 Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.641307 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="31a682bb-b881-47f5-960b-c6ae54c24275" containerName="ceilometer-notification-agent" containerID="cri-o://71670bed8cf5414a16d8e3139affef7f174a018a3fa1f96f685382e7b236b20a" gracePeriod=30 Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.641424 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="31a682bb-b881-47f5-960b-c6ae54c24275" containerName="sg-core" containerID="cri-o://da256f656dc5f88aa583a65920c90cb68f469934b1ee4eea56bd44df48dcdf88" gracePeriod=30 Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.730305 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b897159b-9178-4f59-b254-08229460867d","Type":"ContainerStarted","Data":"66ec14c893f5abca1bc83a4e927703007a05fa08b70365f0103fc74dc81ffc65"} Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.736177 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.741344 4922 generic.go:334] "Generic (PLEG): container finished" podID="53371b07-a65f-4fec-8564-bcd51df6c010" containerID="95063da6319615da44d0b1f12dd9abb6a109354894bb0cb11f9c25d007446b4c" exitCode=143 Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.741446 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6d677498bd-cxq98" event={"ID":"53371b07-a65f-4fec-8564-bcd51df6c010","Type":"ContainerDied","Data":"95063da6319615da44d0b1f12dd9abb6a109354894bb0cb11f9c25d007446b4c"} Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.753706 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6bb9876df9-jt7kg" event={"ID":"8cc5cf6d-c722-42a3-8389-b991e77d1bbf","Type":"ContainerStarted","Data":"ef101f250be994a7814b871182058c04992ceb0141e2aa6a68e3a91fa188bb6b"} Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.810214 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ad78d97-6a95-4bd9-9204-6f8d0af71cf3-combined-ca-bundle\") pod \"6ad78d97-6a95-4bd9-9204-6f8d0af71cf3\" (UID: \"6ad78d97-6a95-4bd9-9204-6f8d0af71cf3\") " Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.810272 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbhgh\" (UniqueName: \"kubernetes.io/projected/6ad78d97-6a95-4bd9-9204-6f8d0af71cf3-kube-api-access-zbhgh\") pod \"6ad78d97-6a95-4bd9-9204-6f8d0af71cf3\" (UID: \"6ad78d97-6a95-4bd9-9204-6f8d0af71cf3\") " Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.810329 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ad78d97-6a95-4bd9-9204-6f8d0af71cf3-scripts\") pod \"6ad78d97-6a95-4bd9-9204-6f8d0af71cf3\" (UID: \"6ad78d97-6a95-4bd9-9204-6f8d0af71cf3\") " Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.810488 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6ad78d97-6a95-4bd9-9204-6f8d0af71cf3-httpd-run\") pod \"6ad78d97-6a95-4bd9-9204-6f8d0af71cf3\" (UID: \"6ad78d97-6a95-4bd9-9204-6f8d0af71cf3\") " Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.810576 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ad78d97-6a95-4bd9-9204-6f8d0af71cf3-logs\") pod \"6ad78d97-6a95-4bd9-9204-6f8d0af71cf3\" (UID: \"6ad78d97-6a95-4bd9-9204-6f8d0af71cf3\") " Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.810604 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"6ad78d97-6a95-4bd9-9204-6f8d0af71cf3\" (UID: \"6ad78d97-6a95-4bd9-9204-6f8d0af71cf3\") " Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.810628 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ad78d97-6a95-4bd9-9204-6f8d0af71cf3-config-data\") pod \"6ad78d97-6a95-4bd9-9204-6f8d0af71cf3\" (UID: \"6ad78d97-6a95-4bd9-9204-6f8d0af71cf3\") " Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.813569 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ad78d97-6a95-4bd9-9204-6f8d0af71cf3-logs" (OuterVolumeSpecName: "logs") pod "6ad78d97-6a95-4bd9-9204-6f8d0af71cf3" (UID: "6ad78d97-6a95-4bd9-9204-6f8d0af71cf3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.814508 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ad78d97-6a95-4bd9-9204-6f8d0af71cf3-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6ad78d97-6a95-4bd9-9204-6f8d0af71cf3" (UID: "6ad78d97-6a95-4bd9-9204-6f8d0af71cf3"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.820077 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ad78d97-6a95-4bd9-9204-6f8d0af71cf3-kube-api-access-zbhgh" (OuterVolumeSpecName: "kube-api-access-zbhgh") pod "6ad78d97-6a95-4bd9-9204-6f8d0af71cf3" (UID: "6ad78d97-6a95-4bd9-9204-6f8d0af71cf3"). InnerVolumeSpecName "kube-api-access-zbhgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.828545 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ad78d97-6a95-4bd9-9204-6f8d0af71cf3-scripts" (OuterVolumeSpecName: "scripts") pod "6ad78d97-6a95-4bd9-9204-6f8d0af71cf3" (UID: "6ad78d97-6a95-4bd9-9204-6f8d0af71cf3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.829865 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "6ad78d97-6a95-4bd9-9204-6f8d0af71cf3" (UID: "6ad78d97-6a95-4bd9-9204-6f8d0af71cf3"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.867388 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ad78d97-6a95-4bd9-9204-6f8d0af71cf3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ad78d97-6a95-4bd9-9204-6f8d0af71cf3" (UID: "6ad78d97-6a95-4bd9-9204-6f8d0af71cf3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.914827 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ad78d97-6a95-4bd9-9204-6f8d0af71cf3-logs\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.915790 4922 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.915816 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ad78d97-6a95-4bd9-9204-6f8d0af71cf3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.915832 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbhgh\" (UniqueName: \"kubernetes.io/projected/6ad78d97-6a95-4bd9-9204-6f8d0af71cf3-kube-api-access-zbhgh\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.915844 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ad78d97-6a95-4bd9-9204-6f8d0af71cf3-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.915854 4922 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6ad78d97-6a95-4bd9-9204-6f8d0af71cf3-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:05 crc kubenswrapper[4922]: I0218 11:57:05.994279 4922 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Feb 18 11:57:06 crc kubenswrapper[4922]: I0218 11:57:06.015562 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ad78d97-6a95-4bd9-9204-6f8d0af71cf3-config-data" (OuterVolumeSpecName: "config-data") pod "6ad78d97-6a95-4bd9-9204-6f8d0af71cf3" (UID: "6ad78d97-6a95-4bd9-9204-6f8d0af71cf3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:06 crc kubenswrapper[4922]: I0218 11:57:06.018647 4922 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:06 crc kubenswrapper[4922]: I0218 11:57:06.018688 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ad78d97-6a95-4bd9-9204-6f8d0af71cf3-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:06 crc kubenswrapper[4922]: I0218 11:57:06.213143 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 11:57:06 crc kubenswrapper[4922]: I0218 11:57:06.794041 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"24eb828d-acb3-4b88-96dc-8d3bb8c49e86","Type":"ContainerStarted","Data":"205155ab7a59a38e41604dd6477d8795045c94879e29fefe3e7383b4bc42a275"} Feb 18 11:57:06 crc kubenswrapper[4922]: I0218 11:57:06.811183 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b897159b-9178-4f59-b254-08229460867d","Type":"ContainerStarted","Data":"5b9dff47acb202a13f5efb12c57ae21c1a29a7d2f410f0ba676d77fbf9a6c0ef"} Feb 18 11:57:06 crc kubenswrapper[4922]: I0218 11:57:06.826232 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6ad78d97-6a95-4bd9-9204-6f8d0af71cf3","Type":"ContainerDied","Data":"f239b779f8f04023b896a330788159d833d8acf6105f2514edc0ee6188559dff"} Feb 18 11:57:06 crc kubenswrapper[4922]: I0218 11:57:06.826258 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 18 11:57:06 crc kubenswrapper[4922]: I0218 11:57:06.826399 4922 scope.go:117] "RemoveContainer" containerID="d3c510b21747ecd1b7af150cea8f55ebe831e769ad7c9a3529624a743396fb61" Feb 18 11:57:06 crc kubenswrapper[4922]: I0218 11:57:06.833623 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6bb9876df9-jt7kg" event={"ID":"8cc5cf6d-c722-42a3-8389-b991e77d1bbf","Type":"ContainerStarted","Data":"66dc8164db89d83a5273d7b5c890920638da439ac802822d3c7ad93d40a2b796"} Feb 18 11:57:06 crc kubenswrapper[4922]: I0218 11:57:06.833668 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6bb9876df9-jt7kg" event={"ID":"8cc5cf6d-c722-42a3-8389-b991e77d1bbf","Type":"ContainerStarted","Data":"7c235e5e03c0add02f18ecc6178afdc6693e386cabfb339de7a43ee043e84f2a"} Feb 18 11:57:06 crc kubenswrapper[4922]: I0218 11:57:06.834943 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6bb9876df9-jt7kg" Feb 18 11:57:06 crc kubenswrapper[4922]: I0218 11:57:06.834978 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6bb9876df9-jt7kg" Feb 18 11:57:06 crc kubenswrapper[4922]: I0218 11:57:06.850829 4922 generic.go:334] "Generic (PLEG): container finished" podID="31a682bb-b881-47f5-960b-c6ae54c24275" containerID="84badbb54b04871d4102bc412504ae7f5911b23870f7e7ded6df793d7986d02b" exitCode=0 Feb 18 11:57:06 crc kubenswrapper[4922]: I0218 11:57:06.850852 4922 generic.go:334] "Generic (PLEG): container finished" podID="31a682bb-b881-47f5-960b-c6ae54c24275" containerID="da256f656dc5f88aa583a65920c90cb68f469934b1ee4eea56bd44df48dcdf88" exitCode=2 Feb 18 11:57:06 crc kubenswrapper[4922]: I0218 11:57:06.850860 4922 generic.go:334] "Generic (PLEG): container finished" podID="31a682bb-b881-47f5-960b-c6ae54c24275" containerID="71670bed8cf5414a16d8e3139affef7f174a018a3fa1f96f685382e7b236b20a" exitCode=0 Feb 18 11:57:06 crc kubenswrapper[4922]: I0218 11:57:06.850867 4922 generic.go:334] "Generic (PLEG): container finished" podID="31a682bb-b881-47f5-960b-c6ae54c24275" containerID="ba51c32817af83f75090d7c0f3e6b11f618358a906fe3fa4abfa17b358890f4c" exitCode=0 Feb 18 11:57:06 crc kubenswrapper[4922]: I0218 11:57:06.850887 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31a682bb-b881-47f5-960b-c6ae54c24275","Type":"ContainerDied","Data":"84badbb54b04871d4102bc412504ae7f5911b23870f7e7ded6df793d7986d02b"} Feb 18 11:57:06 crc kubenswrapper[4922]: I0218 11:57:06.850911 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31a682bb-b881-47f5-960b-c6ae54c24275","Type":"ContainerDied","Data":"da256f656dc5f88aa583a65920c90cb68f469934b1ee4eea56bd44df48dcdf88"} Feb 18 11:57:06 crc kubenswrapper[4922]: I0218 11:57:06.850921 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31a682bb-b881-47f5-960b-c6ae54c24275","Type":"ContainerDied","Data":"71670bed8cf5414a16d8e3139affef7f174a018a3fa1f96f685382e7b236b20a"} Feb 18 11:57:06 crc kubenswrapper[4922]: I0218 11:57:06.850929 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31a682bb-b881-47f5-960b-c6ae54c24275","Type":"ContainerDied","Data":"ba51c32817af83f75090d7c0f3e6b11f618358a906fe3fa4abfa17b358890f4c"} Feb 18 11:57:06 crc kubenswrapper[4922]: I0218 11:57:06.863860 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-6bb9876df9-jt7kg" podStartSLOduration=2.863842079 podStartE2EDuration="2.863842079s" podCreationTimestamp="2026-02-18 11:57:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:57:06.855043177 +0000 UTC m=+1228.582747257" watchObservedRunningTime="2026-02-18 11:57:06.863842079 +0000 UTC m=+1228.591546159" Feb 18 11:57:06 crc kubenswrapper[4922]: I0218 11:57:06.866434 4922 scope.go:117] "RemoveContainer" containerID="b4ae6c8859c170ed82fdd4029620545643f83a046da270a4665b5e08aebee3d6" Feb 18 11:57:06 crc kubenswrapper[4922]: I0218 11:57:06.890302 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 11:57:06 crc kubenswrapper[4922]: I0218 11:57:06.911650 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 11:57:06 crc kubenswrapper[4922]: I0218 11:57:06.924923 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 11:57:06 crc kubenswrapper[4922]: E0218 11:57:06.925537 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ad78d97-6a95-4bd9-9204-6f8d0af71cf3" containerName="glance-log" Feb 18 11:57:06 crc kubenswrapper[4922]: I0218 11:57:06.925564 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ad78d97-6a95-4bd9-9204-6f8d0af71cf3" containerName="glance-log" Feb 18 11:57:06 crc kubenswrapper[4922]: E0218 11:57:06.925635 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ad78d97-6a95-4bd9-9204-6f8d0af71cf3" containerName="glance-httpd" Feb 18 11:57:06 crc kubenswrapper[4922]: I0218 11:57:06.925647 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ad78d97-6a95-4bd9-9204-6f8d0af71cf3" containerName="glance-httpd" Feb 18 11:57:06 crc kubenswrapper[4922]: I0218 11:57:06.925876 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ad78d97-6a95-4bd9-9204-6f8d0af71cf3" containerName="glance-httpd" Feb 18 11:57:06 crc kubenswrapper[4922]: I0218 11:57:06.925917 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ad78d97-6a95-4bd9-9204-6f8d0af71cf3" containerName="glance-log" Feb 18 11:57:06 crc kubenswrapper[4922]: I0218 11:57:06.927193 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 18 11:57:06 crc kubenswrapper[4922]: I0218 11:57:06.937862 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 18 11:57:06 crc kubenswrapper[4922]: I0218 11:57:06.938182 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 18 11:57:06 crc kubenswrapper[4922]: I0218 11:57:06.945278 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.035803 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16d2c710-0adb-4543-8e7c-7e318d2e0091" path="/var/lib/kubelet/pods/16d2c710-0adb-4543-8e7c-7e318d2e0091/volumes" Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.045257 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ad78d97-6a95-4bd9-9204-6f8d0af71cf3" path="/var/lib/kubelet/pods/6ad78d97-6a95-4bd9-9204-6f8d0af71cf3/volumes" Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.055629 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7810aaca-e072-467b-bba7-6a3e12310c68-logs\") pod \"glance-default-external-api-0\" (UID: \"7810aaca-e072-467b-bba7-6a3e12310c68\") " pod="openstack/glance-default-external-api-0" Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.055750 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7810aaca-e072-467b-bba7-6a3e12310c68-config-data\") pod \"glance-default-external-api-0\" (UID: \"7810aaca-e072-467b-bba7-6a3e12310c68\") " pod="openstack/glance-default-external-api-0" Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.055806 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp7nr\" (UniqueName: \"kubernetes.io/projected/7810aaca-e072-467b-bba7-6a3e12310c68-kube-api-access-zp7nr\") pod \"glance-default-external-api-0\" (UID: \"7810aaca-e072-467b-bba7-6a3e12310c68\") " pod="openstack/glance-default-external-api-0" Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.055834 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7810aaca-e072-467b-bba7-6a3e12310c68-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7810aaca-e072-467b-bba7-6a3e12310c68\") " pod="openstack/glance-default-external-api-0" Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.055873 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7810aaca-e072-467b-bba7-6a3e12310c68-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7810aaca-e072-467b-bba7-6a3e12310c68\") " pod="openstack/glance-default-external-api-0" Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.055904 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7810aaca-e072-467b-bba7-6a3e12310c68-scripts\") pod \"glance-default-external-api-0\" (UID: \"7810aaca-e072-467b-bba7-6a3e12310c68\") " pod="openstack/glance-default-external-api-0" Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.056004 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"7810aaca-e072-467b-bba7-6a3e12310c68\") " pod="openstack/glance-default-external-api-0" Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.056038 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7810aaca-e072-467b-bba7-6a3e12310c68-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7810aaca-e072-467b-bba7-6a3e12310c68\") " pod="openstack/glance-default-external-api-0" Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.176846 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7810aaca-e072-467b-bba7-6a3e12310c68-config-data\") pod \"glance-default-external-api-0\" (UID: \"7810aaca-e072-467b-bba7-6a3e12310c68\") " pod="openstack/glance-default-external-api-0" Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.176968 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zp7nr\" (UniqueName: \"kubernetes.io/projected/7810aaca-e072-467b-bba7-6a3e12310c68-kube-api-access-zp7nr\") pod \"glance-default-external-api-0\" (UID: \"7810aaca-e072-467b-bba7-6a3e12310c68\") " pod="openstack/glance-default-external-api-0" Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.177005 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7810aaca-e072-467b-bba7-6a3e12310c68-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7810aaca-e072-467b-bba7-6a3e12310c68\") " pod="openstack/glance-default-external-api-0" Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.177065 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7810aaca-e072-467b-bba7-6a3e12310c68-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7810aaca-e072-467b-bba7-6a3e12310c68\") " pod="openstack/glance-default-external-api-0" Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.177100 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7810aaca-e072-467b-bba7-6a3e12310c68-scripts\") pod \"glance-default-external-api-0\" (UID: \"7810aaca-e072-467b-bba7-6a3e12310c68\") " pod="openstack/glance-default-external-api-0" Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.177286 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"7810aaca-e072-467b-bba7-6a3e12310c68\") " pod="openstack/glance-default-external-api-0" Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.177358 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7810aaca-e072-467b-bba7-6a3e12310c68-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7810aaca-e072-467b-bba7-6a3e12310c68\") " pod="openstack/glance-default-external-api-0" Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.177552 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7810aaca-e072-467b-bba7-6a3e12310c68-logs\") pod \"glance-default-external-api-0\" (UID: \"7810aaca-e072-467b-bba7-6a3e12310c68\") " pod="openstack/glance-default-external-api-0" Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.178091 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7810aaca-e072-467b-bba7-6a3e12310c68-logs\") pod \"glance-default-external-api-0\" (UID: \"7810aaca-e072-467b-bba7-6a3e12310c68\") " pod="openstack/glance-default-external-api-0" Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.182285 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7810aaca-e072-467b-bba7-6a3e12310c68-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7810aaca-e072-467b-bba7-6a3e12310c68\") " pod="openstack/glance-default-external-api-0" Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.184097 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7810aaca-e072-467b-bba7-6a3e12310c68-scripts\") pod \"glance-default-external-api-0\" (UID: \"7810aaca-e072-467b-bba7-6a3e12310c68\") " pod="openstack/glance-default-external-api-0" Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.184504 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"7810aaca-e072-467b-bba7-6a3e12310c68\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.185658 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7810aaca-e072-467b-bba7-6a3e12310c68-config-data\") pod \"glance-default-external-api-0\" (UID: \"7810aaca-e072-467b-bba7-6a3e12310c68\") " pod="openstack/glance-default-external-api-0" Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.185737 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7810aaca-e072-467b-bba7-6a3e12310c68-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7810aaca-e072-467b-bba7-6a3e12310c68\") " pod="openstack/glance-default-external-api-0" Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.207890 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7810aaca-e072-467b-bba7-6a3e12310c68-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7810aaca-e072-467b-bba7-6a3e12310c68\") " pod="openstack/glance-default-external-api-0" Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.214690 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp7nr\" (UniqueName: \"kubernetes.io/projected/7810aaca-e072-467b-bba7-6a3e12310c68-kube-api-access-zp7nr\") pod \"glance-default-external-api-0\" (UID: \"7810aaca-e072-467b-bba7-6a3e12310c68\") " pod="openstack/glance-default-external-api-0" Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.239723 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"7810aaca-e072-467b-bba7-6a3e12310c68\") " pod="openstack/glance-default-external-api-0" Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.273520 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5cc8b5d5c5-jrg5v" Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.379428 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-2ktqs"] Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.379700 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cf78879c9-2ktqs" podUID="24bbb94b-821e-4c8c-ae27-356f296903bf" containerName="dnsmasq-dns" containerID="cri-o://4fbf873b475eb4db471744da3b986195e954cf285ed89e854f16fe22b1d17ca4" gracePeriod=10 Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.437133 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.441553 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.486081 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/31a682bb-b881-47f5-960b-c6ae54c24275-sg-core-conf-yaml\") pod \"31a682bb-b881-47f5-960b-c6ae54c24275\" (UID: \"31a682bb-b881-47f5-960b-c6ae54c24275\") " Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.486168 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31a682bb-b881-47f5-960b-c6ae54c24275-log-httpd\") pod \"31a682bb-b881-47f5-960b-c6ae54c24275\" (UID: \"31a682bb-b881-47f5-960b-c6ae54c24275\") " Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.486192 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31a682bb-b881-47f5-960b-c6ae54c24275-scripts\") pod \"31a682bb-b881-47f5-960b-c6ae54c24275\" (UID: \"31a682bb-b881-47f5-960b-c6ae54c24275\") " Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.486226 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tt6qk\" (UniqueName: \"kubernetes.io/projected/31a682bb-b881-47f5-960b-c6ae54c24275-kube-api-access-tt6qk\") pod \"31a682bb-b881-47f5-960b-c6ae54c24275\" (UID: \"31a682bb-b881-47f5-960b-c6ae54c24275\") " Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.486324 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31a682bb-b881-47f5-960b-c6ae54c24275-combined-ca-bundle\") pod \"31a682bb-b881-47f5-960b-c6ae54c24275\" (UID: \"31a682bb-b881-47f5-960b-c6ae54c24275\") " Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.486458 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31a682bb-b881-47f5-960b-c6ae54c24275-config-data\") pod \"31a682bb-b881-47f5-960b-c6ae54c24275\" (UID: \"31a682bb-b881-47f5-960b-c6ae54c24275\") " Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.486565 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31a682bb-b881-47f5-960b-c6ae54c24275-run-httpd\") pod \"31a682bb-b881-47f5-960b-c6ae54c24275\" (UID: \"31a682bb-b881-47f5-960b-c6ae54c24275\") " Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.487457 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31a682bb-b881-47f5-960b-c6ae54c24275-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "31a682bb-b881-47f5-960b-c6ae54c24275" (UID: "31a682bb-b881-47f5-960b-c6ae54c24275"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.491703 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31a682bb-b881-47f5-960b-c6ae54c24275-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "31a682bb-b881-47f5-960b-c6ae54c24275" (UID: "31a682bb-b881-47f5-960b-c6ae54c24275"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.499615 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31a682bb-b881-47f5-960b-c6ae54c24275-scripts" (OuterVolumeSpecName: "scripts") pod "31a682bb-b881-47f5-960b-c6ae54c24275" (UID: "31a682bb-b881-47f5-960b-c6ae54c24275"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.511695 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31a682bb-b881-47f5-960b-c6ae54c24275-kube-api-access-tt6qk" (OuterVolumeSpecName: "kube-api-access-tt6qk") pod "31a682bb-b881-47f5-960b-c6ae54c24275" (UID: "31a682bb-b881-47f5-960b-c6ae54c24275"). InnerVolumeSpecName "kube-api-access-tt6qk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.586076 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31a682bb-b881-47f5-960b-c6ae54c24275-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "31a682bb-b881-47f5-960b-c6ae54c24275" (UID: "31a682bb-b881-47f5-960b-c6ae54c24275"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.599247 4922 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31a682bb-b881-47f5-960b-c6ae54c24275-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.599280 4922 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/31a682bb-b881-47f5-960b-c6ae54c24275-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.599289 4922 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31a682bb-b881-47f5-960b-c6ae54c24275-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.599297 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31a682bb-b881-47f5-960b-c6ae54c24275-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.599305 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tt6qk\" (UniqueName: \"kubernetes.io/projected/31a682bb-b881-47f5-960b-c6ae54c24275-kube-api-access-tt6qk\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.789856 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31a682bb-b881-47f5-960b-c6ae54c24275-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "31a682bb-b881-47f5-960b-c6ae54c24275" (UID: "31a682bb-b881-47f5-960b-c6ae54c24275"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.795564 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31a682bb-b881-47f5-960b-c6ae54c24275-config-data" (OuterVolumeSpecName: "config-data") pod "31a682bb-b881-47f5-960b-c6ae54c24275" (UID: "31a682bb-b881-47f5-960b-c6ae54c24275"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.806195 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31a682bb-b881-47f5-960b-c6ae54c24275-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.806237 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31a682bb-b881-47f5-960b-c6ae54c24275-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.884627 4922 generic.go:334] "Generic (PLEG): container finished" podID="24bbb94b-821e-4c8c-ae27-356f296903bf" containerID="4fbf873b475eb4db471744da3b986195e954cf285ed89e854f16fe22b1d17ca4" exitCode=0 Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.885061 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-2ktqs" event={"ID":"24bbb94b-821e-4c8c-ae27-356f296903bf","Type":"ContainerDied","Data":"4fbf873b475eb4db471744da3b986195e954cf285ed89e854f16fe22b1d17ca4"} Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.898344 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31a682bb-b881-47f5-960b-c6ae54c24275","Type":"ContainerDied","Data":"2f2fa694e60fe2de69033e6edac945c54110ee47bb40b70b10dec8b4f330dcd3"} Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.898429 4922 scope.go:117] "RemoveContainer" containerID="84badbb54b04871d4102bc412504ae7f5911b23870f7e7ded6df793d7986d02b" Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.898665 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.909990 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"24eb828d-acb3-4b88-96dc-8d3bb8c49e86","Type":"ContainerStarted","Data":"2c9d001f560c7f93208a3584bc230854257ebcad49ef86ea326e6025425c596a"} Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.920133 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b897159b-9178-4f59-b254-08229460867d","Type":"ContainerStarted","Data":"b1aff12a59b6181c85a4b4e0917df836f66c6b7e79213ee94f04444ae389d050"} Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.920712 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 18 11:57:07 crc kubenswrapper[4922]: I0218 11:57:07.971440 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.971414411 podStartE2EDuration="4.971414411s" podCreationTimestamp="2026-02-18 11:57:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:57:07.957730815 +0000 UTC m=+1229.685434895" watchObservedRunningTime="2026-02-18 11:57:07.971414411 +0000 UTC m=+1229.699118501" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.069874 4922 scope.go:117] "RemoveContainer" containerID="da256f656dc5f88aa583a65920c90cb68f469934b1ee4eea56bd44df48dcdf88" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.097199 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cf78879c9-2ktqs" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.113316 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.124549 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.187609 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 18 11:57:08 crc kubenswrapper[4922]: E0218 11:57:08.188183 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31a682bb-b881-47f5-960b-c6ae54c24275" containerName="sg-core" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.188211 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="31a682bb-b881-47f5-960b-c6ae54c24275" containerName="sg-core" Feb 18 11:57:08 crc kubenswrapper[4922]: E0218 11:57:08.188236 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24bbb94b-821e-4c8c-ae27-356f296903bf" containerName="dnsmasq-dns" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.188242 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="24bbb94b-821e-4c8c-ae27-356f296903bf" containerName="dnsmasq-dns" Feb 18 11:57:08 crc kubenswrapper[4922]: E0218 11:57:08.188254 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24bbb94b-821e-4c8c-ae27-356f296903bf" containerName="init" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.188260 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="24bbb94b-821e-4c8c-ae27-356f296903bf" containerName="init" Feb 18 11:57:08 crc kubenswrapper[4922]: E0218 11:57:08.188278 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31a682bb-b881-47f5-960b-c6ae54c24275" containerName="ceilometer-central-agent" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.188285 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="31a682bb-b881-47f5-960b-c6ae54c24275" containerName="ceilometer-central-agent" Feb 18 11:57:08 crc kubenswrapper[4922]: E0218 11:57:08.188310 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31a682bb-b881-47f5-960b-c6ae54c24275" containerName="proxy-httpd" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.188316 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="31a682bb-b881-47f5-960b-c6ae54c24275" containerName="proxy-httpd" Feb 18 11:57:08 crc kubenswrapper[4922]: E0218 11:57:08.188326 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31a682bb-b881-47f5-960b-c6ae54c24275" containerName="ceilometer-notification-agent" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.188332 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="31a682bb-b881-47f5-960b-c6ae54c24275" containerName="ceilometer-notification-agent" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.188518 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="31a682bb-b881-47f5-960b-c6ae54c24275" containerName="sg-core" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.188533 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="31a682bb-b881-47f5-960b-c6ae54c24275" containerName="ceilometer-notification-agent" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.188544 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="24bbb94b-821e-4c8c-ae27-356f296903bf" containerName="dnsmasq-dns" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.188553 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="31a682bb-b881-47f5-960b-c6ae54c24275" containerName="proxy-httpd" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.188564 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="31a682bb-b881-47f5-960b-c6ae54c24275" containerName="ceilometer-central-agent" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.193239 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.204553 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.204901 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.208057 4922 scope.go:117] "RemoveContainer" containerID="71670bed8cf5414a16d8e3139affef7f174a018a3fa1f96f685382e7b236b20a" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.214843 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.223016 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8kkt\" (UniqueName: \"kubernetes.io/projected/24bbb94b-821e-4c8c-ae27-356f296903bf-kube-api-access-x8kkt\") pod \"24bbb94b-821e-4c8c-ae27-356f296903bf\" (UID: \"24bbb94b-821e-4c8c-ae27-356f296903bf\") " Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.223073 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/24bbb94b-821e-4c8c-ae27-356f296903bf-dns-swift-storage-0\") pod \"24bbb94b-821e-4c8c-ae27-356f296903bf\" (UID: \"24bbb94b-821e-4c8c-ae27-356f296903bf\") " Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.223092 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24bbb94b-821e-4c8c-ae27-356f296903bf-ovsdbserver-nb\") pod \"24bbb94b-821e-4c8c-ae27-356f296903bf\" (UID: \"24bbb94b-821e-4c8c-ae27-356f296903bf\") " Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.223545 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24bbb94b-821e-4c8c-ae27-356f296903bf-ovsdbserver-sb\") pod \"24bbb94b-821e-4c8c-ae27-356f296903bf\" (UID: \"24bbb94b-821e-4c8c-ae27-356f296903bf\") " Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.223566 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24bbb94b-821e-4c8c-ae27-356f296903bf-config\") pod \"24bbb94b-821e-4c8c-ae27-356f296903bf\" (UID: \"24bbb94b-821e-4c8c-ae27-356f296903bf\") " Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.223685 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24bbb94b-821e-4c8c-ae27-356f296903bf-dns-svc\") pod \"24bbb94b-821e-4c8c-ae27-356f296903bf\" (UID: \"24bbb94b-821e-4c8c-ae27-356f296903bf\") " Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.246345 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24bbb94b-821e-4c8c-ae27-356f296903bf-kube-api-access-x8kkt" (OuterVolumeSpecName: "kube-api-access-x8kkt") pod "24bbb94b-821e-4c8c-ae27-356f296903bf" (UID: "24bbb94b-821e-4c8c-ae27-356f296903bf"). InnerVolumeSpecName "kube-api-access-x8kkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.320305 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24bbb94b-821e-4c8c-ae27-356f296903bf-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "24bbb94b-821e-4c8c-ae27-356f296903bf" (UID: "24bbb94b-821e-4c8c-ae27-356f296903bf"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.325277 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72a01906-824d-4581-8d88-7d40a91786a1-log-httpd\") pod \"ceilometer-0\" (UID: \"72a01906-824d-4581-8d88-7d40a91786a1\") " pod="openstack/ceilometer-0" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.325378 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72a01906-824d-4581-8d88-7d40a91786a1-scripts\") pod \"ceilometer-0\" (UID: \"72a01906-824d-4581-8d88-7d40a91786a1\") " pod="openstack/ceilometer-0" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.325444 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72a01906-824d-4581-8d88-7d40a91786a1-config-data\") pod \"ceilometer-0\" (UID: \"72a01906-824d-4581-8d88-7d40a91786a1\") " pod="openstack/ceilometer-0" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.325483 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/72a01906-824d-4581-8d88-7d40a91786a1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"72a01906-824d-4581-8d88-7d40a91786a1\") " pod="openstack/ceilometer-0" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.325503 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqwwp\" (UniqueName: \"kubernetes.io/projected/72a01906-824d-4581-8d88-7d40a91786a1-kube-api-access-sqwwp\") pod \"ceilometer-0\" (UID: \"72a01906-824d-4581-8d88-7d40a91786a1\") " pod="openstack/ceilometer-0" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.325549 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72a01906-824d-4581-8d88-7d40a91786a1-run-httpd\") pod \"ceilometer-0\" (UID: \"72a01906-824d-4581-8d88-7d40a91786a1\") " pod="openstack/ceilometer-0" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.325573 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72a01906-824d-4581-8d88-7d40a91786a1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"72a01906-824d-4581-8d88-7d40a91786a1\") " pod="openstack/ceilometer-0" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.325653 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8kkt\" (UniqueName: \"kubernetes.io/projected/24bbb94b-821e-4c8c-ae27-356f296903bf-kube-api-access-x8kkt\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.325670 4922 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/24bbb94b-821e-4c8c-ae27-356f296903bf-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.346164 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.392916 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-9d79df67b-mg9kq" podUID="e3b4cee1-5234-4b6c-93fa-3cb5687ecba9" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.159:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.159:8443: connect: connection refused" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.393048 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-9d79df67b-mg9kq" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.425423 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24bbb94b-821e-4c8c-ae27-356f296903bf-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "24bbb94b-821e-4c8c-ae27-356f296903bf" (UID: "24bbb94b-821e-4c8c-ae27-356f296903bf"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.427715 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72a01906-824d-4581-8d88-7d40a91786a1-scripts\") pod \"ceilometer-0\" (UID: \"72a01906-824d-4581-8d88-7d40a91786a1\") " pod="openstack/ceilometer-0" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.427809 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72a01906-824d-4581-8d88-7d40a91786a1-config-data\") pod \"ceilometer-0\" (UID: \"72a01906-824d-4581-8d88-7d40a91786a1\") " pod="openstack/ceilometer-0" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.427857 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/72a01906-824d-4581-8d88-7d40a91786a1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"72a01906-824d-4581-8d88-7d40a91786a1\") " pod="openstack/ceilometer-0" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.427875 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqwwp\" (UniqueName: \"kubernetes.io/projected/72a01906-824d-4581-8d88-7d40a91786a1-kube-api-access-sqwwp\") pod \"ceilometer-0\" (UID: \"72a01906-824d-4581-8d88-7d40a91786a1\") " pod="openstack/ceilometer-0" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.427922 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72a01906-824d-4581-8d88-7d40a91786a1-run-httpd\") pod \"ceilometer-0\" (UID: \"72a01906-824d-4581-8d88-7d40a91786a1\") " pod="openstack/ceilometer-0" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.427945 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72a01906-824d-4581-8d88-7d40a91786a1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"72a01906-824d-4581-8d88-7d40a91786a1\") " pod="openstack/ceilometer-0" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.427997 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72a01906-824d-4581-8d88-7d40a91786a1-log-httpd\") pod \"ceilometer-0\" (UID: \"72a01906-824d-4581-8d88-7d40a91786a1\") " pod="openstack/ceilometer-0" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.428045 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24bbb94b-821e-4c8c-ae27-356f296903bf-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.428580 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72a01906-824d-4581-8d88-7d40a91786a1-log-httpd\") pod \"ceilometer-0\" (UID: \"72a01906-824d-4581-8d88-7d40a91786a1\") " pod="openstack/ceilometer-0" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.430794 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72a01906-824d-4581-8d88-7d40a91786a1-run-httpd\") pod \"ceilometer-0\" (UID: \"72a01906-824d-4581-8d88-7d40a91786a1\") " pod="openstack/ceilometer-0" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.441616 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72a01906-824d-4581-8d88-7d40a91786a1-scripts\") pod \"ceilometer-0\" (UID: \"72a01906-824d-4581-8d88-7d40a91786a1\") " pod="openstack/ceilometer-0" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.446501 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/72a01906-824d-4581-8d88-7d40a91786a1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"72a01906-824d-4581-8d88-7d40a91786a1\") " pod="openstack/ceilometer-0" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.473006 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72a01906-824d-4581-8d88-7d40a91786a1-config-data\") pod \"ceilometer-0\" (UID: \"72a01906-824d-4581-8d88-7d40a91786a1\") " pod="openstack/ceilometer-0" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.473198 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqwwp\" (UniqueName: \"kubernetes.io/projected/72a01906-824d-4581-8d88-7d40a91786a1-kube-api-access-sqwwp\") pod \"ceilometer-0\" (UID: \"72a01906-824d-4581-8d88-7d40a91786a1\") " pod="openstack/ceilometer-0" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.479401 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72a01906-824d-4581-8d88-7d40a91786a1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"72a01906-824d-4581-8d88-7d40a91786a1\") " pod="openstack/ceilometer-0" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.497294 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24bbb94b-821e-4c8c-ae27-356f296903bf-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "24bbb94b-821e-4c8c-ae27-356f296903bf" (UID: "24bbb94b-821e-4c8c-ae27-356f296903bf"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.531773 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24bbb94b-821e-4c8c-ae27-356f296903bf-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.596089 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24bbb94b-821e-4c8c-ae27-356f296903bf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "24bbb94b-821e-4c8c-ae27-356f296903bf" (UID: "24bbb94b-821e-4c8c-ae27-356f296903bf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.622170 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.623640 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24bbb94b-821e-4c8c-ae27-356f296903bf-config" (OuterVolumeSpecName: "config") pod "24bbb94b-821e-4c8c-ae27-356f296903bf" (UID: "24bbb94b-821e-4c8c-ae27-356f296903bf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.633320 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24bbb94b-821e-4c8c-ae27-356f296903bf-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.633355 4922 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24bbb94b-821e-4c8c-ae27-356f296903bf-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.699639 4922 scope.go:117] "RemoveContainer" containerID="ba51c32817af83f75090d7c0f3e6b11f618358a906fe3fa4abfa17b358890f4c" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.947483 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cf78879c9-2ktqs" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.947490 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-2ktqs" event={"ID":"24bbb94b-821e-4c8c-ae27-356f296903bf","Type":"ContainerDied","Data":"d52ba93a2e395a5fe3e842c514b38b9be3486416e78d8b85381f805174f39d21"} Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.947958 4922 scope.go:117] "RemoveContainer" containerID="4fbf873b475eb4db471744da3b986195e954cf285ed89e854f16fe22b1d17ca4" Feb 18 11:57:08 crc kubenswrapper[4922]: I0218 11:57:08.962883 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"24eb828d-acb3-4b88-96dc-8d3bb8c49e86","Type":"ContainerStarted","Data":"fae0743e5e0fb73717e68120a40713dc2bcdd3fa357699afed1bff0f5a1368e5"} Feb 18 11:57:09 crc kubenswrapper[4922]: I0218 11:57:09.003483 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.003459664 podStartE2EDuration="4.003459664s" podCreationTimestamp="2026-02-18 11:57:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:57:08.993741418 +0000 UTC m=+1230.721445498" watchObservedRunningTime="2026-02-18 11:57:09.003459664 +0000 UTC m=+1230.731163744" Feb 18 11:57:09 crc kubenswrapper[4922]: I0218 11:57:09.010569 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31a682bb-b881-47f5-960b-c6ae54c24275" path="/var/lib/kubelet/pods/31a682bb-b881-47f5-960b-c6ae54c24275/volumes" Feb 18 11:57:09 crc kubenswrapper[4922]: I0218 11:57:09.011494 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7810aaca-e072-467b-bba7-6a3e12310c68","Type":"ContainerStarted","Data":"9937aae78bceb48bd4f47887b4b7c1fa9f743a0bf2b9a03c23a054415125619f"} Feb 18 11:57:09 crc kubenswrapper[4922]: I0218 11:57:09.025146 4922 scope.go:117] "RemoveContainer" containerID="7f1abe52f752c943d157e484e47d8d790a98ef3e4904432fd3c21618fac5c1e6" Feb 18 11:57:09 crc kubenswrapper[4922]: I0218 11:57:09.104694 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-2ktqs"] Feb 18 11:57:09 crc kubenswrapper[4922]: I0218 11:57:09.126958 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-2ktqs"] Feb 18 11:57:09 crc kubenswrapper[4922]: I0218 11:57:09.337591 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 11:57:09 crc kubenswrapper[4922]: I0218 11:57:09.704556 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6d677498bd-cxq98" podUID="53371b07-a65f-4fec-8564-bcd51df6c010" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.171:9311/healthcheck\": read tcp 10.217.0.2:49268->10.217.0.171:9311: read: connection reset by peer" Feb 18 11:57:09 crc kubenswrapper[4922]: I0218 11:57:09.704632 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6d677498bd-cxq98" podUID="53371b07-a65f-4fec-8564-bcd51df6c010" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.171:9311/healthcheck\": read tcp 10.217.0.2:38238->10.217.0.171:9311: read: connection reset by peer" Feb 18 11:57:09 crc kubenswrapper[4922]: I0218 11:57:09.706669 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6d677498bd-cxq98" podUID="53371b07-a65f-4fec-8564-bcd51df6c010" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.171:9311/healthcheck\": dial tcp 10.217.0.171:9311: connect: connection refused" Feb 18 11:57:09 crc kubenswrapper[4922]: I0218 11:57:09.706814 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6d677498bd-cxq98" Feb 18 11:57:09 crc kubenswrapper[4922]: I0218 11:57:09.808074 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 11:57:09 crc kubenswrapper[4922]: I0218 11:57:09.809070 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 11:57:09 crc kubenswrapper[4922]: I0218 11:57:09.893373 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 18 11:57:09 crc kubenswrapper[4922]: I0218 11:57:09.975928 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 18 11:57:10 crc kubenswrapper[4922]: I0218 11:57:10.001606 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7810aaca-e072-467b-bba7-6a3e12310c68","Type":"ContainerStarted","Data":"0619ad10d88107a566fb4c0dbd0711c30708a02e901f59a47be08cbb81b88d18"} Feb 18 11:57:10 crc kubenswrapper[4922]: I0218 11:57:10.002663 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72a01906-824d-4581-8d88-7d40a91786a1","Type":"ContainerStarted","Data":"e51f7f2faa63d11b52bb16edb526931063add3c924782fc45c0056ce678908a1"} Feb 18 11:57:10 crc kubenswrapper[4922]: I0218 11:57:10.004407 4922 generic.go:334] "Generic (PLEG): container finished" podID="53371b07-a65f-4fec-8564-bcd51df6c010" containerID="d307cc88ce59602f69621bcf0371da0ee28a8272ec22fc60147af239fd78badf" exitCode=0 Feb 18 11:57:10 crc kubenswrapper[4922]: I0218 11:57:10.004578 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="74a83ecb-de31-4767-a178-bccf8a37e93e" containerName="cinder-scheduler" containerID="cri-o://d5a387d1fd841c5e7a8b16f873f082fef6f77bf7e19af0404c6f60662142d1da" gracePeriod=30 Feb 18 11:57:10 crc kubenswrapper[4922]: I0218 11:57:10.004799 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6d677498bd-cxq98" event={"ID":"53371b07-a65f-4fec-8564-bcd51df6c010","Type":"ContainerDied","Data":"d307cc88ce59602f69621bcf0371da0ee28a8272ec22fc60147af239fd78badf"} Feb 18 11:57:10 crc kubenswrapper[4922]: I0218 11:57:10.005856 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="74a83ecb-de31-4767-a178-bccf8a37e93e" containerName="probe" containerID="cri-o://fc4b0d98b4a0ffc17082b832ce25867774e01fb1582211f0e74f81571c524a12" gracePeriod=30 Feb 18 11:57:10 crc kubenswrapper[4922]: I0218 11:57:10.330376 4922 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 11:57:10 crc kubenswrapper[4922]: I0218 11:57:10.343826 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6d677498bd-cxq98" Feb 18 11:57:10 crc kubenswrapper[4922]: I0218 11:57:10.485049 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53371b07-a65f-4fec-8564-bcd51df6c010-combined-ca-bundle\") pod \"53371b07-a65f-4fec-8564-bcd51df6c010\" (UID: \"53371b07-a65f-4fec-8564-bcd51df6c010\") " Feb 18 11:57:10 crc kubenswrapper[4922]: I0218 11:57:10.485447 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53371b07-a65f-4fec-8564-bcd51df6c010-logs\") pod \"53371b07-a65f-4fec-8564-bcd51df6c010\" (UID: \"53371b07-a65f-4fec-8564-bcd51df6c010\") " Feb 18 11:57:10 crc kubenswrapper[4922]: I0218 11:57:10.485470 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53371b07-a65f-4fec-8564-bcd51df6c010-config-data\") pod \"53371b07-a65f-4fec-8564-bcd51df6c010\" (UID: \"53371b07-a65f-4fec-8564-bcd51df6c010\") " Feb 18 11:57:10 crc kubenswrapper[4922]: I0218 11:57:10.485524 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/53371b07-a65f-4fec-8564-bcd51df6c010-config-data-custom\") pod \"53371b07-a65f-4fec-8564-bcd51df6c010\" (UID: \"53371b07-a65f-4fec-8564-bcd51df6c010\") " Feb 18 11:57:10 crc kubenswrapper[4922]: I0218 11:57:10.485547 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdmzm\" (UniqueName: \"kubernetes.io/projected/53371b07-a65f-4fec-8564-bcd51df6c010-kube-api-access-jdmzm\") pod \"53371b07-a65f-4fec-8564-bcd51df6c010\" (UID: \"53371b07-a65f-4fec-8564-bcd51df6c010\") " Feb 18 11:57:10 crc kubenswrapper[4922]: I0218 11:57:10.485914 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53371b07-a65f-4fec-8564-bcd51df6c010-logs" (OuterVolumeSpecName: "logs") pod "53371b07-a65f-4fec-8564-bcd51df6c010" (UID: "53371b07-a65f-4fec-8564-bcd51df6c010"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:57:10 crc kubenswrapper[4922]: I0218 11:57:10.486132 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53371b07-a65f-4fec-8564-bcd51df6c010-logs\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:10 crc kubenswrapper[4922]: I0218 11:57:10.491648 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53371b07-a65f-4fec-8564-bcd51df6c010-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "53371b07-a65f-4fec-8564-bcd51df6c010" (UID: "53371b07-a65f-4fec-8564-bcd51df6c010"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:10 crc kubenswrapper[4922]: I0218 11:57:10.506835 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53371b07-a65f-4fec-8564-bcd51df6c010-kube-api-access-jdmzm" (OuterVolumeSpecName: "kube-api-access-jdmzm") pod "53371b07-a65f-4fec-8564-bcd51df6c010" (UID: "53371b07-a65f-4fec-8564-bcd51df6c010"). InnerVolumeSpecName "kube-api-access-jdmzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:57:10 crc kubenswrapper[4922]: I0218 11:57:10.530774 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53371b07-a65f-4fec-8564-bcd51df6c010-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53371b07-a65f-4fec-8564-bcd51df6c010" (UID: "53371b07-a65f-4fec-8564-bcd51df6c010"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:10 crc kubenswrapper[4922]: I0218 11:57:10.588546 4922 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/53371b07-a65f-4fec-8564-bcd51df6c010-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:10 crc kubenswrapper[4922]: I0218 11:57:10.588581 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdmzm\" (UniqueName: \"kubernetes.io/projected/53371b07-a65f-4fec-8564-bcd51df6c010-kube-api-access-jdmzm\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:10 crc kubenswrapper[4922]: I0218 11:57:10.588596 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53371b07-a65f-4fec-8564-bcd51df6c010-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:10 crc kubenswrapper[4922]: I0218 11:57:10.595659 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53371b07-a65f-4fec-8564-bcd51df6c010-config-data" (OuterVolumeSpecName: "config-data") pod "53371b07-a65f-4fec-8564-bcd51df6c010" (UID: "53371b07-a65f-4fec-8564-bcd51df6c010"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:10 crc kubenswrapper[4922]: I0218 11:57:10.690501 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53371b07-a65f-4fec-8564-bcd51df6c010-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:10 crc kubenswrapper[4922]: I0218 11:57:10.988902 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24bbb94b-821e-4c8c-ae27-356f296903bf" path="/var/lib/kubelet/pods/24bbb94b-821e-4c8c-ae27-356f296903bf/volumes" Feb 18 11:57:11 crc kubenswrapper[4922]: I0218 11:57:11.017001 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6d677498bd-cxq98" event={"ID":"53371b07-a65f-4fec-8564-bcd51df6c010","Type":"ContainerDied","Data":"6019911b846afdb67dccdb7e03471f7cbf11c9aa081bfb07a432273f6bc2c54b"} Feb 18 11:57:11 crc kubenswrapper[4922]: I0218 11:57:11.017048 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6d677498bd-cxq98" Feb 18 11:57:11 crc kubenswrapper[4922]: I0218 11:57:11.017095 4922 scope.go:117] "RemoveContainer" containerID="d307cc88ce59602f69621bcf0371da0ee28a8272ec22fc60147af239fd78badf" Feb 18 11:57:11 crc kubenswrapper[4922]: I0218 11:57:11.024468 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7810aaca-e072-467b-bba7-6a3e12310c68","Type":"ContainerStarted","Data":"be139187f859b824114f17925d841efb955f0e7b83d0a0144a156b5f7b3e4df4"} Feb 18 11:57:11 crc kubenswrapper[4922]: I0218 11:57:11.028265 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72a01906-824d-4581-8d88-7d40a91786a1","Type":"ContainerStarted","Data":"f76ae30b441313706d2b84e1c38399b3498bd5d4bdc045db95788de5d490e8e9"} Feb 18 11:57:11 crc kubenswrapper[4922]: I0218 11:57:11.058348 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.058281606 podStartE2EDuration="5.058281606s" podCreationTimestamp="2026-02-18 11:57:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:57:11.048618072 +0000 UTC m=+1232.776322172" watchObservedRunningTime="2026-02-18 11:57:11.058281606 +0000 UTC m=+1232.785985686" Feb 18 11:57:11 crc kubenswrapper[4922]: I0218 11:57:11.091977 4922 scope.go:117] "RemoveContainer" containerID="95063da6319615da44d0b1f12dd9abb6a109354894bb0cb11f9c25d007446b4c" Feb 18 11:57:11 crc kubenswrapper[4922]: I0218 11:57:11.095998 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6d677498bd-cxq98"] Feb 18 11:57:11 crc kubenswrapper[4922]: I0218 11:57:11.107417 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6d677498bd-cxq98"] Feb 18 11:57:12 crc kubenswrapper[4922]: I0218 11:57:12.039547 4922 generic.go:334] "Generic (PLEG): container finished" podID="74a83ecb-de31-4767-a178-bccf8a37e93e" containerID="fc4b0d98b4a0ffc17082b832ce25867774e01fb1582211f0e74f81571c524a12" exitCode=0 Feb 18 11:57:12 crc kubenswrapper[4922]: I0218 11:57:12.039606 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"74a83ecb-de31-4767-a178-bccf8a37e93e","Type":"ContainerDied","Data":"fc4b0d98b4a0ffc17082b832ce25867774e01fb1582211f0e74f81571c524a12"} Feb 18 11:57:12 crc kubenswrapper[4922]: I0218 11:57:12.043777 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72a01906-824d-4581-8d88-7d40a91786a1","Type":"ContainerStarted","Data":"c4b4f11931af9a7b58f5f6e6efc41376cb3e1dcbdd51081feef30ab45a264ba3"} Feb 18 11:57:12 crc kubenswrapper[4922]: I0218 11:57:12.990145 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53371b07-a65f-4fec-8564-bcd51df6c010" path="/var/lib/kubelet/pods/53371b07-a65f-4fec-8564-bcd51df6c010/volumes" Feb 18 11:57:14 crc kubenswrapper[4922]: I0218 11:57:14.067945 4922 generic.go:334] "Generic (PLEG): container finished" podID="e3b4cee1-5234-4b6c-93fa-3cb5687ecba9" containerID="666106d3ce79297e2d8ed502af3f7dcfa2bfde51946ace3e35cdb24c5fc323a9" exitCode=137 Feb 18 11:57:14 crc kubenswrapper[4922]: I0218 11:57:14.068042 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9d79df67b-mg9kq" event={"ID":"e3b4cee1-5234-4b6c-93fa-3cb5687ecba9","Type":"ContainerDied","Data":"666106d3ce79297e2d8ed502af3f7dcfa2bfde51946ace3e35cdb24c5fc323a9"} Feb 18 11:57:14 crc kubenswrapper[4922]: I0218 11:57:14.070710 4922 generic.go:334] "Generic (PLEG): container finished" podID="74a83ecb-de31-4767-a178-bccf8a37e93e" containerID="d5a387d1fd841c5e7a8b16f873f082fef6f77bf7e19af0404c6f60662142d1da" exitCode=0 Feb 18 11:57:14 crc kubenswrapper[4922]: I0218 11:57:14.070744 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"74a83ecb-de31-4767-a178-bccf8a37e93e","Type":"ContainerDied","Data":"d5a387d1fd841c5e7a8b16f873f082fef6f77bf7e19af0404c6f60662142d1da"} Feb 18 11:57:14 crc kubenswrapper[4922]: I0218 11:57:14.678473 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6bb9876df9-jt7kg" Feb 18 11:57:14 crc kubenswrapper[4922]: I0218 11:57:14.682623 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6bb9876df9-jt7kg" Feb 18 11:57:15 crc kubenswrapper[4922]: I0218 11:57:15.436218 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 18 11:57:15 crc kubenswrapper[4922]: I0218 11:57:15.436599 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 18 11:57:15 crc kubenswrapper[4922]: I0218 11:57:15.478510 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 18 11:57:15 crc kubenswrapper[4922]: I0218 11:57:15.483540 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 18 11:57:16 crc kubenswrapper[4922]: I0218 11:57:16.106622 4922 generic.go:334] "Generic (PLEG): container finished" podID="4bcd3608-244b-44f0-be1f-5d953cd35964" containerID="b2705f7646a829371102cfea131123135c7490beec56768c960d999d9c5fa2de" exitCode=0 Feb 18 11:57:16 crc kubenswrapper[4922]: I0218 11:57:16.108509 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-zjb6x" event={"ID":"4bcd3608-244b-44f0-be1f-5d953cd35964","Type":"ContainerDied","Data":"b2705f7646a829371102cfea131123135c7490beec56768c960d999d9c5fa2de"} Feb 18 11:57:16 crc kubenswrapper[4922]: I0218 11:57:16.108557 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 18 11:57:16 crc kubenswrapper[4922]: I0218 11:57:16.109321 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 18 11:57:16 crc kubenswrapper[4922]: I0218 11:57:16.559652 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 18 11:57:16 crc kubenswrapper[4922]: I0218 11:57:16.863819 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 11:57:17 crc kubenswrapper[4922]: I0218 11:57:17.438048 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 18 11:57:17 crc kubenswrapper[4922]: I0218 11:57:17.438452 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 18 11:57:17 crc kubenswrapper[4922]: I0218 11:57:17.505814 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 18 11:57:17 crc kubenswrapper[4922]: I0218 11:57:17.522279 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 18 11:57:18 crc kubenswrapper[4922]: I0218 11:57:18.130778 4922 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 18 11:57:18 crc kubenswrapper[4922]: I0218 11:57:18.130807 4922 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 18 11:57:18 crc kubenswrapper[4922]: I0218 11:57:18.133497 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 18 11:57:18 crc kubenswrapper[4922]: I0218 11:57:18.133552 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 18 11:57:18 crc kubenswrapper[4922]: I0218 11:57:18.289509 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7c7b84785b-f8lmj" Feb 18 11:57:18 crc kubenswrapper[4922]: I0218 11:57:18.389188 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-9d79df67b-mg9kq" podUID="e3b4cee1-5234-4b6c-93fa-3cb5687ecba9" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.159:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.159:8443: connect: connection refused" Feb 18 11:57:18 crc kubenswrapper[4922]: I0218 11:57:18.676433 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-zjb6x" Feb 18 11:57:18 crc kubenswrapper[4922]: I0218 11:57:18.869326 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvmgk\" (UniqueName: \"kubernetes.io/projected/4bcd3608-244b-44f0-be1f-5d953cd35964-kube-api-access-kvmgk\") pod \"4bcd3608-244b-44f0-be1f-5d953cd35964\" (UID: \"4bcd3608-244b-44f0-be1f-5d953cd35964\") " Feb 18 11:57:18 crc kubenswrapper[4922]: I0218 11:57:18.869585 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bcd3608-244b-44f0-be1f-5d953cd35964-combined-ca-bundle\") pod \"4bcd3608-244b-44f0-be1f-5d953cd35964\" (UID: \"4bcd3608-244b-44f0-be1f-5d953cd35964\") " Feb 18 11:57:18 crc kubenswrapper[4922]: I0218 11:57:18.869731 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4bcd3608-244b-44f0-be1f-5d953cd35964-config\") pod \"4bcd3608-244b-44f0-be1f-5d953cd35964\" (UID: \"4bcd3608-244b-44f0-be1f-5d953cd35964\") " Feb 18 11:57:18 crc kubenswrapper[4922]: I0218 11:57:18.878662 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bcd3608-244b-44f0-be1f-5d953cd35964-kube-api-access-kvmgk" (OuterVolumeSpecName: "kube-api-access-kvmgk") pod "4bcd3608-244b-44f0-be1f-5d953cd35964" (UID: "4bcd3608-244b-44f0-be1f-5d953cd35964"). InnerVolumeSpecName "kube-api-access-kvmgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:57:18 crc kubenswrapper[4922]: I0218 11:57:18.958405 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bcd3608-244b-44f0-be1f-5d953cd35964-config" (OuterVolumeSpecName: "config") pod "4bcd3608-244b-44f0-be1f-5d953cd35964" (UID: "4bcd3608-244b-44f0-be1f-5d953cd35964"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:18 crc kubenswrapper[4922]: I0218 11:57:18.964390 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bcd3608-244b-44f0-be1f-5d953cd35964-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4bcd3608-244b-44f0-be1f-5d953cd35964" (UID: "4bcd3608-244b-44f0-be1f-5d953cd35964"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:18 crc kubenswrapper[4922]: I0218 11:57:18.982008 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bcd3608-244b-44f0-be1f-5d953cd35964-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:18 crc kubenswrapper[4922]: I0218 11:57:18.982051 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4bcd3608-244b-44f0-be1f-5d953cd35964-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:18 crc kubenswrapper[4922]: I0218 11:57:18.982068 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvmgk\" (UniqueName: \"kubernetes.io/projected/4bcd3608-244b-44f0-be1f-5d953cd35964-kube-api-access-kvmgk\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:19 crc kubenswrapper[4922]: I0218 11:57:19.141757 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9d79df67b-mg9kq" Feb 18 11:57:19 crc kubenswrapper[4922]: I0218 11:57:19.163536 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9d79df67b-mg9kq" event={"ID":"e3b4cee1-5234-4b6c-93fa-3cb5687ecba9","Type":"ContainerDied","Data":"daa7d2ea30da11dcfd9f4970f896f52151306629ac261d217fb3c7d7a4b261fd"} Feb 18 11:57:19 crc kubenswrapper[4922]: I0218 11:57:19.163592 4922 scope.go:117] "RemoveContainer" containerID="5d696a6b456c46bcc03a5a4e42955c244ebb50a3f0ae3ca3ca3bec18dc6258ea" Feb 18 11:57:19 crc kubenswrapper[4922]: I0218 11:57:19.163782 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9d79df67b-mg9kq" Feb 18 11:57:19 crc kubenswrapper[4922]: I0218 11:57:19.189946 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-zjb6x" event={"ID":"4bcd3608-244b-44f0-be1f-5d953cd35964","Type":"ContainerDied","Data":"4f713717bbd69a1844002c6344555c40f26be59a2b8b6c3086945e62b2e3a5ca"} Feb 18 11:57:19 crc kubenswrapper[4922]: I0218 11:57:19.189987 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f713717bbd69a1844002c6344555c40f26be59a2b8b6c3086945e62b2e3a5ca" Feb 18 11:57:19 crc kubenswrapper[4922]: I0218 11:57:19.191168 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-zjb6x" Feb 18 11:57:19 crc kubenswrapper[4922]: I0218 11:57:19.288016 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3b4cee1-5234-4b6c-93fa-3cb5687ecba9-logs\") pod \"e3b4cee1-5234-4b6c-93fa-3cb5687ecba9\" (UID: \"e3b4cee1-5234-4b6c-93fa-3cb5687ecba9\") " Feb 18 11:57:19 crc kubenswrapper[4922]: I0218 11:57:19.288070 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3b4cee1-5234-4b6c-93fa-3cb5687ecba9-horizon-tls-certs\") pod \"e3b4cee1-5234-4b6c-93fa-3cb5687ecba9\" (UID: \"e3b4cee1-5234-4b6c-93fa-3cb5687ecba9\") " Feb 18 11:57:19 crc kubenswrapper[4922]: I0218 11:57:19.288160 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e3b4cee1-5234-4b6c-93fa-3cb5687ecba9-horizon-secret-key\") pod \"e3b4cee1-5234-4b6c-93fa-3cb5687ecba9\" (UID: \"e3b4cee1-5234-4b6c-93fa-3cb5687ecba9\") " Feb 18 11:57:19 crc kubenswrapper[4922]: I0218 11:57:19.288185 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvkcr\" (UniqueName: \"kubernetes.io/projected/e3b4cee1-5234-4b6c-93fa-3cb5687ecba9-kube-api-access-kvkcr\") pod \"e3b4cee1-5234-4b6c-93fa-3cb5687ecba9\" (UID: \"e3b4cee1-5234-4b6c-93fa-3cb5687ecba9\") " Feb 18 11:57:19 crc kubenswrapper[4922]: I0218 11:57:19.288254 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e3b4cee1-5234-4b6c-93fa-3cb5687ecba9-scripts\") pod \"e3b4cee1-5234-4b6c-93fa-3cb5687ecba9\" (UID: \"e3b4cee1-5234-4b6c-93fa-3cb5687ecba9\") " Feb 18 11:57:19 crc kubenswrapper[4922]: I0218 11:57:19.288313 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3b4cee1-5234-4b6c-93fa-3cb5687ecba9-combined-ca-bundle\") pod \"e3b4cee1-5234-4b6c-93fa-3cb5687ecba9\" (UID: \"e3b4cee1-5234-4b6c-93fa-3cb5687ecba9\") " Feb 18 11:57:19 crc kubenswrapper[4922]: I0218 11:57:19.288429 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3b4cee1-5234-4b6c-93fa-3cb5687ecba9-config-data\") pod \"e3b4cee1-5234-4b6c-93fa-3cb5687ecba9\" (UID: \"e3b4cee1-5234-4b6c-93fa-3cb5687ecba9\") " Feb 18 11:57:19 crc kubenswrapper[4922]: I0218 11:57:19.289968 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3b4cee1-5234-4b6c-93fa-3cb5687ecba9-logs" (OuterVolumeSpecName: "logs") pod "e3b4cee1-5234-4b6c-93fa-3cb5687ecba9" (UID: "e3b4cee1-5234-4b6c-93fa-3cb5687ecba9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:57:19 crc kubenswrapper[4922]: I0218 11:57:19.298547 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3b4cee1-5234-4b6c-93fa-3cb5687ecba9-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "e3b4cee1-5234-4b6c-93fa-3cb5687ecba9" (UID: "e3b4cee1-5234-4b6c-93fa-3cb5687ecba9"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:19 crc kubenswrapper[4922]: I0218 11:57:19.302553 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3b4cee1-5234-4b6c-93fa-3cb5687ecba9-kube-api-access-kvkcr" (OuterVolumeSpecName: "kube-api-access-kvkcr") pod "e3b4cee1-5234-4b6c-93fa-3cb5687ecba9" (UID: "e3b4cee1-5234-4b6c-93fa-3cb5687ecba9"). InnerVolumeSpecName "kube-api-access-kvkcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:57:19 crc kubenswrapper[4922]: I0218 11:57:19.329478 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7c7b84785b-f8lmj" Feb 18 11:57:19 crc kubenswrapper[4922]: I0218 11:57:19.342026 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3b4cee1-5234-4b6c-93fa-3cb5687ecba9-config-data" (OuterVolumeSpecName: "config-data") pod "e3b4cee1-5234-4b6c-93fa-3cb5687ecba9" (UID: "e3b4cee1-5234-4b6c-93fa-3cb5687ecba9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:57:19 crc kubenswrapper[4922]: I0218 11:57:19.357429 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3b4cee1-5234-4b6c-93fa-3cb5687ecba9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3b4cee1-5234-4b6c-93fa-3cb5687ecba9" (UID: "e3b4cee1-5234-4b6c-93fa-3cb5687ecba9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:19 crc kubenswrapper[4922]: I0218 11:57:19.387976 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3b4cee1-5234-4b6c-93fa-3cb5687ecba9-scripts" (OuterVolumeSpecName: "scripts") pod "e3b4cee1-5234-4b6c-93fa-3cb5687ecba9" (UID: "e3b4cee1-5234-4b6c-93fa-3cb5687ecba9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:57:19 crc kubenswrapper[4922]: I0218 11:57:19.399442 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3b4cee1-5234-4b6c-93fa-3cb5687ecba9-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:19 crc kubenswrapper[4922]: I0218 11:57:19.399486 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3b4cee1-5234-4b6c-93fa-3cb5687ecba9-logs\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:19 crc kubenswrapper[4922]: I0218 11:57:19.399499 4922 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e3b4cee1-5234-4b6c-93fa-3cb5687ecba9-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:19 crc kubenswrapper[4922]: I0218 11:57:19.399513 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvkcr\" (UniqueName: \"kubernetes.io/projected/e3b4cee1-5234-4b6c-93fa-3cb5687ecba9-kube-api-access-kvkcr\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:19 crc kubenswrapper[4922]: I0218 11:57:19.399526 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e3b4cee1-5234-4b6c-93fa-3cb5687ecba9-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:19 crc kubenswrapper[4922]: I0218 11:57:19.399538 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3b4cee1-5234-4b6c-93fa-3cb5687ecba9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:19 crc kubenswrapper[4922]: I0218 11:57:19.411510 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3b4cee1-5234-4b6c-93fa-3cb5687ecba9-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "e3b4cee1-5234-4b6c-93fa-3cb5687ecba9" (UID: "e3b4cee1-5234-4b6c-93fa-3cb5687ecba9"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:19 crc kubenswrapper[4922]: I0218 11:57:19.494828 4922 scope.go:117] "RemoveContainer" containerID="666106d3ce79297e2d8ed502af3f7dcfa2bfde51946ace3e35cdb24c5fc323a9" Feb 18 11:57:19 crc kubenswrapper[4922]: I0218 11:57:19.501192 4922 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3b4cee1-5234-4b6c-93fa-3cb5687ecba9-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:19 crc kubenswrapper[4922]: I0218 11:57:19.600281 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-9d79df67b-mg9kq"] Feb 18 11:57:19 crc kubenswrapper[4922]: I0218 11:57:19.619839 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 18 11:57:19 crc kubenswrapper[4922]: I0218 11:57:19.657338 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-9d79df67b-mg9kq"] Feb 18 11:57:19 crc kubenswrapper[4922]: I0218 11:57:19.813023 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74a83ecb-de31-4767-a178-bccf8a37e93e-scripts\") pod \"74a83ecb-de31-4767-a178-bccf8a37e93e\" (UID: \"74a83ecb-de31-4767-a178-bccf8a37e93e\") " Feb 18 11:57:19 crc kubenswrapper[4922]: I0218 11:57:19.813287 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4r77f\" (UniqueName: \"kubernetes.io/projected/74a83ecb-de31-4767-a178-bccf8a37e93e-kube-api-access-4r77f\") pod \"74a83ecb-de31-4767-a178-bccf8a37e93e\" (UID: \"74a83ecb-de31-4767-a178-bccf8a37e93e\") " Feb 18 11:57:19 crc kubenswrapper[4922]: I0218 11:57:19.813338 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/74a83ecb-de31-4767-a178-bccf8a37e93e-config-data-custom\") pod \"74a83ecb-de31-4767-a178-bccf8a37e93e\" (UID: \"74a83ecb-de31-4767-a178-bccf8a37e93e\") " Feb 18 11:57:19 crc kubenswrapper[4922]: I0218 11:57:19.813458 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a83ecb-de31-4767-a178-bccf8a37e93e-combined-ca-bundle\") pod \"74a83ecb-de31-4767-a178-bccf8a37e93e\" (UID: \"74a83ecb-de31-4767-a178-bccf8a37e93e\") " Feb 18 11:57:19 crc kubenswrapper[4922]: I0218 11:57:19.813489 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/74a83ecb-de31-4767-a178-bccf8a37e93e-etc-machine-id\") pod \"74a83ecb-de31-4767-a178-bccf8a37e93e\" (UID: \"74a83ecb-de31-4767-a178-bccf8a37e93e\") " Feb 18 11:57:19 crc kubenswrapper[4922]: I0218 11:57:19.813524 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74a83ecb-de31-4767-a178-bccf8a37e93e-config-data\") pod \"74a83ecb-de31-4767-a178-bccf8a37e93e\" (UID: \"74a83ecb-de31-4767-a178-bccf8a37e93e\") " Feb 18 11:57:19 crc kubenswrapper[4922]: I0218 11:57:19.830466 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74a83ecb-de31-4767-a178-bccf8a37e93e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "74a83ecb-de31-4767-a178-bccf8a37e93e" (UID: "74a83ecb-de31-4767-a178-bccf8a37e93e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:19 crc kubenswrapper[4922]: I0218 11:57:19.830777 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74a83ecb-de31-4767-a178-bccf8a37e93e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "74a83ecb-de31-4767-a178-bccf8a37e93e" (UID: "74a83ecb-de31-4767-a178-bccf8a37e93e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 11:57:19 crc kubenswrapper[4922]: I0218 11:57:19.831152 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74a83ecb-de31-4767-a178-bccf8a37e93e-scripts" (OuterVolumeSpecName: "scripts") pod "74a83ecb-de31-4767-a178-bccf8a37e93e" (UID: "74a83ecb-de31-4767-a178-bccf8a37e93e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:19 crc kubenswrapper[4922]: I0218 11:57:19.842930 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74a83ecb-de31-4767-a178-bccf8a37e93e-kube-api-access-4r77f" (OuterVolumeSpecName: "kube-api-access-4r77f") pod "74a83ecb-de31-4767-a178-bccf8a37e93e" (UID: "74a83ecb-de31-4767-a178-bccf8a37e93e"). InnerVolumeSpecName "kube-api-access-4r77f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:57:19 crc kubenswrapper[4922]: I0218 11:57:19.915447 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4r77f\" (UniqueName: \"kubernetes.io/projected/74a83ecb-de31-4767-a178-bccf8a37e93e-kube-api-access-4r77f\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:19 crc kubenswrapper[4922]: I0218 11:57:19.915484 4922 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/74a83ecb-de31-4767-a178-bccf8a37e93e-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:19 crc kubenswrapper[4922]: I0218 11:57:19.915496 4922 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/74a83ecb-de31-4767-a178-bccf8a37e93e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:19 crc kubenswrapper[4922]: I0218 11:57:19.915508 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74a83ecb-de31-4767-a178-bccf8a37e93e-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.003725 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74a83ecb-de31-4767-a178-bccf8a37e93e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74a83ecb-de31-4767-a178-bccf8a37e93e" (UID: "74a83ecb-de31-4767-a178-bccf8a37e93e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.019179 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a83ecb-de31-4767-a178-bccf8a37e93e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.082437 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-78vwz"] Feb 18 11:57:20 crc kubenswrapper[4922]: E0218 11:57:20.082920 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3b4cee1-5234-4b6c-93fa-3cb5687ecba9" containerName="horizon-log" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.082934 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3b4cee1-5234-4b6c-93fa-3cb5687ecba9" containerName="horizon-log" Feb 18 11:57:20 crc kubenswrapper[4922]: E0218 11:57:20.082953 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53371b07-a65f-4fec-8564-bcd51df6c010" containerName="barbican-api-log" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.082959 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="53371b07-a65f-4fec-8564-bcd51df6c010" containerName="barbican-api-log" Feb 18 11:57:20 crc kubenswrapper[4922]: E0218 11:57:20.082994 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74a83ecb-de31-4767-a178-bccf8a37e93e" containerName="probe" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.083001 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="74a83ecb-de31-4767-a178-bccf8a37e93e" containerName="probe" Feb 18 11:57:20 crc kubenswrapper[4922]: E0218 11:57:20.083013 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53371b07-a65f-4fec-8564-bcd51df6c010" containerName="barbican-api" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.083019 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="53371b07-a65f-4fec-8564-bcd51df6c010" containerName="barbican-api" Feb 18 11:57:20 crc kubenswrapper[4922]: E0218 11:57:20.083031 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74a83ecb-de31-4767-a178-bccf8a37e93e" containerName="cinder-scheduler" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.083037 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="74a83ecb-de31-4767-a178-bccf8a37e93e" containerName="cinder-scheduler" Feb 18 11:57:20 crc kubenswrapper[4922]: E0218 11:57:20.083048 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bcd3608-244b-44f0-be1f-5d953cd35964" containerName="neutron-db-sync" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.083054 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bcd3608-244b-44f0-be1f-5d953cd35964" containerName="neutron-db-sync" Feb 18 11:57:20 crc kubenswrapper[4922]: E0218 11:57:20.083061 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3b4cee1-5234-4b6c-93fa-3cb5687ecba9" containerName="horizon" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.083067 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3b4cee1-5234-4b6c-93fa-3cb5687ecba9" containerName="horizon" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.083246 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3b4cee1-5234-4b6c-93fa-3cb5687ecba9" containerName="horizon-log" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.083262 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="53371b07-a65f-4fec-8564-bcd51df6c010" containerName="barbican-api-log" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.083272 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="53371b07-a65f-4fec-8564-bcd51df6c010" containerName="barbican-api" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.083284 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bcd3608-244b-44f0-be1f-5d953cd35964" containerName="neutron-db-sync" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.083296 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3b4cee1-5234-4b6c-93fa-3cb5687ecba9" containerName="horizon" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.083305 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="74a83ecb-de31-4767-a178-bccf8a37e93e" containerName="cinder-scheduler" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.083313 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="74a83ecb-de31-4767-a178-bccf8a37e93e" containerName="probe" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.084356 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-78vwz" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.137535 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-78vwz"] Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.222969 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-979b8465b-gmztk"] Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.226591 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-979b8465b-gmztk" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.231440 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.231723 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-bmh7l" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.232311 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.232471 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.234351 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d36f2285-2752-4cad-bf52-fe6ae0b262d1-config\") pod \"dnsmasq-dns-6578955fd5-78vwz\" (UID: \"d36f2285-2752-4cad-bf52-fe6ae0b262d1\") " pod="openstack/dnsmasq-dns-6578955fd5-78vwz" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.234462 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mphx\" (UniqueName: \"kubernetes.io/projected/d36f2285-2752-4cad-bf52-fe6ae0b262d1-kube-api-access-4mphx\") pod \"dnsmasq-dns-6578955fd5-78vwz\" (UID: \"d36f2285-2752-4cad-bf52-fe6ae0b262d1\") " pod="openstack/dnsmasq-dns-6578955fd5-78vwz" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.234535 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d36f2285-2752-4cad-bf52-fe6ae0b262d1-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-78vwz\" (UID: \"d36f2285-2752-4cad-bf52-fe6ae0b262d1\") " pod="openstack/dnsmasq-dns-6578955fd5-78vwz" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.234647 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d36f2285-2752-4cad-bf52-fe6ae0b262d1-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-78vwz\" (UID: \"d36f2285-2752-4cad-bf52-fe6ae0b262d1\") " pod="openstack/dnsmasq-dns-6578955fd5-78vwz" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.234677 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d36f2285-2752-4cad-bf52-fe6ae0b262d1-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-78vwz\" (UID: \"d36f2285-2752-4cad-bf52-fe6ae0b262d1\") " pod="openstack/dnsmasq-dns-6578955fd5-78vwz" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.234745 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d36f2285-2752-4cad-bf52-fe6ae0b262d1-dns-svc\") pod \"dnsmasq-dns-6578955fd5-78vwz\" (UID: \"d36f2285-2752-4cad-bf52-fe6ae0b262d1\") " pod="openstack/dnsmasq-dns-6578955fd5-78vwz" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.250963 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"245b1cb9-d98f-4875-adf6-ab887f76849d","Type":"ContainerStarted","Data":"3cb7398179b3d5a8982fa17deb6a397be93a34b5df70b46d6194d91e2f51206b"} Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.265733 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-979b8465b-gmztk"] Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.288350 4922 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.288668 4922 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.288564 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.288589 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"74a83ecb-de31-4767-a178-bccf8a37e93e","Type":"ContainerDied","Data":"94eaad4dfdfd79b9d6e5320a8b3266331430b86c446a5d0f1d18cea2bf387427"} Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.289839 4922 scope.go:117] "RemoveContainer" containerID="fc4b0d98b4a0ffc17082b832ce25867774e01fb1582211f0e74f81571c524a12" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.297275 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74a83ecb-de31-4767-a178-bccf8a37e93e-config-data" (OuterVolumeSpecName: "config-data") pod "74a83ecb-de31-4767-a178-bccf8a37e93e" (UID: "74a83ecb-de31-4767-a178-bccf8a37e93e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.314769 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.193801735 podStartE2EDuration="27.314748392s" podCreationTimestamp="2026-02-18 11:56:53 +0000 UTC" firstStartedPulling="2026-02-18 11:56:54.73651391 +0000 UTC m=+1216.464217990" lastFinishedPulling="2026-02-18 11:57:18.857460567 +0000 UTC m=+1240.585164647" observedRunningTime="2026-02-18 11:57:20.281747338 +0000 UTC m=+1242.009451428" watchObservedRunningTime="2026-02-18 11:57:20.314748392 +0000 UTC m=+1242.042452472" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.332644 4922 scope.go:117] "RemoveContainer" containerID="d5a387d1fd841c5e7a8b16f873f082fef6f77bf7e19af0404c6f60662142d1da" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.336125 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b1ea57e-dcf2-4e47-8650-af483b18ea8f-combined-ca-bundle\") pod \"neutron-979b8465b-gmztk\" (UID: \"6b1ea57e-dcf2-4e47-8650-af483b18ea8f\") " pod="openstack/neutron-979b8465b-gmztk" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.336184 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mphx\" (UniqueName: \"kubernetes.io/projected/d36f2285-2752-4cad-bf52-fe6ae0b262d1-kube-api-access-4mphx\") pod \"dnsmasq-dns-6578955fd5-78vwz\" (UID: \"d36f2285-2752-4cad-bf52-fe6ae0b262d1\") " pod="openstack/dnsmasq-dns-6578955fd5-78vwz" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.336218 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d36f2285-2752-4cad-bf52-fe6ae0b262d1-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-78vwz\" (UID: \"d36f2285-2752-4cad-bf52-fe6ae0b262d1\") " pod="openstack/dnsmasq-dns-6578955fd5-78vwz" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.336259 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjthv\" (UniqueName: \"kubernetes.io/projected/6b1ea57e-dcf2-4e47-8650-af483b18ea8f-kube-api-access-vjthv\") pod \"neutron-979b8465b-gmztk\" (UID: \"6b1ea57e-dcf2-4e47-8650-af483b18ea8f\") " pod="openstack/neutron-979b8465b-gmztk" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.336294 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6b1ea57e-dcf2-4e47-8650-af483b18ea8f-config\") pod \"neutron-979b8465b-gmztk\" (UID: \"6b1ea57e-dcf2-4e47-8650-af483b18ea8f\") " pod="openstack/neutron-979b8465b-gmztk" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.336337 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d36f2285-2752-4cad-bf52-fe6ae0b262d1-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-78vwz\" (UID: \"d36f2285-2752-4cad-bf52-fe6ae0b262d1\") " pod="openstack/dnsmasq-dns-6578955fd5-78vwz" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.336451 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d36f2285-2752-4cad-bf52-fe6ae0b262d1-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-78vwz\" (UID: \"d36f2285-2752-4cad-bf52-fe6ae0b262d1\") " pod="openstack/dnsmasq-dns-6578955fd5-78vwz" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.336532 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d36f2285-2752-4cad-bf52-fe6ae0b262d1-dns-svc\") pod \"dnsmasq-dns-6578955fd5-78vwz\" (UID: \"d36f2285-2752-4cad-bf52-fe6ae0b262d1\") " pod="openstack/dnsmasq-dns-6578955fd5-78vwz" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.336557 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b1ea57e-dcf2-4e47-8650-af483b18ea8f-ovndb-tls-certs\") pod \"neutron-979b8465b-gmztk\" (UID: \"6b1ea57e-dcf2-4e47-8650-af483b18ea8f\") " pod="openstack/neutron-979b8465b-gmztk" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.336612 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d36f2285-2752-4cad-bf52-fe6ae0b262d1-config\") pod \"dnsmasq-dns-6578955fd5-78vwz\" (UID: \"d36f2285-2752-4cad-bf52-fe6ae0b262d1\") " pod="openstack/dnsmasq-dns-6578955fd5-78vwz" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.336651 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6b1ea57e-dcf2-4e47-8650-af483b18ea8f-httpd-config\") pod \"neutron-979b8465b-gmztk\" (UID: \"6b1ea57e-dcf2-4e47-8650-af483b18ea8f\") " pod="openstack/neutron-979b8465b-gmztk" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.336731 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74a83ecb-de31-4767-a178-bccf8a37e93e-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.337335 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d36f2285-2752-4cad-bf52-fe6ae0b262d1-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-78vwz\" (UID: \"d36f2285-2752-4cad-bf52-fe6ae0b262d1\") " pod="openstack/dnsmasq-dns-6578955fd5-78vwz" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.337731 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d36f2285-2752-4cad-bf52-fe6ae0b262d1-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-78vwz\" (UID: \"d36f2285-2752-4cad-bf52-fe6ae0b262d1\") " pod="openstack/dnsmasq-dns-6578955fd5-78vwz" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.338280 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d36f2285-2752-4cad-bf52-fe6ae0b262d1-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-78vwz\" (UID: \"d36f2285-2752-4cad-bf52-fe6ae0b262d1\") " pod="openstack/dnsmasq-dns-6578955fd5-78vwz" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.339509 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d36f2285-2752-4cad-bf52-fe6ae0b262d1-dns-svc\") pod \"dnsmasq-dns-6578955fd5-78vwz\" (UID: \"d36f2285-2752-4cad-bf52-fe6ae0b262d1\") " pod="openstack/dnsmasq-dns-6578955fd5-78vwz" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.340145 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d36f2285-2752-4cad-bf52-fe6ae0b262d1-config\") pod \"dnsmasq-dns-6578955fd5-78vwz\" (UID: \"d36f2285-2752-4cad-bf52-fe6ae0b262d1\") " pod="openstack/dnsmasq-dns-6578955fd5-78vwz" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.364051 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mphx\" (UniqueName: \"kubernetes.io/projected/d36f2285-2752-4cad-bf52-fe6ae0b262d1-kube-api-access-4mphx\") pod \"dnsmasq-dns-6578955fd5-78vwz\" (UID: \"d36f2285-2752-4cad-bf52-fe6ae0b262d1\") " pod="openstack/dnsmasq-dns-6578955fd5-78vwz" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.449040 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjthv\" (UniqueName: \"kubernetes.io/projected/6b1ea57e-dcf2-4e47-8650-af483b18ea8f-kube-api-access-vjthv\") pod \"neutron-979b8465b-gmztk\" (UID: \"6b1ea57e-dcf2-4e47-8650-af483b18ea8f\") " pod="openstack/neutron-979b8465b-gmztk" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.449718 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6b1ea57e-dcf2-4e47-8650-af483b18ea8f-config\") pod \"neutron-979b8465b-gmztk\" (UID: \"6b1ea57e-dcf2-4e47-8650-af483b18ea8f\") " pod="openstack/neutron-979b8465b-gmztk" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.450307 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b1ea57e-dcf2-4e47-8650-af483b18ea8f-ovndb-tls-certs\") pod \"neutron-979b8465b-gmztk\" (UID: \"6b1ea57e-dcf2-4e47-8650-af483b18ea8f\") " pod="openstack/neutron-979b8465b-gmztk" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.450666 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6b1ea57e-dcf2-4e47-8650-af483b18ea8f-httpd-config\") pod \"neutron-979b8465b-gmztk\" (UID: \"6b1ea57e-dcf2-4e47-8650-af483b18ea8f\") " pod="openstack/neutron-979b8465b-gmztk" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.450977 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b1ea57e-dcf2-4e47-8650-af483b18ea8f-combined-ca-bundle\") pod \"neutron-979b8465b-gmztk\" (UID: \"6b1ea57e-dcf2-4e47-8650-af483b18ea8f\") " pod="openstack/neutron-979b8465b-gmztk" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.458173 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b1ea57e-dcf2-4e47-8650-af483b18ea8f-ovndb-tls-certs\") pod \"neutron-979b8465b-gmztk\" (UID: \"6b1ea57e-dcf2-4e47-8650-af483b18ea8f\") " pod="openstack/neutron-979b8465b-gmztk" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.459616 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6b1ea57e-dcf2-4e47-8650-af483b18ea8f-config\") pod \"neutron-979b8465b-gmztk\" (UID: \"6b1ea57e-dcf2-4e47-8650-af483b18ea8f\") " pod="openstack/neutron-979b8465b-gmztk" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.461204 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b1ea57e-dcf2-4e47-8650-af483b18ea8f-combined-ca-bundle\") pod \"neutron-979b8465b-gmztk\" (UID: \"6b1ea57e-dcf2-4e47-8650-af483b18ea8f\") " pod="openstack/neutron-979b8465b-gmztk" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.475321 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-78vwz" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.478068 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6b1ea57e-dcf2-4e47-8650-af483b18ea8f-httpd-config\") pod \"neutron-979b8465b-gmztk\" (UID: \"6b1ea57e-dcf2-4e47-8650-af483b18ea8f\") " pod="openstack/neutron-979b8465b-gmztk" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.491317 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjthv\" (UniqueName: \"kubernetes.io/projected/6b1ea57e-dcf2-4e47-8650-af483b18ea8f-kube-api-access-vjthv\") pod \"neutron-979b8465b-gmztk\" (UID: \"6b1ea57e-dcf2-4e47-8650-af483b18ea8f\") " pod="openstack/neutron-979b8465b-gmztk" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.567716 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-979b8465b-gmztk" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.802773 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.838412 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.868479 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.880158 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.899778 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.929281 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.969025 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bd3cd2cf-8780-4de2-925c-5385d6398e49-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"bd3cd2cf-8780-4de2-925c-5385d6398e49\") " pod="openstack/cinder-scheduler-0" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.969078 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bd3cd2cf-8780-4de2-925c-5385d6398e49-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"bd3cd2cf-8780-4de2-925c-5385d6398e49\") " pod="openstack/cinder-scheduler-0" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.969123 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc26d\" (UniqueName: \"kubernetes.io/projected/bd3cd2cf-8780-4de2-925c-5385d6398e49-kube-api-access-kc26d\") pod \"cinder-scheduler-0\" (UID: \"bd3cd2cf-8780-4de2-925c-5385d6398e49\") " pod="openstack/cinder-scheduler-0" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.969167 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd3cd2cf-8780-4de2-925c-5385d6398e49-scripts\") pod \"cinder-scheduler-0\" (UID: \"bd3cd2cf-8780-4de2-925c-5385d6398e49\") " pod="openstack/cinder-scheduler-0" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.969343 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd3cd2cf-8780-4de2-925c-5385d6398e49-config-data\") pod \"cinder-scheduler-0\" (UID: \"bd3cd2cf-8780-4de2-925c-5385d6398e49\") " pod="openstack/cinder-scheduler-0" Feb 18 11:57:20 crc kubenswrapper[4922]: I0218 11:57:20.970238 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd3cd2cf-8780-4de2-925c-5385d6398e49-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"bd3cd2cf-8780-4de2-925c-5385d6398e49\") " pod="openstack/cinder-scheduler-0" Feb 18 11:57:21 crc kubenswrapper[4922]: I0218 11:57:21.012907 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74a83ecb-de31-4767-a178-bccf8a37e93e" path="/var/lib/kubelet/pods/74a83ecb-de31-4767-a178-bccf8a37e93e/volumes" Feb 18 11:57:21 crc kubenswrapper[4922]: I0218 11:57:21.013851 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3b4cee1-5234-4b6c-93fa-3cb5687ecba9" path="/var/lib/kubelet/pods/e3b4cee1-5234-4b6c-93fa-3cb5687ecba9/volumes" Feb 18 11:57:21 crc kubenswrapper[4922]: I0218 11:57:21.071456 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd3cd2cf-8780-4de2-925c-5385d6398e49-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"bd3cd2cf-8780-4de2-925c-5385d6398e49\") " pod="openstack/cinder-scheduler-0" Feb 18 11:57:21 crc kubenswrapper[4922]: I0218 11:57:21.073476 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bd3cd2cf-8780-4de2-925c-5385d6398e49-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"bd3cd2cf-8780-4de2-925c-5385d6398e49\") " pod="openstack/cinder-scheduler-0" Feb 18 11:57:21 crc kubenswrapper[4922]: I0218 11:57:21.073503 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bd3cd2cf-8780-4de2-925c-5385d6398e49-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"bd3cd2cf-8780-4de2-925c-5385d6398e49\") " pod="openstack/cinder-scheduler-0" Feb 18 11:57:21 crc kubenswrapper[4922]: I0218 11:57:21.073541 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kc26d\" (UniqueName: \"kubernetes.io/projected/bd3cd2cf-8780-4de2-925c-5385d6398e49-kube-api-access-kc26d\") pod \"cinder-scheduler-0\" (UID: \"bd3cd2cf-8780-4de2-925c-5385d6398e49\") " pod="openstack/cinder-scheduler-0" Feb 18 11:57:21 crc kubenswrapper[4922]: I0218 11:57:21.073591 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd3cd2cf-8780-4de2-925c-5385d6398e49-scripts\") pod \"cinder-scheduler-0\" (UID: \"bd3cd2cf-8780-4de2-925c-5385d6398e49\") " pod="openstack/cinder-scheduler-0" Feb 18 11:57:21 crc kubenswrapper[4922]: I0218 11:57:21.073823 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd3cd2cf-8780-4de2-925c-5385d6398e49-config-data\") pod \"cinder-scheduler-0\" (UID: \"bd3cd2cf-8780-4de2-925c-5385d6398e49\") " pod="openstack/cinder-scheduler-0" Feb 18 11:57:21 crc kubenswrapper[4922]: I0218 11:57:21.073982 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bd3cd2cf-8780-4de2-925c-5385d6398e49-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"bd3cd2cf-8780-4de2-925c-5385d6398e49\") " pod="openstack/cinder-scheduler-0" Feb 18 11:57:21 crc kubenswrapper[4922]: I0218 11:57:21.080616 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd3cd2cf-8780-4de2-925c-5385d6398e49-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"bd3cd2cf-8780-4de2-925c-5385d6398e49\") " pod="openstack/cinder-scheduler-0" Feb 18 11:57:21 crc kubenswrapper[4922]: I0218 11:57:21.081077 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bd3cd2cf-8780-4de2-925c-5385d6398e49-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"bd3cd2cf-8780-4de2-925c-5385d6398e49\") " pod="openstack/cinder-scheduler-0" Feb 18 11:57:21 crc kubenswrapper[4922]: I0218 11:57:21.083414 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd3cd2cf-8780-4de2-925c-5385d6398e49-config-data\") pod \"cinder-scheduler-0\" (UID: \"bd3cd2cf-8780-4de2-925c-5385d6398e49\") " pod="openstack/cinder-scheduler-0" Feb 18 11:57:21 crc kubenswrapper[4922]: I0218 11:57:21.083850 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd3cd2cf-8780-4de2-925c-5385d6398e49-scripts\") pod \"cinder-scheduler-0\" (UID: \"bd3cd2cf-8780-4de2-925c-5385d6398e49\") " pod="openstack/cinder-scheduler-0" Feb 18 11:57:21 crc kubenswrapper[4922]: I0218 11:57:21.104081 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc26d\" (UniqueName: \"kubernetes.io/projected/bd3cd2cf-8780-4de2-925c-5385d6398e49-kube-api-access-kc26d\") pod \"cinder-scheduler-0\" (UID: \"bd3cd2cf-8780-4de2-925c-5385d6398e49\") " pod="openstack/cinder-scheduler-0" Feb 18 11:57:21 crc kubenswrapper[4922]: I0218 11:57:21.171135 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-78vwz"] Feb 18 11:57:21 crc kubenswrapper[4922]: I0218 11:57:21.238052 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 18 11:57:21 crc kubenswrapper[4922]: I0218 11:57:21.375163 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-78vwz" event={"ID":"d36f2285-2752-4cad-bf52-fe6ae0b262d1","Type":"ContainerStarted","Data":"f6b2696bce7ccb6880bdda930a5ccfaf927c9ebc64dad81fb193a050fd9b8c85"} Feb 18 11:57:21 crc kubenswrapper[4922]: I0218 11:57:21.381495 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72a01906-824d-4581-8d88-7d40a91786a1","Type":"ContainerStarted","Data":"69736a38daeb8610262f32bde5f3bd06b607bcef34f4fbe4b57e58aae6aee07e"} Feb 18 11:57:21 crc kubenswrapper[4922]: I0218 11:57:21.468544 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-979b8465b-gmztk"] Feb 18 11:57:21 crc kubenswrapper[4922]: W0218 11:57:21.489699 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b1ea57e_dcf2_4e47_8650_af483b18ea8f.slice/crio-d41035e2a8cf606b335f710a1d0f91690ca9a161e3cc25d246206a6e3ab38420 WatchSource:0}: Error finding container d41035e2a8cf606b335f710a1d0f91690ca9a161e3cc25d246206a6e3ab38420: Status 404 returned error can't find the container with id d41035e2a8cf606b335f710a1d0f91690ca9a161e3cc25d246206a6e3ab38420 Feb 18 11:57:22 crc kubenswrapper[4922]: I0218 11:57:22.028909 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 18 11:57:22 crc kubenswrapper[4922]: W0218 11:57:22.064727 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd3cd2cf_8780_4de2_925c_5385d6398e49.slice/crio-9f7f3452ee5fcb60297649d005b3ab7f8df412c6cf50a9719e163931e16c2987 WatchSource:0}: Error finding container 9f7f3452ee5fcb60297649d005b3ab7f8df412c6cf50a9719e163931e16c2987: Status 404 returned error can't find the container with id 9f7f3452ee5fcb60297649d005b3ab7f8df412c6cf50a9719e163931e16c2987 Feb 18 11:57:22 crc kubenswrapper[4922]: I0218 11:57:22.394103 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"bd3cd2cf-8780-4de2-925c-5385d6398e49","Type":"ContainerStarted","Data":"9f7f3452ee5fcb60297649d005b3ab7f8df412c6cf50a9719e163931e16c2987"} Feb 18 11:57:22 crc kubenswrapper[4922]: I0218 11:57:22.397633 4922 generic.go:334] "Generic (PLEG): container finished" podID="d36f2285-2752-4cad-bf52-fe6ae0b262d1" containerID="839de4434ebe21a5f0abbc718b56284e0f7743bf3463c809b6cae16fa7c2db5d" exitCode=0 Feb 18 11:57:22 crc kubenswrapper[4922]: I0218 11:57:22.397706 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-78vwz" event={"ID":"d36f2285-2752-4cad-bf52-fe6ae0b262d1","Type":"ContainerDied","Data":"839de4434ebe21a5f0abbc718b56284e0f7743bf3463c809b6cae16fa7c2db5d"} Feb 18 11:57:22 crc kubenswrapper[4922]: I0218 11:57:22.400895 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-979b8465b-gmztk" event={"ID":"6b1ea57e-dcf2-4e47-8650-af483b18ea8f","Type":"ContainerStarted","Data":"c50b1f228a346e798dee756ffe7172e5752cdc2bb9dd07e1f2ab8318d4e33a78"} Feb 18 11:57:22 crc kubenswrapper[4922]: I0218 11:57:22.400939 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-979b8465b-gmztk" event={"ID":"6b1ea57e-dcf2-4e47-8650-af483b18ea8f","Type":"ContainerStarted","Data":"d41035e2a8cf606b335f710a1d0f91690ca9a161e3cc25d246206a6e3ab38420"} Feb 18 11:57:22 crc kubenswrapper[4922]: I0218 11:57:22.403608 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 18 11:57:22 crc kubenswrapper[4922]: I0218 11:57:22.403727 4922 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 18 11:57:22 crc kubenswrapper[4922]: I0218 11:57:22.407219 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 18 11:57:22 crc kubenswrapper[4922]: I0218 11:57:22.407397 4922 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 18 11:57:22 crc kubenswrapper[4922]: I0218 11:57:22.437133 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 18 11:57:22 crc kubenswrapper[4922]: I0218 11:57:22.740861 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 18 11:57:22 crc kubenswrapper[4922]: I0218 11:57:22.828543 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-f57669c89-7wt5g"] Feb 18 11:57:22 crc kubenswrapper[4922]: I0218 11:57:22.844860 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f57669c89-7wt5g" Feb 18 11:57:22 crc kubenswrapper[4922]: I0218 11:57:22.861552 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 18 11:57:22 crc kubenswrapper[4922]: I0218 11:57:22.861811 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 18 11:57:22 crc kubenswrapper[4922]: I0218 11:57:22.873075 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f57669c89-7wt5g"] Feb 18 11:57:22 crc kubenswrapper[4922]: I0218 11:57:22.969576 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/49aa13b6-3343-43d5-949e-3118c1711ed0-httpd-config\") pod \"neutron-f57669c89-7wt5g\" (UID: \"49aa13b6-3343-43d5-949e-3118c1711ed0\") " pod="openstack/neutron-f57669c89-7wt5g" Feb 18 11:57:22 crc kubenswrapper[4922]: I0218 11:57:22.969661 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/49aa13b6-3343-43d5-949e-3118c1711ed0-config\") pod \"neutron-f57669c89-7wt5g\" (UID: \"49aa13b6-3343-43d5-949e-3118c1711ed0\") " pod="openstack/neutron-f57669c89-7wt5g" Feb 18 11:57:22 crc kubenswrapper[4922]: I0218 11:57:22.969728 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/49aa13b6-3343-43d5-949e-3118c1711ed0-internal-tls-certs\") pod \"neutron-f57669c89-7wt5g\" (UID: \"49aa13b6-3343-43d5-949e-3118c1711ed0\") " pod="openstack/neutron-f57669c89-7wt5g" Feb 18 11:57:22 crc kubenswrapper[4922]: I0218 11:57:22.969756 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klz69\" (UniqueName: \"kubernetes.io/projected/49aa13b6-3343-43d5-949e-3118c1711ed0-kube-api-access-klz69\") pod \"neutron-f57669c89-7wt5g\" (UID: \"49aa13b6-3343-43d5-949e-3118c1711ed0\") " pod="openstack/neutron-f57669c89-7wt5g" Feb 18 11:57:22 crc kubenswrapper[4922]: I0218 11:57:22.969822 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/49aa13b6-3343-43d5-949e-3118c1711ed0-ovndb-tls-certs\") pod \"neutron-f57669c89-7wt5g\" (UID: \"49aa13b6-3343-43d5-949e-3118c1711ed0\") " pod="openstack/neutron-f57669c89-7wt5g" Feb 18 11:57:22 crc kubenswrapper[4922]: I0218 11:57:22.969841 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/49aa13b6-3343-43d5-949e-3118c1711ed0-public-tls-certs\") pod \"neutron-f57669c89-7wt5g\" (UID: \"49aa13b6-3343-43d5-949e-3118c1711ed0\") " pod="openstack/neutron-f57669c89-7wt5g" Feb 18 11:57:22 crc kubenswrapper[4922]: I0218 11:57:22.969881 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49aa13b6-3343-43d5-949e-3118c1711ed0-combined-ca-bundle\") pod \"neutron-f57669c89-7wt5g\" (UID: \"49aa13b6-3343-43d5-949e-3118c1711ed0\") " pod="openstack/neutron-f57669c89-7wt5g" Feb 18 11:57:23 crc kubenswrapper[4922]: I0218 11:57:23.071342 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/49aa13b6-3343-43d5-949e-3118c1711ed0-ovndb-tls-certs\") pod \"neutron-f57669c89-7wt5g\" (UID: \"49aa13b6-3343-43d5-949e-3118c1711ed0\") " pod="openstack/neutron-f57669c89-7wt5g" Feb 18 11:57:23 crc kubenswrapper[4922]: I0218 11:57:23.071522 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/49aa13b6-3343-43d5-949e-3118c1711ed0-public-tls-certs\") pod \"neutron-f57669c89-7wt5g\" (UID: \"49aa13b6-3343-43d5-949e-3118c1711ed0\") " pod="openstack/neutron-f57669c89-7wt5g" Feb 18 11:57:23 crc kubenswrapper[4922]: I0218 11:57:23.071749 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49aa13b6-3343-43d5-949e-3118c1711ed0-combined-ca-bundle\") pod \"neutron-f57669c89-7wt5g\" (UID: \"49aa13b6-3343-43d5-949e-3118c1711ed0\") " pod="openstack/neutron-f57669c89-7wt5g" Feb 18 11:57:23 crc kubenswrapper[4922]: I0218 11:57:23.071894 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/49aa13b6-3343-43d5-949e-3118c1711ed0-httpd-config\") pod \"neutron-f57669c89-7wt5g\" (UID: \"49aa13b6-3343-43d5-949e-3118c1711ed0\") " pod="openstack/neutron-f57669c89-7wt5g" Feb 18 11:57:23 crc kubenswrapper[4922]: I0218 11:57:23.072089 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/49aa13b6-3343-43d5-949e-3118c1711ed0-config\") pod \"neutron-f57669c89-7wt5g\" (UID: \"49aa13b6-3343-43d5-949e-3118c1711ed0\") " pod="openstack/neutron-f57669c89-7wt5g" Feb 18 11:57:23 crc kubenswrapper[4922]: I0218 11:57:23.072319 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/49aa13b6-3343-43d5-949e-3118c1711ed0-internal-tls-certs\") pod \"neutron-f57669c89-7wt5g\" (UID: \"49aa13b6-3343-43d5-949e-3118c1711ed0\") " pod="openstack/neutron-f57669c89-7wt5g" Feb 18 11:57:23 crc kubenswrapper[4922]: I0218 11:57:23.072401 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klz69\" (UniqueName: \"kubernetes.io/projected/49aa13b6-3343-43d5-949e-3118c1711ed0-kube-api-access-klz69\") pod \"neutron-f57669c89-7wt5g\" (UID: \"49aa13b6-3343-43d5-949e-3118c1711ed0\") " pod="openstack/neutron-f57669c89-7wt5g" Feb 18 11:57:23 crc kubenswrapper[4922]: I0218 11:57:23.079635 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/49aa13b6-3343-43d5-949e-3118c1711ed0-httpd-config\") pod \"neutron-f57669c89-7wt5g\" (UID: \"49aa13b6-3343-43d5-949e-3118c1711ed0\") " pod="openstack/neutron-f57669c89-7wt5g" Feb 18 11:57:23 crc kubenswrapper[4922]: I0218 11:57:23.086311 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/49aa13b6-3343-43d5-949e-3118c1711ed0-public-tls-certs\") pod \"neutron-f57669c89-7wt5g\" (UID: \"49aa13b6-3343-43d5-949e-3118c1711ed0\") " pod="openstack/neutron-f57669c89-7wt5g" Feb 18 11:57:23 crc kubenswrapper[4922]: I0218 11:57:23.087099 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/49aa13b6-3343-43d5-949e-3118c1711ed0-ovndb-tls-certs\") pod \"neutron-f57669c89-7wt5g\" (UID: \"49aa13b6-3343-43d5-949e-3118c1711ed0\") " pod="openstack/neutron-f57669c89-7wt5g" Feb 18 11:57:23 crc kubenswrapper[4922]: I0218 11:57:23.087927 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/49aa13b6-3343-43d5-949e-3118c1711ed0-config\") pod \"neutron-f57669c89-7wt5g\" (UID: \"49aa13b6-3343-43d5-949e-3118c1711ed0\") " pod="openstack/neutron-f57669c89-7wt5g" Feb 18 11:57:23 crc kubenswrapper[4922]: I0218 11:57:23.088565 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/49aa13b6-3343-43d5-949e-3118c1711ed0-internal-tls-certs\") pod \"neutron-f57669c89-7wt5g\" (UID: \"49aa13b6-3343-43d5-949e-3118c1711ed0\") " pod="openstack/neutron-f57669c89-7wt5g" Feb 18 11:57:23 crc kubenswrapper[4922]: I0218 11:57:23.090487 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49aa13b6-3343-43d5-949e-3118c1711ed0-combined-ca-bundle\") pod \"neutron-f57669c89-7wt5g\" (UID: \"49aa13b6-3343-43d5-949e-3118c1711ed0\") " pod="openstack/neutron-f57669c89-7wt5g" Feb 18 11:57:23 crc kubenswrapper[4922]: I0218 11:57:23.113556 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klz69\" (UniqueName: \"kubernetes.io/projected/49aa13b6-3343-43d5-949e-3118c1711ed0-kube-api-access-klz69\") pod \"neutron-f57669c89-7wt5g\" (UID: \"49aa13b6-3343-43d5-949e-3118c1711ed0\") " pod="openstack/neutron-f57669c89-7wt5g" Feb 18 11:57:23 crc kubenswrapper[4922]: I0218 11:57:23.223721 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f57669c89-7wt5g" Feb 18 11:57:23 crc kubenswrapper[4922]: I0218 11:57:23.447316 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"bd3cd2cf-8780-4de2-925c-5385d6398e49","Type":"ContainerStarted","Data":"9228fcb9048d9089f519e9840c6b71316b8c7d75be28af3c658ae2e0a85e1545"} Feb 18 11:57:23 crc kubenswrapper[4922]: I0218 11:57:23.458035 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-78vwz" event={"ID":"d36f2285-2752-4cad-bf52-fe6ae0b262d1","Type":"ContainerStarted","Data":"dcc50eb32469aa9ff5b74469bad439e5bb4a6c4458c1ea0f88da9a06c0a2437a"} Feb 18 11:57:23 crc kubenswrapper[4922]: I0218 11:57:23.458667 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6578955fd5-78vwz" Feb 18 11:57:24 crc kubenswrapper[4922]: I0218 11:57:24.101963 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6578955fd5-78vwz" podStartSLOduration=5.101945297 podStartE2EDuration="5.101945297s" podCreationTimestamp="2026-02-18 11:57:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:57:23.502206538 +0000 UTC m=+1245.229910638" watchObservedRunningTime="2026-02-18 11:57:24.101945297 +0000 UTC m=+1245.829649377" Feb 18 11:57:24 crc kubenswrapper[4922]: I0218 11:57:24.106693 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f57669c89-7wt5g"] Feb 18 11:57:24 crc kubenswrapper[4922]: I0218 11:57:24.468722 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f57669c89-7wt5g" event={"ID":"49aa13b6-3343-43d5-949e-3118c1711ed0","Type":"ContainerStarted","Data":"7530ef8b8060cfcd0f02e18c4c18bf4b5026b767146c5c28f0afb184447def2f"} Feb 18 11:57:24 crc kubenswrapper[4922]: I0218 11:57:24.471649 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-979b8465b-gmztk" event={"ID":"6b1ea57e-dcf2-4e47-8650-af483b18ea8f","Type":"ContainerStarted","Data":"dd50276aac03478ea25adcde79bcd1337f3694428dc64ce90dcf6fe92e773c3c"} Feb 18 11:57:24 crc kubenswrapper[4922]: I0218 11:57:24.471825 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-979b8465b-gmztk" Feb 18 11:57:24 crc kubenswrapper[4922]: I0218 11:57:24.476482 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72a01906-824d-4581-8d88-7d40a91786a1","Type":"ContainerStarted","Data":"5743fc1cc15b1c94b293ee77602825aaa6c5b1bdde4fb72ac39f92a37ff15f34"} Feb 18 11:57:24 crc kubenswrapper[4922]: I0218 11:57:24.476911 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="72a01906-824d-4581-8d88-7d40a91786a1" containerName="proxy-httpd" containerID="cri-o://5743fc1cc15b1c94b293ee77602825aaa6c5b1bdde4fb72ac39f92a37ff15f34" gracePeriod=30 Feb 18 11:57:24 crc kubenswrapper[4922]: I0218 11:57:24.476951 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="72a01906-824d-4581-8d88-7d40a91786a1" containerName="ceilometer-notification-agent" containerID="cri-o://c4b4f11931af9a7b58f5f6e6efc41376cb3e1dcbdd51081feef30ab45a264ba3" gracePeriod=30 Feb 18 11:57:24 crc kubenswrapper[4922]: I0218 11:57:24.476907 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="72a01906-824d-4581-8d88-7d40a91786a1" containerName="ceilometer-central-agent" containerID="cri-o://f76ae30b441313706d2b84e1c38399b3498bd5d4bdc045db95788de5d490e8e9" gracePeriod=30 Feb 18 11:57:24 crc kubenswrapper[4922]: I0218 11:57:24.476949 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="72a01906-824d-4581-8d88-7d40a91786a1" containerName="sg-core" containerID="cri-o://69736a38daeb8610262f32bde5f3bd06b607bcef34f4fbe4b57e58aae6aee07e" gracePeriod=30 Feb 18 11:57:24 crc kubenswrapper[4922]: I0218 11:57:24.502912 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-979b8465b-gmztk" podStartSLOduration=4.502892851 podStartE2EDuration="4.502892851s" podCreationTimestamp="2026-02-18 11:57:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:57:24.497202987 +0000 UTC m=+1246.224907077" watchObservedRunningTime="2026-02-18 11:57:24.502892851 +0000 UTC m=+1246.230596931" Feb 18 11:57:24 crc kubenswrapper[4922]: I0218 11:57:24.525124 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.760810546 podStartE2EDuration="16.525102383s" podCreationTimestamp="2026-02-18 11:57:08 +0000 UTC" firstStartedPulling="2026-02-18 11:57:09.36797671 +0000 UTC m=+1231.095680800" lastFinishedPulling="2026-02-18 11:57:23.132268557 +0000 UTC m=+1244.859972637" observedRunningTime="2026-02-18 11:57:24.524906238 +0000 UTC m=+1246.252610318" watchObservedRunningTime="2026-02-18 11:57:24.525102383 +0000 UTC m=+1246.252806463" Feb 18 11:57:25 crc kubenswrapper[4922]: I0218 11:57:25.501171 4922 generic.go:334] "Generic (PLEG): container finished" podID="72a01906-824d-4581-8d88-7d40a91786a1" containerID="5743fc1cc15b1c94b293ee77602825aaa6c5b1bdde4fb72ac39f92a37ff15f34" exitCode=0 Feb 18 11:57:25 crc kubenswrapper[4922]: I0218 11:57:25.501789 4922 generic.go:334] "Generic (PLEG): container finished" podID="72a01906-824d-4581-8d88-7d40a91786a1" containerID="69736a38daeb8610262f32bde5f3bd06b607bcef34f4fbe4b57e58aae6aee07e" exitCode=2 Feb 18 11:57:25 crc kubenswrapper[4922]: I0218 11:57:25.501804 4922 generic.go:334] "Generic (PLEG): container finished" podID="72a01906-824d-4581-8d88-7d40a91786a1" containerID="f76ae30b441313706d2b84e1c38399b3498bd5d4bdc045db95788de5d490e8e9" exitCode=0 Feb 18 11:57:25 crc kubenswrapper[4922]: I0218 11:57:25.501923 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72a01906-824d-4581-8d88-7d40a91786a1","Type":"ContainerDied","Data":"5743fc1cc15b1c94b293ee77602825aaa6c5b1bdde4fb72ac39f92a37ff15f34"} Feb 18 11:57:25 crc kubenswrapper[4922]: I0218 11:57:25.501957 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72a01906-824d-4581-8d88-7d40a91786a1","Type":"ContainerDied","Data":"69736a38daeb8610262f32bde5f3bd06b607bcef34f4fbe4b57e58aae6aee07e"} Feb 18 11:57:25 crc kubenswrapper[4922]: I0218 11:57:25.501972 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72a01906-824d-4581-8d88-7d40a91786a1","Type":"ContainerDied","Data":"f76ae30b441313706d2b84e1c38399b3498bd5d4bdc045db95788de5d490e8e9"} Feb 18 11:57:25 crc kubenswrapper[4922]: I0218 11:57:25.524153 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f57669c89-7wt5g" event={"ID":"49aa13b6-3343-43d5-949e-3118c1711ed0","Type":"ContainerStarted","Data":"6c9993c754311fb00ea2dde49190e698127552602ae7569fc62de61cbd67aee0"} Feb 18 11:57:25 crc kubenswrapper[4922]: I0218 11:57:25.524225 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f57669c89-7wt5g" event={"ID":"49aa13b6-3343-43d5-949e-3118c1711ed0","Type":"ContainerStarted","Data":"9aa3a8ed7f07eb52dcfaf3d9aa4b738c355b70169e828b85dff70e19d80a137b"} Feb 18 11:57:25 crc kubenswrapper[4922]: I0218 11:57:25.526639 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-f57669c89-7wt5g" Feb 18 11:57:25 crc kubenswrapper[4922]: I0218 11:57:25.536330 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"bd3cd2cf-8780-4de2-925c-5385d6398e49","Type":"ContainerStarted","Data":"5266a6b6549185d692cec1e8a671b0c2fae8160f00d1dd69abb4f69e1b2e0601"} Feb 18 11:57:25 crc kubenswrapper[4922]: I0218 11:57:25.564113 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-f57669c89-7wt5g" podStartSLOduration=3.564087833 podStartE2EDuration="3.564087833s" podCreationTimestamp="2026-02-18 11:57:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:57:25.559382484 +0000 UTC m=+1247.287086574" watchObservedRunningTime="2026-02-18 11:57:25.564087833 +0000 UTC m=+1247.291791913" Feb 18 11:57:25 crc kubenswrapper[4922]: I0218 11:57:25.601110 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.601087478 podStartE2EDuration="5.601087478s" podCreationTimestamp="2026-02-18 11:57:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:57:25.581537854 +0000 UTC m=+1247.309241934" watchObservedRunningTime="2026-02-18 11:57:25.601087478 +0000 UTC m=+1247.328791558" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.119984 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.239607 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.248681 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72a01906-824d-4581-8d88-7d40a91786a1-scripts\") pod \"72a01906-824d-4581-8d88-7d40a91786a1\" (UID: \"72a01906-824d-4581-8d88-7d40a91786a1\") " Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.248855 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/72a01906-824d-4581-8d88-7d40a91786a1-sg-core-conf-yaml\") pod \"72a01906-824d-4581-8d88-7d40a91786a1\" (UID: \"72a01906-824d-4581-8d88-7d40a91786a1\") " Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.248926 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72a01906-824d-4581-8d88-7d40a91786a1-run-httpd\") pod \"72a01906-824d-4581-8d88-7d40a91786a1\" (UID: \"72a01906-824d-4581-8d88-7d40a91786a1\") " Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.248962 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72a01906-824d-4581-8d88-7d40a91786a1-combined-ca-bundle\") pod \"72a01906-824d-4581-8d88-7d40a91786a1\" (UID: \"72a01906-824d-4581-8d88-7d40a91786a1\") " Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.248995 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72a01906-824d-4581-8d88-7d40a91786a1-config-data\") pod \"72a01906-824d-4581-8d88-7d40a91786a1\" (UID: \"72a01906-824d-4581-8d88-7d40a91786a1\") " Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.249041 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqwwp\" (UniqueName: \"kubernetes.io/projected/72a01906-824d-4581-8d88-7d40a91786a1-kube-api-access-sqwwp\") pod \"72a01906-824d-4581-8d88-7d40a91786a1\" (UID: \"72a01906-824d-4581-8d88-7d40a91786a1\") " Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.249106 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72a01906-824d-4581-8d88-7d40a91786a1-log-httpd\") pod \"72a01906-824d-4581-8d88-7d40a91786a1\" (UID: \"72a01906-824d-4581-8d88-7d40a91786a1\") " Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.249858 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72a01906-824d-4581-8d88-7d40a91786a1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "72a01906-824d-4581-8d88-7d40a91786a1" (UID: "72a01906-824d-4581-8d88-7d40a91786a1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.249927 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72a01906-824d-4581-8d88-7d40a91786a1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "72a01906-824d-4581-8d88-7d40a91786a1" (UID: "72a01906-824d-4581-8d88-7d40a91786a1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.250473 4922 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72a01906-824d-4581-8d88-7d40a91786a1-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.250493 4922 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72a01906-824d-4581-8d88-7d40a91786a1-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.258808 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72a01906-824d-4581-8d88-7d40a91786a1-kube-api-access-sqwwp" (OuterVolumeSpecName: "kube-api-access-sqwwp") pod "72a01906-824d-4581-8d88-7d40a91786a1" (UID: "72a01906-824d-4581-8d88-7d40a91786a1"). InnerVolumeSpecName "kube-api-access-sqwwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.259674 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72a01906-824d-4581-8d88-7d40a91786a1-scripts" (OuterVolumeSpecName: "scripts") pod "72a01906-824d-4581-8d88-7d40a91786a1" (UID: "72a01906-824d-4581-8d88-7d40a91786a1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.304240 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72a01906-824d-4581-8d88-7d40a91786a1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "72a01906-824d-4581-8d88-7d40a91786a1" (UID: "72a01906-824d-4581-8d88-7d40a91786a1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.349690 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72a01906-824d-4581-8d88-7d40a91786a1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72a01906-824d-4581-8d88-7d40a91786a1" (UID: "72a01906-824d-4581-8d88-7d40a91786a1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.354439 4922 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/72a01906-824d-4581-8d88-7d40a91786a1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.354472 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72a01906-824d-4581-8d88-7d40a91786a1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.354486 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqwwp\" (UniqueName: \"kubernetes.io/projected/72a01906-824d-4581-8d88-7d40a91786a1-kube-api-access-sqwwp\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.354498 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72a01906-824d-4581-8d88-7d40a91786a1-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.404698 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72a01906-824d-4581-8d88-7d40a91786a1-config-data" (OuterVolumeSpecName: "config-data") pod "72a01906-824d-4581-8d88-7d40a91786a1" (UID: "72a01906-824d-4581-8d88-7d40a91786a1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.456211 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72a01906-824d-4581-8d88-7d40a91786a1-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.550270 4922 generic.go:334] "Generic (PLEG): container finished" podID="72a01906-824d-4581-8d88-7d40a91786a1" containerID="c4b4f11931af9a7b58f5f6e6efc41376cb3e1dcbdd51081feef30ab45a264ba3" exitCode=0 Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.551133 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.551303 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72a01906-824d-4581-8d88-7d40a91786a1","Type":"ContainerDied","Data":"c4b4f11931af9a7b58f5f6e6efc41376cb3e1dcbdd51081feef30ab45a264ba3"} Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.551579 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72a01906-824d-4581-8d88-7d40a91786a1","Type":"ContainerDied","Data":"e51f7f2faa63d11b52bb16edb526931063add3c924782fc45c0056ce678908a1"} Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.551679 4922 scope.go:117] "RemoveContainer" containerID="5743fc1cc15b1c94b293ee77602825aaa6c5b1bdde4fb72ac39f92a37ff15f34" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.594496 4922 scope.go:117] "RemoveContainer" containerID="69736a38daeb8610262f32bde5f3bd06b607bcef34f4fbe4b57e58aae6aee07e" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.596494 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.609113 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.618938 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 18 11:57:26 crc kubenswrapper[4922]: E0218 11:57:26.619448 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a01906-824d-4581-8d88-7d40a91786a1" containerName="proxy-httpd" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.619469 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a01906-824d-4581-8d88-7d40a91786a1" containerName="proxy-httpd" Feb 18 11:57:26 crc kubenswrapper[4922]: E0218 11:57:26.619503 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a01906-824d-4581-8d88-7d40a91786a1" containerName="ceilometer-central-agent" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.619511 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a01906-824d-4581-8d88-7d40a91786a1" containerName="ceilometer-central-agent" Feb 18 11:57:26 crc kubenswrapper[4922]: E0218 11:57:26.619532 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a01906-824d-4581-8d88-7d40a91786a1" containerName="sg-core" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.619538 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a01906-824d-4581-8d88-7d40a91786a1" containerName="sg-core" Feb 18 11:57:26 crc kubenswrapper[4922]: E0218 11:57:26.619549 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a01906-824d-4581-8d88-7d40a91786a1" containerName="ceilometer-notification-agent" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.619556 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a01906-824d-4581-8d88-7d40a91786a1" containerName="ceilometer-notification-agent" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.619731 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="72a01906-824d-4581-8d88-7d40a91786a1" containerName="sg-core" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.619747 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="72a01906-824d-4581-8d88-7d40a91786a1" containerName="ceilometer-central-agent" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.619760 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="72a01906-824d-4581-8d88-7d40a91786a1" containerName="ceilometer-notification-agent" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.619784 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="72a01906-824d-4581-8d88-7d40a91786a1" containerName="proxy-httpd" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.620325 4922 scope.go:117] "RemoveContainer" containerID="c4b4f11931af9a7b58f5f6e6efc41376cb3e1dcbdd51081feef30ab45a264ba3" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.624491 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.627454 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.627509 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.648573 4922 scope.go:117] "RemoveContainer" containerID="f76ae30b441313706d2b84e1c38399b3498bd5d4bdc045db95788de5d490e8e9" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.664920 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.697645 4922 scope.go:117] "RemoveContainer" containerID="5743fc1cc15b1c94b293ee77602825aaa6c5b1bdde4fb72ac39f92a37ff15f34" Feb 18 11:57:26 crc kubenswrapper[4922]: E0218 11:57:26.698684 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5743fc1cc15b1c94b293ee77602825aaa6c5b1bdde4fb72ac39f92a37ff15f34\": container with ID starting with 5743fc1cc15b1c94b293ee77602825aaa6c5b1bdde4fb72ac39f92a37ff15f34 not found: ID does not exist" containerID="5743fc1cc15b1c94b293ee77602825aaa6c5b1bdde4fb72ac39f92a37ff15f34" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.698735 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5743fc1cc15b1c94b293ee77602825aaa6c5b1bdde4fb72ac39f92a37ff15f34"} err="failed to get container status \"5743fc1cc15b1c94b293ee77602825aaa6c5b1bdde4fb72ac39f92a37ff15f34\": rpc error: code = NotFound desc = could not find container \"5743fc1cc15b1c94b293ee77602825aaa6c5b1bdde4fb72ac39f92a37ff15f34\": container with ID starting with 5743fc1cc15b1c94b293ee77602825aaa6c5b1bdde4fb72ac39f92a37ff15f34 not found: ID does not exist" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.698764 4922 scope.go:117] "RemoveContainer" containerID="69736a38daeb8610262f32bde5f3bd06b607bcef34f4fbe4b57e58aae6aee07e" Feb 18 11:57:26 crc kubenswrapper[4922]: E0218 11:57:26.700146 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69736a38daeb8610262f32bde5f3bd06b607bcef34f4fbe4b57e58aae6aee07e\": container with ID starting with 69736a38daeb8610262f32bde5f3bd06b607bcef34f4fbe4b57e58aae6aee07e not found: ID does not exist" containerID="69736a38daeb8610262f32bde5f3bd06b607bcef34f4fbe4b57e58aae6aee07e" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.700176 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69736a38daeb8610262f32bde5f3bd06b607bcef34f4fbe4b57e58aae6aee07e"} err="failed to get container status \"69736a38daeb8610262f32bde5f3bd06b607bcef34f4fbe4b57e58aae6aee07e\": rpc error: code = NotFound desc = could not find container \"69736a38daeb8610262f32bde5f3bd06b607bcef34f4fbe4b57e58aae6aee07e\": container with ID starting with 69736a38daeb8610262f32bde5f3bd06b607bcef34f4fbe4b57e58aae6aee07e not found: ID does not exist" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.700190 4922 scope.go:117] "RemoveContainer" containerID="c4b4f11931af9a7b58f5f6e6efc41376cb3e1dcbdd51081feef30ab45a264ba3" Feb 18 11:57:26 crc kubenswrapper[4922]: E0218 11:57:26.700514 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4b4f11931af9a7b58f5f6e6efc41376cb3e1dcbdd51081feef30ab45a264ba3\": container with ID starting with c4b4f11931af9a7b58f5f6e6efc41376cb3e1dcbdd51081feef30ab45a264ba3 not found: ID does not exist" containerID="c4b4f11931af9a7b58f5f6e6efc41376cb3e1dcbdd51081feef30ab45a264ba3" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.700551 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4b4f11931af9a7b58f5f6e6efc41376cb3e1dcbdd51081feef30ab45a264ba3"} err="failed to get container status \"c4b4f11931af9a7b58f5f6e6efc41376cb3e1dcbdd51081feef30ab45a264ba3\": rpc error: code = NotFound desc = could not find container \"c4b4f11931af9a7b58f5f6e6efc41376cb3e1dcbdd51081feef30ab45a264ba3\": container with ID starting with c4b4f11931af9a7b58f5f6e6efc41376cb3e1dcbdd51081feef30ab45a264ba3 not found: ID does not exist" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.700567 4922 scope.go:117] "RemoveContainer" containerID="f76ae30b441313706d2b84e1c38399b3498bd5d4bdc045db95788de5d490e8e9" Feb 18 11:57:26 crc kubenswrapper[4922]: E0218 11:57:26.700956 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f76ae30b441313706d2b84e1c38399b3498bd5d4bdc045db95788de5d490e8e9\": container with ID starting with f76ae30b441313706d2b84e1c38399b3498bd5d4bdc045db95788de5d490e8e9 not found: ID does not exist" containerID="f76ae30b441313706d2b84e1c38399b3498bd5d4bdc045db95788de5d490e8e9" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.700988 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f76ae30b441313706d2b84e1c38399b3498bd5d4bdc045db95788de5d490e8e9"} err="failed to get container status \"f76ae30b441313706d2b84e1c38399b3498bd5d4bdc045db95788de5d490e8e9\": rpc error: code = NotFound desc = could not find container \"f76ae30b441313706d2b84e1c38399b3498bd5d4bdc045db95788de5d490e8e9\": container with ID starting with f76ae30b441313706d2b84e1c38399b3498bd5d4bdc045db95788de5d490e8e9 not found: ID does not exist" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.761341 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26464\" (UniqueName: \"kubernetes.io/projected/3ad26da5-56a1-4f67-aae9-ab321499352f-kube-api-access-26464\") pod \"ceilometer-0\" (UID: \"3ad26da5-56a1-4f67-aae9-ab321499352f\") " pod="openstack/ceilometer-0" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.761426 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ad26da5-56a1-4f67-aae9-ab321499352f-config-data\") pod \"ceilometer-0\" (UID: \"3ad26da5-56a1-4f67-aae9-ab321499352f\") " pod="openstack/ceilometer-0" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.761470 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ad26da5-56a1-4f67-aae9-ab321499352f-scripts\") pod \"ceilometer-0\" (UID: \"3ad26da5-56a1-4f67-aae9-ab321499352f\") " pod="openstack/ceilometer-0" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.761500 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ad26da5-56a1-4f67-aae9-ab321499352f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3ad26da5-56a1-4f67-aae9-ab321499352f\") " pod="openstack/ceilometer-0" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.761533 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ad26da5-56a1-4f67-aae9-ab321499352f-run-httpd\") pod \"ceilometer-0\" (UID: \"3ad26da5-56a1-4f67-aae9-ab321499352f\") " pod="openstack/ceilometer-0" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.761556 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ad26da5-56a1-4f67-aae9-ab321499352f-log-httpd\") pod \"ceilometer-0\" (UID: \"3ad26da5-56a1-4f67-aae9-ab321499352f\") " pod="openstack/ceilometer-0" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.761597 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3ad26da5-56a1-4f67-aae9-ab321499352f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3ad26da5-56a1-4f67-aae9-ab321499352f\") " pod="openstack/ceilometer-0" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.864273 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ad26da5-56a1-4f67-aae9-ab321499352f-run-httpd\") pod \"ceilometer-0\" (UID: \"3ad26da5-56a1-4f67-aae9-ab321499352f\") " pod="openstack/ceilometer-0" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.864632 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ad26da5-56a1-4f67-aae9-ab321499352f-log-httpd\") pod \"ceilometer-0\" (UID: \"3ad26da5-56a1-4f67-aae9-ab321499352f\") " pod="openstack/ceilometer-0" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.864701 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3ad26da5-56a1-4f67-aae9-ab321499352f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3ad26da5-56a1-4f67-aae9-ab321499352f\") " pod="openstack/ceilometer-0" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.864771 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26464\" (UniqueName: \"kubernetes.io/projected/3ad26da5-56a1-4f67-aae9-ab321499352f-kube-api-access-26464\") pod \"ceilometer-0\" (UID: \"3ad26da5-56a1-4f67-aae9-ab321499352f\") " pod="openstack/ceilometer-0" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.864825 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ad26da5-56a1-4f67-aae9-ab321499352f-config-data\") pod \"ceilometer-0\" (UID: \"3ad26da5-56a1-4f67-aae9-ab321499352f\") " pod="openstack/ceilometer-0" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.864874 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ad26da5-56a1-4f67-aae9-ab321499352f-scripts\") pod \"ceilometer-0\" (UID: \"3ad26da5-56a1-4f67-aae9-ab321499352f\") " pod="openstack/ceilometer-0" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.864890 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ad26da5-56a1-4f67-aae9-ab321499352f-run-httpd\") pod \"ceilometer-0\" (UID: \"3ad26da5-56a1-4f67-aae9-ab321499352f\") " pod="openstack/ceilometer-0" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.864916 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ad26da5-56a1-4f67-aae9-ab321499352f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3ad26da5-56a1-4f67-aae9-ab321499352f\") " pod="openstack/ceilometer-0" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.865642 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ad26da5-56a1-4f67-aae9-ab321499352f-log-httpd\") pod \"ceilometer-0\" (UID: \"3ad26da5-56a1-4f67-aae9-ab321499352f\") " pod="openstack/ceilometer-0" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.871959 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ad26da5-56a1-4f67-aae9-ab321499352f-scripts\") pod \"ceilometer-0\" (UID: \"3ad26da5-56a1-4f67-aae9-ab321499352f\") " pod="openstack/ceilometer-0" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.872526 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3ad26da5-56a1-4f67-aae9-ab321499352f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3ad26da5-56a1-4f67-aae9-ab321499352f\") " pod="openstack/ceilometer-0" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.874495 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ad26da5-56a1-4f67-aae9-ab321499352f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3ad26da5-56a1-4f67-aae9-ab321499352f\") " pod="openstack/ceilometer-0" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.874716 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ad26da5-56a1-4f67-aae9-ab321499352f-config-data\") pod \"ceilometer-0\" (UID: \"3ad26da5-56a1-4f67-aae9-ab321499352f\") " pod="openstack/ceilometer-0" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.890399 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26464\" (UniqueName: \"kubernetes.io/projected/3ad26da5-56a1-4f67-aae9-ab321499352f-kube-api-access-26464\") pod \"ceilometer-0\" (UID: \"3ad26da5-56a1-4f67-aae9-ab321499352f\") " pod="openstack/ceilometer-0" Feb 18 11:57:26 crc kubenswrapper[4922]: I0218 11:57:26.950348 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 11:57:27 crc kubenswrapper[4922]: I0218 11:57:27.029201 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72a01906-824d-4581-8d88-7d40a91786a1" path="/var/lib/kubelet/pods/72a01906-824d-4581-8d88-7d40a91786a1/volumes" Feb 18 11:57:27 crc kubenswrapper[4922]: I0218 11:57:27.487277 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 11:57:27 crc kubenswrapper[4922]: W0218 11:57:27.497875 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ad26da5_56a1_4f67_aae9_ab321499352f.slice/crio-4ab86a79e660c19ee3ee370625c973f496ff01d2184f2823cb5772f2dd377459 WatchSource:0}: Error finding container 4ab86a79e660c19ee3ee370625c973f496ff01d2184f2823cb5772f2dd377459: Status 404 returned error can't find the container with id 4ab86a79e660c19ee3ee370625c973f496ff01d2184f2823cb5772f2dd377459 Feb 18 11:57:27 crc kubenswrapper[4922]: I0218 11:57:27.571819 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3ad26da5-56a1-4f67-aae9-ab321499352f","Type":"ContainerStarted","Data":"4ab86a79e660c19ee3ee370625c973f496ff01d2184f2823cb5772f2dd377459"} Feb 18 11:57:28 crc kubenswrapper[4922]: I0218 11:57:28.582849 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3ad26da5-56a1-4f67-aae9-ab321499352f","Type":"ContainerStarted","Data":"4f7ad4cbfa0275aa87aaa646fee7583ae6b1acc937b50ade063f9b7ab68d66a4"} Feb 18 11:57:29 crc kubenswrapper[4922]: I0218 11:57:29.085994 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 11:57:29 crc kubenswrapper[4922]: I0218 11:57:29.601498 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3ad26da5-56a1-4f67-aae9-ab321499352f","Type":"ContainerStarted","Data":"6840f0d264514efcd94bc4a02aec112cf5f76e7909fa91e4880bd5ff5ad6f1bc"} Feb 18 11:57:30 crc kubenswrapper[4922]: I0218 11:57:30.369810 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 11:57:30 crc kubenswrapper[4922]: I0218 11:57:30.378590 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="24eb828d-acb3-4b88-96dc-8d3bb8c49e86" containerName="glance-log" containerID="cri-o://2c9d001f560c7f93208a3584bc230854257ebcad49ef86ea326e6025425c596a" gracePeriod=30 Feb 18 11:57:30 crc kubenswrapper[4922]: I0218 11:57:30.378769 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="24eb828d-acb3-4b88-96dc-8d3bb8c49e86" containerName="glance-httpd" containerID="cri-o://fae0743e5e0fb73717e68120a40713dc2bcdd3fa357699afed1bff0f5a1368e5" gracePeriod=30 Feb 18 11:57:30 crc kubenswrapper[4922]: I0218 11:57:30.477275 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6578955fd5-78vwz" Feb 18 11:57:30 crc kubenswrapper[4922]: I0218 11:57:30.646740 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cc8b5d5c5-jrg5v"] Feb 18 11:57:30 crc kubenswrapper[4922]: I0218 11:57:30.647278 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5cc8b5d5c5-jrg5v" podUID="d61bee1b-0ee4-4c97-8d5d-8655406f124c" containerName="dnsmasq-dns" containerID="cri-o://6fc40bd77d62f143a53ef0627d5b912914ae270a819ef4018ea2e7de4e360674" gracePeriod=10 Feb 18 11:57:30 crc kubenswrapper[4922]: I0218 11:57:30.670026 4922 generic.go:334] "Generic (PLEG): container finished" podID="24eb828d-acb3-4b88-96dc-8d3bb8c49e86" containerID="2c9d001f560c7f93208a3584bc230854257ebcad49ef86ea326e6025425c596a" exitCode=143 Feb 18 11:57:30 crc kubenswrapper[4922]: I0218 11:57:30.670163 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"24eb828d-acb3-4b88-96dc-8d3bb8c49e86","Type":"ContainerDied","Data":"2c9d001f560c7f93208a3584bc230854257ebcad49ef86ea326e6025425c596a"} Feb 18 11:57:30 crc kubenswrapper[4922]: I0218 11:57:30.709112 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3ad26da5-56a1-4f67-aae9-ab321499352f","Type":"ContainerStarted","Data":"0426fee054517907fa41850ee4ebf4b625cb2e9874199241df43ce1756e27ff1"} Feb 18 11:57:30 crc kubenswrapper[4922]: I0218 11:57:30.763146 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-p2pzf"] Feb 18 11:57:30 crc kubenswrapper[4922]: I0218 11:57:30.764899 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-p2pzf" Feb 18 11:57:30 crc kubenswrapper[4922]: I0218 11:57:30.786463 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-p2pzf"] Feb 18 11:57:30 crc kubenswrapper[4922]: I0218 11:57:30.788427 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmnpw\" (UniqueName: \"kubernetes.io/projected/157bc07b-77b8-4a29-b8e0-9a205215187b-kube-api-access-mmnpw\") pod \"nova-api-db-create-p2pzf\" (UID: \"157bc07b-77b8-4a29-b8e0-9a205215187b\") " pod="openstack/nova-api-db-create-p2pzf" Feb 18 11:57:30 crc kubenswrapper[4922]: I0218 11:57:30.788553 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/157bc07b-77b8-4a29-b8e0-9a205215187b-operator-scripts\") pod \"nova-api-db-create-p2pzf\" (UID: \"157bc07b-77b8-4a29-b8e0-9a205215187b\") " pod="openstack/nova-api-db-create-p2pzf" Feb 18 11:57:30 crc kubenswrapper[4922]: E0218 11:57:30.790159 4922 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24eb828d_acb3_4b88_96dc_8d3bb8c49e86.slice/crio-conmon-2c9d001f560c7f93208a3584bc230854257ebcad49ef86ea326e6025425c596a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24eb828d_acb3_4b88_96dc_8d3bb8c49e86.slice/crio-2c9d001f560c7f93208a3584bc230854257ebcad49ef86ea326e6025425c596a.scope\": RecentStats: unable to find data in memory cache]" Feb 18 11:57:30 crc kubenswrapper[4922]: I0218 11:57:30.851438 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-b9cbq"] Feb 18 11:57:30 crc kubenswrapper[4922]: I0218 11:57:30.852608 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-b9cbq" Feb 18 11:57:30 crc kubenswrapper[4922]: I0218 11:57:30.890494 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmnpw\" (UniqueName: \"kubernetes.io/projected/157bc07b-77b8-4a29-b8e0-9a205215187b-kube-api-access-mmnpw\") pod \"nova-api-db-create-p2pzf\" (UID: \"157bc07b-77b8-4a29-b8e0-9a205215187b\") " pod="openstack/nova-api-db-create-p2pzf" Feb 18 11:57:30 crc kubenswrapper[4922]: I0218 11:57:30.890572 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/157bc07b-77b8-4a29-b8e0-9a205215187b-operator-scripts\") pod \"nova-api-db-create-p2pzf\" (UID: \"157bc07b-77b8-4a29-b8e0-9a205215187b\") " pod="openstack/nova-api-db-create-p2pzf" Feb 18 11:57:30 crc kubenswrapper[4922]: I0218 11:57:30.890698 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cea3a613-3571-4de4-be73-07a4db1c146e-operator-scripts\") pod \"nova-cell0-db-create-b9cbq\" (UID: \"cea3a613-3571-4de4-be73-07a4db1c146e\") " pod="openstack/nova-cell0-db-create-b9cbq" Feb 18 11:57:30 crc kubenswrapper[4922]: I0218 11:57:30.890744 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9855v\" (UniqueName: \"kubernetes.io/projected/cea3a613-3571-4de4-be73-07a4db1c146e-kube-api-access-9855v\") pod \"nova-cell0-db-create-b9cbq\" (UID: \"cea3a613-3571-4de4-be73-07a4db1c146e\") " pod="openstack/nova-cell0-db-create-b9cbq" Feb 18 11:57:30 crc kubenswrapper[4922]: I0218 11:57:30.892421 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/157bc07b-77b8-4a29-b8e0-9a205215187b-operator-scripts\") pod \"nova-api-db-create-p2pzf\" (UID: \"157bc07b-77b8-4a29-b8e0-9a205215187b\") " pod="openstack/nova-api-db-create-p2pzf" Feb 18 11:57:30 crc kubenswrapper[4922]: I0218 11:57:30.899468 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-b9cbq"] Feb 18 11:57:30 crc kubenswrapper[4922]: I0218 11:57:30.941207 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmnpw\" (UniqueName: \"kubernetes.io/projected/157bc07b-77b8-4a29-b8e0-9a205215187b-kube-api-access-mmnpw\") pod \"nova-api-db-create-p2pzf\" (UID: \"157bc07b-77b8-4a29-b8e0-9a205215187b\") " pod="openstack/nova-api-db-create-p2pzf" Feb 18 11:57:30 crc kubenswrapper[4922]: I0218 11:57:30.956524 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-96dc-account-create-update-4px58"] Feb 18 11:57:30 crc kubenswrapper[4922]: I0218 11:57:30.957945 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-96dc-account-create-update-4px58" Feb 18 11:57:30 crc kubenswrapper[4922]: I0218 11:57:30.960784 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 18 11:57:30 crc kubenswrapper[4922]: I0218 11:57:30.965395 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-96dc-account-create-update-4px58"] Feb 18 11:57:30 crc kubenswrapper[4922]: I0218 11:57:30.992542 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbxsg\" (UniqueName: \"kubernetes.io/projected/7513cf0a-f653-48b9-a365-9732179aaffc-kube-api-access-vbxsg\") pod \"nova-api-96dc-account-create-update-4px58\" (UID: \"7513cf0a-f653-48b9-a365-9732179aaffc\") " pod="openstack/nova-api-96dc-account-create-update-4px58" Feb 18 11:57:30 crc kubenswrapper[4922]: I0218 11:57:30.992649 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cea3a613-3571-4de4-be73-07a4db1c146e-operator-scripts\") pod \"nova-cell0-db-create-b9cbq\" (UID: \"cea3a613-3571-4de4-be73-07a4db1c146e\") " pod="openstack/nova-cell0-db-create-b9cbq" Feb 18 11:57:30 crc kubenswrapper[4922]: I0218 11:57:30.992686 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9855v\" (UniqueName: \"kubernetes.io/projected/cea3a613-3571-4de4-be73-07a4db1c146e-kube-api-access-9855v\") pod \"nova-cell0-db-create-b9cbq\" (UID: \"cea3a613-3571-4de4-be73-07a4db1c146e\") " pod="openstack/nova-cell0-db-create-b9cbq" Feb 18 11:57:30 crc kubenswrapper[4922]: I0218 11:57:30.992778 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7513cf0a-f653-48b9-a365-9732179aaffc-operator-scripts\") pod \"nova-api-96dc-account-create-update-4px58\" (UID: \"7513cf0a-f653-48b9-a365-9732179aaffc\") " pod="openstack/nova-api-96dc-account-create-update-4px58" Feb 18 11:57:30 crc kubenswrapper[4922]: I0218 11:57:30.993610 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cea3a613-3571-4de4-be73-07a4db1c146e-operator-scripts\") pod \"nova-cell0-db-create-b9cbq\" (UID: \"cea3a613-3571-4de4-be73-07a4db1c146e\") " pod="openstack/nova-cell0-db-create-b9cbq" Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.031074 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9855v\" (UniqueName: \"kubernetes.io/projected/cea3a613-3571-4de4-be73-07a4db1c146e-kube-api-access-9855v\") pod \"nova-cell0-db-create-b9cbq\" (UID: \"cea3a613-3571-4de4-be73-07a4db1c146e\") " pod="openstack/nova-cell0-db-create-b9cbq" Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.046595 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-ddrmz"] Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.048161 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-ddrmz" Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.057828 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-ddrmz"] Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.096675 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbxsg\" (UniqueName: \"kubernetes.io/projected/7513cf0a-f653-48b9-a365-9732179aaffc-kube-api-access-vbxsg\") pod \"nova-api-96dc-account-create-update-4px58\" (UID: \"7513cf0a-f653-48b9-a365-9732179aaffc\") " pod="openstack/nova-api-96dc-account-create-update-4px58" Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.096912 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7513cf0a-f653-48b9-a365-9732179aaffc-operator-scripts\") pod \"nova-api-96dc-account-create-update-4px58\" (UID: \"7513cf0a-f653-48b9-a365-9732179aaffc\") " pod="openstack/nova-api-96dc-account-create-update-4px58" Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.097811 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7513cf0a-f653-48b9-a365-9732179aaffc-operator-scripts\") pod \"nova-api-96dc-account-create-update-4px58\" (UID: \"7513cf0a-f653-48b9-a365-9732179aaffc\") " pod="openstack/nova-api-96dc-account-create-update-4px58" Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.137082 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbxsg\" (UniqueName: \"kubernetes.io/projected/7513cf0a-f653-48b9-a365-9732179aaffc-kube-api-access-vbxsg\") pod \"nova-api-96dc-account-create-update-4px58\" (UID: \"7513cf0a-f653-48b9-a365-9732179aaffc\") " pod="openstack/nova-api-96dc-account-create-update-4px58" Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.161323 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-ea20-account-create-update-k5t5k"] Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.162265 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-p2pzf" Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.163323 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ea20-account-create-update-k5t5k" Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.169762 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-ea20-account-create-update-k5t5k"] Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.172777 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.184679 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-b9cbq" Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.200280 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4f87\" (UniqueName: \"kubernetes.io/projected/41c3abe9-3a81-44ef-babf-818b176f6437-kube-api-access-v4f87\") pod \"nova-cell1-db-create-ddrmz\" (UID: \"41c3abe9-3a81-44ef-babf-818b176f6437\") " pod="openstack/nova-cell1-db-create-ddrmz" Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.200448 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41c3abe9-3a81-44ef-babf-818b176f6437-operator-scripts\") pod \"nova-cell1-db-create-ddrmz\" (UID: \"41c3abe9-3a81-44ef-babf-818b176f6437\") " pod="openstack/nova-cell1-db-create-ddrmz" Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.259619 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-c19a-account-create-update-24shd"] Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.261270 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c19a-account-create-update-24shd" Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.272683 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.291998 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-96dc-account-create-update-4px58" Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.295294 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-c19a-account-create-update-24shd"] Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.304557 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9360e33-9ae9-4b84-a898-c2c22626a565-operator-scripts\") pod \"nova-cell1-c19a-account-create-update-24shd\" (UID: \"f9360e33-9ae9-4b84-a898-c2c22626a565\") " pod="openstack/nova-cell1-c19a-account-create-update-24shd" Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.304604 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh78w\" (UniqueName: \"kubernetes.io/projected/f9360e33-9ae9-4b84-a898-c2c22626a565-kube-api-access-rh78w\") pod \"nova-cell1-c19a-account-create-update-24shd\" (UID: \"f9360e33-9ae9-4b84-a898-c2c22626a565\") " pod="openstack/nova-cell1-c19a-account-create-update-24shd" Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.304652 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41c3abe9-3a81-44ef-babf-818b176f6437-operator-scripts\") pod \"nova-cell1-db-create-ddrmz\" (UID: \"41c3abe9-3a81-44ef-babf-818b176f6437\") " pod="openstack/nova-cell1-db-create-ddrmz" Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.304707 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b28b3ba-c697-4cef-8e3f-e41317e3abe6-operator-scripts\") pod \"nova-cell0-ea20-account-create-update-k5t5k\" (UID: \"9b28b3ba-c697-4cef-8e3f-e41317e3abe6\") " pod="openstack/nova-cell0-ea20-account-create-update-k5t5k" Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.304785 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4f87\" (UniqueName: \"kubernetes.io/projected/41c3abe9-3a81-44ef-babf-818b176f6437-kube-api-access-v4f87\") pod \"nova-cell1-db-create-ddrmz\" (UID: \"41c3abe9-3a81-44ef-babf-818b176f6437\") " pod="openstack/nova-cell1-db-create-ddrmz" Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.304822 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7dh4\" (UniqueName: \"kubernetes.io/projected/9b28b3ba-c697-4cef-8e3f-e41317e3abe6-kube-api-access-k7dh4\") pod \"nova-cell0-ea20-account-create-update-k5t5k\" (UID: \"9b28b3ba-c697-4cef-8e3f-e41317e3abe6\") " pod="openstack/nova-cell0-ea20-account-create-update-k5t5k" Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.305737 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41c3abe9-3a81-44ef-babf-818b176f6437-operator-scripts\") pod \"nova-cell1-db-create-ddrmz\" (UID: \"41c3abe9-3a81-44ef-babf-818b176f6437\") " pod="openstack/nova-cell1-db-create-ddrmz" Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.335225 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4f87\" (UniqueName: \"kubernetes.io/projected/41c3abe9-3a81-44ef-babf-818b176f6437-kube-api-access-v4f87\") pod \"nova-cell1-db-create-ddrmz\" (UID: \"41c3abe9-3a81-44ef-babf-818b176f6437\") " pod="openstack/nova-cell1-db-create-ddrmz" Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.406301 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7dh4\" (UniqueName: \"kubernetes.io/projected/9b28b3ba-c697-4cef-8e3f-e41317e3abe6-kube-api-access-k7dh4\") pod \"nova-cell0-ea20-account-create-update-k5t5k\" (UID: \"9b28b3ba-c697-4cef-8e3f-e41317e3abe6\") " pod="openstack/nova-cell0-ea20-account-create-update-k5t5k" Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.406901 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9360e33-9ae9-4b84-a898-c2c22626a565-operator-scripts\") pod \"nova-cell1-c19a-account-create-update-24shd\" (UID: \"f9360e33-9ae9-4b84-a898-c2c22626a565\") " pod="openstack/nova-cell1-c19a-account-create-update-24shd" Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.406936 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rh78w\" (UniqueName: \"kubernetes.io/projected/f9360e33-9ae9-4b84-a898-c2c22626a565-kube-api-access-rh78w\") pod \"nova-cell1-c19a-account-create-update-24shd\" (UID: \"f9360e33-9ae9-4b84-a898-c2c22626a565\") " pod="openstack/nova-cell1-c19a-account-create-update-24shd" Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.407039 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b28b3ba-c697-4cef-8e3f-e41317e3abe6-operator-scripts\") pod \"nova-cell0-ea20-account-create-update-k5t5k\" (UID: \"9b28b3ba-c697-4cef-8e3f-e41317e3abe6\") " pod="openstack/nova-cell0-ea20-account-create-update-k5t5k" Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.410345 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9360e33-9ae9-4b84-a898-c2c22626a565-operator-scripts\") pod \"nova-cell1-c19a-account-create-update-24shd\" (UID: \"f9360e33-9ae9-4b84-a898-c2c22626a565\") " pod="openstack/nova-cell1-c19a-account-create-update-24shd" Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.414621 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b28b3ba-c697-4cef-8e3f-e41317e3abe6-operator-scripts\") pod \"nova-cell0-ea20-account-create-update-k5t5k\" (UID: \"9b28b3ba-c697-4cef-8e3f-e41317e3abe6\") " pod="openstack/nova-cell0-ea20-account-create-update-k5t5k" Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.427998 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7dh4\" (UniqueName: \"kubernetes.io/projected/9b28b3ba-c697-4cef-8e3f-e41317e3abe6-kube-api-access-k7dh4\") pod \"nova-cell0-ea20-account-create-update-k5t5k\" (UID: \"9b28b3ba-c697-4cef-8e3f-e41317e3abe6\") " pod="openstack/nova-cell0-ea20-account-create-update-k5t5k" Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.428669 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh78w\" (UniqueName: \"kubernetes.io/projected/f9360e33-9ae9-4b84-a898-c2c22626a565-kube-api-access-rh78w\") pod \"nova-cell1-c19a-account-create-update-24shd\" (UID: \"f9360e33-9ae9-4b84-a898-c2c22626a565\") " pod="openstack/nova-cell1-c19a-account-create-update-24shd" Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.439470 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-ddrmz" Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.524652 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ea20-account-create-update-k5t5k" Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.602094 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c19a-account-create-update-24shd" Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.800475 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cc8b5d5c5-jrg5v" Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.817616 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.821513 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d61bee1b-0ee4-4c97-8d5d-8655406f124c-ovsdbserver-sb\") pod \"d61bee1b-0ee4-4c97-8d5d-8655406f124c\" (UID: \"d61bee1b-0ee4-4c97-8d5d-8655406f124c\") " Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.821591 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2htl\" (UniqueName: \"kubernetes.io/projected/d61bee1b-0ee4-4c97-8d5d-8655406f124c-kube-api-access-k2htl\") pod \"d61bee1b-0ee4-4c97-8d5d-8655406f124c\" (UID: \"d61bee1b-0ee4-4c97-8d5d-8655406f124c\") " Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.821656 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d61bee1b-0ee4-4c97-8d5d-8655406f124c-dns-svc\") pod \"d61bee1b-0ee4-4c97-8d5d-8655406f124c\" (UID: \"d61bee1b-0ee4-4c97-8d5d-8655406f124c\") " Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.821683 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d61bee1b-0ee4-4c97-8d5d-8655406f124c-ovsdbserver-nb\") pod \"d61bee1b-0ee4-4c97-8d5d-8655406f124c\" (UID: \"d61bee1b-0ee4-4c97-8d5d-8655406f124c\") " Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.821714 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d61bee1b-0ee4-4c97-8d5d-8655406f124c-dns-swift-storage-0\") pod \"d61bee1b-0ee4-4c97-8d5d-8655406f124c\" (UID: \"d61bee1b-0ee4-4c97-8d5d-8655406f124c\") " Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.821822 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d61bee1b-0ee4-4c97-8d5d-8655406f124c-config\") pod \"d61bee1b-0ee4-4c97-8d5d-8655406f124c\" (UID: \"d61bee1b-0ee4-4c97-8d5d-8655406f124c\") " Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.863384 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d61bee1b-0ee4-4c97-8d5d-8655406f124c-kube-api-access-k2htl" (OuterVolumeSpecName: "kube-api-access-k2htl") pod "d61bee1b-0ee4-4c97-8d5d-8655406f124c" (UID: "d61bee1b-0ee4-4c97-8d5d-8655406f124c"). InnerVolumeSpecName "kube-api-access-k2htl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.913221 4922 generic.go:334] "Generic (PLEG): container finished" podID="d61bee1b-0ee4-4c97-8d5d-8655406f124c" containerID="6fc40bd77d62f143a53ef0627d5b912914ae270a819ef4018ea2e7de4e360674" exitCode=0 Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.913273 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cc8b5d5c5-jrg5v" event={"ID":"d61bee1b-0ee4-4c97-8d5d-8655406f124c","Type":"ContainerDied","Data":"6fc40bd77d62f143a53ef0627d5b912914ae270a819ef4018ea2e7de4e360674"} Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.913326 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cc8b5d5c5-jrg5v" event={"ID":"d61bee1b-0ee4-4c97-8d5d-8655406f124c","Type":"ContainerDied","Data":"18a7a7ab6ce87177724ccea1ae9a737d84614c82a96458fedc35807aee0f2e52"} Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.913349 4922 scope.go:117] "RemoveContainer" containerID="6fc40bd77d62f143a53ef0627d5b912914ae270a819ef4018ea2e7de4e360674" Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.913559 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cc8b5d5c5-jrg5v" Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.928612 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2htl\" (UniqueName: \"kubernetes.io/projected/d61bee1b-0ee4-4c97-8d5d-8655406f124c-kube-api-access-k2htl\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.966047 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d61bee1b-0ee4-4c97-8d5d-8655406f124c-config" (OuterVolumeSpecName: "config") pod "d61bee1b-0ee4-4c97-8d5d-8655406f124c" (UID: "d61bee1b-0ee4-4c97-8d5d-8655406f124c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.976388 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-p2pzf"] Feb 18 11:57:31 crc kubenswrapper[4922]: I0218 11:57:31.990459 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-b9cbq"] Feb 18 11:57:32 crc kubenswrapper[4922]: I0218 11:57:32.003656 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d61bee1b-0ee4-4c97-8d5d-8655406f124c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d61bee1b-0ee4-4c97-8d5d-8655406f124c" (UID: "d61bee1b-0ee4-4c97-8d5d-8655406f124c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:57:32 crc kubenswrapper[4922]: I0218 11:57:32.009579 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d61bee1b-0ee4-4c97-8d5d-8655406f124c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d61bee1b-0ee4-4c97-8d5d-8655406f124c" (UID: "d61bee1b-0ee4-4c97-8d5d-8655406f124c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:57:32 crc kubenswrapper[4922]: I0218 11:57:32.040509 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d61bee1b-0ee4-4c97-8d5d-8655406f124c-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:32 crc kubenswrapper[4922]: I0218 11:57:32.041202 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d61bee1b-0ee4-4c97-8d5d-8655406f124c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:32 crc kubenswrapper[4922]: I0218 11:57:32.041238 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d61bee1b-0ee4-4c97-8d5d-8655406f124c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:32 crc kubenswrapper[4922]: I0218 11:57:32.104862 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d61bee1b-0ee4-4c97-8d5d-8655406f124c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d61bee1b-0ee4-4c97-8d5d-8655406f124c" (UID: "d61bee1b-0ee4-4c97-8d5d-8655406f124c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:57:32 crc kubenswrapper[4922]: I0218 11:57:32.127561 4922 scope.go:117] "RemoveContainer" containerID="ce7a1f8b346e07ed0990be6cb946929c06f52e0b61b2bdd4bba88e18309b0f61" Feb 18 11:57:32 crc kubenswrapper[4922]: I0218 11:57:32.144436 4922 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d61bee1b-0ee4-4c97-8d5d-8655406f124c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:32 crc kubenswrapper[4922]: I0218 11:57:32.152892 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d61bee1b-0ee4-4c97-8d5d-8655406f124c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d61bee1b-0ee4-4c97-8d5d-8655406f124c" (UID: "d61bee1b-0ee4-4c97-8d5d-8655406f124c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:57:32 crc kubenswrapper[4922]: I0218 11:57:32.207440 4922 scope.go:117] "RemoveContainer" containerID="6fc40bd77d62f143a53ef0627d5b912914ae270a819ef4018ea2e7de4e360674" Feb 18 11:57:32 crc kubenswrapper[4922]: E0218 11:57:32.207901 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fc40bd77d62f143a53ef0627d5b912914ae270a819ef4018ea2e7de4e360674\": container with ID starting with 6fc40bd77d62f143a53ef0627d5b912914ae270a819ef4018ea2e7de4e360674 not found: ID does not exist" containerID="6fc40bd77d62f143a53ef0627d5b912914ae270a819ef4018ea2e7de4e360674" Feb 18 11:57:32 crc kubenswrapper[4922]: I0218 11:57:32.207941 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fc40bd77d62f143a53ef0627d5b912914ae270a819ef4018ea2e7de4e360674"} err="failed to get container status \"6fc40bd77d62f143a53ef0627d5b912914ae270a819ef4018ea2e7de4e360674\": rpc error: code = NotFound desc = could not find container \"6fc40bd77d62f143a53ef0627d5b912914ae270a819ef4018ea2e7de4e360674\": container with ID starting with 6fc40bd77d62f143a53ef0627d5b912914ae270a819ef4018ea2e7de4e360674 not found: ID does not exist" Feb 18 11:57:32 crc kubenswrapper[4922]: I0218 11:57:32.207970 4922 scope.go:117] "RemoveContainer" containerID="ce7a1f8b346e07ed0990be6cb946929c06f52e0b61b2bdd4bba88e18309b0f61" Feb 18 11:57:32 crc kubenswrapper[4922]: E0218 11:57:32.208214 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce7a1f8b346e07ed0990be6cb946929c06f52e0b61b2bdd4bba88e18309b0f61\": container with ID starting with ce7a1f8b346e07ed0990be6cb946929c06f52e0b61b2bdd4bba88e18309b0f61 not found: ID does not exist" containerID="ce7a1f8b346e07ed0990be6cb946929c06f52e0b61b2bdd4bba88e18309b0f61" Feb 18 11:57:32 crc kubenswrapper[4922]: I0218 11:57:32.208239 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce7a1f8b346e07ed0990be6cb946929c06f52e0b61b2bdd4bba88e18309b0f61"} err="failed to get container status \"ce7a1f8b346e07ed0990be6cb946929c06f52e0b61b2bdd4bba88e18309b0f61\": rpc error: code = NotFound desc = could not find container \"ce7a1f8b346e07ed0990be6cb946929c06f52e0b61b2bdd4bba88e18309b0f61\": container with ID starting with ce7a1f8b346e07ed0990be6cb946929c06f52e0b61b2bdd4bba88e18309b0f61 not found: ID does not exist" Feb 18 11:57:32 crc kubenswrapper[4922]: I0218 11:57:32.251078 4922 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d61bee1b-0ee4-4c97-8d5d-8655406f124c-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:32 crc kubenswrapper[4922]: I0218 11:57:32.280062 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cc8b5d5c5-jrg5v"] Feb 18 11:57:32 crc kubenswrapper[4922]: I0218 11:57:32.297876 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5cc8b5d5c5-jrg5v"] Feb 18 11:57:32 crc kubenswrapper[4922]: I0218 11:57:32.396634 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-96dc-account-create-update-4px58"] Feb 18 11:57:32 crc kubenswrapper[4922]: W0218 11:57:32.408820 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41c3abe9_3a81_44ef_babf_818b176f6437.slice/crio-6b97ca742e23751715b7c18d5f5f62f93230d2dcc1c0f65e7e7422315924cd92 WatchSource:0}: Error finding container 6b97ca742e23751715b7c18d5f5f62f93230d2dcc1c0f65e7e7422315924cd92: Status 404 returned error can't find the container with id 6b97ca742e23751715b7c18d5f5f62f93230d2dcc1c0f65e7e7422315924cd92 Feb 18 11:57:32 crc kubenswrapper[4922]: I0218 11:57:32.409763 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-ddrmz"] Feb 18 11:57:32 crc kubenswrapper[4922]: I0218 11:57:32.562550 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-ea20-account-create-update-k5t5k"] Feb 18 11:57:32 crc kubenswrapper[4922]: I0218 11:57:32.834190 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-c19a-account-create-update-24shd"] Feb 18 11:57:32 crc kubenswrapper[4922]: I0218 11:57:32.952623 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ea20-account-create-update-k5t5k" event={"ID":"9b28b3ba-c697-4cef-8e3f-e41317e3abe6","Type":"ContainerStarted","Data":"c71bbdea45ab7a402cd8fc37c94e31cbd03659099c218416879906794286646a"} Feb 18 11:57:33 crc kubenswrapper[4922]: I0218 11:57:33.016408 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d61bee1b-0ee4-4c97-8d5d-8655406f124c" path="/var/lib/kubelet/pods/d61bee1b-0ee4-4c97-8d5d-8655406f124c/volumes" Feb 18 11:57:33 crc kubenswrapper[4922]: I0218 11:57:33.016394 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-b9cbq" podStartSLOduration=3.016354996 podStartE2EDuration="3.016354996s" podCreationTimestamp="2026-02-18 11:57:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:57:32.999224133 +0000 UTC m=+1254.726928213" watchObservedRunningTime="2026-02-18 11:57:33.016354996 +0000 UTC m=+1254.744059076" Feb 18 11:57:33 crc kubenswrapper[4922]: I0218 11:57:33.022302 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-b9cbq" event={"ID":"cea3a613-3571-4de4-be73-07a4db1c146e","Type":"ContainerStarted","Data":"2ca4584a9bdc48778761e031dee58443f19954bb292722b1e84049c5d0d3891e"} Feb 18 11:57:33 crc kubenswrapper[4922]: I0218 11:57:33.022454 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-b9cbq" event={"ID":"cea3a613-3571-4de4-be73-07a4db1c146e","Type":"ContainerStarted","Data":"73eb578f99941b6e8fcccd4f7146c408046aea775fac188592a56b6aa1c8c60e"} Feb 18 11:57:33 crc kubenswrapper[4922]: I0218 11:57:33.022521 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-96dc-account-create-update-4px58" event={"ID":"7513cf0a-f653-48b9-a365-9732179aaffc","Type":"ContainerStarted","Data":"64eecc2f7b37277d3eaa046c9a0ed3c760eb621a97a64a32162b927b6a47e4d1"} Feb 18 11:57:33 crc kubenswrapper[4922]: I0218 11:57:33.022587 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-96dc-account-create-update-4px58" event={"ID":"7513cf0a-f653-48b9-a365-9732179aaffc","Type":"ContainerStarted","Data":"99782e6d55a2e38110d1ce6513a88af301a487264155e726820a08e94b0d1c9f"} Feb 18 11:57:33 crc kubenswrapper[4922]: I0218 11:57:33.022717 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c19a-account-create-update-24shd" event={"ID":"f9360e33-9ae9-4b84-a898-c2c22626a565","Type":"ContainerStarted","Data":"95ae0db4dc810a397e5536957907573f044ce4062de198f02463dffeab24a900"} Feb 18 11:57:33 crc kubenswrapper[4922]: I0218 11:57:33.036765 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-p2pzf" event={"ID":"157bc07b-77b8-4a29-b8e0-9a205215187b","Type":"ContainerStarted","Data":"6447a128ed7c2ef808d7f125af51fae05848cf0c3a78ee0cef1bb21c2071c85c"} Feb 18 11:57:33 crc kubenswrapper[4922]: I0218 11:57:33.037066 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-p2pzf" event={"ID":"157bc07b-77b8-4a29-b8e0-9a205215187b","Type":"ContainerStarted","Data":"15aa49b2b6d6e10c8c1597a89c8dfd815ac09c35f2814b8e479c59053ff9efa1"} Feb 18 11:57:33 crc kubenswrapper[4922]: I0218 11:57:33.054128 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-96dc-account-create-update-4px58" podStartSLOduration=3.05410833 podStartE2EDuration="3.05410833s" podCreationTimestamp="2026-02-18 11:57:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:57:33.020778647 +0000 UTC m=+1254.748482727" watchObservedRunningTime="2026-02-18 11:57:33.05410833 +0000 UTC m=+1254.781812410" Feb 18 11:57:33 crc kubenswrapper[4922]: I0218 11:57:33.078713 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3ad26da5-56a1-4f67-aae9-ab321499352f","Type":"ContainerStarted","Data":"861840f7d8623349410807b78c7acecc1dbed44761c62e936d5db92618e71c43"} Feb 18 11:57:33 crc kubenswrapper[4922]: I0218 11:57:33.078891 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3ad26da5-56a1-4f67-aae9-ab321499352f" containerName="ceilometer-central-agent" containerID="cri-o://4f7ad4cbfa0275aa87aaa646fee7583ae6b1acc937b50ade063f9b7ab68d66a4" gracePeriod=30 Feb 18 11:57:33 crc kubenswrapper[4922]: I0218 11:57:33.079116 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 18 11:57:33 crc kubenswrapper[4922]: I0218 11:57:33.079136 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3ad26da5-56a1-4f67-aae9-ab321499352f" containerName="proxy-httpd" containerID="cri-o://861840f7d8623349410807b78c7acecc1dbed44761c62e936d5db92618e71c43" gracePeriod=30 Feb 18 11:57:33 crc kubenswrapper[4922]: I0218 11:57:33.079148 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3ad26da5-56a1-4f67-aae9-ab321499352f" containerName="sg-core" containerID="cri-o://0426fee054517907fa41850ee4ebf4b625cb2e9874199241df43ce1756e27ff1" gracePeriod=30 Feb 18 11:57:33 crc kubenswrapper[4922]: I0218 11:57:33.079159 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3ad26da5-56a1-4f67-aae9-ab321499352f" containerName="ceilometer-notification-agent" containerID="cri-o://6840f0d264514efcd94bc4a02aec112cf5f76e7909fa91e4880bd5ff5ad6f1bc" gracePeriod=30 Feb 18 11:57:33 crc kubenswrapper[4922]: I0218 11:57:33.082329 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-p2pzf" podStartSLOduration=3.082303753 podStartE2EDuration="3.082303753s" podCreationTimestamp="2026-02-18 11:57:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:57:33.072040853 +0000 UTC m=+1254.799744943" watchObservedRunningTime="2026-02-18 11:57:33.082303753 +0000 UTC m=+1254.810007833" Feb 18 11:57:33 crc kubenswrapper[4922]: I0218 11:57:33.090275 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-ddrmz" event={"ID":"41c3abe9-3a81-44ef-babf-818b176f6437","Type":"ContainerStarted","Data":"bf9c600a2ed87d6610ddd93585d70fd608560cee157862758bf35ddf2c2a4754"} Feb 18 11:57:33 crc kubenswrapper[4922]: I0218 11:57:33.090312 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-ddrmz" event={"ID":"41c3abe9-3a81-44ef-babf-818b176f6437","Type":"ContainerStarted","Data":"6b97ca742e23751715b7c18d5f5f62f93230d2dcc1c0f65e7e7422315924cd92"} Feb 18 11:57:33 crc kubenswrapper[4922]: I0218 11:57:33.132968 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.698479248 podStartE2EDuration="7.132948223s" podCreationTimestamp="2026-02-18 11:57:26 +0000 UTC" firstStartedPulling="2026-02-18 11:57:27.502755145 +0000 UTC m=+1249.230459225" lastFinishedPulling="2026-02-18 11:57:31.93722412 +0000 UTC m=+1253.664928200" observedRunningTime="2026-02-18 11:57:33.121924974 +0000 UTC m=+1254.849629054" watchObservedRunningTime="2026-02-18 11:57:33.132948223 +0000 UTC m=+1254.860652303" Feb 18 11:57:33 crc kubenswrapper[4922]: I0218 11:57:33.149759 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-ddrmz" podStartSLOduration=2.149738907 podStartE2EDuration="2.149738907s" podCreationTimestamp="2026-02-18 11:57:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:57:33.145977062 +0000 UTC m=+1254.873681162" watchObservedRunningTime="2026-02-18 11:57:33.149738907 +0000 UTC m=+1254.877442987" Feb 18 11:57:33 crc kubenswrapper[4922]: I0218 11:57:33.395334 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 11:57:33 crc kubenswrapper[4922]: I0218 11:57:33.395675 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7810aaca-e072-467b-bba7-6a3e12310c68" containerName="glance-log" containerID="cri-o://0619ad10d88107a566fb4c0dbd0711c30708a02e901f59a47be08cbb81b88d18" gracePeriod=30 Feb 18 11:57:33 crc kubenswrapper[4922]: I0218 11:57:33.396220 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7810aaca-e072-467b-bba7-6a3e12310c68" containerName="glance-httpd" containerID="cri-o://be139187f859b824114f17925d841efb955f0e7b83d0a0144a156b5f7b3e4df4" gracePeriod=30 Feb 18 11:57:34 crc kubenswrapper[4922]: I0218 11:57:34.123006 4922 generic.go:334] "Generic (PLEG): container finished" podID="7513cf0a-f653-48b9-a365-9732179aaffc" containerID="64eecc2f7b37277d3eaa046c9a0ed3c760eb621a97a64a32162b927b6a47e4d1" exitCode=0 Feb 18 11:57:34 crc kubenswrapper[4922]: I0218 11:57:34.123421 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-96dc-account-create-update-4px58" event={"ID":"7513cf0a-f653-48b9-a365-9732179aaffc","Type":"ContainerDied","Data":"64eecc2f7b37277d3eaa046c9a0ed3c760eb621a97a64a32162b927b6a47e4d1"} Feb 18 11:57:34 crc kubenswrapper[4922]: I0218 11:57:34.126937 4922 generic.go:334] "Generic (PLEG): container finished" podID="41c3abe9-3a81-44ef-babf-818b176f6437" containerID="bf9c600a2ed87d6610ddd93585d70fd608560cee157862758bf35ddf2c2a4754" exitCode=0 Feb 18 11:57:34 crc kubenswrapper[4922]: I0218 11:57:34.127016 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-ddrmz" event={"ID":"41c3abe9-3a81-44ef-babf-818b176f6437","Type":"ContainerDied","Data":"bf9c600a2ed87d6610ddd93585d70fd608560cee157862758bf35ddf2c2a4754"} Feb 18 11:57:34 crc kubenswrapper[4922]: I0218 11:57:34.129700 4922 generic.go:334] "Generic (PLEG): container finished" podID="9b28b3ba-c697-4cef-8e3f-e41317e3abe6" containerID="ac0ab0e9aaca817513e97dbed88ce8e6eac29d917cc8fc47fb5c8da1460429d9" exitCode=0 Feb 18 11:57:34 crc kubenswrapper[4922]: I0218 11:57:34.129772 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ea20-account-create-update-k5t5k" event={"ID":"9b28b3ba-c697-4cef-8e3f-e41317e3abe6","Type":"ContainerDied","Data":"ac0ab0e9aaca817513e97dbed88ce8e6eac29d917cc8fc47fb5c8da1460429d9"} Feb 18 11:57:34 crc kubenswrapper[4922]: I0218 11:57:34.136641 4922 generic.go:334] "Generic (PLEG): container finished" podID="7810aaca-e072-467b-bba7-6a3e12310c68" containerID="0619ad10d88107a566fb4c0dbd0711c30708a02e901f59a47be08cbb81b88d18" exitCode=143 Feb 18 11:57:34 crc kubenswrapper[4922]: I0218 11:57:34.136938 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7810aaca-e072-467b-bba7-6a3e12310c68","Type":"ContainerDied","Data":"0619ad10d88107a566fb4c0dbd0711c30708a02e901f59a47be08cbb81b88d18"} Feb 18 11:57:34 crc kubenswrapper[4922]: I0218 11:57:34.144667 4922 generic.go:334] "Generic (PLEG): container finished" podID="cea3a613-3571-4de4-be73-07a4db1c146e" containerID="2ca4584a9bdc48778761e031dee58443f19954bb292722b1e84049c5d0d3891e" exitCode=0 Feb 18 11:57:34 crc kubenswrapper[4922]: I0218 11:57:34.144745 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-b9cbq" event={"ID":"cea3a613-3571-4de4-be73-07a4db1c146e","Type":"ContainerDied","Data":"2ca4584a9bdc48778761e031dee58443f19954bb292722b1e84049c5d0d3891e"} Feb 18 11:57:34 crc kubenswrapper[4922]: I0218 11:57:34.179584 4922 generic.go:334] "Generic (PLEG): container finished" podID="3ad26da5-56a1-4f67-aae9-ab321499352f" containerID="861840f7d8623349410807b78c7acecc1dbed44761c62e936d5db92618e71c43" exitCode=0 Feb 18 11:57:34 crc kubenswrapper[4922]: I0218 11:57:34.179624 4922 generic.go:334] "Generic (PLEG): container finished" podID="3ad26da5-56a1-4f67-aae9-ab321499352f" containerID="0426fee054517907fa41850ee4ebf4b625cb2e9874199241df43ce1756e27ff1" exitCode=2 Feb 18 11:57:34 crc kubenswrapper[4922]: I0218 11:57:34.179632 4922 generic.go:334] "Generic (PLEG): container finished" podID="3ad26da5-56a1-4f67-aae9-ab321499352f" containerID="6840f0d264514efcd94bc4a02aec112cf5f76e7909fa91e4880bd5ff5ad6f1bc" exitCode=0 Feb 18 11:57:34 crc kubenswrapper[4922]: I0218 11:57:34.179732 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3ad26da5-56a1-4f67-aae9-ab321499352f","Type":"ContainerDied","Data":"861840f7d8623349410807b78c7acecc1dbed44761c62e936d5db92618e71c43"} Feb 18 11:57:34 crc kubenswrapper[4922]: I0218 11:57:34.179763 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3ad26da5-56a1-4f67-aae9-ab321499352f","Type":"ContainerDied","Data":"0426fee054517907fa41850ee4ebf4b625cb2e9874199241df43ce1756e27ff1"} Feb 18 11:57:34 crc kubenswrapper[4922]: I0218 11:57:34.179778 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3ad26da5-56a1-4f67-aae9-ab321499352f","Type":"ContainerDied","Data":"6840f0d264514efcd94bc4a02aec112cf5f76e7909fa91e4880bd5ff5ad6f1bc"} Feb 18 11:57:34 crc kubenswrapper[4922]: I0218 11:57:34.189140 4922 generic.go:334] "Generic (PLEG): container finished" podID="24eb828d-acb3-4b88-96dc-8d3bb8c49e86" containerID="fae0743e5e0fb73717e68120a40713dc2bcdd3fa357699afed1bff0f5a1368e5" exitCode=0 Feb 18 11:57:34 crc kubenswrapper[4922]: I0218 11:57:34.189229 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"24eb828d-acb3-4b88-96dc-8d3bb8c49e86","Type":"ContainerDied","Data":"fae0743e5e0fb73717e68120a40713dc2bcdd3fa357699afed1bff0f5a1368e5"} Feb 18 11:57:34 crc kubenswrapper[4922]: I0218 11:57:34.211196 4922 generic.go:334] "Generic (PLEG): container finished" podID="f9360e33-9ae9-4b84-a898-c2c22626a565" containerID="9eae3101b2310737957f7e6d08c731592c72422d2cd0b2731a1d4e5979cf4d34" exitCode=0 Feb 18 11:57:34 crc kubenswrapper[4922]: I0218 11:57:34.211278 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c19a-account-create-update-24shd" event={"ID":"f9360e33-9ae9-4b84-a898-c2c22626a565","Type":"ContainerDied","Data":"9eae3101b2310737957f7e6d08c731592c72422d2cd0b2731a1d4e5979cf4d34"} Feb 18 11:57:34 crc kubenswrapper[4922]: I0218 11:57:34.213526 4922 generic.go:334] "Generic (PLEG): container finished" podID="157bc07b-77b8-4a29-b8e0-9a205215187b" containerID="6447a128ed7c2ef808d7f125af51fae05848cf0c3a78ee0cef1bb21c2071c85c" exitCode=0 Feb 18 11:57:34 crc kubenswrapper[4922]: I0218 11:57:34.213571 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-p2pzf" event={"ID":"157bc07b-77b8-4a29-b8e0-9a205215187b","Type":"ContainerDied","Data":"6447a128ed7c2ef808d7f125af51fae05848cf0c3a78ee0cef1bb21c2071c85c"} Feb 18 11:57:34 crc kubenswrapper[4922]: I0218 11:57:34.337033 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 18 11:57:34 crc kubenswrapper[4922]: I0218 11:57:34.513417 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/24eb828d-acb3-4b88-96dc-8d3bb8c49e86-httpd-run\") pod \"24eb828d-acb3-4b88-96dc-8d3bb8c49e86\" (UID: \"24eb828d-acb3-4b88-96dc-8d3bb8c49e86\") " Feb 18 11:57:34 crc kubenswrapper[4922]: I0218 11:57:34.513486 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24eb828d-acb3-4b88-96dc-8d3bb8c49e86-logs\") pod \"24eb828d-acb3-4b88-96dc-8d3bb8c49e86\" (UID: \"24eb828d-acb3-4b88-96dc-8d3bb8c49e86\") " Feb 18 11:57:34 crc kubenswrapper[4922]: I0218 11:57:34.513512 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24eb828d-acb3-4b88-96dc-8d3bb8c49e86-scripts\") pod \"24eb828d-acb3-4b88-96dc-8d3bb8c49e86\" (UID: \"24eb828d-acb3-4b88-96dc-8d3bb8c49e86\") " Feb 18 11:57:34 crc kubenswrapper[4922]: I0218 11:57:34.513622 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24eb828d-acb3-4b88-96dc-8d3bb8c49e86-config-data\") pod \"24eb828d-acb3-4b88-96dc-8d3bb8c49e86\" (UID: \"24eb828d-acb3-4b88-96dc-8d3bb8c49e86\") " Feb 18 11:57:34 crc kubenswrapper[4922]: I0218 11:57:34.513659 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24eb828d-acb3-4b88-96dc-8d3bb8c49e86-combined-ca-bundle\") pod \"24eb828d-acb3-4b88-96dc-8d3bb8c49e86\" (UID: \"24eb828d-acb3-4b88-96dc-8d3bb8c49e86\") " Feb 18 11:57:34 crc kubenswrapper[4922]: I0218 11:57:34.513690 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"24eb828d-acb3-4b88-96dc-8d3bb8c49e86\" (UID: \"24eb828d-acb3-4b88-96dc-8d3bb8c49e86\") " Feb 18 11:57:34 crc kubenswrapper[4922]: I0218 11:57:34.514744 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcrkn\" (UniqueName: \"kubernetes.io/projected/24eb828d-acb3-4b88-96dc-8d3bb8c49e86-kube-api-access-qcrkn\") pod \"24eb828d-acb3-4b88-96dc-8d3bb8c49e86\" (UID: \"24eb828d-acb3-4b88-96dc-8d3bb8c49e86\") " Feb 18 11:57:34 crc kubenswrapper[4922]: I0218 11:57:34.514766 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/24eb828d-acb3-4b88-96dc-8d3bb8c49e86-internal-tls-certs\") pod \"24eb828d-acb3-4b88-96dc-8d3bb8c49e86\" (UID: \"24eb828d-acb3-4b88-96dc-8d3bb8c49e86\") " Feb 18 11:57:34 crc kubenswrapper[4922]: I0218 11:57:34.516131 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24eb828d-acb3-4b88-96dc-8d3bb8c49e86-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "24eb828d-acb3-4b88-96dc-8d3bb8c49e86" (UID: "24eb828d-acb3-4b88-96dc-8d3bb8c49e86"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:57:34 crc kubenswrapper[4922]: I0218 11:57:34.516283 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24eb828d-acb3-4b88-96dc-8d3bb8c49e86-logs" (OuterVolumeSpecName: "logs") pod "24eb828d-acb3-4b88-96dc-8d3bb8c49e86" (UID: "24eb828d-acb3-4b88-96dc-8d3bb8c49e86"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:57:34 crc kubenswrapper[4922]: I0218 11:57:34.526679 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24eb828d-acb3-4b88-96dc-8d3bb8c49e86-scripts" (OuterVolumeSpecName: "scripts") pod "24eb828d-acb3-4b88-96dc-8d3bb8c49e86" (UID: "24eb828d-acb3-4b88-96dc-8d3bb8c49e86"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:34 crc kubenswrapper[4922]: I0218 11:57:34.531423 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24eb828d-acb3-4b88-96dc-8d3bb8c49e86-kube-api-access-qcrkn" (OuterVolumeSpecName: "kube-api-access-qcrkn") pod "24eb828d-acb3-4b88-96dc-8d3bb8c49e86" (UID: "24eb828d-acb3-4b88-96dc-8d3bb8c49e86"). InnerVolumeSpecName "kube-api-access-qcrkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:57:34 crc kubenswrapper[4922]: I0218 11:57:34.553242 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "24eb828d-acb3-4b88-96dc-8d3bb8c49e86" (UID: "24eb828d-acb3-4b88-96dc-8d3bb8c49e86"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 18 11:57:34 crc kubenswrapper[4922]: I0218 11:57:34.562495 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24eb828d-acb3-4b88-96dc-8d3bb8c49e86-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "24eb828d-acb3-4b88-96dc-8d3bb8c49e86" (UID: "24eb828d-acb3-4b88-96dc-8d3bb8c49e86"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:34 crc kubenswrapper[4922]: I0218 11:57:34.578046 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24eb828d-acb3-4b88-96dc-8d3bb8c49e86-config-data" (OuterVolumeSpecName: "config-data") pod "24eb828d-acb3-4b88-96dc-8d3bb8c49e86" (UID: "24eb828d-acb3-4b88-96dc-8d3bb8c49e86"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:34 crc kubenswrapper[4922]: I0218 11:57:34.589700 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24eb828d-acb3-4b88-96dc-8d3bb8c49e86-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "24eb828d-acb3-4b88-96dc-8d3bb8c49e86" (UID: "24eb828d-acb3-4b88-96dc-8d3bb8c49e86"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:34 crc kubenswrapper[4922]: I0218 11:57:34.618726 4922 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Feb 18 11:57:34 crc kubenswrapper[4922]: I0218 11:57:34.619069 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcrkn\" (UniqueName: \"kubernetes.io/projected/24eb828d-acb3-4b88-96dc-8d3bb8c49e86-kube-api-access-qcrkn\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:34 crc kubenswrapper[4922]: I0218 11:57:34.619157 4922 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/24eb828d-acb3-4b88-96dc-8d3bb8c49e86-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:34 crc kubenswrapper[4922]: I0218 11:57:34.619230 4922 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/24eb828d-acb3-4b88-96dc-8d3bb8c49e86-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:34 crc kubenswrapper[4922]: I0218 11:57:34.619302 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24eb828d-acb3-4b88-96dc-8d3bb8c49e86-logs\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:34 crc kubenswrapper[4922]: I0218 11:57:34.619390 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24eb828d-acb3-4b88-96dc-8d3bb8c49e86-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:34 crc kubenswrapper[4922]: I0218 11:57:34.619494 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24eb828d-acb3-4b88-96dc-8d3bb8c49e86-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:34 crc kubenswrapper[4922]: I0218 11:57:34.619579 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24eb828d-acb3-4b88-96dc-8d3bb8c49e86-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:34 crc kubenswrapper[4922]: I0218 11:57:34.638166 4922 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Feb 18 11:57:34 crc kubenswrapper[4922]: I0218 11:57:34.722335 4922 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.223237 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"24eb828d-acb3-4b88-96dc-8d3bb8c49e86","Type":"ContainerDied","Data":"205155ab7a59a38e41604dd6477d8795045c94879e29fefe3e7383b4bc42a275"} Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.223307 4922 scope.go:117] "RemoveContainer" containerID="fae0743e5e0fb73717e68120a40713dc2bcdd3fa357699afed1bff0f5a1368e5" Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.226541 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.257190 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.263585 4922 scope.go:117] "RemoveContainer" containerID="2c9d001f560c7f93208a3584bc230854257ebcad49ef86ea326e6025425c596a" Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.273748 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.289711 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 11:57:35 crc kubenswrapper[4922]: E0218 11:57:35.290191 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d61bee1b-0ee4-4c97-8d5d-8655406f124c" containerName="init" Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.290205 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="d61bee1b-0ee4-4c97-8d5d-8655406f124c" containerName="init" Feb 18 11:57:35 crc kubenswrapper[4922]: E0218 11:57:35.290225 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d61bee1b-0ee4-4c97-8d5d-8655406f124c" containerName="dnsmasq-dns" Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.290231 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="d61bee1b-0ee4-4c97-8d5d-8655406f124c" containerName="dnsmasq-dns" Feb 18 11:57:35 crc kubenswrapper[4922]: E0218 11:57:35.290243 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24eb828d-acb3-4b88-96dc-8d3bb8c49e86" containerName="glance-httpd" Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.290250 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="24eb828d-acb3-4b88-96dc-8d3bb8c49e86" containerName="glance-httpd" Feb 18 11:57:35 crc kubenswrapper[4922]: E0218 11:57:35.290270 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24eb828d-acb3-4b88-96dc-8d3bb8c49e86" containerName="glance-log" Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.290276 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="24eb828d-acb3-4b88-96dc-8d3bb8c49e86" containerName="glance-log" Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.290482 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="d61bee1b-0ee4-4c97-8d5d-8655406f124c" containerName="dnsmasq-dns" Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.290496 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="24eb828d-acb3-4b88-96dc-8d3bb8c49e86" containerName="glance-httpd" Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.290512 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="24eb828d-acb3-4b88-96dc-8d3bb8c49e86" containerName="glance-log" Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.291625 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.294919 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.311091 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.341744 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.437737 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/342c8bfd-c2d6-4afd-b2be-3e1474b63b62-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"342c8bfd-c2d6-4afd-b2be-3e1474b63b62\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.437791 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/342c8bfd-c2d6-4afd-b2be-3e1474b63b62-config-data\") pod \"glance-default-internal-api-0\" (UID: \"342c8bfd-c2d6-4afd-b2be-3e1474b63b62\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.437839 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"342c8bfd-c2d6-4afd-b2be-3e1474b63b62\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.437976 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/342c8bfd-c2d6-4afd-b2be-3e1474b63b62-logs\") pod \"glance-default-internal-api-0\" (UID: \"342c8bfd-c2d6-4afd-b2be-3e1474b63b62\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.438000 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgznr\" (UniqueName: \"kubernetes.io/projected/342c8bfd-c2d6-4afd-b2be-3e1474b63b62-kube-api-access-fgznr\") pod \"glance-default-internal-api-0\" (UID: \"342c8bfd-c2d6-4afd-b2be-3e1474b63b62\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.438019 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/342c8bfd-c2d6-4afd-b2be-3e1474b63b62-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"342c8bfd-c2d6-4afd-b2be-3e1474b63b62\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.438060 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/342c8bfd-c2d6-4afd-b2be-3e1474b63b62-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"342c8bfd-c2d6-4afd-b2be-3e1474b63b62\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.438230 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/342c8bfd-c2d6-4afd-b2be-3e1474b63b62-scripts\") pod \"glance-default-internal-api-0\" (UID: \"342c8bfd-c2d6-4afd-b2be-3e1474b63b62\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.544284 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/342c8bfd-c2d6-4afd-b2be-3e1474b63b62-scripts\") pod \"glance-default-internal-api-0\" (UID: \"342c8bfd-c2d6-4afd-b2be-3e1474b63b62\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.544611 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/342c8bfd-c2d6-4afd-b2be-3e1474b63b62-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"342c8bfd-c2d6-4afd-b2be-3e1474b63b62\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.544638 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/342c8bfd-c2d6-4afd-b2be-3e1474b63b62-config-data\") pod \"glance-default-internal-api-0\" (UID: \"342c8bfd-c2d6-4afd-b2be-3e1474b63b62\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.544678 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"342c8bfd-c2d6-4afd-b2be-3e1474b63b62\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.544739 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/342c8bfd-c2d6-4afd-b2be-3e1474b63b62-logs\") pod \"glance-default-internal-api-0\" (UID: \"342c8bfd-c2d6-4afd-b2be-3e1474b63b62\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.544765 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgznr\" (UniqueName: \"kubernetes.io/projected/342c8bfd-c2d6-4afd-b2be-3e1474b63b62-kube-api-access-fgznr\") pod \"glance-default-internal-api-0\" (UID: \"342c8bfd-c2d6-4afd-b2be-3e1474b63b62\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.544797 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/342c8bfd-c2d6-4afd-b2be-3e1474b63b62-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"342c8bfd-c2d6-4afd-b2be-3e1474b63b62\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.544854 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/342c8bfd-c2d6-4afd-b2be-3e1474b63b62-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"342c8bfd-c2d6-4afd-b2be-3e1474b63b62\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.545379 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/342c8bfd-c2d6-4afd-b2be-3e1474b63b62-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"342c8bfd-c2d6-4afd-b2be-3e1474b63b62\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.546671 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/342c8bfd-c2d6-4afd-b2be-3e1474b63b62-logs\") pod \"glance-default-internal-api-0\" (UID: \"342c8bfd-c2d6-4afd-b2be-3e1474b63b62\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.546853 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"342c8bfd-c2d6-4afd-b2be-3e1474b63b62\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.555428 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/342c8bfd-c2d6-4afd-b2be-3e1474b63b62-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"342c8bfd-c2d6-4afd-b2be-3e1474b63b62\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.555542 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/342c8bfd-c2d6-4afd-b2be-3e1474b63b62-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"342c8bfd-c2d6-4afd-b2be-3e1474b63b62\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.555552 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/342c8bfd-c2d6-4afd-b2be-3e1474b63b62-scripts\") pod \"glance-default-internal-api-0\" (UID: \"342c8bfd-c2d6-4afd-b2be-3e1474b63b62\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.556057 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/342c8bfd-c2d6-4afd-b2be-3e1474b63b62-config-data\") pod \"glance-default-internal-api-0\" (UID: \"342c8bfd-c2d6-4afd-b2be-3e1474b63b62\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.577197 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgznr\" (UniqueName: \"kubernetes.io/projected/342c8bfd-c2d6-4afd-b2be-3e1474b63b62-kube-api-access-fgznr\") pod \"glance-default-internal-api-0\" (UID: \"342c8bfd-c2d6-4afd-b2be-3e1474b63b62\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.625303 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"342c8bfd-c2d6-4afd-b2be-3e1474b63b62\") " pod="openstack/glance-default-internal-api-0" Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.770217 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-p2pzf" Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.927188 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.949221 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c19a-account-create-update-24shd" Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.954985 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmnpw\" (UniqueName: \"kubernetes.io/projected/157bc07b-77b8-4a29-b8e0-9a205215187b-kube-api-access-mmnpw\") pod \"157bc07b-77b8-4a29-b8e0-9a205215187b\" (UID: \"157bc07b-77b8-4a29-b8e0-9a205215187b\") " Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.955193 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/157bc07b-77b8-4a29-b8e0-9a205215187b-operator-scripts\") pod \"157bc07b-77b8-4a29-b8e0-9a205215187b\" (UID: \"157bc07b-77b8-4a29-b8e0-9a205215187b\") " Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.955851 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/157bc07b-77b8-4a29-b8e0-9a205215187b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "157bc07b-77b8-4a29-b8e0-9a205215187b" (UID: "157bc07b-77b8-4a29-b8e0-9a205215187b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.956492 4922 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/157bc07b-77b8-4a29-b8e0-9a205215187b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.963544 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/157bc07b-77b8-4a29-b8e0-9a205215187b-kube-api-access-mmnpw" (OuterVolumeSpecName: "kube-api-access-mmnpw") pod "157bc07b-77b8-4a29-b8e0-9a205215187b" (UID: "157bc07b-77b8-4a29-b8e0-9a205215187b"). InnerVolumeSpecName "kube-api-access-mmnpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.990843 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-96dc-account-create-update-4px58" Feb 18 11:57:35 crc kubenswrapper[4922]: I0218 11:57:35.999159 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-ddrmz" Feb 18 11:57:36 crc kubenswrapper[4922]: I0218 11:57:36.027508 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ea20-account-create-update-k5t5k" Feb 18 11:57:36 crc kubenswrapper[4922]: I0218 11:57:36.038531 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-b9cbq" Feb 18 11:57:36 crc kubenswrapper[4922]: I0218 11:57:36.059252 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rh78w\" (UniqueName: \"kubernetes.io/projected/f9360e33-9ae9-4b84-a898-c2c22626a565-kube-api-access-rh78w\") pod \"f9360e33-9ae9-4b84-a898-c2c22626a565\" (UID: \"f9360e33-9ae9-4b84-a898-c2c22626a565\") " Feb 18 11:57:36 crc kubenswrapper[4922]: I0218 11:57:36.059331 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9360e33-9ae9-4b84-a898-c2c22626a565-operator-scripts\") pod \"f9360e33-9ae9-4b84-a898-c2c22626a565\" (UID: \"f9360e33-9ae9-4b84-a898-c2c22626a565\") " Feb 18 11:57:36 crc kubenswrapper[4922]: I0218 11:57:36.060210 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9360e33-9ae9-4b84-a898-c2c22626a565-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f9360e33-9ae9-4b84-a898-c2c22626a565" (UID: "f9360e33-9ae9-4b84-a898-c2c22626a565"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:57:36 crc kubenswrapper[4922]: I0218 11:57:36.064602 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmnpw\" (UniqueName: \"kubernetes.io/projected/157bc07b-77b8-4a29-b8e0-9a205215187b-kube-api-access-mmnpw\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:36 crc kubenswrapper[4922]: I0218 11:57:36.066012 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9360e33-9ae9-4b84-a898-c2c22626a565-kube-api-access-rh78w" (OuterVolumeSpecName: "kube-api-access-rh78w") pod "f9360e33-9ae9-4b84-a898-c2c22626a565" (UID: "f9360e33-9ae9-4b84-a898-c2c22626a565"). InnerVolumeSpecName "kube-api-access-rh78w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:57:36 crc kubenswrapper[4922]: I0218 11:57:36.171802 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4f87\" (UniqueName: \"kubernetes.io/projected/41c3abe9-3a81-44ef-babf-818b176f6437-kube-api-access-v4f87\") pod \"41c3abe9-3a81-44ef-babf-818b176f6437\" (UID: \"41c3abe9-3a81-44ef-babf-818b176f6437\") " Feb 18 11:57:36 crc kubenswrapper[4922]: I0218 11:57:36.171863 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9855v\" (UniqueName: \"kubernetes.io/projected/cea3a613-3571-4de4-be73-07a4db1c146e-kube-api-access-9855v\") pod \"cea3a613-3571-4de4-be73-07a4db1c146e\" (UID: \"cea3a613-3571-4de4-be73-07a4db1c146e\") " Feb 18 11:57:36 crc kubenswrapper[4922]: I0218 11:57:36.171987 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbxsg\" (UniqueName: \"kubernetes.io/projected/7513cf0a-f653-48b9-a365-9732179aaffc-kube-api-access-vbxsg\") pod \"7513cf0a-f653-48b9-a365-9732179aaffc\" (UID: \"7513cf0a-f653-48b9-a365-9732179aaffc\") " Feb 18 11:57:36 crc kubenswrapper[4922]: I0218 11:57:36.172046 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7513cf0a-f653-48b9-a365-9732179aaffc-operator-scripts\") pod \"7513cf0a-f653-48b9-a365-9732179aaffc\" (UID: \"7513cf0a-f653-48b9-a365-9732179aaffc\") " Feb 18 11:57:36 crc kubenswrapper[4922]: I0218 11:57:36.172109 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41c3abe9-3a81-44ef-babf-818b176f6437-operator-scripts\") pod \"41c3abe9-3a81-44ef-babf-818b176f6437\" (UID: \"41c3abe9-3a81-44ef-babf-818b176f6437\") " Feb 18 11:57:36 crc kubenswrapper[4922]: I0218 11:57:36.172187 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b28b3ba-c697-4cef-8e3f-e41317e3abe6-operator-scripts\") pod \"9b28b3ba-c697-4cef-8e3f-e41317e3abe6\" (UID: \"9b28b3ba-c697-4cef-8e3f-e41317e3abe6\") " Feb 18 11:57:36 crc kubenswrapper[4922]: I0218 11:57:36.172230 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cea3a613-3571-4de4-be73-07a4db1c146e-operator-scripts\") pod \"cea3a613-3571-4de4-be73-07a4db1c146e\" (UID: \"cea3a613-3571-4de4-be73-07a4db1c146e\") " Feb 18 11:57:36 crc kubenswrapper[4922]: I0218 11:57:36.172293 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7dh4\" (UniqueName: \"kubernetes.io/projected/9b28b3ba-c697-4cef-8e3f-e41317e3abe6-kube-api-access-k7dh4\") pod \"9b28b3ba-c697-4cef-8e3f-e41317e3abe6\" (UID: \"9b28b3ba-c697-4cef-8e3f-e41317e3abe6\") " Feb 18 11:57:36 crc kubenswrapper[4922]: I0218 11:57:36.172810 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rh78w\" (UniqueName: \"kubernetes.io/projected/f9360e33-9ae9-4b84-a898-c2c22626a565-kube-api-access-rh78w\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:36 crc kubenswrapper[4922]: I0218 11:57:36.172830 4922 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9360e33-9ae9-4b84-a898-c2c22626a565-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:36 crc kubenswrapper[4922]: I0218 11:57:36.174437 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7513cf0a-f653-48b9-a365-9732179aaffc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7513cf0a-f653-48b9-a365-9732179aaffc" (UID: "7513cf0a-f653-48b9-a365-9732179aaffc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:57:36 crc kubenswrapper[4922]: I0218 11:57:36.176059 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41c3abe9-3a81-44ef-babf-818b176f6437-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "41c3abe9-3a81-44ef-babf-818b176f6437" (UID: "41c3abe9-3a81-44ef-babf-818b176f6437"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:57:36 crc kubenswrapper[4922]: I0218 11:57:36.176087 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b28b3ba-c697-4cef-8e3f-e41317e3abe6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9b28b3ba-c697-4cef-8e3f-e41317e3abe6" (UID: "9b28b3ba-c697-4cef-8e3f-e41317e3abe6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:57:36 crc kubenswrapper[4922]: I0218 11:57:36.176484 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cea3a613-3571-4de4-be73-07a4db1c146e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cea3a613-3571-4de4-be73-07a4db1c146e" (UID: "cea3a613-3571-4de4-be73-07a4db1c146e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:57:36 crc kubenswrapper[4922]: I0218 11:57:36.178769 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cea3a613-3571-4de4-be73-07a4db1c146e-kube-api-access-9855v" (OuterVolumeSpecName: "kube-api-access-9855v") pod "cea3a613-3571-4de4-be73-07a4db1c146e" (UID: "cea3a613-3571-4de4-be73-07a4db1c146e"). InnerVolumeSpecName "kube-api-access-9855v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:57:36 crc kubenswrapper[4922]: I0218 11:57:36.179102 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b28b3ba-c697-4cef-8e3f-e41317e3abe6-kube-api-access-k7dh4" (OuterVolumeSpecName: "kube-api-access-k7dh4") pod "9b28b3ba-c697-4cef-8e3f-e41317e3abe6" (UID: "9b28b3ba-c697-4cef-8e3f-e41317e3abe6"). InnerVolumeSpecName "kube-api-access-k7dh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:57:36 crc kubenswrapper[4922]: I0218 11:57:36.179266 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7513cf0a-f653-48b9-a365-9732179aaffc-kube-api-access-vbxsg" (OuterVolumeSpecName: "kube-api-access-vbxsg") pod "7513cf0a-f653-48b9-a365-9732179aaffc" (UID: "7513cf0a-f653-48b9-a365-9732179aaffc"). InnerVolumeSpecName "kube-api-access-vbxsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:57:36 crc kubenswrapper[4922]: I0218 11:57:36.188978 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41c3abe9-3a81-44ef-babf-818b176f6437-kube-api-access-v4f87" (OuterVolumeSpecName: "kube-api-access-v4f87") pod "41c3abe9-3a81-44ef-babf-818b176f6437" (UID: "41c3abe9-3a81-44ef-babf-818b176f6437"). InnerVolumeSpecName "kube-api-access-v4f87". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:57:36 crc kubenswrapper[4922]: I0218 11:57:36.239101 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-96dc-account-create-update-4px58" event={"ID":"7513cf0a-f653-48b9-a365-9732179aaffc","Type":"ContainerDied","Data":"99782e6d55a2e38110d1ce6513a88af301a487264155e726820a08e94b0d1c9f"} Feb 18 11:57:36 crc kubenswrapper[4922]: I0218 11:57:36.239140 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99782e6d55a2e38110d1ce6513a88af301a487264155e726820a08e94b0d1c9f" Feb 18 11:57:36 crc kubenswrapper[4922]: I0218 11:57:36.239196 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-96dc-account-create-update-4px58" Feb 18 11:57:36 crc kubenswrapper[4922]: I0218 11:57:36.244759 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c19a-account-create-update-24shd" Feb 18 11:57:36 crc kubenswrapper[4922]: I0218 11:57:36.245693 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c19a-account-create-update-24shd" event={"ID":"f9360e33-9ae9-4b84-a898-c2c22626a565","Type":"ContainerDied","Data":"95ae0db4dc810a397e5536957907573f044ce4062de198f02463dffeab24a900"} Feb 18 11:57:36 crc kubenswrapper[4922]: I0218 11:57:36.245736 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95ae0db4dc810a397e5536957907573f044ce4062de198f02463dffeab24a900" Feb 18 11:57:36 crc kubenswrapper[4922]: I0218 11:57:36.247325 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-p2pzf" event={"ID":"157bc07b-77b8-4a29-b8e0-9a205215187b","Type":"ContainerDied","Data":"15aa49b2b6d6e10c8c1597a89c8dfd815ac09c35f2814b8e479c59053ff9efa1"} Feb 18 11:57:36 crc kubenswrapper[4922]: I0218 11:57:36.247389 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15aa49b2b6d6e10c8c1597a89c8dfd815ac09c35f2814b8e479c59053ff9efa1" Feb 18 11:57:36 crc kubenswrapper[4922]: I0218 11:57:36.247449 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-p2pzf" Feb 18 11:57:36 crc kubenswrapper[4922]: I0218 11:57:36.257656 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-ddrmz" Feb 18 11:57:36 crc kubenswrapper[4922]: I0218 11:57:36.257663 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-ddrmz" event={"ID":"41c3abe9-3a81-44ef-babf-818b176f6437","Type":"ContainerDied","Data":"6b97ca742e23751715b7c18d5f5f62f93230d2dcc1c0f65e7e7422315924cd92"} Feb 18 11:57:36 crc kubenswrapper[4922]: I0218 11:57:36.257695 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b97ca742e23751715b7c18d5f5f62f93230d2dcc1c0f65e7e7422315924cd92" Feb 18 11:57:36 crc kubenswrapper[4922]: I0218 11:57:36.266553 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ea20-account-create-update-k5t5k" event={"ID":"9b28b3ba-c697-4cef-8e3f-e41317e3abe6","Type":"ContainerDied","Data":"c71bbdea45ab7a402cd8fc37c94e31cbd03659099c218416879906794286646a"} Feb 18 11:57:36 crc kubenswrapper[4922]: I0218 11:57:36.266623 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c71bbdea45ab7a402cd8fc37c94e31cbd03659099c218416879906794286646a" Feb 18 11:57:36 crc kubenswrapper[4922]: I0218 11:57:36.266710 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ea20-account-create-update-k5t5k" Feb 18 11:57:36 crc kubenswrapper[4922]: I0218 11:57:36.273283 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-b9cbq" Feb 18 11:57:36 crc kubenswrapper[4922]: I0218 11:57:36.273552 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-b9cbq" event={"ID":"cea3a613-3571-4de4-be73-07a4db1c146e","Type":"ContainerDied","Data":"73eb578f99941b6e8fcccd4f7146c408046aea775fac188592a56b6aa1c8c60e"} Feb 18 11:57:36 crc kubenswrapper[4922]: I0218 11:57:36.273615 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73eb578f99941b6e8fcccd4f7146c408046aea775fac188592a56b6aa1c8c60e" Feb 18 11:57:36 crc kubenswrapper[4922]: I0218 11:57:36.276568 4922 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b28b3ba-c697-4cef-8e3f-e41317e3abe6-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:36 crc kubenswrapper[4922]: I0218 11:57:36.276605 4922 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cea3a613-3571-4de4-be73-07a4db1c146e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:36 crc kubenswrapper[4922]: I0218 11:57:36.276619 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7dh4\" (UniqueName: \"kubernetes.io/projected/9b28b3ba-c697-4cef-8e3f-e41317e3abe6-kube-api-access-k7dh4\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:36 crc kubenswrapper[4922]: I0218 11:57:36.276635 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4f87\" (UniqueName: \"kubernetes.io/projected/41c3abe9-3a81-44ef-babf-818b176f6437-kube-api-access-v4f87\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:36 crc kubenswrapper[4922]: I0218 11:57:36.276647 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9855v\" (UniqueName: \"kubernetes.io/projected/cea3a613-3571-4de4-be73-07a4db1c146e-kube-api-access-9855v\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:36 crc kubenswrapper[4922]: I0218 11:57:36.276661 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbxsg\" (UniqueName: \"kubernetes.io/projected/7513cf0a-f653-48b9-a365-9732179aaffc-kube-api-access-vbxsg\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:36 crc kubenswrapper[4922]: I0218 11:57:36.276674 4922 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7513cf0a-f653-48b9-a365-9732179aaffc-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:36 crc kubenswrapper[4922]: I0218 11:57:36.276686 4922 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41c3abe9-3a81-44ef-babf-818b176f6437-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:36 crc kubenswrapper[4922]: I0218 11:57:36.578481 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 11:57:36 crc kubenswrapper[4922]: W0218 11:57:36.578558 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod342c8bfd_c2d6_4afd_b2be_3e1474b63b62.slice/crio-933fa04c15f9dacf8ce6f064b1ad2f6a28db246e3d767262f15546b2719d0cb1 WatchSource:0}: Error finding container 933fa04c15f9dacf8ce6f064b1ad2f6a28db246e3d767262f15546b2719d0cb1: Status 404 returned error can't find the container with id 933fa04c15f9dacf8ce6f064b1ad2f6a28db246e3d767262f15546b2719d0cb1 Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.010204 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24eb828d-acb3-4b88-96dc-8d3bb8c49e86" path="/var/lib/kubelet/pods/24eb828d-acb3-4b88-96dc-8d3bb8c49e86/volumes" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.281179 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.291875 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"342c8bfd-c2d6-4afd-b2be-3e1474b63b62","Type":"ContainerStarted","Data":"933fa04c15f9dacf8ce6f064b1ad2f6a28db246e3d767262f15546b2719d0cb1"} Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.294291 4922 generic.go:334] "Generic (PLEG): container finished" podID="7810aaca-e072-467b-bba7-6a3e12310c68" containerID="be139187f859b824114f17925d841efb955f0e7b83d0a0144a156b5f7b3e4df4" exitCode=0 Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.294330 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7810aaca-e072-467b-bba7-6a3e12310c68","Type":"ContainerDied","Data":"be139187f859b824114f17925d841efb955f0e7b83d0a0144a156b5f7b3e4df4"} Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.294346 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.294353 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7810aaca-e072-467b-bba7-6a3e12310c68","Type":"ContainerDied","Data":"9937aae78bceb48bd4f47887b4b7c1fa9f743a0bf2b9a03c23a054415125619f"} Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.294419 4922 scope.go:117] "RemoveContainer" containerID="be139187f859b824114f17925d841efb955f0e7b83d0a0144a156b5f7b3e4df4" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.344690 4922 scope.go:117] "RemoveContainer" containerID="0619ad10d88107a566fb4c0dbd0711c30708a02e901f59a47be08cbb81b88d18" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.401331 4922 scope.go:117] "RemoveContainer" containerID="be139187f859b824114f17925d841efb955f0e7b83d0a0144a156b5f7b3e4df4" Feb 18 11:57:37 crc kubenswrapper[4922]: E0218 11:57:37.401857 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be139187f859b824114f17925d841efb955f0e7b83d0a0144a156b5f7b3e4df4\": container with ID starting with be139187f859b824114f17925d841efb955f0e7b83d0a0144a156b5f7b3e4df4 not found: ID does not exist" containerID="be139187f859b824114f17925d841efb955f0e7b83d0a0144a156b5f7b3e4df4" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.401898 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be139187f859b824114f17925d841efb955f0e7b83d0a0144a156b5f7b3e4df4"} err="failed to get container status \"be139187f859b824114f17925d841efb955f0e7b83d0a0144a156b5f7b3e4df4\": rpc error: code = NotFound desc = could not find container \"be139187f859b824114f17925d841efb955f0e7b83d0a0144a156b5f7b3e4df4\": container with ID starting with be139187f859b824114f17925d841efb955f0e7b83d0a0144a156b5f7b3e4df4 not found: ID does not exist" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.401928 4922 scope.go:117] "RemoveContainer" containerID="0619ad10d88107a566fb4c0dbd0711c30708a02e901f59a47be08cbb81b88d18" Feb 18 11:57:37 crc kubenswrapper[4922]: E0218 11:57:37.402272 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0619ad10d88107a566fb4c0dbd0711c30708a02e901f59a47be08cbb81b88d18\": container with ID starting with 0619ad10d88107a566fb4c0dbd0711c30708a02e901f59a47be08cbb81b88d18 not found: ID does not exist" containerID="0619ad10d88107a566fb4c0dbd0711c30708a02e901f59a47be08cbb81b88d18" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.402305 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0619ad10d88107a566fb4c0dbd0711c30708a02e901f59a47be08cbb81b88d18"} err="failed to get container status \"0619ad10d88107a566fb4c0dbd0711c30708a02e901f59a47be08cbb81b88d18\": rpc error: code = NotFound desc = could not find container \"0619ad10d88107a566fb4c0dbd0711c30708a02e901f59a47be08cbb81b88d18\": container with ID starting with 0619ad10d88107a566fb4c0dbd0711c30708a02e901f59a47be08cbb81b88d18 not found: ID does not exist" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.415972 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7810aaca-e072-467b-bba7-6a3e12310c68-httpd-run\") pod \"7810aaca-e072-467b-bba7-6a3e12310c68\" (UID: \"7810aaca-e072-467b-bba7-6a3e12310c68\") " Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.416016 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7810aaca-e072-467b-bba7-6a3e12310c68-public-tls-certs\") pod \"7810aaca-e072-467b-bba7-6a3e12310c68\" (UID: \"7810aaca-e072-467b-bba7-6a3e12310c68\") " Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.416078 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7810aaca-e072-467b-bba7-6a3e12310c68-scripts\") pod \"7810aaca-e072-467b-bba7-6a3e12310c68\" (UID: \"7810aaca-e072-467b-bba7-6a3e12310c68\") " Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.416173 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"7810aaca-e072-467b-bba7-6a3e12310c68\" (UID: \"7810aaca-e072-467b-bba7-6a3e12310c68\") " Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.416204 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7810aaca-e072-467b-bba7-6a3e12310c68-config-data\") pod \"7810aaca-e072-467b-bba7-6a3e12310c68\" (UID: \"7810aaca-e072-467b-bba7-6a3e12310c68\") " Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.416232 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zp7nr\" (UniqueName: \"kubernetes.io/projected/7810aaca-e072-467b-bba7-6a3e12310c68-kube-api-access-zp7nr\") pod \"7810aaca-e072-467b-bba7-6a3e12310c68\" (UID: \"7810aaca-e072-467b-bba7-6a3e12310c68\") " Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.416247 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7810aaca-e072-467b-bba7-6a3e12310c68-logs\") pod \"7810aaca-e072-467b-bba7-6a3e12310c68\" (UID: \"7810aaca-e072-467b-bba7-6a3e12310c68\") " Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.416334 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7810aaca-e072-467b-bba7-6a3e12310c68-combined-ca-bundle\") pod \"7810aaca-e072-467b-bba7-6a3e12310c68\" (UID: \"7810aaca-e072-467b-bba7-6a3e12310c68\") " Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.416806 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7810aaca-e072-467b-bba7-6a3e12310c68-logs" (OuterVolumeSpecName: "logs") pod "7810aaca-e072-467b-bba7-6a3e12310c68" (UID: "7810aaca-e072-467b-bba7-6a3e12310c68"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.417055 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7810aaca-e072-467b-bba7-6a3e12310c68-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7810aaca-e072-467b-bba7-6a3e12310c68" (UID: "7810aaca-e072-467b-bba7-6a3e12310c68"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.424440 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7810aaca-e072-467b-bba7-6a3e12310c68-scripts" (OuterVolumeSpecName: "scripts") pod "7810aaca-e072-467b-bba7-6a3e12310c68" (UID: "7810aaca-e072-467b-bba7-6a3e12310c68"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.431481 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "7810aaca-e072-467b-bba7-6a3e12310c68" (UID: "7810aaca-e072-467b-bba7-6a3e12310c68"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.433647 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7810aaca-e072-467b-bba7-6a3e12310c68-kube-api-access-zp7nr" (OuterVolumeSpecName: "kube-api-access-zp7nr") pod "7810aaca-e072-467b-bba7-6a3e12310c68" (UID: "7810aaca-e072-467b-bba7-6a3e12310c68"). InnerVolumeSpecName "kube-api-access-zp7nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.509415 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7810aaca-e072-467b-bba7-6a3e12310c68-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7810aaca-e072-467b-bba7-6a3e12310c68" (UID: "7810aaca-e072-467b-bba7-6a3e12310c68"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.522273 4922 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.522320 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zp7nr\" (UniqueName: \"kubernetes.io/projected/7810aaca-e072-467b-bba7-6a3e12310c68-kube-api-access-zp7nr\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.522339 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7810aaca-e072-467b-bba7-6a3e12310c68-logs\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.522351 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7810aaca-e072-467b-bba7-6a3e12310c68-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.522383 4922 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7810aaca-e072-467b-bba7-6a3e12310c68-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.522398 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7810aaca-e072-467b-bba7-6a3e12310c68-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.540531 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7810aaca-e072-467b-bba7-6a3e12310c68-config-data" (OuterVolumeSpecName: "config-data") pod "7810aaca-e072-467b-bba7-6a3e12310c68" (UID: "7810aaca-e072-467b-bba7-6a3e12310c68"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.572908 4922 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.574474 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7810aaca-e072-467b-bba7-6a3e12310c68-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7810aaca-e072-467b-bba7-6a3e12310c68" (UID: "7810aaca-e072-467b-bba7-6a3e12310c68"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.624479 4922 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.624514 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7810aaca-e072-467b-bba7-6a3e12310c68-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.624526 4922 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7810aaca-e072-467b-bba7-6a3e12310c68-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.707317 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.720334 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.744126 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 11:57:37 crc kubenswrapper[4922]: E0218 11:57:37.744548 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7513cf0a-f653-48b9-a365-9732179aaffc" containerName="mariadb-account-create-update" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.744561 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="7513cf0a-f653-48b9-a365-9732179aaffc" containerName="mariadb-account-create-update" Feb 18 11:57:37 crc kubenswrapper[4922]: E0218 11:57:37.744576 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9360e33-9ae9-4b84-a898-c2c22626a565" containerName="mariadb-account-create-update" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.744582 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9360e33-9ae9-4b84-a898-c2c22626a565" containerName="mariadb-account-create-update" Feb 18 11:57:37 crc kubenswrapper[4922]: E0218 11:57:37.744589 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41c3abe9-3a81-44ef-babf-818b176f6437" containerName="mariadb-database-create" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.744595 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="41c3abe9-3a81-44ef-babf-818b176f6437" containerName="mariadb-database-create" Feb 18 11:57:37 crc kubenswrapper[4922]: E0218 11:57:37.744613 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7810aaca-e072-467b-bba7-6a3e12310c68" containerName="glance-log" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.744620 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="7810aaca-e072-467b-bba7-6a3e12310c68" containerName="glance-log" Feb 18 11:57:37 crc kubenswrapper[4922]: E0218 11:57:37.744626 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b28b3ba-c697-4cef-8e3f-e41317e3abe6" containerName="mariadb-account-create-update" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.744631 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b28b3ba-c697-4cef-8e3f-e41317e3abe6" containerName="mariadb-account-create-update" Feb 18 11:57:37 crc kubenswrapper[4922]: E0218 11:57:37.744652 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="157bc07b-77b8-4a29-b8e0-9a205215187b" containerName="mariadb-database-create" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.744658 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="157bc07b-77b8-4a29-b8e0-9a205215187b" containerName="mariadb-database-create" Feb 18 11:57:37 crc kubenswrapper[4922]: E0218 11:57:37.744675 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7810aaca-e072-467b-bba7-6a3e12310c68" containerName="glance-httpd" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.744680 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="7810aaca-e072-467b-bba7-6a3e12310c68" containerName="glance-httpd" Feb 18 11:57:37 crc kubenswrapper[4922]: E0218 11:57:37.744687 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cea3a613-3571-4de4-be73-07a4db1c146e" containerName="mariadb-database-create" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.744693 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="cea3a613-3571-4de4-be73-07a4db1c146e" containerName="mariadb-database-create" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.744845 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9360e33-9ae9-4b84-a898-c2c22626a565" containerName="mariadb-account-create-update" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.744858 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="cea3a613-3571-4de4-be73-07a4db1c146e" containerName="mariadb-database-create" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.744868 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="41c3abe9-3a81-44ef-babf-818b176f6437" containerName="mariadb-database-create" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.744878 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="157bc07b-77b8-4a29-b8e0-9a205215187b" containerName="mariadb-database-create" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.744888 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="7810aaca-e072-467b-bba7-6a3e12310c68" containerName="glance-log" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.744899 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="7513cf0a-f653-48b9-a365-9732179aaffc" containerName="mariadb-account-create-update" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.744908 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b28b3ba-c697-4cef-8e3f-e41317e3abe6" containerName="mariadb-account-create-update" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.744919 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="7810aaca-e072-467b-bba7-6a3e12310c68" containerName="glance-httpd" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.745874 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.751986 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.753813 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.761474 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.930941 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5056168-d177-4e40-813a-db20d428ce9a-config-data\") pod \"glance-default-external-api-0\" (UID: \"f5056168-d177-4e40-813a-db20d428ce9a\") " pod="openstack/glance-default-external-api-0" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.931002 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5056168-d177-4e40-813a-db20d428ce9a-scripts\") pod \"glance-default-external-api-0\" (UID: \"f5056168-d177-4e40-813a-db20d428ce9a\") " pod="openstack/glance-default-external-api-0" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.931073 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5056168-d177-4e40-813a-db20d428ce9a-logs\") pod \"glance-default-external-api-0\" (UID: \"f5056168-d177-4e40-813a-db20d428ce9a\") " pod="openstack/glance-default-external-api-0" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.931088 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5056168-d177-4e40-813a-db20d428ce9a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f5056168-d177-4e40-813a-db20d428ce9a\") " pod="openstack/glance-default-external-api-0" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.931131 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f5056168-d177-4e40-813a-db20d428ce9a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f5056168-d177-4e40-813a-db20d428ce9a\") " pod="openstack/glance-default-external-api-0" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.931156 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5056168-d177-4e40-813a-db20d428ce9a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f5056168-d177-4e40-813a-db20d428ce9a\") " pod="openstack/glance-default-external-api-0" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.931186 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"f5056168-d177-4e40-813a-db20d428ce9a\") " pod="openstack/glance-default-external-api-0" Feb 18 11:57:37 crc kubenswrapper[4922]: I0218 11:57:37.931231 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ghvh\" (UniqueName: \"kubernetes.io/projected/f5056168-d177-4e40-813a-db20d428ce9a-kube-api-access-6ghvh\") pod \"glance-default-external-api-0\" (UID: \"f5056168-d177-4e40-813a-db20d428ce9a\") " pod="openstack/glance-default-external-api-0" Feb 18 11:57:38 crc kubenswrapper[4922]: I0218 11:57:38.033132 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5056168-d177-4e40-813a-db20d428ce9a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f5056168-d177-4e40-813a-db20d428ce9a\") " pod="openstack/glance-default-external-api-0" Feb 18 11:57:38 crc kubenswrapper[4922]: I0218 11:57:38.033199 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f5056168-d177-4e40-813a-db20d428ce9a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f5056168-d177-4e40-813a-db20d428ce9a\") " pod="openstack/glance-default-external-api-0" Feb 18 11:57:38 crc kubenswrapper[4922]: I0218 11:57:38.033231 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5056168-d177-4e40-813a-db20d428ce9a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f5056168-d177-4e40-813a-db20d428ce9a\") " pod="openstack/glance-default-external-api-0" Feb 18 11:57:38 crc kubenswrapper[4922]: I0218 11:57:38.033268 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"f5056168-d177-4e40-813a-db20d428ce9a\") " pod="openstack/glance-default-external-api-0" Feb 18 11:57:38 crc kubenswrapper[4922]: I0218 11:57:38.033313 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ghvh\" (UniqueName: \"kubernetes.io/projected/f5056168-d177-4e40-813a-db20d428ce9a-kube-api-access-6ghvh\") pod \"glance-default-external-api-0\" (UID: \"f5056168-d177-4e40-813a-db20d428ce9a\") " pod="openstack/glance-default-external-api-0" Feb 18 11:57:38 crc kubenswrapper[4922]: I0218 11:57:38.033342 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5056168-d177-4e40-813a-db20d428ce9a-config-data\") pod \"glance-default-external-api-0\" (UID: \"f5056168-d177-4e40-813a-db20d428ce9a\") " pod="openstack/glance-default-external-api-0" Feb 18 11:57:38 crc kubenswrapper[4922]: I0218 11:57:38.033383 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5056168-d177-4e40-813a-db20d428ce9a-scripts\") pod \"glance-default-external-api-0\" (UID: \"f5056168-d177-4e40-813a-db20d428ce9a\") " pod="openstack/glance-default-external-api-0" Feb 18 11:57:38 crc kubenswrapper[4922]: I0218 11:57:38.033440 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5056168-d177-4e40-813a-db20d428ce9a-logs\") pod \"glance-default-external-api-0\" (UID: \"f5056168-d177-4e40-813a-db20d428ce9a\") " pod="openstack/glance-default-external-api-0" Feb 18 11:57:38 crc kubenswrapper[4922]: I0218 11:57:38.036255 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5056168-d177-4e40-813a-db20d428ce9a-logs\") pod \"glance-default-external-api-0\" (UID: \"f5056168-d177-4e40-813a-db20d428ce9a\") " pod="openstack/glance-default-external-api-0" Feb 18 11:57:38 crc kubenswrapper[4922]: I0218 11:57:38.037012 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"f5056168-d177-4e40-813a-db20d428ce9a\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Feb 18 11:57:38 crc kubenswrapper[4922]: I0218 11:57:38.037165 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f5056168-d177-4e40-813a-db20d428ce9a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f5056168-d177-4e40-813a-db20d428ce9a\") " pod="openstack/glance-default-external-api-0" Feb 18 11:57:38 crc kubenswrapper[4922]: I0218 11:57:38.039874 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5056168-d177-4e40-813a-db20d428ce9a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f5056168-d177-4e40-813a-db20d428ce9a\") " pod="openstack/glance-default-external-api-0" Feb 18 11:57:38 crc kubenswrapper[4922]: I0218 11:57:38.043299 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5056168-d177-4e40-813a-db20d428ce9a-config-data\") pod \"glance-default-external-api-0\" (UID: \"f5056168-d177-4e40-813a-db20d428ce9a\") " pod="openstack/glance-default-external-api-0" Feb 18 11:57:38 crc kubenswrapper[4922]: I0218 11:57:38.044022 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5056168-d177-4e40-813a-db20d428ce9a-scripts\") pod \"glance-default-external-api-0\" (UID: \"f5056168-d177-4e40-813a-db20d428ce9a\") " pod="openstack/glance-default-external-api-0" Feb 18 11:57:38 crc kubenswrapper[4922]: I0218 11:57:38.044074 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5056168-d177-4e40-813a-db20d428ce9a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f5056168-d177-4e40-813a-db20d428ce9a\") " pod="openstack/glance-default-external-api-0" Feb 18 11:57:38 crc kubenswrapper[4922]: I0218 11:57:38.065687 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ghvh\" (UniqueName: \"kubernetes.io/projected/f5056168-d177-4e40-813a-db20d428ce9a-kube-api-access-6ghvh\") pod \"glance-default-external-api-0\" (UID: \"f5056168-d177-4e40-813a-db20d428ce9a\") " pod="openstack/glance-default-external-api-0" Feb 18 11:57:38 crc kubenswrapper[4922]: I0218 11:57:38.072525 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"f5056168-d177-4e40-813a-db20d428ce9a\") " pod="openstack/glance-default-external-api-0" Feb 18 11:57:38 crc kubenswrapper[4922]: I0218 11:57:38.320582 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"342c8bfd-c2d6-4afd-b2be-3e1474b63b62","Type":"ContainerStarted","Data":"7d799824fdf1a5ef664b15c2397490ac0c1217659ad5a3da53e1b8a625ebf5c6"} Feb 18 11:57:38 crc kubenswrapper[4922]: I0218 11:57:38.320642 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"342c8bfd-c2d6-4afd-b2be-3e1474b63b62","Type":"ContainerStarted","Data":"653f5b3f2a92734abd872bb0b1bf1e7b948a6cc6b5fb9ca31a149128eede3844"} Feb 18 11:57:38 crc kubenswrapper[4922]: I0218 11:57:38.324675 4922 generic.go:334] "Generic (PLEG): container finished" podID="3ad26da5-56a1-4f67-aae9-ab321499352f" containerID="4f7ad4cbfa0275aa87aaa646fee7583ae6b1acc937b50ade063f9b7ab68d66a4" exitCode=0 Feb 18 11:57:38 crc kubenswrapper[4922]: I0218 11:57:38.324723 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3ad26da5-56a1-4f67-aae9-ab321499352f","Type":"ContainerDied","Data":"4f7ad4cbfa0275aa87aaa646fee7583ae6b1acc937b50ade063f9b7ab68d66a4"} Feb 18 11:57:38 crc kubenswrapper[4922]: I0218 11:57:38.350219 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.350197364 podStartE2EDuration="3.350197364s" podCreationTimestamp="2026-02-18 11:57:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:57:38.342861658 +0000 UTC m=+1260.070565738" watchObservedRunningTime="2026-02-18 11:57:38.350197364 +0000 UTC m=+1260.077901444" Feb 18 11:57:38 crc kubenswrapper[4922]: I0218 11:57:38.364276 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 18 11:57:38 crc kubenswrapper[4922]: I0218 11:57:38.694095 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 11:57:38 crc kubenswrapper[4922]: I0218 11:57:38.754178 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ad26da5-56a1-4f67-aae9-ab321499352f-log-httpd\") pod \"3ad26da5-56a1-4f67-aae9-ab321499352f\" (UID: \"3ad26da5-56a1-4f67-aae9-ab321499352f\") " Feb 18 11:57:38 crc kubenswrapper[4922]: I0218 11:57:38.754308 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ad26da5-56a1-4f67-aae9-ab321499352f-combined-ca-bundle\") pod \"3ad26da5-56a1-4f67-aae9-ab321499352f\" (UID: \"3ad26da5-56a1-4f67-aae9-ab321499352f\") " Feb 18 11:57:38 crc kubenswrapper[4922]: I0218 11:57:38.754456 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3ad26da5-56a1-4f67-aae9-ab321499352f-sg-core-conf-yaml\") pod \"3ad26da5-56a1-4f67-aae9-ab321499352f\" (UID: \"3ad26da5-56a1-4f67-aae9-ab321499352f\") " Feb 18 11:57:38 crc kubenswrapper[4922]: I0218 11:57:38.754495 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ad26da5-56a1-4f67-aae9-ab321499352f-config-data\") pod \"3ad26da5-56a1-4f67-aae9-ab321499352f\" (UID: \"3ad26da5-56a1-4f67-aae9-ab321499352f\") " Feb 18 11:57:38 crc kubenswrapper[4922]: I0218 11:57:38.754552 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ad26da5-56a1-4f67-aae9-ab321499352f-run-httpd\") pod \"3ad26da5-56a1-4f67-aae9-ab321499352f\" (UID: \"3ad26da5-56a1-4f67-aae9-ab321499352f\") " Feb 18 11:57:38 crc kubenswrapper[4922]: I0218 11:57:38.754627 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26464\" (UniqueName: \"kubernetes.io/projected/3ad26da5-56a1-4f67-aae9-ab321499352f-kube-api-access-26464\") pod \"3ad26da5-56a1-4f67-aae9-ab321499352f\" (UID: \"3ad26da5-56a1-4f67-aae9-ab321499352f\") " Feb 18 11:57:38 crc kubenswrapper[4922]: I0218 11:57:38.754697 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ad26da5-56a1-4f67-aae9-ab321499352f-scripts\") pod \"3ad26da5-56a1-4f67-aae9-ab321499352f\" (UID: \"3ad26da5-56a1-4f67-aae9-ab321499352f\") " Feb 18 11:57:38 crc kubenswrapper[4922]: I0218 11:57:38.761852 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ad26da5-56a1-4f67-aae9-ab321499352f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3ad26da5-56a1-4f67-aae9-ab321499352f" (UID: "3ad26da5-56a1-4f67-aae9-ab321499352f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:57:38 crc kubenswrapper[4922]: I0218 11:57:38.761997 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ad26da5-56a1-4f67-aae9-ab321499352f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3ad26da5-56a1-4f67-aae9-ab321499352f" (UID: "3ad26da5-56a1-4f67-aae9-ab321499352f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:57:38 crc kubenswrapper[4922]: I0218 11:57:38.775909 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ad26da5-56a1-4f67-aae9-ab321499352f-kube-api-access-26464" (OuterVolumeSpecName: "kube-api-access-26464") pod "3ad26da5-56a1-4f67-aae9-ab321499352f" (UID: "3ad26da5-56a1-4f67-aae9-ab321499352f"). InnerVolumeSpecName "kube-api-access-26464". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:57:38 crc kubenswrapper[4922]: I0218 11:57:38.776750 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ad26da5-56a1-4f67-aae9-ab321499352f-scripts" (OuterVolumeSpecName: "scripts") pod "3ad26da5-56a1-4f67-aae9-ab321499352f" (UID: "3ad26da5-56a1-4f67-aae9-ab321499352f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:38 crc kubenswrapper[4922]: I0218 11:57:38.814604 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ad26da5-56a1-4f67-aae9-ab321499352f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3ad26da5-56a1-4f67-aae9-ab321499352f" (UID: "3ad26da5-56a1-4f67-aae9-ab321499352f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:38 crc kubenswrapper[4922]: I0218 11:57:38.860534 4922 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3ad26da5-56a1-4f67-aae9-ab321499352f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:38 crc kubenswrapper[4922]: I0218 11:57:38.860607 4922 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ad26da5-56a1-4f67-aae9-ab321499352f-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:38 crc kubenswrapper[4922]: I0218 11:57:38.860733 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26464\" (UniqueName: \"kubernetes.io/projected/3ad26da5-56a1-4f67-aae9-ab321499352f-kube-api-access-26464\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:38 crc kubenswrapper[4922]: I0218 11:57:38.860751 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ad26da5-56a1-4f67-aae9-ab321499352f-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:38 crc kubenswrapper[4922]: I0218 11:57:38.860763 4922 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ad26da5-56a1-4f67-aae9-ab321499352f-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:38 crc kubenswrapper[4922]: I0218 11:57:38.900844 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ad26da5-56a1-4f67-aae9-ab321499352f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ad26da5-56a1-4f67-aae9-ab321499352f" (UID: "3ad26da5-56a1-4f67-aae9-ab321499352f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:38 crc kubenswrapper[4922]: I0218 11:57:38.914292 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ad26da5-56a1-4f67-aae9-ab321499352f-config-data" (OuterVolumeSpecName: "config-data") pod "3ad26da5-56a1-4f67-aae9-ab321499352f" (UID: "3ad26da5-56a1-4f67-aae9-ab321499352f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:38 crc kubenswrapper[4922]: I0218 11:57:38.963943 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ad26da5-56a1-4f67-aae9-ab321499352f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:38 crc kubenswrapper[4922]: I0218 11:57:38.963983 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ad26da5-56a1-4f67-aae9-ab321499352f-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:38 crc kubenswrapper[4922]: I0218 11:57:38.989174 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7810aaca-e072-467b-bba7-6a3e12310c68" path="/var/lib/kubelet/pods/7810aaca-e072-467b-bba7-6a3e12310c68/volumes" Feb 18 11:57:39 crc kubenswrapper[4922]: I0218 11:57:39.024508 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 11:57:39 crc kubenswrapper[4922]: W0218 11:57:39.026011 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5056168_d177_4e40_813a_db20d428ce9a.slice/crio-cbc622c3c7d1ad91b0ac3fa62385fdfc6fb82690de70ed799bc25da48755f0c7 WatchSource:0}: Error finding container cbc622c3c7d1ad91b0ac3fa62385fdfc6fb82690de70ed799bc25da48755f0c7: Status 404 returned error can't find the container with id cbc622c3c7d1ad91b0ac3fa62385fdfc6fb82690de70ed799bc25da48755f0c7 Feb 18 11:57:39 crc kubenswrapper[4922]: I0218 11:57:39.339507 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f5056168-d177-4e40-813a-db20d428ce9a","Type":"ContainerStarted","Data":"cbc622c3c7d1ad91b0ac3fa62385fdfc6fb82690de70ed799bc25da48755f0c7"} Feb 18 11:57:39 crc kubenswrapper[4922]: I0218 11:57:39.342599 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 11:57:39 crc kubenswrapper[4922]: I0218 11:57:39.343480 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3ad26da5-56a1-4f67-aae9-ab321499352f","Type":"ContainerDied","Data":"4ab86a79e660c19ee3ee370625c973f496ff01d2184f2823cb5772f2dd377459"} Feb 18 11:57:39 crc kubenswrapper[4922]: I0218 11:57:39.343525 4922 scope.go:117] "RemoveContainer" containerID="861840f7d8623349410807b78c7acecc1dbed44761c62e936d5db92618e71c43" Feb 18 11:57:39 crc kubenswrapper[4922]: I0218 11:57:39.367845 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 11:57:39 crc kubenswrapper[4922]: I0218 11:57:39.377300 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 18 11:57:39 crc kubenswrapper[4922]: I0218 11:57:39.391992 4922 scope.go:117] "RemoveContainer" containerID="0426fee054517907fa41850ee4ebf4b625cb2e9874199241df43ce1756e27ff1" Feb 18 11:57:39 crc kubenswrapper[4922]: I0218 11:57:39.410987 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 18 11:57:39 crc kubenswrapper[4922]: E0218 11:57:39.411942 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ad26da5-56a1-4f67-aae9-ab321499352f" containerName="proxy-httpd" Feb 18 11:57:39 crc kubenswrapper[4922]: I0218 11:57:39.411967 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ad26da5-56a1-4f67-aae9-ab321499352f" containerName="proxy-httpd" Feb 18 11:57:39 crc kubenswrapper[4922]: E0218 11:57:39.411989 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ad26da5-56a1-4f67-aae9-ab321499352f" containerName="ceilometer-notification-agent" Feb 18 11:57:39 crc kubenswrapper[4922]: I0218 11:57:39.411997 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ad26da5-56a1-4f67-aae9-ab321499352f" containerName="ceilometer-notification-agent" Feb 18 11:57:39 crc kubenswrapper[4922]: E0218 11:57:39.412012 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ad26da5-56a1-4f67-aae9-ab321499352f" containerName="ceilometer-central-agent" Feb 18 11:57:39 crc kubenswrapper[4922]: I0218 11:57:39.412020 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ad26da5-56a1-4f67-aae9-ab321499352f" containerName="ceilometer-central-agent" Feb 18 11:57:39 crc kubenswrapper[4922]: E0218 11:57:39.412049 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ad26da5-56a1-4f67-aae9-ab321499352f" containerName="sg-core" Feb 18 11:57:39 crc kubenswrapper[4922]: I0218 11:57:39.412056 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ad26da5-56a1-4f67-aae9-ab321499352f" containerName="sg-core" Feb 18 11:57:39 crc kubenswrapper[4922]: I0218 11:57:39.412261 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ad26da5-56a1-4f67-aae9-ab321499352f" containerName="ceilometer-notification-agent" Feb 18 11:57:39 crc kubenswrapper[4922]: I0218 11:57:39.412283 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ad26da5-56a1-4f67-aae9-ab321499352f" containerName="ceilometer-central-agent" Feb 18 11:57:39 crc kubenswrapper[4922]: I0218 11:57:39.412299 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ad26da5-56a1-4f67-aae9-ab321499352f" containerName="sg-core" Feb 18 11:57:39 crc kubenswrapper[4922]: I0218 11:57:39.412315 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ad26da5-56a1-4f67-aae9-ab321499352f" containerName="proxy-httpd" Feb 18 11:57:39 crc kubenswrapper[4922]: I0218 11:57:39.418604 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 11:57:39 crc kubenswrapper[4922]: I0218 11:57:39.425138 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 18 11:57:39 crc kubenswrapper[4922]: I0218 11:57:39.425424 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 18 11:57:39 crc kubenswrapper[4922]: I0218 11:57:39.438989 4922 scope.go:117] "RemoveContainer" containerID="6840f0d264514efcd94bc4a02aec112cf5f76e7909fa91e4880bd5ff5ad6f1bc" Feb 18 11:57:39 crc kubenswrapper[4922]: I0218 11:57:39.451118 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 11:57:39 crc kubenswrapper[4922]: I0218 11:57:39.474277 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21959345-b5c7-4013-a975-3d02790d2e8a-config-data\") pod \"ceilometer-0\" (UID: \"21959345-b5c7-4013-a975-3d02790d2e8a\") " pod="openstack/ceilometer-0" Feb 18 11:57:39 crc kubenswrapper[4922]: I0218 11:57:39.474420 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21959345-b5c7-4013-a975-3d02790d2e8a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"21959345-b5c7-4013-a975-3d02790d2e8a\") " pod="openstack/ceilometer-0" Feb 18 11:57:39 crc kubenswrapper[4922]: I0218 11:57:39.474500 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/21959345-b5c7-4013-a975-3d02790d2e8a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"21959345-b5c7-4013-a975-3d02790d2e8a\") " pod="openstack/ceilometer-0" Feb 18 11:57:39 crc kubenswrapper[4922]: I0218 11:57:39.474534 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/21959345-b5c7-4013-a975-3d02790d2e8a-run-httpd\") pod \"ceilometer-0\" (UID: \"21959345-b5c7-4013-a975-3d02790d2e8a\") " pod="openstack/ceilometer-0" Feb 18 11:57:39 crc kubenswrapper[4922]: I0218 11:57:39.474616 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/21959345-b5c7-4013-a975-3d02790d2e8a-log-httpd\") pod \"ceilometer-0\" (UID: \"21959345-b5c7-4013-a975-3d02790d2e8a\") " pod="openstack/ceilometer-0" Feb 18 11:57:39 crc kubenswrapper[4922]: I0218 11:57:39.474964 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21959345-b5c7-4013-a975-3d02790d2e8a-scripts\") pod \"ceilometer-0\" (UID: \"21959345-b5c7-4013-a975-3d02790d2e8a\") " pod="openstack/ceilometer-0" Feb 18 11:57:39 crc kubenswrapper[4922]: I0218 11:57:39.475130 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6h9k\" (UniqueName: \"kubernetes.io/projected/21959345-b5c7-4013-a975-3d02790d2e8a-kube-api-access-b6h9k\") pod \"ceilometer-0\" (UID: \"21959345-b5c7-4013-a975-3d02790d2e8a\") " pod="openstack/ceilometer-0" Feb 18 11:57:39 crc kubenswrapper[4922]: I0218 11:57:39.490128 4922 scope.go:117] "RemoveContainer" containerID="4f7ad4cbfa0275aa87aaa646fee7583ae6b1acc937b50ade063f9b7ab68d66a4" Feb 18 11:57:39 crc kubenswrapper[4922]: I0218 11:57:39.577859 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21959345-b5c7-4013-a975-3d02790d2e8a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"21959345-b5c7-4013-a975-3d02790d2e8a\") " pod="openstack/ceilometer-0" Feb 18 11:57:39 crc kubenswrapper[4922]: I0218 11:57:39.578248 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/21959345-b5c7-4013-a975-3d02790d2e8a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"21959345-b5c7-4013-a975-3d02790d2e8a\") " pod="openstack/ceilometer-0" Feb 18 11:57:39 crc kubenswrapper[4922]: I0218 11:57:39.578327 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/21959345-b5c7-4013-a975-3d02790d2e8a-run-httpd\") pod \"ceilometer-0\" (UID: \"21959345-b5c7-4013-a975-3d02790d2e8a\") " pod="openstack/ceilometer-0" Feb 18 11:57:39 crc kubenswrapper[4922]: I0218 11:57:39.578390 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/21959345-b5c7-4013-a975-3d02790d2e8a-log-httpd\") pod \"ceilometer-0\" (UID: \"21959345-b5c7-4013-a975-3d02790d2e8a\") " pod="openstack/ceilometer-0" Feb 18 11:57:39 crc kubenswrapper[4922]: I0218 11:57:39.578421 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21959345-b5c7-4013-a975-3d02790d2e8a-scripts\") pod \"ceilometer-0\" (UID: \"21959345-b5c7-4013-a975-3d02790d2e8a\") " pod="openstack/ceilometer-0" Feb 18 11:57:39 crc kubenswrapper[4922]: I0218 11:57:39.578468 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6h9k\" (UniqueName: \"kubernetes.io/projected/21959345-b5c7-4013-a975-3d02790d2e8a-kube-api-access-b6h9k\") pod \"ceilometer-0\" (UID: \"21959345-b5c7-4013-a975-3d02790d2e8a\") " pod="openstack/ceilometer-0" Feb 18 11:57:39 crc kubenswrapper[4922]: I0218 11:57:39.578593 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21959345-b5c7-4013-a975-3d02790d2e8a-config-data\") pod \"ceilometer-0\" (UID: \"21959345-b5c7-4013-a975-3d02790d2e8a\") " pod="openstack/ceilometer-0" Feb 18 11:57:39 crc kubenswrapper[4922]: I0218 11:57:39.578810 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/21959345-b5c7-4013-a975-3d02790d2e8a-run-httpd\") pod \"ceilometer-0\" (UID: \"21959345-b5c7-4013-a975-3d02790d2e8a\") " pod="openstack/ceilometer-0" Feb 18 11:57:39 crc kubenswrapper[4922]: I0218 11:57:39.579122 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/21959345-b5c7-4013-a975-3d02790d2e8a-log-httpd\") pod \"ceilometer-0\" (UID: \"21959345-b5c7-4013-a975-3d02790d2e8a\") " pod="openstack/ceilometer-0" Feb 18 11:57:39 crc kubenswrapper[4922]: I0218 11:57:39.587194 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21959345-b5c7-4013-a975-3d02790d2e8a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"21959345-b5c7-4013-a975-3d02790d2e8a\") " pod="openstack/ceilometer-0" Feb 18 11:57:39 crc kubenswrapper[4922]: I0218 11:57:39.593950 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/21959345-b5c7-4013-a975-3d02790d2e8a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"21959345-b5c7-4013-a975-3d02790d2e8a\") " pod="openstack/ceilometer-0" Feb 18 11:57:39 crc kubenswrapper[4922]: I0218 11:57:39.598314 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21959345-b5c7-4013-a975-3d02790d2e8a-scripts\") pod \"ceilometer-0\" (UID: \"21959345-b5c7-4013-a975-3d02790d2e8a\") " pod="openstack/ceilometer-0" Feb 18 11:57:39 crc kubenswrapper[4922]: I0218 11:57:39.599754 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6h9k\" (UniqueName: \"kubernetes.io/projected/21959345-b5c7-4013-a975-3d02790d2e8a-kube-api-access-b6h9k\") pod \"ceilometer-0\" (UID: \"21959345-b5c7-4013-a975-3d02790d2e8a\") " pod="openstack/ceilometer-0" Feb 18 11:57:39 crc kubenswrapper[4922]: I0218 11:57:39.600175 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21959345-b5c7-4013-a975-3d02790d2e8a-config-data\") pod \"ceilometer-0\" (UID: \"21959345-b5c7-4013-a975-3d02790d2e8a\") " pod="openstack/ceilometer-0" Feb 18 11:57:39 crc kubenswrapper[4922]: I0218 11:57:39.759945 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 11:57:39 crc kubenswrapper[4922]: I0218 11:57:39.808826 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 11:57:39 crc kubenswrapper[4922]: I0218 11:57:39.808896 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 11:57:39 crc kubenswrapper[4922]: I0218 11:57:39.808965 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-znglx" Feb 18 11:57:39 crc kubenswrapper[4922]: I0218 11:57:39.809711 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6ab0358a6b4b84604aa8265da97113127295fc06806ed22c39a69885110c93fc"} pod="openshift-machine-config-operator/machine-config-daemon-znglx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 11:57:39 crc kubenswrapper[4922]: I0218 11:57:39.809777 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" containerID="cri-o://6ab0358a6b4b84604aa8265da97113127295fc06806ed22c39a69885110c93fc" gracePeriod=600 Feb 18 11:57:40 crc kubenswrapper[4922]: I0218 11:57:40.246046 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 11:57:40 crc kubenswrapper[4922]: W0218 11:57:40.255771 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21959345_b5c7_4013_a975_3d02790d2e8a.slice/crio-65b652de66b9a52f98793fd9afbf09e548d09526752352ff7c8d2022f8399ea5 WatchSource:0}: Error finding container 65b652de66b9a52f98793fd9afbf09e548d09526752352ff7c8d2022f8399ea5: Status 404 returned error can't find the container with id 65b652de66b9a52f98793fd9afbf09e548d09526752352ff7c8d2022f8399ea5 Feb 18 11:57:40 crc kubenswrapper[4922]: I0218 11:57:40.396142 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"21959345-b5c7-4013-a975-3d02790d2e8a","Type":"ContainerStarted","Data":"65b652de66b9a52f98793fd9afbf09e548d09526752352ff7c8d2022f8399ea5"} Feb 18 11:57:40 crc kubenswrapper[4922]: I0218 11:57:40.401258 4922 generic.go:334] "Generic (PLEG): container finished" podID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerID="6ab0358a6b4b84604aa8265da97113127295fc06806ed22c39a69885110c93fc" exitCode=0 Feb 18 11:57:40 crc kubenswrapper[4922]: I0218 11:57:40.401320 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" event={"ID":"fdb7cedc-b2e3-48f0-80e0-e17073b43228","Type":"ContainerDied","Data":"6ab0358a6b4b84604aa8265da97113127295fc06806ed22c39a69885110c93fc"} Feb 18 11:57:40 crc kubenswrapper[4922]: I0218 11:57:40.401347 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" event={"ID":"fdb7cedc-b2e3-48f0-80e0-e17073b43228","Type":"ContainerStarted","Data":"ef7998df4ff2ac956dafb01bf87962d308e4ed2ee7ceb57165a1d59bde7c799c"} Feb 18 11:57:40 crc kubenswrapper[4922]: I0218 11:57:40.401379 4922 scope.go:117] "RemoveContainer" containerID="3ecb5c1316b12312a3aa69dd6c259c33fc3f21c1c782e1e74d55d4b725fb05a8" Feb 18 11:57:40 crc kubenswrapper[4922]: I0218 11:57:40.406862 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f5056168-d177-4e40-813a-db20d428ce9a","Type":"ContainerStarted","Data":"90371bd96ab264a9bb82ffb0fc0dd87731cc01ffcabe85aa1f861569e31349f5"} Feb 18 11:57:40 crc kubenswrapper[4922]: I0218 11:57:40.991643 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ad26da5-56a1-4f67-aae9-ab321499352f" path="/var/lib/kubelet/pods/3ad26da5-56a1-4f67-aae9-ab321499352f/volumes" Feb 18 11:57:41 crc kubenswrapper[4922]: I0218 11:57:41.450636 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f5056168-d177-4e40-813a-db20d428ce9a","Type":"ContainerStarted","Data":"181d13923e340b31677d3b88a3afe9db22758e34cda4decd2bdf335c4bfb32bc"} Feb 18 11:57:41 crc kubenswrapper[4922]: I0218 11:57:41.473690 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"21959345-b5c7-4013-a975-3d02790d2e8a","Type":"ContainerStarted","Data":"cb478ec359f29d9399971270b2b6115201780be8066b8577d1cced1b5e9c1a76"} Feb 18 11:57:41 crc kubenswrapper[4922]: I0218 11:57:41.721677 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.72165909 podStartE2EDuration="4.72165909s" podCreationTimestamp="2026-02-18 11:57:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:57:40.45839042 +0000 UTC m=+1262.186094500" watchObservedRunningTime="2026-02-18 11:57:41.72165909 +0000 UTC m=+1263.449363180" Feb 18 11:57:41 crc kubenswrapper[4922]: I0218 11:57:41.732397 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ws7m8"] Feb 18 11:57:41 crc kubenswrapper[4922]: I0218 11:57:41.744189 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ws7m8"] Feb 18 11:57:41 crc kubenswrapper[4922]: I0218 11:57:41.744292 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ws7m8" Feb 18 11:57:41 crc kubenswrapper[4922]: I0218 11:57:41.749016 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-m8flj" Feb 18 11:57:41 crc kubenswrapper[4922]: I0218 11:57:41.749177 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 18 11:57:41 crc kubenswrapper[4922]: I0218 11:57:41.749320 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 18 11:57:41 crc kubenswrapper[4922]: I0218 11:57:41.926473 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-ws7m8\" (UID: \"ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5\") " pod="openstack/nova-cell0-conductor-db-sync-ws7m8" Feb 18 11:57:41 crc kubenswrapper[4922]: I0218 11:57:41.926845 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdr6h\" (UniqueName: \"kubernetes.io/projected/ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5-kube-api-access-hdr6h\") pod \"nova-cell0-conductor-db-sync-ws7m8\" (UID: \"ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5\") " pod="openstack/nova-cell0-conductor-db-sync-ws7m8" Feb 18 11:57:41 crc kubenswrapper[4922]: I0218 11:57:41.926977 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5-scripts\") pod \"nova-cell0-conductor-db-sync-ws7m8\" (UID: \"ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5\") " pod="openstack/nova-cell0-conductor-db-sync-ws7m8" Feb 18 11:57:41 crc kubenswrapper[4922]: I0218 11:57:41.927011 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5-config-data\") pod \"nova-cell0-conductor-db-sync-ws7m8\" (UID: \"ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5\") " pod="openstack/nova-cell0-conductor-db-sync-ws7m8" Feb 18 11:57:42 crc kubenswrapper[4922]: I0218 11:57:42.029026 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdr6h\" (UniqueName: \"kubernetes.io/projected/ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5-kube-api-access-hdr6h\") pod \"nova-cell0-conductor-db-sync-ws7m8\" (UID: \"ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5\") " pod="openstack/nova-cell0-conductor-db-sync-ws7m8" Feb 18 11:57:42 crc kubenswrapper[4922]: I0218 11:57:42.029171 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5-scripts\") pod \"nova-cell0-conductor-db-sync-ws7m8\" (UID: \"ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5\") " pod="openstack/nova-cell0-conductor-db-sync-ws7m8" Feb 18 11:57:42 crc kubenswrapper[4922]: I0218 11:57:42.029200 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5-config-data\") pod \"nova-cell0-conductor-db-sync-ws7m8\" (UID: \"ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5\") " pod="openstack/nova-cell0-conductor-db-sync-ws7m8" Feb 18 11:57:42 crc kubenswrapper[4922]: I0218 11:57:42.029313 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-ws7m8\" (UID: \"ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5\") " pod="openstack/nova-cell0-conductor-db-sync-ws7m8" Feb 18 11:57:42 crc kubenswrapper[4922]: I0218 11:57:42.037271 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-ws7m8\" (UID: \"ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5\") " pod="openstack/nova-cell0-conductor-db-sync-ws7m8" Feb 18 11:57:42 crc kubenswrapper[4922]: I0218 11:57:42.046177 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5-scripts\") pod \"nova-cell0-conductor-db-sync-ws7m8\" (UID: \"ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5\") " pod="openstack/nova-cell0-conductor-db-sync-ws7m8" Feb 18 11:57:42 crc kubenswrapper[4922]: I0218 11:57:42.047321 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdr6h\" (UniqueName: \"kubernetes.io/projected/ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5-kube-api-access-hdr6h\") pod \"nova-cell0-conductor-db-sync-ws7m8\" (UID: \"ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5\") " pod="openstack/nova-cell0-conductor-db-sync-ws7m8" Feb 18 11:57:42 crc kubenswrapper[4922]: I0218 11:57:42.050967 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5-config-data\") pod \"nova-cell0-conductor-db-sync-ws7m8\" (UID: \"ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5\") " pod="openstack/nova-cell0-conductor-db-sync-ws7m8" Feb 18 11:57:42 crc kubenswrapper[4922]: I0218 11:57:42.067204 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ws7m8" Feb 18 11:57:42 crc kubenswrapper[4922]: I0218 11:57:42.500546 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"21959345-b5c7-4013-a975-3d02790d2e8a","Type":"ContainerStarted","Data":"001d92f56b3881bb246a5b8d6020a874a2e98a7c7721ad75c44399ad7e7ed2db"} Feb 18 11:57:42 crc kubenswrapper[4922]: I0218 11:57:42.501193 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"21959345-b5c7-4013-a975-3d02790d2e8a","Type":"ContainerStarted","Data":"99f8fed10436d49f464412fa00af054c4cb5c5d4c102938ed7110cfe0fb2e1b2"} Feb 18 11:57:42 crc kubenswrapper[4922]: I0218 11:57:42.616400 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ws7m8"] Feb 18 11:57:43 crc kubenswrapper[4922]: I0218 11:57:43.553870 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ws7m8" event={"ID":"ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5","Type":"ContainerStarted","Data":"19238151c6895845aeb1fcd79101b75a8cace32e850eb034480705b42b7e45a0"} Feb 18 11:57:44 crc kubenswrapper[4922]: I0218 11:57:44.568977 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"21959345-b5c7-4013-a975-3d02790d2e8a","Type":"ContainerStarted","Data":"c0108bf2b931e5acca4c64b4090bbf10772b51b4c02d9b81008189353130e77b"} Feb 18 11:57:44 crc kubenswrapper[4922]: I0218 11:57:44.569302 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 18 11:57:44 crc kubenswrapper[4922]: I0218 11:57:44.603127 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.748672096 podStartE2EDuration="5.603107561s" podCreationTimestamp="2026-02-18 11:57:39 +0000 UTC" firstStartedPulling="2026-02-18 11:57:40.266965881 +0000 UTC m=+1261.994669961" lastFinishedPulling="2026-02-18 11:57:44.121401346 +0000 UTC m=+1265.849105426" observedRunningTime="2026-02-18 11:57:44.595174281 +0000 UTC m=+1266.322878371" watchObservedRunningTime="2026-02-18 11:57:44.603107561 +0000 UTC m=+1266.330811641" Feb 18 11:57:45 crc kubenswrapper[4922]: I0218 11:57:45.928600 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 18 11:57:45 crc kubenswrapper[4922]: I0218 11:57:45.928914 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 18 11:57:45 crc kubenswrapper[4922]: I0218 11:57:45.987349 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 18 11:57:45 crc kubenswrapper[4922]: I0218 11:57:45.988515 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 18 11:57:46 crc kubenswrapper[4922]: I0218 11:57:46.591975 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 18 11:57:46 crc kubenswrapper[4922]: I0218 11:57:46.592025 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 18 11:57:48 crc kubenswrapper[4922]: I0218 11:57:48.365027 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 18 11:57:48 crc kubenswrapper[4922]: I0218 11:57:48.365391 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 18 11:57:48 crc kubenswrapper[4922]: I0218 11:57:48.396533 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 18 11:57:48 crc kubenswrapper[4922]: I0218 11:57:48.409119 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 18 11:57:48 crc kubenswrapper[4922]: I0218 11:57:48.619028 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 18 11:57:48 crc kubenswrapper[4922]: I0218 11:57:48.619311 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 18 11:57:48 crc kubenswrapper[4922]: I0218 11:57:48.899911 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 18 11:57:48 crc kubenswrapper[4922]: I0218 11:57:48.900027 4922 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 18 11:57:48 crc kubenswrapper[4922]: I0218 11:57:48.952439 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 18 11:57:49 crc kubenswrapper[4922]: I0218 11:57:49.966660 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 18 11:57:49 crc kubenswrapper[4922]: I0218 11:57:49.968013 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-decision-engine-0" podUID="bdb6fddf-10f2-476b-822f-130f6fa12007" containerName="watcher-decision-engine" containerID="cri-o://bc2ff599d529126d7879e500e35b5132d881d5c58dbce8cd92a858ece8e6e115" gracePeriod=30 Feb 18 11:57:50 crc kubenswrapper[4922]: I0218 11:57:50.588113 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-979b8465b-gmztk" Feb 18 11:57:51 crc kubenswrapper[4922]: I0218 11:57:51.207326 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 18 11:57:51 crc kubenswrapper[4922]: I0218 11:57:51.207446 4922 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 18 11:57:51 crc kubenswrapper[4922]: I0218 11:57:51.257962 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 18 11:57:52 crc kubenswrapper[4922]: I0218 11:57:52.038970 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 11:57:52 crc kubenswrapper[4922]: I0218 11:57:52.039720 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="21959345-b5c7-4013-a975-3d02790d2e8a" containerName="ceilometer-central-agent" containerID="cri-o://cb478ec359f29d9399971270b2b6115201780be8066b8577d1cced1b5e9c1a76" gracePeriod=30 Feb 18 11:57:52 crc kubenswrapper[4922]: I0218 11:57:52.039807 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="21959345-b5c7-4013-a975-3d02790d2e8a" containerName="sg-core" containerID="cri-o://001d92f56b3881bb246a5b8d6020a874a2e98a7c7721ad75c44399ad7e7ed2db" gracePeriod=30 Feb 18 11:57:52 crc kubenswrapper[4922]: I0218 11:57:52.039881 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="21959345-b5c7-4013-a975-3d02790d2e8a" containerName="ceilometer-notification-agent" containerID="cri-o://99f8fed10436d49f464412fa00af054c4cb5c5d4c102938ed7110cfe0fb2e1b2" gracePeriod=30 Feb 18 11:57:52 crc kubenswrapper[4922]: I0218 11:57:52.039842 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="21959345-b5c7-4013-a975-3d02790d2e8a" containerName="proxy-httpd" containerID="cri-o://c0108bf2b931e5acca4c64b4090bbf10772b51b4c02d9b81008189353130e77b" gracePeriod=30 Feb 18 11:57:52 crc kubenswrapper[4922]: I0218 11:57:52.681235 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ws7m8" event={"ID":"ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5","Type":"ContainerStarted","Data":"38158e702c80c7a31868549296123efd8bf7bfe465aae6ab472768cf855c3a38"} Feb 18 11:57:52 crc kubenswrapper[4922]: I0218 11:57:52.683941 4922 generic.go:334] "Generic (PLEG): container finished" podID="21959345-b5c7-4013-a975-3d02790d2e8a" containerID="c0108bf2b931e5acca4c64b4090bbf10772b51b4c02d9b81008189353130e77b" exitCode=0 Feb 18 11:57:52 crc kubenswrapper[4922]: I0218 11:57:52.683976 4922 generic.go:334] "Generic (PLEG): container finished" podID="21959345-b5c7-4013-a975-3d02790d2e8a" containerID="001d92f56b3881bb246a5b8d6020a874a2e98a7c7721ad75c44399ad7e7ed2db" exitCode=2 Feb 18 11:57:52 crc kubenswrapper[4922]: I0218 11:57:52.684000 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"21959345-b5c7-4013-a975-3d02790d2e8a","Type":"ContainerDied","Data":"c0108bf2b931e5acca4c64b4090bbf10772b51b4c02d9b81008189353130e77b"} Feb 18 11:57:52 crc kubenswrapper[4922]: I0218 11:57:52.684030 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"21959345-b5c7-4013-a975-3d02790d2e8a","Type":"ContainerDied","Data":"001d92f56b3881bb246a5b8d6020a874a2e98a7c7721ad75c44399ad7e7ed2db"} Feb 18 11:57:53 crc kubenswrapper[4922]: I0218 11:57:53.241598 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-f57669c89-7wt5g" Feb 18 11:57:53 crc kubenswrapper[4922]: I0218 11:57:53.271120 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-ws7m8" podStartSLOduration=3.105033463 podStartE2EDuration="12.271101443s" podCreationTimestamp="2026-02-18 11:57:41 +0000 UTC" firstStartedPulling="2026-02-18 11:57:42.627388014 +0000 UTC m=+1264.355092094" lastFinishedPulling="2026-02-18 11:57:51.793455984 +0000 UTC m=+1273.521160074" observedRunningTime="2026-02-18 11:57:52.703833164 +0000 UTC m=+1274.431537244" watchObservedRunningTime="2026-02-18 11:57:53.271101443 +0000 UTC m=+1274.998805523" Feb 18 11:57:53 crc kubenswrapper[4922]: I0218 11:57:53.320329 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-979b8465b-gmztk"] Feb 18 11:57:53 crc kubenswrapper[4922]: I0218 11:57:53.320977 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-979b8465b-gmztk" podUID="6b1ea57e-dcf2-4e47-8650-af483b18ea8f" containerName="neutron-api" containerID="cri-o://c50b1f228a346e798dee756ffe7172e5752cdc2bb9dd07e1f2ab8318d4e33a78" gracePeriod=30 Feb 18 11:57:53 crc kubenswrapper[4922]: I0218 11:57:53.321425 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-979b8465b-gmztk" podUID="6b1ea57e-dcf2-4e47-8650-af483b18ea8f" containerName="neutron-httpd" containerID="cri-o://dd50276aac03478ea25adcde79bcd1337f3694428dc64ce90dcf6fe92e773c3c" gracePeriod=30 Feb 18 11:57:53 crc kubenswrapper[4922]: I0218 11:57:53.702693 4922 generic.go:334] "Generic (PLEG): container finished" podID="21959345-b5c7-4013-a975-3d02790d2e8a" containerID="99f8fed10436d49f464412fa00af054c4cb5c5d4c102938ed7110cfe0fb2e1b2" exitCode=0 Feb 18 11:57:53 crc kubenswrapper[4922]: I0218 11:57:53.702734 4922 generic.go:334] "Generic (PLEG): container finished" podID="21959345-b5c7-4013-a975-3d02790d2e8a" containerID="cb478ec359f29d9399971270b2b6115201780be8066b8577d1cced1b5e9c1a76" exitCode=0 Feb 18 11:57:53 crc kubenswrapper[4922]: I0218 11:57:53.702774 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"21959345-b5c7-4013-a975-3d02790d2e8a","Type":"ContainerDied","Data":"99f8fed10436d49f464412fa00af054c4cb5c5d4c102938ed7110cfe0fb2e1b2"} Feb 18 11:57:53 crc kubenswrapper[4922]: I0218 11:57:53.702831 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"21959345-b5c7-4013-a975-3d02790d2e8a","Type":"ContainerDied","Data":"cb478ec359f29d9399971270b2b6115201780be8066b8577d1cced1b5e9c1a76"} Feb 18 11:57:53 crc kubenswrapper[4922]: I0218 11:57:53.708527 4922 generic.go:334] "Generic (PLEG): container finished" podID="6b1ea57e-dcf2-4e47-8650-af483b18ea8f" containerID="dd50276aac03478ea25adcde79bcd1337f3694428dc64ce90dcf6fe92e773c3c" exitCode=0 Feb 18 11:57:53 crc kubenswrapper[4922]: I0218 11:57:53.709650 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-979b8465b-gmztk" event={"ID":"6b1ea57e-dcf2-4e47-8650-af483b18ea8f","Type":"ContainerDied","Data":"dd50276aac03478ea25adcde79bcd1337f3694428dc64ce90dcf6fe92e773c3c"} Feb 18 11:57:53 crc kubenswrapper[4922]: I0218 11:57:53.854160 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 11:57:53 crc kubenswrapper[4922]: I0218 11:57:53.905768 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/21959345-b5c7-4013-a975-3d02790d2e8a-sg-core-conf-yaml\") pod \"21959345-b5c7-4013-a975-3d02790d2e8a\" (UID: \"21959345-b5c7-4013-a975-3d02790d2e8a\") " Feb 18 11:57:53 crc kubenswrapper[4922]: I0218 11:57:53.905852 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21959345-b5c7-4013-a975-3d02790d2e8a-config-data\") pod \"21959345-b5c7-4013-a975-3d02790d2e8a\" (UID: \"21959345-b5c7-4013-a975-3d02790d2e8a\") " Feb 18 11:57:53 crc kubenswrapper[4922]: I0218 11:57:53.905970 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6h9k\" (UniqueName: \"kubernetes.io/projected/21959345-b5c7-4013-a975-3d02790d2e8a-kube-api-access-b6h9k\") pod \"21959345-b5c7-4013-a975-3d02790d2e8a\" (UID: \"21959345-b5c7-4013-a975-3d02790d2e8a\") " Feb 18 11:57:53 crc kubenswrapper[4922]: I0218 11:57:53.906009 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21959345-b5c7-4013-a975-3d02790d2e8a-combined-ca-bundle\") pod \"21959345-b5c7-4013-a975-3d02790d2e8a\" (UID: \"21959345-b5c7-4013-a975-3d02790d2e8a\") " Feb 18 11:57:53 crc kubenswrapper[4922]: I0218 11:57:53.906113 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/21959345-b5c7-4013-a975-3d02790d2e8a-run-httpd\") pod \"21959345-b5c7-4013-a975-3d02790d2e8a\" (UID: \"21959345-b5c7-4013-a975-3d02790d2e8a\") " Feb 18 11:57:53 crc kubenswrapper[4922]: I0218 11:57:53.906186 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21959345-b5c7-4013-a975-3d02790d2e8a-scripts\") pod \"21959345-b5c7-4013-a975-3d02790d2e8a\" (UID: \"21959345-b5c7-4013-a975-3d02790d2e8a\") " Feb 18 11:57:53 crc kubenswrapper[4922]: I0218 11:57:53.906217 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/21959345-b5c7-4013-a975-3d02790d2e8a-log-httpd\") pod \"21959345-b5c7-4013-a975-3d02790d2e8a\" (UID: \"21959345-b5c7-4013-a975-3d02790d2e8a\") " Feb 18 11:57:53 crc kubenswrapper[4922]: I0218 11:57:53.906506 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21959345-b5c7-4013-a975-3d02790d2e8a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "21959345-b5c7-4013-a975-3d02790d2e8a" (UID: "21959345-b5c7-4013-a975-3d02790d2e8a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:57:53 crc kubenswrapper[4922]: I0218 11:57:53.906757 4922 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/21959345-b5c7-4013-a975-3d02790d2e8a-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:53 crc kubenswrapper[4922]: I0218 11:57:53.906875 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21959345-b5c7-4013-a975-3d02790d2e8a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "21959345-b5c7-4013-a975-3d02790d2e8a" (UID: "21959345-b5c7-4013-a975-3d02790d2e8a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:57:53 crc kubenswrapper[4922]: I0218 11:57:53.914762 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21959345-b5c7-4013-a975-3d02790d2e8a-scripts" (OuterVolumeSpecName: "scripts") pod "21959345-b5c7-4013-a975-3d02790d2e8a" (UID: "21959345-b5c7-4013-a975-3d02790d2e8a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:53 crc kubenswrapper[4922]: I0218 11:57:53.928634 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21959345-b5c7-4013-a975-3d02790d2e8a-kube-api-access-b6h9k" (OuterVolumeSpecName: "kube-api-access-b6h9k") pod "21959345-b5c7-4013-a975-3d02790d2e8a" (UID: "21959345-b5c7-4013-a975-3d02790d2e8a"). InnerVolumeSpecName "kube-api-access-b6h9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:57:53 crc kubenswrapper[4922]: I0218 11:57:53.952526 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21959345-b5c7-4013-a975-3d02790d2e8a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "21959345-b5c7-4013-a975-3d02790d2e8a" (UID: "21959345-b5c7-4013-a975-3d02790d2e8a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:54 crc kubenswrapper[4922]: I0218 11:57:54.009897 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21959345-b5c7-4013-a975-3d02790d2e8a-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:54 crc kubenswrapper[4922]: I0218 11:57:54.009937 4922 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/21959345-b5c7-4013-a975-3d02790d2e8a-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:54 crc kubenswrapper[4922]: I0218 11:57:54.009950 4922 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/21959345-b5c7-4013-a975-3d02790d2e8a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:54 crc kubenswrapper[4922]: I0218 11:57:54.009962 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6h9k\" (UniqueName: \"kubernetes.io/projected/21959345-b5c7-4013-a975-3d02790d2e8a-kube-api-access-b6h9k\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:54 crc kubenswrapper[4922]: I0218 11:57:54.044462 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21959345-b5c7-4013-a975-3d02790d2e8a-config-data" (OuterVolumeSpecName: "config-data") pod "21959345-b5c7-4013-a975-3d02790d2e8a" (UID: "21959345-b5c7-4013-a975-3d02790d2e8a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:54 crc kubenswrapper[4922]: I0218 11:57:54.053521 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21959345-b5c7-4013-a975-3d02790d2e8a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "21959345-b5c7-4013-a975-3d02790d2e8a" (UID: "21959345-b5c7-4013-a975-3d02790d2e8a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:54 crc kubenswrapper[4922]: I0218 11:57:54.113039 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21959345-b5c7-4013-a975-3d02790d2e8a-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:54 crc kubenswrapper[4922]: I0218 11:57:54.113076 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21959345-b5c7-4013-a975-3d02790d2e8a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:54 crc kubenswrapper[4922]: I0218 11:57:54.722601 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"21959345-b5c7-4013-a975-3d02790d2e8a","Type":"ContainerDied","Data":"65b652de66b9a52f98793fd9afbf09e548d09526752352ff7c8d2022f8399ea5"} Feb 18 11:57:54 crc kubenswrapper[4922]: I0218 11:57:54.722652 4922 scope.go:117] "RemoveContainer" containerID="c0108bf2b931e5acca4c64b4090bbf10772b51b4c02d9b81008189353130e77b" Feb 18 11:57:54 crc kubenswrapper[4922]: I0218 11:57:54.722717 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 11:57:54 crc kubenswrapper[4922]: I0218 11:57:54.757199 4922 scope.go:117] "RemoveContainer" containerID="001d92f56b3881bb246a5b8d6020a874a2e98a7c7721ad75c44399ad7e7ed2db" Feb 18 11:57:54 crc kubenswrapper[4922]: I0218 11:57:54.767767 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 11:57:54 crc kubenswrapper[4922]: I0218 11:57:54.788800 4922 scope.go:117] "RemoveContainer" containerID="99f8fed10436d49f464412fa00af054c4cb5c5d4c102938ed7110cfe0fb2e1b2" Feb 18 11:57:54 crc kubenswrapper[4922]: I0218 11:57:54.792017 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 18 11:57:54 crc kubenswrapper[4922]: I0218 11:57:54.809496 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 18 11:57:54 crc kubenswrapper[4922]: E0218 11:57:54.810362 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21959345-b5c7-4013-a975-3d02790d2e8a" containerName="proxy-httpd" Feb 18 11:57:54 crc kubenswrapper[4922]: I0218 11:57:54.810402 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="21959345-b5c7-4013-a975-3d02790d2e8a" containerName="proxy-httpd" Feb 18 11:57:54 crc kubenswrapper[4922]: E0218 11:57:54.810423 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21959345-b5c7-4013-a975-3d02790d2e8a" containerName="ceilometer-central-agent" Feb 18 11:57:54 crc kubenswrapper[4922]: I0218 11:57:54.810431 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="21959345-b5c7-4013-a975-3d02790d2e8a" containerName="ceilometer-central-agent" Feb 18 11:57:54 crc kubenswrapper[4922]: E0218 11:57:54.810446 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21959345-b5c7-4013-a975-3d02790d2e8a" containerName="ceilometer-notification-agent" Feb 18 11:57:54 crc kubenswrapper[4922]: I0218 11:57:54.810455 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="21959345-b5c7-4013-a975-3d02790d2e8a" containerName="ceilometer-notification-agent" Feb 18 11:57:54 crc kubenswrapper[4922]: E0218 11:57:54.810464 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21959345-b5c7-4013-a975-3d02790d2e8a" containerName="sg-core" Feb 18 11:57:54 crc kubenswrapper[4922]: I0218 11:57:54.810471 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="21959345-b5c7-4013-a975-3d02790d2e8a" containerName="sg-core" Feb 18 11:57:54 crc kubenswrapper[4922]: I0218 11:57:54.810707 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="21959345-b5c7-4013-a975-3d02790d2e8a" containerName="proxy-httpd" Feb 18 11:57:54 crc kubenswrapper[4922]: I0218 11:57:54.810724 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="21959345-b5c7-4013-a975-3d02790d2e8a" containerName="ceilometer-notification-agent" Feb 18 11:57:54 crc kubenswrapper[4922]: I0218 11:57:54.810744 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="21959345-b5c7-4013-a975-3d02790d2e8a" containerName="ceilometer-central-agent" Feb 18 11:57:54 crc kubenswrapper[4922]: I0218 11:57:54.810757 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="21959345-b5c7-4013-a975-3d02790d2e8a" containerName="sg-core" Feb 18 11:57:54 crc kubenswrapper[4922]: I0218 11:57:54.812402 4922 scope.go:117] "RemoveContainer" containerID="cb478ec359f29d9399971270b2b6115201780be8066b8577d1cced1b5e9c1a76" Feb 18 11:57:54 crc kubenswrapper[4922]: I0218 11:57:54.815083 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 11:57:54 crc kubenswrapper[4922]: I0218 11:57:54.818902 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 18 11:57:54 crc kubenswrapper[4922]: I0218 11:57:54.818952 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 18 11:57:54 crc kubenswrapper[4922]: I0218 11:57:54.837630 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 11:57:54 crc kubenswrapper[4922]: I0218 11:57:54.934573 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01766e5f-d149-4175-9fdb-15e65b0e0665-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"01766e5f-d149-4175-9fdb-15e65b0e0665\") " pod="openstack/ceilometer-0" Feb 18 11:57:54 crc kubenswrapper[4922]: I0218 11:57:54.934616 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01766e5f-d149-4175-9fdb-15e65b0e0665-config-data\") pod \"ceilometer-0\" (UID: \"01766e5f-d149-4175-9fdb-15e65b0e0665\") " pod="openstack/ceilometer-0" Feb 18 11:57:54 crc kubenswrapper[4922]: I0218 11:57:54.934635 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01766e5f-d149-4175-9fdb-15e65b0e0665-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"01766e5f-d149-4175-9fdb-15e65b0e0665\") " pod="openstack/ceilometer-0" Feb 18 11:57:54 crc kubenswrapper[4922]: I0218 11:57:54.934709 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01766e5f-d149-4175-9fdb-15e65b0e0665-scripts\") pod \"ceilometer-0\" (UID: \"01766e5f-d149-4175-9fdb-15e65b0e0665\") " pod="openstack/ceilometer-0" Feb 18 11:57:54 crc kubenswrapper[4922]: I0218 11:57:54.934928 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkcct\" (UniqueName: \"kubernetes.io/projected/01766e5f-d149-4175-9fdb-15e65b0e0665-kube-api-access-bkcct\") pod \"ceilometer-0\" (UID: \"01766e5f-d149-4175-9fdb-15e65b0e0665\") " pod="openstack/ceilometer-0" Feb 18 11:57:54 crc kubenswrapper[4922]: I0218 11:57:54.934999 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01766e5f-d149-4175-9fdb-15e65b0e0665-log-httpd\") pod \"ceilometer-0\" (UID: \"01766e5f-d149-4175-9fdb-15e65b0e0665\") " pod="openstack/ceilometer-0" Feb 18 11:57:54 crc kubenswrapper[4922]: I0218 11:57:54.935046 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01766e5f-d149-4175-9fdb-15e65b0e0665-run-httpd\") pod \"ceilometer-0\" (UID: \"01766e5f-d149-4175-9fdb-15e65b0e0665\") " pod="openstack/ceilometer-0" Feb 18 11:57:54 crc kubenswrapper[4922]: I0218 11:57:54.986272 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21959345-b5c7-4013-a975-3d02790d2e8a" path="/var/lib/kubelet/pods/21959345-b5c7-4013-a975-3d02790d2e8a/volumes" Feb 18 11:57:55 crc kubenswrapper[4922]: I0218 11:57:55.037663 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkcct\" (UniqueName: \"kubernetes.io/projected/01766e5f-d149-4175-9fdb-15e65b0e0665-kube-api-access-bkcct\") pod \"ceilometer-0\" (UID: \"01766e5f-d149-4175-9fdb-15e65b0e0665\") " pod="openstack/ceilometer-0" Feb 18 11:57:55 crc kubenswrapper[4922]: I0218 11:57:55.037842 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01766e5f-d149-4175-9fdb-15e65b0e0665-log-httpd\") pod \"ceilometer-0\" (UID: \"01766e5f-d149-4175-9fdb-15e65b0e0665\") " pod="openstack/ceilometer-0" Feb 18 11:57:55 crc kubenswrapper[4922]: I0218 11:57:55.037900 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01766e5f-d149-4175-9fdb-15e65b0e0665-run-httpd\") pod \"ceilometer-0\" (UID: \"01766e5f-d149-4175-9fdb-15e65b0e0665\") " pod="openstack/ceilometer-0" Feb 18 11:57:55 crc kubenswrapper[4922]: I0218 11:57:55.037992 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01766e5f-d149-4175-9fdb-15e65b0e0665-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"01766e5f-d149-4175-9fdb-15e65b0e0665\") " pod="openstack/ceilometer-0" Feb 18 11:57:55 crc kubenswrapper[4922]: I0218 11:57:55.038119 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01766e5f-d149-4175-9fdb-15e65b0e0665-config-data\") pod \"ceilometer-0\" (UID: \"01766e5f-d149-4175-9fdb-15e65b0e0665\") " pod="openstack/ceilometer-0" Feb 18 11:57:55 crc kubenswrapper[4922]: I0218 11:57:55.038149 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01766e5f-d149-4175-9fdb-15e65b0e0665-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"01766e5f-d149-4175-9fdb-15e65b0e0665\") " pod="openstack/ceilometer-0" Feb 18 11:57:55 crc kubenswrapper[4922]: I0218 11:57:55.038369 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01766e5f-d149-4175-9fdb-15e65b0e0665-scripts\") pod \"ceilometer-0\" (UID: \"01766e5f-d149-4175-9fdb-15e65b0e0665\") " pod="openstack/ceilometer-0" Feb 18 11:57:55 crc kubenswrapper[4922]: I0218 11:57:55.038736 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01766e5f-d149-4175-9fdb-15e65b0e0665-log-httpd\") pod \"ceilometer-0\" (UID: \"01766e5f-d149-4175-9fdb-15e65b0e0665\") " pod="openstack/ceilometer-0" Feb 18 11:57:55 crc kubenswrapper[4922]: I0218 11:57:55.039388 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01766e5f-d149-4175-9fdb-15e65b0e0665-run-httpd\") pod \"ceilometer-0\" (UID: \"01766e5f-d149-4175-9fdb-15e65b0e0665\") " pod="openstack/ceilometer-0" Feb 18 11:57:55 crc kubenswrapper[4922]: I0218 11:57:55.042478 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01766e5f-d149-4175-9fdb-15e65b0e0665-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"01766e5f-d149-4175-9fdb-15e65b0e0665\") " pod="openstack/ceilometer-0" Feb 18 11:57:55 crc kubenswrapper[4922]: I0218 11:57:55.043207 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01766e5f-d149-4175-9fdb-15e65b0e0665-scripts\") pod \"ceilometer-0\" (UID: \"01766e5f-d149-4175-9fdb-15e65b0e0665\") " pod="openstack/ceilometer-0" Feb 18 11:57:55 crc kubenswrapper[4922]: I0218 11:57:55.043288 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01766e5f-d149-4175-9fdb-15e65b0e0665-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"01766e5f-d149-4175-9fdb-15e65b0e0665\") " pod="openstack/ceilometer-0" Feb 18 11:57:55 crc kubenswrapper[4922]: I0218 11:57:55.052884 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01766e5f-d149-4175-9fdb-15e65b0e0665-config-data\") pod \"ceilometer-0\" (UID: \"01766e5f-d149-4175-9fdb-15e65b0e0665\") " pod="openstack/ceilometer-0" Feb 18 11:57:55 crc kubenswrapper[4922]: I0218 11:57:55.063895 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkcct\" (UniqueName: \"kubernetes.io/projected/01766e5f-d149-4175-9fdb-15e65b0e0665-kube-api-access-bkcct\") pod \"ceilometer-0\" (UID: \"01766e5f-d149-4175-9fdb-15e65b0e0665\") " pod="openstack/ceilometer-0" Feb 18 11:57:55 crc kubenswrapper[4922]: I0218 11:57:55.138901 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 11:57:55 crc kubenswrapper[4922]: W0218 11:57:55.595342 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01766e5f_d149_4175_9fdb_15e65b0e0665.slice/crio-9caa36cbf4a061073a94710aba88a19e4ceb523f9ac965d8c82018ec849187a8 WatchSource:0}: Error finding container 9caa36cbf4a061073a94710aba88a19e4ceb523f9ac965d8c82018ec849187a8: Status 404 returned error can't find the container with id 9caa36cbf4a061073a94710aba88a19e4ceb523f9ac965d8c82018ec849187a8 Feb 18 11:57:55 crc kubenswrapper[4922]: I0218 11:57:55.599131 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 11:57:55 crc kubenswrapper[4922]: I0218 11:57:55.733438 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01766e5f-d149-4175-9fdb-15e65b0e0665","Type":"ContainerStarted","Data":"9caa36cbf4a061073a94710aba88a19e4ceb523f9ac965d8c82018ec849187a8"} Feb 18 11:57:56 crc kubenswrapper[4922]: I0218 11:57:56.765642 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01766e5f-d149-4175-9fdb-15e65b0e0665","Type":"ContainerStarted","Data":"6c511819319a5fc193d0f75ed63d9083c79d2c12407691912b456da48212acab"} Feb 18 11:57:57 crc kubenswrapper[4922]: I0218 11:57:57.776090 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01766e5f-d149-4175-9fdb-15e65b0e0665","Type":"ContainerStarted","Data":"d7cb0d481469976e2f0d27c45a95752ec61c6325990c025f51f551cbf42c3d51"} Feb 18 11:57:58 crc kubenswrapper[4922]: I0218 11:57:58.788921 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01766e5f-d149-4175-9fdb-15e65b0e0665","Type":"ContainerStarted","Data":"019a6342d78a17e9007569b8df388ae4cd83074ceff02bfe55eb5c3a71054609"} Feb 18 11:57:59 crc kubenswrapper[4922]: I0218 11:57:59.265296 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 11:57:59 crc kubenswrapper[4922]: I0218 11:57:59.616045 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 18 11:57:59 crc kubenswrapper[4922]: I0218 11:57:59.741592 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/bdb6fddf-10f2-476b-822f-130f6fa12007-custom-prometheus-ca\") pod \"bdb6fddf-10f2-476b-822f-130f6fa12007\" (UID: \"bdb6fddf-10f2-476b-822f-130f6fa12007\") " Feb 18 11:57:59 crc kubenswrapper[4922]: I0218 11:57:59.741689 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ff6wh\" (UniqueName: \"kubernetes.io/projected/bdb6fddf-10f2-476b-822f-130f6fa12007-kube-api-access-ff6wh\") pod \"bdb6fddf-10f2-476b-822f-130f6fa12007\" (UID: \"bdb6fddf-10f2-476b-822f-130f6fa12007\") " Feb 18 11:57:59 crc kubenswrapper[4922]: I0218 11:57:59.741791 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdb6fddf-10f2-476b-822f-130f6fa12007-logs\") pod \"bdb6fddf-10f2-476b-822f-130f6fa12007\" (UID: \"bdb6fddf-10f2-476b-822f-130f6fa12007\") " Feb 18 11:57:59 crc kubenswrapper[4922]: I0218 11:57:59.741870 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdb6fddf-10f2-476b-822f-130f6fa12007-combined-ca-bundle\") pod \"bdb6fddf-10f2-476b-822f-130f6fa12007\" (UID: \"bdb6fddf-10f2-476b-822f-130f6fa12007\") " Feb 18 11:57:59 crc kubenswrapper[4922]: I0218 11:57:59.742148 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdb6fddf-10f2-476b-822f-130f6fa12007-config-data\") pod \"bdb6fddf-10f2-476b-822f-130f6fa12007\" (UID: \"bdb6fddf-10f2-476b-822f-130f6fa12007\") " Feb 18 11:57:59 crc kubenswrapper[4922]: I0218 11:57:59.742850 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdb6fddf-10f2-476b-822f-130f6fa12007-logs" (OuterVolumeSpecName: "logs") pod "bdb6fddf-10f2-476b-822f-130f6fa12007" (UID: "bdb6fddf-10f2-476b-822f-130f6fa12007"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:57:59 crc kubenswrapper[4922]: I0218 11:57:59.748641 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdb6fddf-10f2-476b-822f-130f6fa12007-kube-api-access-ff6wh" (OuterVolumeSpecName: "kube-api-access-ff6wh") pod "bdb6fddf-10f2-476b-822f-130f6fa12007" (UID: "bdb6fddf-10f2-476b-822f-130f6fa12007"). InnerVolumeSpecName "kube-api-access-ff6wh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:57:59 crc kubenswrapper[4922]: I0218 11:57:59.796556 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdb6fddf-10f2-476b-822f-130f6fa12007-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bdb6fddf-10f2-476b-822f-130f6fa12007" (UID: "bdb6fddf-10f2-476b-822f-130f6fa12007"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:59 crc kubenswrapper[4922]: I0218 11:57:59.818865 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdb6fddf-10f2-476b-822f-130f6fa12007-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "bdb6fddf-10f2-476b-822f-130f6fa12007" (UID: "bdb6fddf-10f2-476b-822f-130f6fa12007"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:59 crc kubenswrapper[4922]: I0218 11:57:59.827677 4922 generic.go:334] "Generic (PLEG): container finished" podID="bdb6fddf-10f2-476b-822f-130f6fa12007" containerID="bc2ff599d529126d7879e500e35b5132d881d5c58dbce8cd92a858ece8e6e115" exitCode=0 Feb 18 11:57:59 crc kubenswrapper[4922]: I0218 11:57:59.827782 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"bdb6fddf-10f2-476b-822f-130f6fa12007","Type":"ContainerDied","Data":"bc2ff599d529126d7879e500e35b5132d881d5c58dbce8cd92a858ece8e6e115"} Feb 18 11:57:59 crc kubenswrapper[4922]: I0218 11:57:59.827852 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"bdb6fddf-10f2-476b-822f-130f6fa12007","Type":"ContainerDied","Data":"767ebbd1c5208f481e0a0c9a07d1e2942ae4da643c6fc17067643a26968c3ac5"} Feb 18 11:57:59 crc kubenswrapper[4922]: I0218 11:57:59.827874 4922 scope.go:117] "RemoveContainer" containerID="bc2ff599d529126d7879e500e35b5132d881d5c58dbce8cd92a858ece8e6e115" Feb 18 11:57:59 crc kubenswrapper[4922]: I0218 11:57:59.828069 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 18 11:57:59 crc kubenswrapper[4922]: I0218 11:57:59.828176 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdb6fddf-10f2-476b-822f-130f6fa12007-config-data" (OuterVolumeSpecName: "config-data") pod "bdb6fddf-10f2-476b-822f-130f6fa12007" (UID: "bdb6fddf-10f2-476b-822f-130f6fa12007"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:59 crc kubenswrapper[4922]: I0218 11:57:59.836350 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01766e5f-d149-4175-9fdb-15e65b0e0665","Type":"ContainerStarted","Data":"ede12ae546b38989e02144d9c29b45bfe17c0490fbfcd99cc1a79c54dc349009"} Feb 18 11:57:59 crc kubenswrapper[4922]: I0218 11:57:59.836625 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="01766e5f-d149-4175-9fdb-15e65b0e0665" containerName="ceilometer-central-agent" containerID="cri-o://6c511819319a5fc193d0f75ed63d9083c79d2c12407691912b456da48212acab" gracePeriod=30 Feb 18 11:57:59 crc kubenswrapper[4922]: I0218 11:57:59.837046 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 18 11:57:59 crc kubenswrapper[4922]: I0218 11:57:59.837512 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="01766e5f-d149-4175-9fdb-15e65b0e0665" containerName="proxy-httpd" containerID="cri-o://ede12ae546b38989e02144d9c29b45bfe17c0490fbfcd99cc1a79c54dc349009" gracePeriod=30 Feb 18 11:57:59 crc kubenswrapper[4922]: I0218 11:57:59.837593 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="01766e5f-d149-4175-9fdb-15e65b0e0665" containerName="sg-core" containerID="cri-o://019a6342d78a17e9007569b8df388ae4cd83074ceff02bfe55eb5c3a71054609" gracePeriod=30 Feb 18 11:57:59 crc kubenswrapper[4922]: I0218 11:57:59.837655 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="01766e5f-d149-4175-9fdb-15e65b0e0665" containerName="ceilometer-notification-agent" containerID="cri-o://d7cb0d481469976e2f0d27c45a95752ec61c6325990c025f51f551cbf42c3d51" gracePeriod=30 Feb 18 11:57:59 crc kubenswrapper[4922]: I0218 11:57:59.845079 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdb6fddf-10f2-476b-822f-130f6fa12007-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:59 crc kubenswrapper[4922]: I0218 11:57:59.845114 4922 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/bdb6fddf-10f2-476b-822f-130f6fa12007-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:59 crc kubenswrapper[4922]: I0218 11:57:59.845125 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ff6wh\" (UniqueName: \"kubernetes.io/projected/bdb6fddf-10f2-476b-822f-130f6fa12007-kube-api-access-ff6wh\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:59 crc kubenswrapper[4922]: I0218 11:57:59.845147 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdb6fddf-10f2-476b-822f-130f6fa12007-logs\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:59 crc kubenswrapper[4922]: I0218 11:57:59.845159 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdb6fddf-10f2-476b-822f-130f6fa12007-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:59 crc kubenswrapper[4922]: I0218 11:57:59.867698 4922 scope.go:117] "RemoveContainer" containerID="bc2ff599d529126d7879e500e35b5132d881d5c58dbce8cd92a858ece8e6e115" Feb 18 11:57:59 crc kubenswrapper[4922]: E0218 11:57:59.868914 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc2ff599d529126d7879e500e35b5132d881d5c58dbce8cd92a858ece8e6e115\": container with ID starting with bc2ff599d529126d7879e500e35b5132d881d5c58dbce8cd92a858ece8e6e115 not found: ID does not exist" containerID="bc2ff599d529126d7879e500e35b5132d881d5c58dbce8cd92a858ece8e6e115" Feb 18 11:57:59 crc kubenswrapper[4922]: I0218 11:57:59.868975 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc2ff599d529126d7879e500e35b5132d881d5c58dbce8cd92a858ece8e6e115"} err="failed to get container status \"bc2ff599d529126d7879e500e35b5132d881d5c58dbce8cd92a858ece8e6e115\": rpc error: code = NotFound desc = could not find container \"bc2ff599d529126d7879e500e35b5132d881d5c58dbce8cd92a858ece8e6e115\": container with ID starting with bc2ff599d529126d7879e500e35b5132d881d5c58dbce8cd92a858ece8e6e115 not found: ID does not exist" Feb 18 11:57:59 crc kubenswrapper[4922]: I0218 11:57:59.870199 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.006449051 podStartE2EDuration="5.87018607s" podCreationTimestamp="2026-02-18 11:57:54 +0000 UTC" firstStartedPulling="2026-02-18 11:57:55.598328735 +0000 UTC m=+1277.326032815" lastFinishedPulling="2026-02-18 11:57:59.462065754 +0000 UTC m=+1281.189769834" observedRunningTime="2026-02-18 11:57:59.862795613 +0000 UTC m=+1281.590499693" watchObservedRunningTime="2026-02-18 11:57:59.87018607 +0000 UTC m=+1281.597890150" Feb 18 11:57:59 crc kubenswrapper[4922]: I0218 11:57:59.892402 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 18 11:57:59 crc kubenswrapper[4922]: I0218 11:57:59.906957 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 18 11:57:59 crc kubenswrapper[4922]: I0218 11:57:59.918808 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 18 11:57:59 crc kubenswrapper[4922]: E0218 11:57:59.919258 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdb6fddf-10f2-476b-822f-130f6fa12007" containerName="watcher-decision-engine" Feb 18 11:57:59 crc kubenswrapper[4922]: I0218 11:57:59.919272 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdb6fddf-10f2-476b-822f-130f6fa12007" containerName="watcher-decision-engine" Feb 18 11:57:59 crc kubenswrapper[4922]: I0218 11:57:59.919474 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdb6fddf-10f2-476b-822f-130f6fa12007" containerName="watcher-decision-engine" Feb 18 11:57:59 crc kubenswrapper[4922]: I0218 11:57:59.921072 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 18 11:57:59 crc kubenswrapper[4922]: I0218 11:57:59.935833 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 18 11:57:59 crc kubenswrapper[4922]: I0218 11:57:59.936680 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Feb 18 11:58:00 crc kubenswrapper[4922]: I0218 11:58:00.058391 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stxjq\" (UniqueName: \"kubernetes.io/projected/3df41ae7-b237-49e2-902c-f33e693f5db9-kube-api-access-stxjq\") pod \"watcher-decision-engine-0\" (UID: \"3df41ae7-b237-49e2-902c-f33e693f5db9\") " pod="openstack/watcher-decision-engine-0" Feb 18 11:58:00 crc kubenswrapper[4922]: I0218 11:58:00.058452 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3df41ae7-b237-49e2-902c-f33e693f5db9-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"3df41ae7-b237-49e2-902c-f33e693f5db9\") " pod="openstack/watcher-decision-engine-0" Feb 18 11:58:00 crc kubenswrapper[4922]: I0218 11:58:00.058563 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3df41ae7-b237-49e2-902c-f33e693f5db9-config-data\") pod \"watcher-decision-engine-0\" (UID: \"3df41ae7-b237-49e2-902c-f33e693f5db9\") " pod="openstack/watcher-decision-engine-0" Feb 18 11:58:00 crc kubenswrapper[4922]: I0218 11:58:00.058612 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3df41ae7-b237-49e2-902c-f33e693f5db9-logs\") pod \"watcher-decision-engine-0\" (UID: \"3df41ae7-b237-49e2-902c-f33e693f5db9\") " pod="openstack/watcher-decision-engine-0" Feb 18 11:58:00 crc kubenswrapper[4922]: I0218 11:58:00.058673 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3df41ae7-b237-49e2-902c-f33e693f5db9-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"3df41ae7-b237-49e2-902c-f33e693f5db9\") " pod="openstack/watcher-decision-engine-0" Feb 18 11:58:00 crc kubenswrapper[4922]: I0218 11:58:00.160692 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3df41ae7-b237-49e2-902c-f33e693f5db9-logs\") pod \"watcher-decision-engine-0\" (UID: \"3df41ae7-b237-49e2-902c-f33e693f5db9\") " pod="openstack/watcher-decision-engine-0" Feb 18 11:58:00 crc kubenswrapper[4922]: I0218 11:58:00.160998 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3df41ae7-b237-49e2-902c-f33e693f5db9-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"3df41ae7-b237-49e2-902c-f33e693f5db9\") " pod="openstack/watcher-decision-engine-0" Feb 18 11:58:00 crc kubenswrapper[4922]: I0218 11:58:00.161285 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3df41ae7-b237-49e2-902c-f33e693f5db9-logs\") pod \"watcher-decision-engine-0\" (UID: \"3df41ae7-b237-49e2-902c-f33e693f5db9\") " pod="openstack/watcher-decision-engine-0" Feb 18 11:58:00 crc kubenswrapper[4922]: I0218 11:58:00.161721 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stxjq\" (UniqueName: \"kubernetes.io/projected/3df41ae7-b237-49e2-902c-f33e693f5db9-kube-api-access-stxjq\") pod \"watcher-decision-engine-0\" (UID: \"3df41ae7-b237-49e2-902c-f33e693f5db9\") " pod="openstack/watcher-decision-engine-0" Feb 18 11:58:00 crc kubenswrapper[4922]: I0218 11:58:00.161754 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3df41ae7-b237-49e2-902c-f33e693f5db9-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"3df41ae7-b237-49e2-902c-f33e693f5db9\") " pod="openstack/watcher-decision-engine-0" Feb 18 11:58:00 crc kubenswrapper[4922]: I0218 11:58:00.161811 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3df41ae7-b237-49e2-902c-f33e693f5db9-config-data\") pod \"watcher-decision-engine-0\" (UID: \"3df41ae7-b237-49e2-902c-f33e693f5db9\") " pod="openstack/watcher-decision-engine-0" Feb 18 11:58:00 crc kubenswrapper[4922]: I0218 11:58:00.165073 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3df41ae7-b237-49e2-902c-f33e693f5db9-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"3df41ae7-b237-49e2-902c-f33e693f5db9\") " pod="openstack/watcher-decision-engine-0" Feb 18 11:58:00 crc kubenswrapper[4922]: I0218 11:58:00.166128 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3df41ae7-b237-49e2-902c-f33e693f5db9-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"3df41ae7-b237-49e2-902c-f33e693f5db9\") " pod="openstack/watcher-decision-engine-0" Feb 18 11:58:00 crc kubenswrapper[4922]: I0218 11:58:00.168159 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3df41ae7-b237-49e2-902c-f33e693f5db9-config-data\") pod \"watcher-decision-engine-0\" (UID: \"3df41ae7-b237-49e2-902c-f33e693f5db9\") " pod="openstack/watcher-decision-engine-0" Feb 18 11:58:00 crc kubenswrapper[4922]: I0218 11:58:00.181804 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stxjq\" (UniqueName: \"kubernetes.io/projected/3df41ae7-b237-49e2-902c-f33e693f5db9-kube-api-access-stxjq\") pod \"watcher-decision-engine-0\" (UID: \"3df41ae7-b237-49e2-902c-f33e693f5db9\") " pod="openstack/watcher-decision-engine-0" Feb 18 11:58:00 crc kubenswrapper[4922]: I0218 11:58:00.247466 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 18 11:58:00 crc kubenswrapper[4922]: I0218 11:58:00.742865 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 18 11:58:00 crc kubenswrapper[4922]: I0218 11:58:00.846569 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"3df41ae7-b237-49e2-902c-f33e693f5db9","Type":"ContainerStarted","Data":"d6a4a9c5612986b7024445846c98908756d70164a4c3727278455ddfd9a50fed"} Feb 18 11:58:00 crc kubenswrapper[4922]: I0218 11:58:00.850673 4922 generic.go:334] "Generic (PLEG): container finished" podID="01766e5f-d149-4175-9fdb-15e65b0e0665" containerID="019a6342d78a17e9007569b8df388ae4cd83074ceff02bfe55eb5c3a71054609" exitCode=2 Feb 18 11:58:00 crc kubenswrapper[4922]: I0218 11:58:00.850703 4922 generic.go:334] "Generic (PLEG): container finished" podID="01766e5f-d149-4175-9fdb-15e65b0e0665" containerID="d7cb0d481469976e2f0d27c45a95752ec61c6325990c025f51f551cbf42c3d51" exitCode=0 Feb 18 11:58:00 crc kubenswrapper[4922]: I0218 11:58:00.850709 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01766e5f-d149-4175-9fdb-15e65b0e0665","Type":"ContainerDied","Data":"019a6342d78a17e9007569b8df388ae4cd83074ceff02bfe55eb5c3a71054609"} Feb 18 11:58:00 crc kubenswrapper[4922]: I0218 11:58:00.850752 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01766e5f-d149-4175-9fdb-15e65b0e0665","Type":"ContainerDied","Data":"d7cb0d481469976e2f0d27c45a95752ec61c6325990c025f51f551cbf42c3d51"} Feb 18 11:58:00 crc kubenswrapper[4922]: I0218 11:58:00.986152 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdb6fddf-10f2-476b-822f-130f6fa12007" path="/var/lib/kubelet/pods/bdb6fddf-10f2-476b-822f-130f6fa12007/volumes" Feb 18 11:58:01 crc kubenswrapper[4922]: I0218 11:58:01.862288 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"3df41ae7-b237-49e2-902c-f33e693f5db9","Type":"ContainerStarted","Data":"87903c73d53186adbab4fd2f241af4eb9eed10c5c887c5c9f491c04133c21f0b"} Feb 18 11:58:01 crc kubenswrapper[4922]: I0218 11:58:01.884393 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=2.884354529 podStartE2EDuration="2.884354529s" podCreationTimestamp="2026-02-18 11:57:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:58:01.879586759 +0000 UTC m=+1283.607290839" watchObservedRunningTime="2026-02-18 11:58:01.884354529 +0000 UTC m=+1283.612058609" Feb 18 11:58:02 crc kubenswrapper[4922]: I0218 11:58:02.873965 4922 generic.go:334] "Generic (PLEG): container finished" podID="6b1ea57e-dcf2-4e47-8650-af483b18ea8f" containerID="c50b1f228a346e798dee756ffe7172e5752cdc2bb9dd07e1f2ab8318d4e33a78" exitCode=0 Feb 18 11:58:02 crc kubenswrapper[4922]: I0218 11:58:02.874050 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-979b8465b-gmztk" event={"ID":"6b1ea57e-dcf2-4e47-8650-af483b18ea8f","Type":"ContainerDied","Data":"c50b1f228a346e798dee756ffe7172e5752cdc2bb9dd07e1f2ab8318d4e33a78"} Feb 18 11:58:03 crc kubenswrapper[4922]: I0218 11:58:03.409662 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-979b8465b-gmztk" Feb 18 11:58:03 crc kubenswrapper[4922]: I0218 11:58:03.524142 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjthv\" (UniqueName: \"kubernetes.io/projected/6b1ea57e-dcf2-4e47-8650-af483b18ea8f-kube-api-access-vjthv\") pod \"6b1ea57e-dcf2-4e47-8650-af483b18ea8f\" (UID: \"6b1ea57e-dcf2-4e47-8650-af483b18ea8f\") " Feb 18 11:58:03 crc kubenswrapper[4922]: I0218 11:58:03.524227 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b1ea57e-dcf2-4e47-8650-af483b18ea8f-combined-ca-bundle\") pod \"6b1ea57e-dcf2-4e47-8650-af483b18ea8f\" (UID: \"6b1ea57e-dcf2-4e47-8650-af483b18ea8f\") " Feb 18 11:58:03 crc kubenswrapper[4922]: I0218 11:58:03.524287 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b1ea57e-dcf2-4e47-8650-af483b18ea8f-ovndb-tls-certs\") pod \"6b1ea57e-dcf2-4e47-8650-af483b18ea8f\" (UID: \"6b1ea57e-dcf2-4e47-8650-af483b18ea8f\") " Feb 18 11:58:03 crc kubenswrapper[4922]: I0218 11:58:03.524407 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6b1ea57e-dcf2-4e47-8650-af483b18ea8f-config\") pod \"6b1ea57e-dcf2-4e47-8650-af483b18ea8f\" (UID: \"6b1ea57e-dcf2-4e47-8650-af483b18ea8f\") " Feb 18 11:58:03 crc kubenswrapper[4922]: I0218 11:58:03.524515 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6b1ea57e-dcf2-4e47-8650-af483b18ea8f-httpd-config\") pod \"6b1ea57e-dcf2-4e47-8650-af483b18ea8f\" (UID: \"6b1ea57e-dcf2-4e47-8650-af483b18ea8f\") " Feb 18 11:58:03 crc kubenswrapper[4922]: I0218 11:58:03.535814 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b1ea57e-dcf2-4e47-8650-af483b18ea8f-kube-api-access-vjthv" (OuterVolumeSpecName: "kube-api-access-vjthv") pod "6b1ea57e-dcf2-4e47-8650-af483b18ea8f" (UID: "6b1ea57e-dcf2-4e47-8650-af483b18ea8f"). InnerVolumeSpecName "kube-api-access-vjthv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:58:03 crc kubenswrapper[4922]: I0218 11:58:03.536177 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b1ea57e-dcf2-4e47-8650-af483b18ea8f-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "6b1ea57e-dcf2-4e47-8650-af483b18ea8f" (UID: "6b1ea57e-dcf2-4e47-8650-af483b18ea8f"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:58:03 crc kubenswrapper[4922]: I0218 11:58:03.585352 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b1ea57e-dcf2-4e47-8650-af483b18ea8f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6b1ea57e-dcf2-4e47-8650-af483b18ea8f" (UID: "6b1ea57e-dcf2-4e47-8650-af483b18ea8f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:58:03 crc kubenswrapper[4922]: I0218 11:58:03.600476 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b1ea57e-dcf2-4e47-8650-af483b18ea8f-config" (OuterVolumeSpecName: "config") pod "6b1ea57e-dcf2-4e47-8650-af483b18ea8f" (UID: "6b1ea57e-dcf2-4e47-8650-af483b18ea8f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:58:03 crc kubenswrapper[4922]: I0218 11:58:03.627452 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/6b1ea57e-dcf2-4e47-8650-af483b18ea8f-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:03 crc kubenswrapper[4922]: I0218 11:58:03.627493 4922 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6b1ea57e-dcf2-4e47-8650-af483b18ea8f-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:03 crc kubenswrapper[4922]: I0218 11:58:03.627507 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjthv\" (UniqueName: \"kubernetes.io/projected/6b1ea57e-dcf2-4e47-8650-af483b18ea8f-kube-api-access-vjthv\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:03 crc kubenswrapper[4922]: I0218 11:58:03.627519 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b1ea57e-dcf2-4e47-8650-af483b18ea8f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:03 crc kubenswrapper[4922]: I0218 11:58:03.635524 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b1ea57e-dcf2-4e47-8650-af483b18ea8f-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "6b1ea57e-dcf2-4e47-8650-af483b18ea8f" (UID: "6b1ea57e-dcf2-4e47-8650-af483b18ea8f"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:58:03 crc kubenswrapper[4922]: I0218 11:58:03.729642 4922 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b1ea57e-dcf2-4e47-8650-af483b18ea8f-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:03 crc kubenswrapper[4922]: I0218 11:58:03.885198 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-979b8465b-gmztk" event={"ID":"6b1ea57e-dcf2-4e47-8650-af483b18ea8f","Type":"ContainerDied","Data":"d41035e2a8cf606b335f710a1d0f91690ca9a161e3cc25d246206a6e3ab38420"} Feb 18 11:58:03 crc kubenswrapper[4922]: I0218 11:58:03.885261 4922 scope.go:117] "RemoveContainer" containerID="dd50276aac03478ea25adcde79bcd1337f3694428dc64ce90dcf6fe92e773c3c" Feb 18 11:58:03 crc kubenswrapper[4922]: I0218 11:58:03.885260 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-979b8465b-gmztk" Feb 18 11:58:03 crc kubenswrapper[4922]: I0218 11:58:03.918565 4922 scope.go:117] "RemoveContainer" containerID="c50b1f228a346e798dee756ffe7172e5752cdc2bb9dd07e1f2ab8318d4e33a78" Feb 18 11:58:03 crc kubenswrapper[4922]: I0218 11:58:03.928535 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-979b8465b-gmztk"] Feb 18 11:58:03 crc kubenswrapper[4922]: I0218 11:58:03.936675 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-979b8465b-gmztk"] Feb 18 11:58:04 crc kubenswrapper[4922]: I0218 11:58:04.988623 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b1ea57e-dcf2-4e47-8650-af483b18ea8f" path="/var/lib/kubelet/pods/6b1ea57e-dcf2-4e47-8650-af483b18ea8f/volumes" Feb 18 11:58:05 crc kubenswrapper[4922]: I0218 11:58:05.907230 4922 generic.go:334] "Generic (PLEG): container finished" podID="ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5" containerID="38158e702c80c7a31868549296123efd8bf7bfe465aae6ab472768cf855c3a38" exitCode=0 Feb 18 11:58:05 crc kubenswrapper[4922]: I0218 11:58:05.907325 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ws7m8" event={"ID":"ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5","Type":"ContainerDied","Data":"38158e702c80c7a31868549296123efd8bf7bfe465aae6ab472768cf855c3a38"} Feb 18 11:58:07 crc kubenswrapper[4922]: I0218 11:58:07.256285 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ws7m8" Feb 18 11:58:07 crc kubenswrapper[4922]: I0218 11:58:07.291719 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5-config-data\") pod \"ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5\" (UID: \"ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5\") " Feb 18 11:58:07 crc kubenswrapper[4922]: I0218 11:58:07.291856 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5-scripts\") pod \"ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5\" (UID: \"ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5\") " Feb 18 11:58:07 crc kubenswrapper[4922]: I0218 11:58:07.291892 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdr6h\" (UniqueName: \"kubernetes.io/projected/ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5-kube-api-access-hdr6h\") pod \"ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5\" (UID: \"ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5\") " Feb 18 11:58:07 crc kubenswrapper[4922]: I0218 11:58:07.291933 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5-combined-ca-bundle\") pod \"ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5\" (UID: \"ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5\") " Feb 18 11:58:07 crc kubenswrapper[4922]: I0218 11:58:07.298125 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5-scripts" (OuterVolumeSpecName: "scripts") pod "ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5" (UID: "ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:58:07 crc kubenswrapper[4922]: I0218 11:58:07.298467 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5-kube-api-access-hdr6h" (OuterVolumeSpecName: "kube-api-access-hdr6h") pod "ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5" (UID: "ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5"). InnerVolumeSpecName "kube-api-access-hdr6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:58:07 crc kubenswrapper[4922]: I0218 11:58:07.326535 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5" (UID: "ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:58:07 crc kubenswrapper[4922]: I0218 11:58:07.326584 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5-config-data" (OuterVolumeSpecName: "config-data") pod "ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5" (UID: "ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:58:07 crc kubenswrapper[4922]: I0218 11:58:07.393911 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:07 crc kubenswrapper[4922]: I0218 11:58:07.393971 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:07 crc kubenswrapper[4922]: I0218 11:58:07.393981 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdr6h\" (UniqueName: \"kubernetes.io/projected/ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5-kube-api-access-hdr6h\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:07 crc kubenswrapper[4922]: I0218 11:58:07.393993 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:07 crc kubenswrapper[4922]: I0218 11:58:07.925188 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ws7m8" event={"ID":"ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5","Type":"ContainerDied","Data":"19238151c6895845aeb1fcd79101b75a8cace32e850eb034480705b42b7e45a0"} Feb 18 11:58:07 crc kubenswrapper[4922]: I0218 11:58:07.925238 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19238151c6895845aeb1fcd79101b75a8cace32e850eb034480705b42b7e45a0" Feb 18 11:58:07 crc kubenswrapper[4922]: I0218 11:58:07.925238 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ws7m8" Feb 18 11:58:08 crc kubenswrapper[4922]: I0218 11:58:08.028836 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 18 11:58:08 crc kubenswrapper[4922]: E0218 11:58:08.031271 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b1ea57e-dcf2-4e47-8650-af483b18ea8f" containerName="neutron-httpd" Feb 18 11:58:08 crc kubenswrapper[4922]: I0218 11:58:08.031518 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b1ea57e-dcf2-4e47-8650-af483b18ea8f" containerName="neutron-httpd" Feb 18 11:58:08 crc kubenswrapper[4922]: E0218 11:58:08.031673 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b1ea57e-dcf2-4e47-8650-af483b18ea8f" containerName="neutron-api" Feb 18 11:58:08 crc kubenswrapper[4922]: I0218 11:58:08.031968 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b1ea57e-dcf2-4e47-8650-af483b18ea8f" containerName="neutron-api" Feb 18 11:58:08 crc kubenswrapper[4922]: E0218 11:58:08.032100 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5" containerName="nova-cell0-conductor-db-sync" Feb 18 11:58:08 crc kubenswrapper[4922]: I0218 11:58:08.032172 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5" containerName="nova-cell0-conductor-db-sync" Feb 18 11:58:08 crc kubenswrapper[4922]: I0218 11:58:08.032512 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b1ea57e-dcf2-4e47-8650-af483b18ea8f" containerName="neutron-api" Feb 18 11:58:08 crc kubenswrapper[4922]: I0218 11:58:08.032609 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5" containerName="nova-cell0-conductor-db-sync" Feb 18 11:58:08 crc kubenswrapper[4922]: I0218 11:58:08.032793 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b1ea57e-dcf2-4e47-8650-af483b18ea8f" containerName="neutron-httpd" Feb 18 11:58:08 crc kubenswrapper[4922]: I0218 11:58:08.033746 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 18 11:58:08 crc kubenswrapper[4922]: I0218 11:58:08.035745 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 18 11:58:08 crc kubenswrapper[4922]: I0218 11:58:08.037578 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-m8flj" Feb 18 11:58:08 crc kubenswrapper[4922]: I0218 11:58:08.040468 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 18 11:58:08 crc kubenswrapper[4922]: I0218 11:58:08.107128 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a95479a-1834-4e95-b18a-c0bcef05f7ed-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4a95479a-1834-4e95-b18a-c0bcef05f7ed\") " pod="openstack/nova-cell0-conductor-0" Feb 18 11:58:08 crc kubenswrapper[4922]: I0218 11:58:08.107191 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp9nd\" (UniqueName: \"kubernetes.io/projected/4a95479a-1834-4e95-b18a-c0bcef05f7ed-kube-api-access-vp9nd\") pod \"nova-cell0-conductor-0\" (UID: \"4a95479a-1834-4e95-b18a-c0bcef05f7ed\") " pod="openstack/nova-cell0-conductor-0" Feb 18 11:58:08 crc kubenswrapper[4922]: I0218 11:58:08.107443 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a95479a-1834-4e95-b18a-c0bcef05f7ed-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4a95479a-1834-4e95-b18a-c0bcef05f7ed\") " pod="openstack/nova-cell0-conductor-0" Feb 18 11:58:08 crc kubenswrapper[4922]: I0218 11:58:08.209281 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a95479a-1834-4e95-b18a-c0bcef05f7ed-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4a95479a-1834-4e95-b18a-c0bcef05f7ed\") " pod="openstack/nova-cell0-conductor-0" Feb 18 11:58:08 crc kubenswrapper[4922]: I0218 11:58:08.209669 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vp9nd\" (UniqueName: \"kubernetes.io/projected/4a95479a-1834-4e95-b18a-c0bcef05f7ed-kube-api-access-vp9nd\") pod \"nova-cell0-conductor-0\" (UID: \"4a95479a-1834-4e95-b18a-c0bcef05f7ed\") " pod="openstack/nova-cell0-conductor-0" Feb 18 11:58:08 crc kubenswrapper[4922]: I0218 11:58:08.209859 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a95479a-1834-4e95-b18a-c0bcef05f7ed-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4a95479a-1834-4e95-b18a-c0bcef05f7ed\") " pod="openstack/nova-cell0-conductor-0" Feb 18 11:58:08 crc kubenswrapper[4922]: I0218 11:58:08.213249 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a95479a-1834-4e95-b18a-c0bcef05f7ed-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4a95479a-1834-4e95-b18a-c0bcef05f7ed\") " pod="openstack/nova-cell0-conductor-0" Feb 18 11:58:08 crc kubenswrapper[4922]: I0218 11:58:08.219193 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a95479a-1834-4e95-b18a-c0bcef05f7ed-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4a95479a-1834-4e95-b18a-c0bcef05f7ed\") " pod="openstack/nova-cell0-conductor-0" Feb 18 11:58:08 crc kubenswrapper[4922]: I0218 11:58:08.229613 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp9nd\" (UniqueName: \"kubernetes.io/projected/4a95479a-1834-4e95-b18a-c0bcef05f7ed-kube-api-access-vp9nd\") pod \"nova-cell0-conductor-0\" (UID: \"4a95479a-1834-4e95-b18a-c0bcef05f7ed\") " pod="openstack/nova-cell0-conductor-0" Feb 18 11:58:08 crc kubenswrapper[4922]: I0218 11:58:08.354079 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 18 11:58:08 crc kubenswrapper[4922]: I0218 11:58:08.796102 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 18 11:58:08 crc kubenswrapper[4922]: I0218 11:58:08.944728 4922 generic.go:334] "Generic (PLEG): container finished" podID="01766e5f-d149-4175-9fdb-15e65b0e0665" containerID="6c511819319a5fc193d0f75ed63d9083c79d2c12407691912b456da48212acab" exitCode=0 Feb 18 11:58:08 crc kubenswrapper[4922]: I0218 11:58:08.944887 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01766e5f-d149-4175-9fdb-15e65b0e0665","Type":"ContainerDied","Data":"6c511819319a5fc193d0f75ed63d9083c79d2c12407691912b456da48212acab"} Feb 18 11:58:08 crc kubenswrapper[4922]: I0218 11:58:08.962112 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"4a95479a-1834-4e95-b18a-c0bcef05f7ed","Type":"ContainerStarted","Data":"f42d55be5c9e0a17431f7ff37043f8713b027dcd2e1d8947123ee2cafd90afd7"} Feb 18 11:58:09 crc kubenswrapper[4922]: I0218 11:58:09.974302 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"4a95479a-1834-4e95-b18a-c0bcef05f7ed","Type":"ContainerStarted","Data":"02ac8807bfca9d0dfa78345a198171fd9c85f24ad621752cf9fc96c02e8473c1"} Feb 18 11:58:09 crc kubenswrapper[4922]: I0218 11:58:09.974606 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 18 11:58:09 crc kubenswrapper[4922]: I0218 11:58:09.999877 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=1.9998415349999998 podStartE2EDuration="1.999841535s" podCreationTimestamp="2026-02-18 11:58:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:58:09.994388978 +0000 UTC m=+1291.722093078" watchObservedRunningTime="2026-02-18 11:58:09.999841535 +0000 UTC m=+1291.727545615" Feb 18 11:58:10 crc kubenswrapper[4922]: I0218 11:58:10.248873 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Feb 18 11:58:10 crc kubenswrapper[4922]: I0218 11:58:10.274295 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Feb 18 11:58:10 crc kubenswrapper[4922]: I0218 11:58:10.984711 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Feb 18 11:58:11 crc kubenswrapper[4922]: I0218 11:58:11.008860 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Feb 18 11:58:13 crc kubenswrapper[4922]: I0218 11:58:13.380309 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 18 11:58:13 crc kubenswrapper[4922]: I0218 11:58:13.803128 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-gts9l"] Feb 18 11:58:13 crc kubenswrapper[4922]: I0218 11:58:13.804706 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gts9l" Feb 18 11:58:13 crc kubenswrapper[4922]: I0218 11:58:13.806824 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 18 11:58:13 crc kubenswrapper[4922]: I0218 11:58:13.808208 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 18 11:58:13 crc kubenswrapper[4922]: I0218 11:58:13.814587 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-gts9l"] Feb 18 11:58:13 crc kubenswrapper[4922]: I0218 11:58:13.931924 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95fc0adb-b8ae-4fd6-88eb-3b6357173103-scripts\") pod \"nova-cell0-cell-mapping-gts9l\" (UID: \"95fc0adb-b8ae-4fd6-88eb-3b6357173103\") " pod="openstack/nova-cell0-cell-mapping-gts9l" Feb 18 11:58:13 crc kubenswrapper[4922]: I0218 11:58:13.932026 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2rwd\" (UniqueName: \"kubernetes.io/projected/95fc0adb-b8ae-4fd6-88eb-3b6357173103-kube-api-access-r2rwd\") pod \"nova-cell0-cell-mapping-gts9l\" (UID: \"95fc0adb-b8ae-4fd6-88eb-3b6357173103\") " pod="openstack/nova-cell0-cell-mapping-gts9l" Feb 18 11:58:13 crc kubenswrapper[4922]: I0218 11:58:13.932108 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95fc0adb-b8ae-4fd6-88eb-3b6357173103-config-data\") pod \"nova-cell0-cell-mapping-gts9l\" (UID: \"95fc0adb-b8ae-4fd6-88eb-3b6357173103\") " pod="openstack/nova-cell0-cell-mapping-gts9l" Feb 18 11:58:13 crc kubenswrapper[4922]: I0218 11:58:13.932249 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95fc0adb-b8ae-4fd6-88eb-3b6357173103-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-gts9l\" (UID: \"95fc0adb-b8ae-4fd6-88eb-3b6357173103\") " pod="openstack/nova-cell0-cell-mapping-gts9l" Feb 18 11:58:13 crc kubenswrapper[4922]: I0218 11:58:13.955975 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 18 11:58:13 crc kubenswrapper[4922]: I0218 11:58:13.958209 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 11:58:13 crc kubenswrapper[4922]: I0218 11:58:13.964823 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 18 11:58:13 crc kubenswrapper[4922]: I0218 11:58:13.995870 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.035617 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95fc0adb-b8ae-4fd6-88eb-3b6357173103-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-gts9l\" (UID: \"95fc0adb-b8ae-4fd6-88eb-3b6357173103\") " pod="openstack/nova-cell0-cell-mapping-gts9l" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.035671 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2856a778-a8b2-4740-8d2a-4a6f64619bc2-logs\") pod \"nova-api-0\" (UID: \"2856a778-a8b2-4740-8d2a-4a6f64619bc2\") " pod="openstack/nova-api-0" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.035694 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95fc0adb-b8ae-4fd6-88eb-3b6357173103-scripts\") pod \"nova-cell0-cell-mapping-gts9l\" (UID: \"95fc0adb-b8ae-4fd6-88eb-3b6357173103\") " pod="openstack/nova-cell0-cell-mapping-gts9l" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.035759 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2rwd\" (UniqueName: \"kubernetes.io/projected/95fc0adb-b8ae-4fd6-88eb-3b6357173103-kube-api-access-r2rwd\") pod \"nova-cell0-cell-mapping-gts9l\" (UID: \"95fc0adb-b8ae-4fd6-88eb-3b6357173103\") " pod="openstack/nova-cell0-cell-mapping-gts9l" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.035796 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2856a778-a8b2-4740-8d2a-4a6f64619bc2-config-data\") pod \"nova-api-0\" (UID: \"2856a778-a8b2-4740-8d2a-4a6f64619bc2\") " pod="openstack/nova-api-0" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.035833 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95fc0adb-b8ae-4fd6-88eb-3b6357173103-config-data\") pod \"nova-cell0-cell-mapping-gts9l\" (UID: \"95fc0adb-b8ae-4fd6-88eb-3b6357173103\") " pod="openstack/nova-cell0-cell-mapping-gts9l" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.035874 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2856a778-a8b2-4740-8d2a-4a6f64619bc2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2856a778-a8b2-4740-8d2a-4a6f64619bc2\") " pod="openstack/nova-api-0" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.035913 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqjqm\" (UniqueName: \"kubernetes.io/projected/2856a778-a8b2-4740-8d2a-4a6f64619bc2-kube-api-access-bqjqm\") pod \"nova-api-0\" (UID: \"2856a778-a8b2-4740-8d2a-4a6f64619bc2\") " pod="openstack/nova-api-0" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.054959 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.056470 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.058195 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95fc0adb-b8ae-4fd6-88eb-3b6357173103-config-data\") pod \"nova-cell0-cell-mapping-gts9l\" (UID: \"95fc0adb-b8ae-4fd6-88eb-3b6357173103\") " pod="openstack/nova-cell0-cell-mapping-gts9l" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.060905 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95fc0adb-b8ae-4fd6-88eb-3b6357173103-scripts\") pod \"nova-cell0-cell-mapping-gts9l\" (UID: \"95fc0adb-b8ae-4fd6-88eb-3b6357173103\") " pod="openstack/nova-cell0-cell-mapping-gts9l" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.061566 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.076187 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.078256 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95fc0adb-b8ae-4fd6-88eb-3b6357173103-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-gts9l\" (UID: \"95fc0adb-b8ae-4fd6-88eb-3b6357173103\") " pod="openstack/nova-cell0-cell-mapping-gts9l" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.105501 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2rwd\" (UniqueName: \"kubernetes.io/projected/95fc0adb-b8ae-4fd6-88eb-3b6357173103-kube-api-access-r2rwd\") pod \"nova-cell0-cell-mapping-gts9l\" (UID: \"95fc0adb-b8ae-4fd6-88eb-3b6357173103\") " pod="openstack/nova-cell0-cell-mapping-gts9l" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.124220 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gts9l" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.137792 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2856a778-a8b2-4740-8d2a-4a6f64619bc2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2856a778-a8b2-4740-8d2a-4a6f64619bc2\") " pod="openstack/nova-api-0" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.137856 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqjqm\" (UniqueName: \"kubernetes.io/projected/2856a778-a8b2-4740-8d2a-4a6f64619bc2-kube-api-access-bqjqm\") pod \"nova-api-0\" (UID: \"2856a778-a8b2-4740-8d2a-4a6f64619bc2\") " pod="openstack/nova-api-0" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.137911 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2856a778-a8b2-4740-8d2a-4a6f64619bc2-logs\") pod \"nova-api-0\" (UID: \"2856a778-a8b2-4740-8d2a-4a6f64619bc2\") " pod="openstack/nova-api-0" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.137942 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv5hb\" (UniqueName: \"kubernetes.io/projected/3ce92cda-4459-4a73-8fc2-84bbb56eccce-kube-api-access-mv5hb\") pod \"nova-cell1-novncproxy-0\" (UID: \"3ce92cda-4459-4a73-8fc2-84bbb56eccce\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.137999 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ce92cda-4459-4a73-8fc2-84bbb56eccce-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3ce92cda-4459-4a73-8fc2-84bbb56eccce\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.138022 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2856a778-a8b2-4740-8d2a-4a6f64619bc2-config-data\") pod \"nova-api-0\" (UID: \"2856a778-a8b2-4740-8d2a-4a6f64619bc2\") " pod="openstack/nova-api-0" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.138061 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ce92cda-4459-4a73-8fc2-84bbb56eccce-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3ce92cda-4459-4a73-8fc2-84bbb56eccce\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.139202 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2856a778-a8b2-4740-8d2a-4a6f64619bc2-logs\") pod \"nova-api-0\" (UID: \"2856a778-a8b2-4740-8d2a-4a6f64619bc2\") " pod="openstack/nova-api-0" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.144292 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2856a778-a8b2-4740-8d2a-4a6f64619bc2-config-data\") pod \"nova-api-0\" (UID: \"2856a778-a8b2-4740-8d2a-4a6f64619bc2\") " pod="openstack/nova-api-0" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.153437 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2856a778-a8b2-4740-8d2a-4a6f64619bc2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2856a778-a8b2-4740-8d2a-4a6f64619bc2\") " pod="openstack/nova-api-0" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.174557 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqjqm\" (UniqueName: \"kubernetes.io/projected/2856a778-a8b2-4740-8d2a-4a6f64619bc2-kube-api-access-bqjqm\") pod \"nova-api-0\" (UID: \"2856a778-a8b2-4740-8d2a-4a6f64619bc2\") " pod="openstack/nova-api-0" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.184234 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.186050 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.194159 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.213200 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.245118 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66vrt\" (UniqueName: \"kubernetes.io/projected/eea42093-2f99-433e-8cde-fe075d89d91f-kube-api-access-66vrt\") pod \"nova-metadata-0\" (UID: \"eea42093-2f99-433e-8cde-fe075d89d91f\") " pod="openstack/nova-metadata-0" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.245164 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ce92cda-4459-4a73-8fc2-84bbb56eccce-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3ce92cda-4459-4a73-8fc2-84bbb56eccce\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.245246 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eea42093-2f99-433e-8cde-fe075d89d91f-logs\") pod \"nova-metadata-0\" (UID: \"eea42093-2f99-433e-8cde-fe075d89d91f\") " pod="openstack/nova-metadata-0" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.245267 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eea42093-2f99-433e-8cde-fe075d89d91f-config-data\") pod \"nova-metadata-0\" (UID: \"eea42093-2f99-433e-8cde-fe075d89d91f\") " pod="openstack/nova-metadata-0" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.245304 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mv5hb\" (UniqueName: \"kubernetes.io/projected/3ce92cda-4459-4a73-8fc2-84bbb56eccce-kube-api-access-mv5hb\") pod \"nova-cell1-novncproxy-0\" (UID: \"3ce92cda-4459-4a73-8fc2-84bbb56eccce\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.245339 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eea42093-2f99-433e-8cde-fe075d89d91f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"eea42093-2f99-433e-8cde-fe075d89d91f\") " pod="openstack/nova-metadata-0" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.245388 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ce92cda-4459-4a73-8fc2-84bbb56eccce-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3ce92cda-4459-4a73-8fc2-84bbb56eccce\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.249984 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ce92cda-4459-4a73-8fc2-84bbb56eccce-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3ce92cda-4459-4a73-8fc2-84bbb56eccce\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.268586 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv5hb\" (UniqueName: \"kubernetes.io/projected/3ce92cda-4459-4a73-8fc2-84bbb56eccce-kube-api-access-mv5hb\") pod \"nova-cell1-novncproxy-0\" (UID: \"3ce92cda-4459-4a73-8fc2-84bbb56eccce\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.269996 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ce92cda-4459-4a73-8fc2-84bbb56eccce-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3ce92cda-4459-4a73-8fc2-84bbb56eccce\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.281518 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.282671 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.282914 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.286704 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.309342 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.340790 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-pc7hm"] Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.343541 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-pc7hm" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.347563 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eea42093-2f99-433e-8cde-fe075d89d91f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"eea42093-2f99-433e-8cde-fe075d89d91f\") " pod="openstack/nova-metadata-0" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.347644 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvvg5\" (UniqueName: \"kubernetes.io/projected/4e0d2e17-4045-420d-817b-41a1fc66c425-kube-api-access-qvvg5\") pod \"nova-scheduler-0\" (UID: \"4e0d2e17-4045-420d-817b-41a1fc66c425\") " pod="openstack/nova-scheduler-0" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.347720 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66vrt\" (UniqueName: \"kubernetes.io/projected/eea42093-2f99-433e-8cde-fe075d89d91f-kube-api-access-66vrt\") pod \"nova-metadata-0\" (UID: \"eea42093-2f99-433e-8cde-fe075d89d91f\") " pod="openstack/nova-metadata-0" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.347795 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e0d2e17-4045-420d-817b-41a1fc66c425-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4e0d2e17-4045-420d-817b-41a1fc66c425\") " pod="openstack/nova-scheduler-0" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.347850 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eea42093-2f99-433e-8cde-fe075d89d91f-logs\") pod \"nova-metadata-0\" (UID: \"eea42093-2f99-433e-8cde-fe075d89d91f\") " pod="openstack/nova-metadata-0" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.347876 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eea42093-2f99-433e-8cde-fe075d89d91f-config-data\") pod \"nova-metadata-0\" (UID: \"eea42093-2f99-433e-8cde-fe075d89d91f\") " pod="openstack/nova-metadata-0" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.347910 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e0d2e17-4045-420d-817b-41a1fc66c425-config-data\") pod \"nova-scheduler-0\" (UID: \"4e0d2e17-4045-420d-817b-41a1fc66c425\") " pod="openstack/nova-scheduler-0" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.350892 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eea42093-2f99-433e-8cde-fe075d89d91f-logs\") pod \"nova-metadata-0\" (UID: \"eea42093-2f99-433e-8cde-fe075d89d91f\") " pod="openstack/nova-metadata-0" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.353112 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eea42093-2f99-433e-8cde-fe075d89d91f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"eea42093-2f99-433e-8cde-fe075d89d91f\") " pod="openstack/nova-metadata-0" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.361604 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eea42093-2f99-433e-8cde-fe075d89d91f-config-data\") pod \"nova-metadata-0\" (UID: \"eea42093-2f99-433e-8cde-fe075d89d91f\") " pod="openstack/nova-metadata-0" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.365175 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-pc7hm"] Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.379178 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.380209 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66vrt\" (UniqueName: \"kubernetes.io/projected/eea42093-2f99-433e-8cde-fe075d89d91f-kube-api-access-66vrt\") pod \"nova-metadata-0\" (UID: \"eea42093-2f99-433e-8cde-fe075d89d91f\") " pod="openstack/nova-metadata-0" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.408589 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.450147 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29tzw\" (UniqueName: \"kubernetes.io/projected/5fe07f11-0f10-4aa7-ab94-51d42b7a6367-kube-api-access-29tzw\") pod \"dnsmasq-dns-bccf8f775-pc7hm\" (UID: \"5fe07f11-0f10-4aa7-ab94-51d42b7a6367\") " pod="openstack/dnsmasq-dns-bccf8f775-pc7hm" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.450206 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5fe07f11-0f10-4aa7-ab94-51d42b7a6367-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-pc7hm\" (UID: \"5fe07f11-0f10-4aa7-ab94-51d42b7a6367\") " pod="openstack/dnsmasq-dns-bccf8f775-pc7hm" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.450251 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5fe07f11-0f10-4aa7-ab94-51d42b7a6367-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-pc7hm\" (UID: \"5fe07f11-0f10-4aa7-ab94-51d42b7a6367\") " pod="openstack/dnsmasq-dns-bccf8f775-pc7hm" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.450289 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvvg5\" (UniqueName: \"kubernetes.io/projected/4e0d2e17-4045-420d-817b-41a1fc66c425-kube-api-access-qvvg5\") pod \"nova-scheduler-0\" (UID: \"4e0d2e17-4045-420d-817b-41a1fc66c425\") " pod="openstack/nova-scheduler-0" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.450372 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5fe07f11-0f10-4aa7-ab94-51d42b7a6367-dns-svc\") pod \"dnsmasq-dns-bccf8f775-pc7hm\" (UID: \"5fe07f11-0f10-4aa7-ab94-51d42b7a6367\") " pod="openstack/dnsmasq-dns-bccf8f775-pc7hm" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.450432 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5fe07f11-0f10-4aa7-ab94-51d42b7a6367-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-pc7hm\" (UID: \"5fe07f11-0f10-4aa7-ab94-51d42b7a6367\") " pod="openstack/dnsmasq-dns-bccf8f775-pc7hm" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.450452 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fe07f11-0f10-4aa7-ab94-51d42b7a6367-config\") pod \"dnsmasq-dns-bccf8f775-pc7hm\" (UID: \"5fe07f11-0f10-4aa7-ab94-51d42b7a6367\") " pod="openstack/dnsmasq-dns-bccf8f775-pc7hm" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.450494 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e0d2e17-4045-420d-817b-41a1fc66c425-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4e0d2e17-4045-420d-817b-41a1fc66c425\") " pod="openstack/nova-scheduler-0" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.450555 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e0d2e17-4045-420d-817b-41a1fc66c425-config-data\") pod \"nova-scheduler-0\" (UID: \"4e0d2e17-4045-420d-817b-41a1fc66c425\") " pod="openstack/nova-scheduler-0" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.455092 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e0d2e17-4045-420d-817b-41a1fc66c425-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4e0d2e17-4045-420d-817b-41a1fc66c425\") " pod="openstack/nova-scheduler-0" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.457832 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e0d2e17-4045-420d-817b-41a1fc66c425-config-data\") pod \"nova-scheduler-0\" (UID: \"4e0d2e17-4045-420d-817b-41a1fc66c425\") " pod="openstack/nova-scheduler-0" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.469965 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvvg5\" (UniqueName: \"kubernetes.io/projected/4e0d2e17-4045-420d-817b-41a1fc66c425-kube-api-access-qvvg5\") pod \"nova-scheduler-0\" (UID: \"4e0d2e17-4045-420d-817b-41a1fc66c425\") " pod="openstack/nova-scheduler-0" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.554082 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29tzw\" (UniqueName: \"kubernetes.io/projected/5fe07f11-0f10-4aa7-ab94-51d42b7a6367-kube-api-access-29tzw\") pod \"dnsmasq-dns-bccf8f775-pc7hm\" (UID: \"5fe07f11-0f10-4aa7-ab94-51d42b7a6367\") " pod="openstack/dnsmasq-dns-bccf8f775-pc7hm" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.554178 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5fe07f11-0f10-4aa7-ab94-51d42b7a6367-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-pc7hm\" (UID: \"5fe07f11-0f10-4aa7-ab94-51d42b7a6367\") " pod="openstack/dnsmasq-dns-bccf8f775-pc7hm" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.554248 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5fe07f11-0f10-4aa7-ab94-51d42b7a6367-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-pc7hm\" (UID: \"5fe07f11-0f10-4aa7-ab94-51d42b7a6367\") " pod="openstack/dnsmasq-dns-bccf8f775-pc7hm" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.554389 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5fe07f11-0f10-4aa7-ab94-51d42b7a6367-dns-svc\") pod \"dnsmasq-dns-bccf8f775-pc7hm\" (UID: \"5fe07f11-0f10-4aa7-ab94-51d42b7a6367\") " pod="openstack/dnsmasq-dns-bccf8f775-pc7hm" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.554428 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5fe07f11-0f10-4aa7-ab94-51d42b7a6367-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-pc7hm\" (UID: \"5fe07f11-0f10-4aa7-ab94-51d42b7a6367\") " pod="openstack/dnsmasq-dns-bccf8f775-pc7hm" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.554470 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fe07f11-0f10-4aa7-ab94-51d42b7a6367-config\") pod \"dnsmasq-dns-bccf8f775-pc7hm\" (UID: \"5fe07f11-0f10-4aa7-ab94-51d42b7a6367\") " pod="openstack/dnsmasq-dns-bccf8f775-pc7hm" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.556032 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fe07f11-0f10-4aa7-ab94-51d42b7a6367-config\") pod \"dnsmasq-dns-bccf8f775-pc7hm\" (UID: \"5fe07f11-0f10-4aa7-ab94-51d42b7a6367\") " pod="openstack/dnsmasq-dns-bccf8f775-pc7hm" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.556032 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5fe07f11-0f10-4aa7-ab94-51d42b7a6367-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-pc7hm\" (UID: \"5fe07f11-0f10-4aa7-ab94-51d42b7a6367\") " pod="openstack/dnsmasq-dns-bccf8f775-pc7hm" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.556804 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5fe07f11-0f10-4aa7-ab94-51d42b7a6367-dns-svc\") pod \"dnsmasq-dns-bccf8f775-pc7hm\" (UID: \"5fe07f11-0f10-4aa7-ab94-51d42b7a6367\") " pod="openstack/dnsmasq-dns-bccf8f775-pc7hm" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.557863 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5fe07f11-0f10-4aa7-ab94-51d42b7a6367-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-pc7hm\" (UID: \"5fe07f11-0f10-4aa7-ab94-51d42b7a6367\") " pod="openstack/dnsmasq-dns-bccf8f775-pc7hm" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.557916 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5fe07f11-0f10-4aa7-ab94-51d42b7a6367-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-pc7hm\" (UID: \"5fe07f11-0f10-4aa7-ab94-51d42b7a6367\") " pod="openstack/dnsmasq-dns-bccf8f775-pc7hm" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.588122 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29tzw\" (UniqueName: \"kubernetes.io/projected/5fe07f11-0f10-4aa7-ab94-51d42b7a6367-kube-api-access-29tzw\") pod \"dnsmasq-dns-bccf8f775-pc7hm\" (UID: \"5fe07f11-0f10-4aa7-ab94-51d42b7a6367\") " pod="openstack/dnsmasq-dns-bccf8f775-pc7hm" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.727875 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.745762 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-pc7hm" Feb 18 11:58:14 crc kubenswrapper[4922]: I0218 11:58:14.811517 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-gts9l"] Feb 18 11:58:15 crc kubenswrapper[4922]: I0218 11:58:15.041202 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 18 11:58:15 crc kubenswrapper[4922]: I0218 11:58:15.081842 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gts9l" event={"ID":"95fc0adb-b8ae-4fd6-88eb-3b6357173103","Type":"ContainerStarted","Data":"3dd9e8b282e0169c21e93008600de3cbb1dc520cc5c66cbb90c2934ce89d2770"} Feb 18 11:58:15 crc kubenswrapper[4922]: I0218 11:58:15.085476 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zqpbh"] Feb 18 11:58:15 crc kubenswrapper[4922]: I0218 11:58:15.093435 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zqpbh" Feb 18 11:58:15 crc kubenswrapper[4922]: I0218 11:58:15.101739 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zqpbh"] Feb 18 11:58:15 crc kubenswrapper[4922]: I0218 11:58:15.104395 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 18 11:58:15 crc kubenswrapper[4922]: I0218 11:58:15.107960 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 18 11:58:15 crc kubenswrapper[4922]: I0218 11:58:15.108787 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 11:58:15 crc kubenswrapper[4922]: I0218 11:58:15.120266 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 18 11:58:15 crc kubenswrapper[4922]: I0218 11:58:15.169926 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aacb8ffe-ff30-4292-b253-1e12d07f499b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-zqpbh\" (UID: \"aacb8ffe-ff30-4292-b253-1e12d07f499b\") " pod="openstack/nova-cell1-conductor-db-sync-zqpbh" Feb 18 11:58:15 crc kubenswrapper[4922]: I0218 11:58:15.170157 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aacb8ffe-ff30-4292-b253-1e12d07f499b-config-data\") pod \"nova-cell1-conductor-db-sync-zqpbh\" (UID: \"aacb8ffe-ff30-4292-b253-1e12d07f499b\") " pod="openstack/nova-cell1-conductor-db-sync-zqpbh" Feb 18 11:58:15 crc kubenswrapper[4922]: I0218 11:58:15.170462 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aacb8ffe-ff30-4292-b253-1e12d07f499b-scripts\") pod \"nova-cell1-conductor-db-sync-zqpbh\" (UID: \"aacb8ffe-ff30-4292-b253-1e12d07f499b\") " pod="openstack/nova-cell1-conductor-db-sync-zqpbh" Feb 18 11:58:15 crc kubenswrapper[4922]: I0218 11:58:15.170498 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8zsk\" (UniqueName: \"kubernetes.io/projected/aacb8ffe-ff30-4292-b253-1e12d07f499b-kube-api-access-q8zsk\") pod \"nova-cell1-conductor-db-sync-zqpbh\" (UID: \"aacb8ffe-ff30-4292-b253-1e12d07f499b\") " pod="openstack/nova-cell1-conductor-db-sync-zqpbh" Feb 18 11:58:15 crc kubenswrapper[4922]: I0218 11:58:15.278495 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aacb8ffe-ff30-4292-b253-1e12d07f499b-scripts\") pod \"nova-cell1-conductor-db-sync-zqpbh\" (UID: \"aacb8ffe-ff30-4292-b253-1e12d07f499b\") " pod="openstack/nova-cell1-conductor-db-sync-zqpbh" Feb 18 11:58:15 crc kubenswrapper[4922]: I0218 11:58:15.278559 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8zsk\" (UniqueName: \"kubernetes.io/projected/aacb8ffe-ff30-4292-b253-1e12d07f499b-kube-api-access-q8zsk\") pod \"nova-cell1-conductor-db-sync-zqpbh\" (UID: \"aacb8ffe-ff30-4292-b253-1e12d07f499b\") " pod="openstack/nova-cell1-conductor-db-sync-zqpbh" Feb 18 11:58:15 crc kubenswrapper[4922]: I0218 11:58:15.278730 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aacb8ffe-ff30-4292-b253-1e12d07f499b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-zqpbh\" (UID: \"aacb8ffe-ff30-4292-b253-1e12d07f499b\") " pod="openstack/nova-cell1-conductor-db-sync-zqpbh" Feb 18 11:58:15 crc kubenswrapper[4922]: I0218 11:58:15.279052 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aacb8ffe-ff30-4292-b253-1e12d07f499b-config-data\") pod \"nova-cell1-conductor-db-sync-zqpbh\" (UID: \"aacb8ffe-ff30-4292-b253-1e12d07f499b\") " pod="openstack/nova-cell1-conductor-db-sync-zqpbh" Feb 18 11:58:15 crc kubenswrapper[4922]: I0218 11:58:15.285563 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aacb8ffe-ff30-4292-b253-1e12d07f499b-config-data\") pod \"nova-cell1-conductor-db-sync-zqpbh\" (UID: \"aacb8ffe-ff30-4292-b253-1e12d07f499b\") " pod="openstack/nova-cell1-conductor-db-sync-zqpbh" Feb 18 11:58:15 crc kubenswrapper[4922]: I0218 11:58:15.286960 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aacb8ffe-ff30-4292-b253-1e12d07f499b-scripts\") pod \"nova-cell1-conductor-db-sync-zqpbh\" (UID: \"aacb8ffe-ff30-4292-b253-1e12d07f499b\") " pod="openstack/nova-cell1-conductor-db-sync-zqpbh" Feb 18 11:58:15 crc kubenswrapper[4922]: I0218 11:58:15.292050 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aacb8ffe-ff30-4292-b253-1e12d07f499b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-zqpbh\" (UID: \"aacb8ffe-ff30-4292-b253-1e12d07f499b\") " pod="openstack/nova-cell1-conductor-db-sync-zqpbh" Feb 18 11:58:15 crc kubenswrapper[4922]: I0218 11:58:15.300875 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8zsk\" (UniqueName: \"kubernetes.io/projected/aacb8ffe-ff30-4292-b253-1e12d07f499b-kube-api-access-q8zsk\") pod \"nova-cell1-conductor-db-sync-zqpbh\" (UID: \"aacb8ffe-ff30-4292-b253-1e12d07f499b\") " pod="openstack/nova-cell1-conductor-db-sync-zqpbh" Feb 18 11:58:15 crc kubenswrapper[4922]: I0218 11:58:15.387162 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 11:58:15 crc kubenswrapper[4922]: I0218 11:58:15.491560 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zqpbh" Feb 18 11:58:15 crc kubenswrapper[4922]: I0218 11:58:15.531467 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-pc7hm"] Feb 18 11:58:15 crc kubenswrapper[4922]: W0218 11:58:15.542106 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5fe07f11_0f10_4aa7_ab94_51d42b7a6367.slice/crio-0199a345c591cc6803424fbda06bc8be81c6f2649bc4692acfd3ea6b8d197abc WatchSource:0}: Error finding container 0199a345c591cc6803424fbda06bc8be81c6f2649bc4692acfd3ea6b8d197abc: Status 404 returned error can't find the container with id 0199a345c591cc6803424fbda06bc8be81c6f2649bc4692acfd3ea6b8d197abc Feb 18 11:58:16 crc kubenswrapper[4922]: I0218 11:58:16.022228 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zqpbh"] Feb 18 11:58:16 crc kubenswrapper[4922]: I0218 11:58:16.106895 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zqpbh" event={"ID":"aacb8ffe-ff30-4292-b253-1e12d07f499b","Type":"ContainerStarted","Data":"1ed9fd2d7f07e8fb94873887c1633bde383c19238f884a2aec0494c3844e1788"} Feb 18 11:58:16 crc kubenswrapper[4922]: I0218 11:58:16.108253 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eea42093-2f99-433e-8cde-fe075d89d91f","Type":"ContainerStarted","Data":"6f453239385971f9189e759542805bb61c809c6dd720600811af9a9f4a7ac835"} Feb 18 11:58:16 crc kubenswrapper[4922]: I0218 11:58:16.111408 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3ce92cda-4459-4a73-8fc2-84bbb56eccce","Type":"ContainerStarted","Data":"92d7d00c3ed4a00a1aeac19f0bbbe19ab3df77be12277cadf25b39bf998465e7"} Feb 18 11:58:16 crc kubenswrapper[4922]: I0218 11:58:16.114220 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gts9l" event={"ID":"95fc0adb-b8ae-4fd6-88eb-3b6357173103","Type":"ContainerStarted","Data":"3dd5bc6bbbea448a7d9229f94c3182f66fafad86532a2426f6a51eaa5b649205"} Feb 18 11:58:16 crc kubenswrapper[4922]: I0218 11:58:16.135342 4922 generic.go:334] "Generic (PLEG): container finished" podID="5fe07f11-0f10-4aa7-ab94-51d42b7a6367" containerID="8aa54b45b2152668d79b56f9c12b91df1925011ab0dbd7a2601a1ffa9f2d27a9" exitCode=0 Feb 18 11:58:16 crc kubenswrapper[4922]: I0218 11:58:16.135458 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-pc7hm" event={"ID":"5fe07f11-0f10-4aa7-ab94-51d42b7a6367","Type":"ContainerDied","Data":"8aa54b45b2152668d79b56f9c12b91df1925011ab0dbd7a2601a1ffa9f2d27a9"} Feb 18 11:58:16 crc kubenswrapper[4922]: I0218 11:58:16.135484 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-pc7hm" event={"ID":"5fe07f11-0f10-4aa7-ab94-51d42b7a6367","Type":"ContainerStarted","Data":"0199a345c591cc6803424fbda06bc8be81c6f2649bc4692acfd3ea6b8d197abc"} Feb 18 11:58:16 crc kubenswrapper[4922]: I0218 11:58:16.144703 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4e0d2e17-4045-420d-817b-41a1fc66c425","Type":"ContainerStarted","Data":"5c4ed2cdb2b752aa88aa0c848f0558afd5579a29c6af7de94b0afbe9de2ec4ec"} Feb 18 11:58:16 crc kubenswrapper[4922]: I0218 11:58:16.148672 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-gts9l" podStartSLOduration=3.148649972 podStartE2EDuration="3.148649972s" podCreationTimestamp="2026-02-18 11:58:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:58:16.138924446 +0000 UTC m=+1297.866628546" watchObservedRunningTime="2026-02-18 11:58:16.148649972 +0000 UTC m=+1297.876354052" Feb 18 11:58:16 crc kubenswrapper[4922]: I0218 11:58:16.164578 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2856a778-a8b2-4740-8d2a-4a6f64619bc2","Type":"ContainerStarted","Data":"0bc208ef650f9f609657312f6d8f2198c5a12a9f657dff0516610817ba9c8516"} Feb 18 11:58:17 crc kubenswrapper[4922]: I0218 11:58:17.181039 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zqpbh" event={"ID":"aacb8ffe-ff30-4292-b253-1e12d07f499b","Type":"ContainerStarted","Data":"ee60fdb1d51bfbe6bd8d0e50a4e4f1a8691171bcc2ba0c807ba59a9d88e2dc0d"} Feb 18 11:58:17 crc kubenswrapper[4922]: I0218 11:58:17.195525 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-zqpbh" podStartSLOduration=3.195506452 podStartE2EDuration="3.195506452s" podCreationTimestamp="2026-02-18 11:58:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:58:17.194809345 +0000 UTC m=+1298.922513425" watchObservedRunningTime="2026-02-18 11:58:17.195506452 +0000 UTC m=+1298.923210532" Feb 18 11:58:17 crc kubenswrapper[4922]: I0218 11:58:17.684564 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 18 11:58:17 crc kubenswrapper[4922]: I0218 11:58:17.699065 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 11:58:21 crc kubenswrapper[4922]: I0218 11:58:21.234338 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eea42093-2f99-433e-8cde-fe075d89d91f","Type":"ContainerStarted","Data":"b8da702e013fe942a755036a03637ef1c9322a5d134a3e1f2777cc744e2c7d25"} Feb 18 11:58:21 crc kubenswrapper[4922]: I0218 11:58:21.235047 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eea42093-2f99-433e-8cde-fe075d89d91f","Type":"ContainerStarted","Data":"353ea64b8a388da79d11ac959cbc398ef7009e7be3e2209ab8d4d8c02038874d"} Feb 18 11:58:21 crc kubenswrapper[4922]: I0218 11:58:21.234512 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="eea42093-2f99-433e-8cde-fe075d89d91f" containerName="nova-metadata-metadata" containerID="cri-o://b8da702e013fe942a755036a03637ef1c9322a5d134a3e1f2777cc744e2c7d25" gracePeriod=30 Feb 18 11:58:21 crc kubenswrapper[4922]: I0218 11:58:21.234464 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="eea42093-2f99-433e-8cde-fe075d89d91f" containerName="nova-metadata-log" containerID="cri-o://353ea64b8a388da79d11ac959cbc398ef7009e7be3e2209ab8d4d8c02038874d" gracePeriod=30 Feb 18 11:58:21 crc kubenswrapper[4922]: I0218 11:58:21.237571 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3ce92cda-4459-4a73-8fc2-84bbb56eccce","Type":"ContainerStarted","Data":"c14ea122843eb522430115cdefefb0c3b7d6b58a0e5c203b5fb2fca56b1e56ba"} Feb 18 11:58:21 crc kubenswrapper[4922]: I0218 11:58:21.238683 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="3ce92cda-4459-4a73-8fc2-84bbb56eccce" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://c14ea122843eb522430115cdefefb0c3b7d6b58a0e5c203b5fb2fca56b1e56ba" gracePeriod=30 Feb 18 11:58:21 crc kubenswrapper[4922]: I0218 11:58:21.244666 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-pc7hm" event={"ID":"5fe07f11-0f10-4aa7-ab94-51d42b7a6367","Type":"ContainerStarted","Data":"42671006b982c23fe661235bf5a74dcd5e79ee9d03166f4a0fe4596b08069112"} Feb 18 11:58:21 crc kubenswrapper[4922]: I0218 11:58:21.249328 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4e0d2e17-4045-420d-817b-41a1fc66c425","Type":"ContainerStarted","Data":"c2ec66f0a6fe1edbd0a3743686091f7121951868a859f3c215737d15adf40c85"} Feb 18 11:58:21 crc kubenswrapper[4922]: I0218 11:58:21.252441 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2856a778-a8b2-4740-8d2a-4a6f64619bc2","Type":"ContainerStarted","Data":"783aeec10a7933650343fa4b55375b721d65807531d8e7dbf7c2823b6b487dd6"} Feb 18 11:58:21 crc kubenswrapper[4922]: I0218 11:58:21.252493 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2856a778-a8b2-4740-8d2a-4a6f64619bc2","Type":"ContainerStarted","Data":"deb34b70b7dbc61de6d4d92ee5f97c64b68483ab7a9de52c357c2663ff82c337"} Feb 18 11:58:21 crc kubenswrapper[4922]: I0218 11:58:21.264604 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.384826183 podStartE2EDuration="7.264584092s" podCreationTimestamp="2026-02-18 11:58:14 +0000 UTC" firstStartedPulling="2026-02-18 11:58:15.090413335 +0000 UTC m=+1296.818117405" lastFinishedPulling="2026-02-18 11:58:19.970171234 +0000 UTC m=+1301.697875314" observedRunningTime="2026-02-18 11:58:21.255374379 +0000 UTC m=+1302.983078479" watchObservedRunningTime="2026-02-18 11:58:21.264584092 +0000 UTC m=+1302.992288172" Feb 18 11:58:21 crc kubenswrapper[4922]: I0218 11:58:21.288687 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.361677808 podStartE2EDuration="8.288664131s" podCreationTimestamp="2026-02-18 11:58:13 +0000 UTC" firstStartedPulling="2026-02-18 11:58:15.0415503 +0000 UTC m=+1296.769254380" lastFinishedPulling="2026-02-18 11:58:19.968536623 +0000 UTC m=+1301.696240703" observedRunningTime="2026-02-18 11:58:21.279109589 +0000 UTC m=+1303.006813679" watchObservedRunningTime="2026-02-18 11:58:21.288664131 +0000 UTC m=+1303.016368211" Feb 18 11:58:21 crc kubenswrapper[4922]: I0218 11:58:21.304595 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.729398631 podStartE2EDuration="7.304578043s" podCreationTimestamp="2026-02-18 11:58:14 +0000 UTC" firstStartedPulling="2026-02-18 11:58:15.39370412 +0000 UTC m=+1297.121408200" lastFinishedPulling="2026-02-18 11:58:19.968883522 +0000 UTC m=+1301.696587612" observedRunningTime="2026-02-18 11:58:21.293699348 +0000 UTC m=+1303.021403438" watchObservedRunningTime="2026-02-18 11:58:21.304578043 +0000 UTC m=+1303.032282123" Feb 18 11:58:21 crc kubenswrapper[4922]: I0218 11:58:21.320062 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.486165194 podStartE2EDuration="7.320041644s" podCreationTimestamp="2026-02-18 11:58:14 +0000 UTC" firstStartedPulling="2026-02-18 11:58:15.134734015 +0000 UTC m=+1296.862438095" lastFinishedPulling="2026-02-18 11:58:19.968610465 +0000 UTC m=+1301.696314545" observedRunningTime="2026-02-18 11:58:21.311427066 +0000 UTC m=+1303.039131146" watchObservedRunningTime="2026-02-18 11:58:21.320041644 +0000 UTC m=+1303.047745724" Feb 18 11:58:21 crc kubenswrapper[4922]: I0218 11:58:21.363525 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bccf8f775-pc7hm" podStartSLOduration=7.363503032 podStartE2EDuration="7.363503032s" podCreationTimestamp="2026-02-18 11:58:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:58:21.345964749 +0000 UTC m=+1303.073668829" watchObservedRunningTime="2026-02-18 11:58:21.363503032 +0000 UTC m=+1303.091207122" Feb 18 11:58:21 crc kubenswrapper[4922]: I0218 11:58:21.842766 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 11:58:21 crc kubenswrapper[4922]: I0218 11:58:21.935625 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eea42093-2f99-433e-8cde-fe075d89d91f-combined-ca-bundle\") pod \"eea42093-2f99-433e-8cde-fe075d89d91f\" (UID: \"eea42093-2f99-433e-8cde-fe075d89d91f\") " Feb 18 11:58:21 crc kubenswrapper[4922]: I0218 11:58:21.935695 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66vrt\" (UniqueName: \"kubernetes.io/projected/eea42093-2f99-433e-8cde-fe075d89d91f-kube-api-access-66vrt\") pod \"eea42093-2f99-433e-8cde-fe075d89d91f\" (UID: \"eea42093-2f99-433e-8cde-fe075d89d91f\") " Feb 18 11:58:21 crc kubenswrapper[4922]: I0218 11:58:21.935776 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eea42093-2f99-433e-8cde-fe075d89d91f-config-data\") pod \"eea42093-2f99-433e-8cde-fe075d89d91f\" (UID: \"eea42093-2f99-433e-8cde-fe075d89d91f\") " Feb 18 11:58:21 crc kubenswrapper[4922]: I0218 11:58:21.935823 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eea42093-2f99-433e-8cde-fe075d89d91f-logs\") pod \"eea42093-2f99-433e-8cde-fe075d89d91f\" (UID: \"eea42093-2f99-433e-8cde-fe075d89d91f\") " Feb 18 11:58:21 crc kubenswrapper[4922]: I0218 11:58:21.936803 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eea42093-2f99-433e-8cde-fe075d89d91f-logs" (OuterVolumeSpecName: "logs") pod "eea42093-2f99-433e-8cde-fe075d89d91f" (UID: "eea42093-2f99-433e-8cde-fe075d89d91f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:58:21 crc kubenswrapper[4922]: I0218 11:58:21.941736 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eea42093-2f99-433e-8cde-fe075d89d91f-kube-api-access-66vrt" (OuterVolumeSpecName: "kube-api-access-66vrt") pod "eea42093-2f99-433e-8cde-fe075d89d91f" (UID: "eea42093-2f99-433e-8cde-fe075d89d91f"). InnerVolumeSpecName "kube-api-access-66vrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:58:21 crc kubenswrapper[4922]: I0218 11:58:21.973147 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eea42093-2f99-433e-8cde-fe075d89d91f-config-data" (OuterVolumeSpecName: "config-data") pod "eea42093-2f99-433e-8cde-fe075d89d91f" (UID: "eea42093-2f99-433e-8cde-fe075d89d91f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:58:21 crc kubenswrapper[4922]: I0218 11:58:21.976969 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eea42093-2f99-433e-8cde-fe075d89d91f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eea42093-2f99-433e-8cde-fe075d89d91f" (UID: "eea42093-2f99-433e-8cde-fe075d89d91f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:58:22 crc kubenswrapper[4922]: I0218 11:58:22.038009 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eea42093-2f99-433e-8cde-fe075d89d91f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:22 crc kubenswrapper[4922]: I0218 11:58:22.038048 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66vrt\" (UniqueName: \"kubernetes.io/projected/eea42093-2f99-433e-8cde-fe075d89d91f-kube-api-access-66vrt\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:22 crc kubenswrapper[4922]: I0218 11:58:22.038065 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eea42093-2f99-433e-8cde-fe075d89d91f-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:22 crc kubenswrapper[4922]: I0218 11:58:22.038079 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eea42093-2f99-433e-8cde-fe075d89d91f-logs\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:22 crc kubenswrapper[4922]: I0218 11:58:22.264963 4922 generic.go:334] "Generic (PLEG): container finished" podID="eea42093-2f99-433e-8cde-fe075d89d91f" containerID="b8da702e013fe942a755036a03637ef1c9322a5d134a3e1f2777cc744e2c7d25" exitCode=0 Feb 18 11:58:22 crc kubenswrapper[4922]: I0218 11:58:22.265294 4922 generic.go:334] "Generic (PLEG): container finished" podID="eea42093-2f99-433e-8cde-fe075d89d91f" containerID="353ea64b8a388da79d11ac959cbc398ef7009e7be3e2209ab8d4d8c02038874d" exitCode=143 Feb 18 11:58:22 crc kubenswrapper[4922]: I0218 11:58:22.265211 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eea42093-2f99-433e-8cde-fe075d89d91f","Type":"ContainerDied","Data":"b8da702e013fe942a755036a03637ef1c9322a5d134a3e1f2777cc744e2c7d25"} Feb 18 11:58:22 crc kubenswrapper[4922]: I0218 11:58:22.265711 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eea42093-2f99-433e-8cde-fe075d89d91f","Type":"ContainerDied","Data":"353ea64b8a388da79d11ac959cbc398ef7009e7be3e2209ab8d4d8c02038874d"} Feb 18 11:58:22 crc kubenswrapper[4922]: I0218 11:58:22.265733 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eea42093-2f99-433e-8cde-fe075d89d91f","Type":"ContainerDied","Data":"6f453239385971f9189e759542805bb61c809c6dd720600811af9a9f4a7ac835"} Feb 18 11:58:22 crc kubenswrapper[4922]: I0218 11:58:22.265752 4922 scope.go:117] "RemoveContainer" containerID="b8da702e013fe942a755036a03637ef1c9322a5d134a3e1f2777cc744e2c7d25" Feb 18 11:58:22 crc kubenswrapper[4922]: I0218 11:58:22.265176 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 11:58:22 crc kubenswrapper[4922]: I0218 11:58:22.267412 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bccf8f775-pc7hm" Feb 18 11:58:22 crc kubenswrapper[4922]: I0218 11:58:22.307887 4922 scope.go:117] "RemoveContainer" containerID="353ea64b8a388da79d11ac959cbc398ef7009e7be3e2209ab8d4d8c02038874d" Feb 18 11:58:22 crc kubenswrapper[4922]: I0218 11:58:22.333517 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 11:58:22 crc kubenswrapper[4922]: I0218 11:58:22.333841 4922 scope.go:117] "RemoveContainer" containerID="b8da702e013fe942a755036a03637ef1c9322a5d134a3e1f2777cc744e2c7d25" Feb 18 11:58:22 crc kubenswrapper[4922]: E0218 11:58:22.334583 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8da702e013fe942a755036a03637ef1c9322a5d134a3e1f2777cc744e2c7d25\": container with ID starting with b8da702e013fe942a755036a03637ef1c9322a5d134a3e1f2777cc744e2c7d25 not found: ID does not exist" containerID="b8da702e013fe942a755036a03637ef1c9322a5d134a3e1f2777cc744e2c7d25" Feb 18 11:58:22 crc kubenswrapper[4922]: I0218 11:58:22.334620 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8da702e013fe942a755036a03637ef1c9322a5d134a3e1f2777cc744e2c7d25"} err="failed to get container status \"b8da702e013fe942a755036a03637ef1c9322a5d134a3e1f2777cc744e2c7d25\": rpc error: code = NotFound desc = could not find container \"b8da702e013fe942a755036a03637ef1c9322a5d134a3e1f2777cc744e2c7d25\": container with ID starting with b8da702e013fe942a755036a03637ef1c9322a5d134a3e1f2777cc744e2c7d25 not found: ID does not exist" Feb 18 11:58:22 crc kubenswrapper[4922]: I0218 11:58:22.334646 4922 scope.go:117] "RemoveContainer" containerID="353ea64b8a388da79d11ac959cbc398ef7009e7be3e2209ab8d4d8c02038874d" Feb 18 11:58:22 crc kubenswrapper[4922]: E0218 11:58:22.335046 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"353ea64b8a388da79d11ac959cbc398ef7009e7be3e2209ab8d4d8c02038874d\": container with ID starting with 353ea64b8a388da79d11ac959cbc398ef7009e7be3e2209ab8d4d8c02038874d not found: ID does not exist" containerID="353ea64b8a388da79d11ac959cbc398ef7009e7be3e2209ab8d4d8c02038874d" Feb 18 11:58:22 crc kubenswrapper[4922]: I0218 11:58:22.335063 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"353ea64b8a388da79d11ac959cbc398ef7009e7be3e2209ab8d4d8c02038874d"} err="failed to get container status \"353ea64b8a388da79d11ac959cbc398ef7009e7be3e2209ab8d4d8c02038874d\": rpc error: code = NotFound desc = could not find container \"353ea64b8a388da79d11ac959cbc398ef7009e7be3e2209ab8d4d8c02038874d\": container with ID starting with 353ea64b8a388da79d11ac959cbc398ef7009e7be3e2209ab8d4d8c02038874d not found: ID does not exist" Feb 18 11:58:22 crc kubenswrapper[4922]: I0218 11:58:22.335084 4922 scope.go:117] "RemoveContainer" containerID="b8da702e013fe942a755036a03637ef1c9322a5d134a3e1f2777cc744e2c7d25" Feb 18 11:58:22 crc kubenswrapper[4922]: I0218 11:58:22.335294 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8da702e013fe942a755036a03637ef1c9322a5d134a3e1f2777cc744e2c7d25"} err="failed to get container status \"b8da702e013fe942a755036a03637ef1c9322a5d134a3e1f2777cc744e2c7d25\": rpc error: code = NotFound desc = could not find container \"b8da702e013fe942a755036a03637ef1c9322a5d134a3e1f2777cc744e2c7d25\": container with ID starting with b8da702e013fe942a755036a03637ef1c9322a5d134a3e1f2777cc744e2c7d25 not found: ID does not exist" Feb 18 11:58:22 crc kubenswrapper[4922]: I0218 11:58:22.335313 4922 scope.go:117] "RemoveContainer" containerID="353ea64b8a388da79d11ac959cbc398ef7009e7be3e2209ab8d4d8c02038874d" Feb 18 11:58:22 crc kubenswrapper[4922]: I0218 11:58:22.335543 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"353ea64b8a388da79d11ac959cbc398ef7009e7be3e2209ab8d4d8c02038874d"} err="failed to get container status \"353ea64b8a388da79d11ac959cbc398ef7009e7be3e2209ab8d4d8c02038874d\": rpc error: code = NotFound desc = could not find container \"353ea64b8a388da79d11ac959cbc398ef7009e7be3e2209ab8d4d8c02038874d\": container with ID starting with 353ea64b8a388da79d11ac959cbc398ef7009e7be3e2209ab8d4d8c02038874d not found: ID does not exist" Feb 18 11:58:22 crc kubenswrapper[4922]: I0218 11:58:22.338753 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 11:58:22 crc kubenswrapper[4922]: I0218 11:58:22.348004 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 18 11:58:22 crc kubenswrapper[4922]: E0218 11:58:22.352014 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eea42093-2f99-433e-8cde-fe075d89d91f" containerName="nova-metadata-log" Feb 18 11:58:22 crc kubenswrapper[4922]: I0218 11:58:22.352039 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="eea42093-2f99-433e-8cde-fe075d89d91f" containerName="nova-metadata-log" Feb 18 11:58:22 crc kubenswrapper[4922]: E0218 11:58:22.352058 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eea42093-2f99-433e-8cde-fe075d89d91f" containerName="nova-metadata-metadata" Feb 18 11:58:22 crc kubenswrapper[4922]: I0218 11:58:22.352064 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="eea42093-2f99-433e-8cde-fe075d89d91f" containerName="nova-metadata-metadata" Feb 18 11:58:22 crc kubenswrapper[4922]: I0218 11:58:22.352287 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="eea42093-2f99-433e-8cde-fe075d89d91f" containerName="nova-metadata-metadata" Feb 18 11:58:22 crc kubenswrapper[4922]: I0218 11:58:22.352311 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="eea42093-2f99-433e-8cde-fe075d89d91f" containerName="nova-metadata-log" Feb 18 11:58:22 crc kubenswrapper[4922]: I0218 11:58:22.353328 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 11:58:22 crc kubenswrapper[4922]: I0218 11:58:22.356172 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 18 11:58:22 crc kubenswrapper[4922]: I0218 11:58:22.356391 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 18 11:58:22 crc kubenswrapper[4922]: I0218 11:58:22.392388 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 11:58:22 crc kubenswrapper[4922]: I0218 11:58:22.444077 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cda998c8-9655-49e8-ad74-689371f71535-logs\") pod \"nova-metadata-0\" (UID: \"cda998c8-9655-49e8-ad74-689371f71535\") " pod="openstack/nova-metadata-0" Feb 18 11:58:22 crc kubenswrapper[4922]: I0218 11:58:22.444227 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cda998c8-9655-49e8-ad74-689371f71535-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cda998c8-9655-49e8-ad74-689371f71535\") " pod="openstack/nova-metadata-0" Feb 18 11:58:22 crc kubenswrapper[4922]: I0218 11:58:22.444277 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cda998c8-9655-49e8-ad74-689371f71535-config-data\") pod \"nova-metadata-0\" (UID: \"cda998c8-9655-49e8-ad74-689371f71535\") " pod="openstack/nova-metadata-0" Feb 18 11:58:22 crc kubenswrapper[4922]: I0218 11:58:22.444315 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx867\" (UniqueName: \"kubernetes.io/projected/cda998c8-9655-49e8-ad74-689371f71535-kube-api-access-mx867\") pod \"nova-metadata-0\" (UID: \"cda998c8-9655-49e8-ad74-689371f71535\") " pod="openstack/nova-metadata-0" Feb 18 11:58:22 crc kubenswrapper[4922]: I0218 11:58:22.444350 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cda998c8-9655-49e8-ad74-689371f71535-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cda998c8-9655-49e8-ad74-689371f71535\") " pod="openstack/nova-metadata-0" Feb 18 11:58:22 crc kubenswrapper[4922]: I0218 11:58:22.545899 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cda998c8-9655-49e8-ad74-689371f71535-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cda998c8-9655-49e8-ad74-689371f71535\") " pod="openstack/nova-metadata-0" Feb 18 11:58:22 crc kubenswrapper[4922]: I0218 11:58:22.545960 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cda998c8-9655-49e8-ad74-689371f71535-config-data\") pod \"nova-metadata-0\" (UID: \"cda998c8-9655-49e8-ad74-689371f71535\") " pod="openstack/nova-metadata-0" Feb 18 11:58:22 crc kubenswrapper[4922]: I0218 11:58:22.545993 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mx867\" (UniqueName: \"kubernetes.io/projected/cda998c8-9655-49e8-ad74-689371f71535-kube-api-access-mx867\") pod \"nova-metadata-0\" (UID: \"cda998c8-9655-49e8-ad74-689371f71535\") " pod="openstack/nova-metadata-0" Feb 18 11:58:22 crc kubenswrapper[4922]: I0218 11:58:22.546023 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cda998c8-9655-49e8-ad74-689371f71535-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cda998c8-9655-49e8-ad74-689371f71535\") " pod="openstack/nova-metadata-0" Feb 18 11:58:22 crc kubenswrapper[4922]: I0218 11:58:22.546090 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cda998c8-9655-49e8-ad74-689371f71535-logs\") pod \"nova-metadata-0\" (UID: \"cda998c8-9655-49e8-ad74-689371f71535\") " pod="openstack/nova-metadata-0" Feb 18 11:58:22 crc kubenswrapper[4922]: I0218 11:58:22.546588 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cda998c8-9655-49e8-ad74-689371f71535-logs\") pod \"nova-metadata-0\" (UID: \"cda998c8-9655-49e8-ad74-689371f71535\") " pod="openstack/nova-metadata-0" Feb 18 11:58:22 crc kubenswrapper[4922]: I0218 11:58:22.559604 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cda998c8-9655-49e8-ad74-689371f71535-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cda998c8-9655-49e8-ad74-689371f71535\") " pod="openstack/nova-metadata-0" Feb 18 11:58:22 crc kubenswrapper[4922]: I0218 11:58:22.560198 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cda998c8-9655-49e8-ad74-689371f71535-config-data\") pod \"nova-metadata-0\" (UID: \"cda998c8-9655-49e8-ad74-689371f71535\") " pod="openstack/nova-metadata-0" Feb 18 11:58:22 crc kubenswrapper[4922]: I0218 11:58:22.561081 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cda998c8-9655-49e8-ad74-689371f71535-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cda998c8-9655-49e8-ad74-689371f71535\") " pod="openstack/nova-metadata-0" Feb 18 11:58:22 crc kubenswrapper[4922]: I0218 11:58:22.562932 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx867\" (UniqueName: \"kubernetes.io/projected/cda998c8-9655-49e8-ad74-689371f71535-kube-api-access-mx867\") pod \"nova-metadata-0\" (UID: \"cda998c8-9655-49e8-ad74-689371f71535\") " pod="openstack/nova-metadata-0" Feb 18 11:58:22 crc kubenswrapper[4922]: I0218 11:58:22.674738 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 11:58:22 crc kubenswrapper[4922]: I0218 11:58:22.985173 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eea42093-2f99-433e-8cde-fe075d89d91f" path="/var/lib/kubelet/pods/eea42093-2f99-433e-8cde-fe075d89d91f/volumes" Feb 18 11:58:23 crc kubenswrapper[4922]: I0218 11:58:23.131263 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 11:58:23 crc kubenswrapper[4922]: I0218 11:58:23.284371 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cda998c8-9655-49e8-ad74-689371f71535","Type":"ContainerStarted","Data":"356cb0c5496ed1db46c504b66fd0a7df5bfe8b80d7e99efd7e71f14288b55ff9"} Feb 18 11:58:24 crc kubenswrapper[4922]: I0218 11:58:24.284497 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 18 11:58:24 crc kubenswrapper[4922]: I0218 11:58:24.284990 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 18 11:58:24 crc kubenswrapper[4922]: I0218 11:58:24.300159 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cda998c8-9655-49e8-ad74-689371f71535","Type":"ContainerStarted","Data":"9dce9da90c9fbb3d596e5a3d01c25c49fc5f8665b585b44d2234a48680cd799a"} Feb 18 11:58:24 crc kubenswrapper[4922]: I0218 11:58:24.300239 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cda998c8-9655-49e8-ad74-689371f71535","Type":"ContainerStarted","Data":"c33e043e4f1e87b96e6f147a1fcb7afe16fcf9841b5e0ebbe368f8d27947cc5e"} Feb 18 11:58:24 crc kubenswrapper[4922]: I0218 11:58:24.305879 4922 generic.go:334] "Generic (PLEG): container finished" podID="95fc0adb-b8ae-4fd6-88eb-3b6357173103" containerID="3dd5bc6bbbea448a7d9229f94c3182f66fafad86532a2426f6a51eaa5b649205" exitCode=0 Feb 18 11:58:24 crc kubenswrapper[4922]: I0218 11:58:24.305993 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gts9l" event={"ID":"95fc0adb-b8ae-4fd6-88eb-3b6357173103","Type":"ContainerDied","Data":"3dd5bc6bbbea448a7d9229f94c3182f66fafad86532a2426f6a51eaa5b649205"} Feb 18 11:58:24 crc kubenswrapper[4922]: I0218 11:58:24.309730 4922 generic.go:334] "Generic (PLEG): container finished" podID="aacb8ffe-ff30-4292-b253-1e12d07f499b" containerID="ee60fdb1d51bfbe6bd8d0e50a4e4f1a8691171bcc2ba0c807ba59a9d88e2dc0d" exitCode=0 Feb 18 11:58:24 crc kubenswrapper[4922]: I0218 11:58:24.309797 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zqpbh" event={"ID":"aacb8ffe-ff30-4292-b253-1e12d07f499b","Type":"ContainerDied","Data":"ee60fdb1d51bfbe6bd8d0e50a4e4f1a8691171bcc2ba0c807ba59a9d88e2dc0d"} Feb 18 11:58:24 crc kubenswrapper[4922]: I0218 11:58:24.322111 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.322082643 podStartE2EDuration="2.322082643s" podCreationTimestamp="2026-02-18 11:58:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:58:24.321657922 +0000 UTC m=+1306.049362042" watchObservedRunningTime="2026-02-18 11:58:24.322082643 +0000 UTC m=+1306.049786793" Feb 18 11:58:24 crc kubenswrapper[4922]: I0218 11:58:24.380595 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 18 11:58:24 crc kubenswrapper[4922]: I0218 11:58:24.728523 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 18 11:58:24 crc kubenswrapper[4922]: I0218 11:58:24.728595 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 18 11:58:24 crc kubenswrapper[4922]: I0218 11:58:24.766943 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 18 11:58:25 crc kubenswrapper[4922]: I0218 11:58:25.143508 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="01766e5f-d149-4175-9fdb-15e65b0e0665" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 18 11:58:25 crc kubenswrapper[4922]: I0218 11:58:25.350647 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 18 11:58:25 crc kubenswrapper[4922]: I0218 11:58:25.366509 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2856a778-a8b2-4740-8d2a-4a6f64619bc2" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.205:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 18 11:58:25 crc kubenswrapper[4922]: I0218 11:58:25.366515 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2856a778-a8b2-4740-8d2a-4a6f64619bc2" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.205:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 18 11:58:25 crc kubenswrapper[4922]: I0218 11:58:25.849230 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zqpbh" Feb 18 11:58:25 crc kubenswrapper[4922]: I0218 11:58:25.855011 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gts9l" Feb 18 11:58:25 crc kubenswrapper[4922]: I0218 11:58:25.912970 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aacb8ffe-ff30-4292-b253-1e12d07f499b-scripts\") pod \"aacb8ffe-ff30-4292-b253-1e12d07f499b\" (UID: \"aacb8ffe-ff30-4292-b253-1e12d07f499b\") " Feb 18 11:58:25 crc kubenswrapper[4922]: I0218 11:58:25.913139 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aacb8ffe-ff30-4292-b253-1e12d07f499b-config-data\") pod \"aacb8ffe-ff30-4292-b253-1e12d07f499b\" (UID: \"aacb8ffe-ff30-4292-b253-1e12d07f499b\") " Feb 18 11:58:25 crc kubenswrapper[4922]: I0218 11:58:25.913458 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aacb8ffe-ff30-4292-b253-1e12d07f499b-combined-ca-bundle\") pod \"aacb8ffe-ff30-4292-b253-1e12d07f499b\" (UID: \"aacb8ffe-ff30-4292-b253-1e12d07f499b\") " Feb 18 11:58:25 crc kubenswrapper[4922]: I0218 11:58:25.913585 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8zsk\" (UniqueName: \"kubernetes.io/projected/aacb8ffe-ff30-4292-b253-1e12d07f499b-kube-api-access-q8zsk\") pod \"aacb8ffe-ff30-4292-b253-1e12d07f499b\" (UID: \"aacb8ffe-ff30-4292-b253-1e12d07f499b\") " Feb 18 11:58:25 crc kubenswrapper[4922]: I0218 11:58:25.941942 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aacb8ffe-ff30-4292-b253-1e12d07f499b-scripts" (OuterVolumeSpecName: "scripts") pod "aacb8ffe-ff30-4292-b253-1e12d07f499b" (UID: "aacb8ffe-ff30-4292-b253-1e12d07f499b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:58:25 crc kubenswrapper[4922]: I0218 11:58:25.950617 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aacb8ffe-ff30-4292-b253-1e12d07f499b-kube-api-access-q8zsk" (OuterVolumeSpecName: "kube-api-access-q8zsk") pod "aacb8ffe-ff30-4292-b253-1e12d07f499b" (UID: "aacb8ffe-ff30-4292-b253-1e12d07f499b"). InnerVolumeSpecName "kube-api-access-q8zsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:58:25 crc kubenswrapper[4922]: I0218 11:58:25.970137 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aacb8ffe-ff30-4292-b253-1e12d07f499b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aacb8ffe-ff30-4292-b253-1e12d07f499b" (UID: "aacb8ffe-ff30-4292-b253-1e12d07f499b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:58:26 crc kubenswrapper[4922]: I0218 11:58:26.006983 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aacb8ffe-ff30-4292-b253-1e12d07f499b-config-data" (OuterVolumeSpecName: "config-data") pod "aacb8ffe-ff30-4292-b253-1e12d07f499b" (UID: "aacb8ffe-ff30-4292-b253-1e12d07f499b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:58:26 crc kubenswrapper[4922]: I0218 11:58:26.021681 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95fc0adb-b8ae-4fd6-88eb-3b6357173103-scripts\") pod \"95fc0adb-b8ae-4fd6-88eb-3b6357173103\" (UID: \"95fc0adb-b8ae-4fd6-88eb-3b6357173103\") " Feb 18 11:58:26 crc kubenswrapper[4922]: I0218 11:58:26.021810 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95fc0adb-b8ae-4fd6-88eb-3b6357173103-combined-ca-bundle\") pod \"95fc0adb-b8ae-4fd6-88eb-3b6357173103\" (UID: \"95fc0adb-b8ae-4fd6-88eb-3b6357173103\") " Feb 18 11:58:26 crc kubenswrapper[4922]: I0218 11:58:26.021839 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95fc0adb-b8ae-4fd6-88eb-3b6357173103-config-data\") pod \"95fc0adb-b8ae-4fd6-88eb-3b6357173103\" (UID: \"95fc0adb-b8ae-4fd6-88eb-3b6357173103\") " Feb 18 11:58:26 crc kubenswrapper[4922]: I0218 11:58:26.021869 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2rwd\" (UniqueName: \"kubernetes.io/projected/95fc0adb-b8ae-4fd6-88eb-3b6357173103-kube-api-access-r2rwd\") pod \"95fc0adb-b8ae-4fd6-88eb-3b6357173103\" (UID: \"95fc0adb-b8ae-4fd6-88eb-3b6357173103\") " Feb 18 11:58:26 crc kubenswrapper[4922]: I0218 11:58:26.022449 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aacb8ffe-ff30-4292-b253-1e12d07f499b-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:26 crc kubenswrapper[4922]: I0218 11:58:26.022471 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aacb8ffe-ff30-4292-b253-1e12d07f499b-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:26 crc kubenswrapper[4922]: I0218 11:58:26.022483 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aacb8ffe-ff30-4292-b253-1e12d07f499b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:26 crc kubenswrapper[4922]: I0218 11:58:26.022497 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8zsk\" (UniqueName: \"kubernetes.io/projected/aacb8ffe-ff30-4292-b253-1e12d07f499b-kube-api-access-q8zsk\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:26 crc kubenswrapper[4922]: I0218 11:58:26.027339 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95fc0adb-b8ae-4fd6-88eb-3b6357173103-scripts" (OuterVolumeSpecName: "scripts") pod "95fc0adb-b8ae-4fd6-88eb-3b6357173103" (UID: "95fc0adb-b8ae-4fd6-88eb-3b6357173103"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:58:26 crc kubenswrapper[4922]: I0218 11:58:26.028959 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95fc0adb-b8ae-4fd6-88eb-3b6357173103-kube-api-access-r2rwd" (OuterVolumeSpecName: "kube-api-access-r2rwd") pod "95fc0adb-b8ae-4fd6-88eb-3b6357173103" (UID: "95fc0adb-b8ae-4fd6-88eb-3b6357173103"). InnerVolumeSpecName "kube-api-access-r2rwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:58:26 crc kubenswrapper[4922]: I0218 11:58:26.053438 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95fc0adb-b8ae-4fd6-88eb-3b6357173103-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95fc0adb-b8ae-4fd6-88eb-3b6357173103" (UID: "95fc0adb-b8ae-4fd6-88eb-3b6357173103"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:58:26 crc kubenswrapper[4922]: I0218 11:58:26.053856 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95fc0adb-b8ae-4fd6-88eb-3b6357173103-config-data" (OuterVolumeSpecName: "config-data") pod "95fc0adb-b8ae-4fd6-88eb-3b6357173103" (UID: "95fc0adb-b8ae-4fd6-88eb-3b6357173103"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:58:26 crc kubenswrapper[4922]: I0218 11:58:26.124568 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2rwd\" (UniqueName: \"kubernetes.io/projected/95fc0adb-b8ae-4fd6-88eb-3b6357173103-kube-api-access-r2rwd\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:26 crc kubenswrapper[4922]: I0218 11:58:26.124614 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95fc0adb-b8ae-4fd6-88eb-3b6357173103-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:26 crc kubenswrapper[4922]: I0218 11:58:26.124630 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95fc0adb-b8ae-4fd6-88eb-3b6357173103-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:26 crc kubenswrapper[4922]: I0218 11:58:26.124643 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95fc0adb-b8ae-4fd6-88eb-3b6357173103-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:26 crc kubenswrapper[4922]: I0218 11:58:26.327097 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gts9l" event={"ID":"95fc0adb-b8ae-4fd6-88eb-3b6357173103","Type":"ContainerDied","Data":"3dd9e8b282e0169c21e93008600de3cbb1dc520cc5c66cbb90c2934ce89d2770"} Feb 18 11:58:26 crc kubenswrapper[4922]: I0218 11:58:26.327143 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3dd9e8b282e0169c21e93008600de3cbb1dc520cc5c66cbb90c2934ce89d2770" Feb 18 11:58:26 crc kubenswrapper[4922]: I0218 11:58:26.327219 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gts9l" Feb 18 11:58:26 crc kubenswrapper[4922]: I0218 11:58:26.329967 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zqpbh" Feb 18 11:58:26 crc kubenswrapper[4922]: I0218 11:58:26.331428 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zqpbh" event={"ID":"aacb8ffe-ff30-4292-b253-1e12d07f499b","Type":"ContainerDied","Data":"1ed9fd2d7f07e8fb94873887c1633bde383c19238f884a2aec0494c3844e1788"} Feb 18 11:58:26 crc kubenswrapper[4922]: I0218 11:58:26.331479 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ed9fd2d7f07e8fb94873887c1633bde383c19238f884a2aec0494c3844e1788" Feb 18 11:58:26 crc kubenswrapper[4922]: I0218 11:58:26.441458 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 18 11:58:26 crc kubenswrapper[4922]: E0218 11:58:26.443210 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aacb8ffe-ff30-4292-b253-1e12d07f499b" containerName="nova-cell1-conductor-db-sync" Feb 18 11:58:26 crc kubenswrapper[4922]: I0218 11:58:26.443303 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="aacb8ffe-ff30-4292-b253-1e12d07f499b" containerName="nova-cell1-conductor-db-sync" Feb 18 11:58:26 crc kubenswrapper[4922]: E0218 11:58:26.443406 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95fc0adb-b8ae-4fd6-88eb-3b6357173103" containerName="nova-manage" Feb 18 11:58:26 crc kubenswrapper[4922]: I0218 11:58:26.443462 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="95fc0adb-b8ae-4fd6-88eb-3b6357173103" containerName="nova-manage" Feb 18 11:58:26 crc kubenswrapper[4922]: I0218 11:58:26.443689 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="aacb8ffe-ff30-4292-b253-1e12d07f499b" containerName="nova-cell1-conductor-db-sync" Feb 18 11:58:26 crc kubenswrapper[4922]: I0218 11:58:26.443768 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="95fc0adb-b8ae-4fd6-88eb-3b6357173103" containerName="nova-manage" Feb 18 11:58:26 crc kubenswrapper[4922]: I0218 11:58:26.444460 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 18 11:58:26 crc kubenswrapper[4922]: I0218 11:58:26.452918 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 18 11:58:26 crc kubenswrapper[4922]: I0218 11:58:26.471328 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 18 11:58:26 crc kubenswrapper[4922]: I0218 11:58:26.533552 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rbd2\" (UniqueName: \"kubernetes.io/projected/31ef9a9b-fedd-4afd-8582-19ef097c98a2-kube-api-access-9rbd2\") pod \"nova-cell1-conductor-0\" (UID: \"31ef9a9b-fedd-4afd-8582-19ef097c98a2\") " pod="openstack/nova-cell1-conductor-0" Feb 18 11:58:26 crc kubenswrapper[4922]: I0218 11:58:26.533901 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31ef9a9b-fedd-4afd-8582-19ef097c98a2-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"31ef9a9b-fedd-4afd-8582-19ef097c98a2\") " pod="openstack/nova-cell1-conductor-0" Feb 18 11:58:26 crc kubenswrapper[4922]: I0218 11:58:26.534168 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31ef9a9b-fedd-4afd-8582-19ef097c98a2-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"31ef9a9b-fedd-4afd-8582-19ef097c98a2\") " pod="openstack/nova-cell1-conductor-0" Feb 18 11:58:26 crc kubenswrapper[4922]: I0218 11:58:26.549931 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 18 11:58:26 crc kubenswrapper[4922]: I0218 11:58:26.550180 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2856a778-a8b2-4740-8d2a-4a6f64619bc2" containerName="nova-api-log" containerID="cri-o://deb34b70b7dbc61de6d4d92ee5f97c64b68483ab7a9de52c357c2663ff82c337" gracePeriod=30 Feb 18 11:58:26 crc kubenswrapper[4922]: I0218 11:58:26.550616 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2856a778-a8b2-4740-8d2a-4a6f64619bc2" containerName="nova-api-api" containerID="cri-o://783aeec10a7933650343fa4b55375b721d65807531d8e7dbf7c2823b6b487dd6" gracePeriod=30 Feb 18 11:58:26 crc kubenswrapper[4922]: I0218 11:58:26.563098 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 11:58:26 crc kubenswrapper[4922]: I0218 11:58:26.577656 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 11:58:26 crc kubenswrapper[4922]: I0218 11:58:26.577941 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="cda998c8-9655-49e8-ad74-689371f71535" containerName="nova-metadata-log" containerID="cri-o://c33e043e4f1e87b96e6f147a1fcb7afe16fcf9841b5e0ebbe368f8d27947cc5e" gracePeriod=30 Feb 18 11:58:26 crc kubenswrapper[4922]: I0218 11:58:26.578005 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="cda998c8-9655-49e8-ad74-689371f71535" containerName="nova-metadata-metadata" containerID="cri-o://9dce9da90c9fbb3d596e5a3d01c25c49fc5f8665b585b44d2234a48680cd799a" gracePeriod=30 Feb 18 11:58:26 crc kubenswrapper[4922]: I0218 11:58:26.636534 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31ef9a9b-fedd-4afd-8582-19ef097c98a2-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"31ef9a9b-fedd-4afd-8582-19ef097c98a2\") " pod="openstack/nova-cell1-conductor-0" Feb 18 11:58:26 crc kubenswrapper[4922]: I0218 11:58:26.636859 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rbd2\" (UniqueName: \"kubernetes.io/projected/31ef9a9b-fedd-4afd-8582-19ef097c98a2-kube-api-access-9rbd2\") pod \"nova-cell1-conductor-0\" (UID: \"31ef9a9b-fedd-4afd-8582-19ef097c98a2\") " pod="openstack/nova-cell1-conductor-0" Feb 18 11:58:26 crc kubenswrapper[4922]: I0218 11:58:26.636968 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31ef9a9b-fedd-4afd-8582-19ef097c98a2-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"31ef9a9b-fedd-4afd-8582-19ef097c98a2\") " pod="openstack/nova-cell1-conductor-0" Feb 18 11:58:26 crc kubenswrapper[4922]: I0218 11:58:26.641020 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31ef9a9b-fedd-4afd-8582-19ef097c98a2-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"31ef9a9b-fedd-4afd-8582-19ef097c98a2\") " pod="openstack/nova-cell1-conductor-0" Feb 18 11:58:26 crc kubenswrapper[4922]: I0218 11:58:26.641270 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31ef9a9b-fedd-4afd-8582-19ef097c98a2-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"31ef9a9b-fedd-4afd-8582-19ef097c98a2\") " pod="openstack/nova-cell1-conductor-0" Feb 18 11:58:26 crc kubenswrapper[4922]: I0218 11:58:26.657408 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rbd2\" (UniqueName: \"kubernetes.io/projected/31ef9a9b-fedd-4afd-8582-19ef097c98a2-kube-api-access-9rbd2\") pod \"nova-cell1-conductor-0\" (UID: \"31ef9a9b-fedd-4afd-8582-19ef097c98a2\") " pod="openstack/nova-cell1-conductor-0" Feb 18 11:58:26 crc kubenswrapper[4922]: I0218 11:58:26.761796 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.314194 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.345300 4922 generic.go:334] "Generic (PLEG): container finished" podID="2856a778-a8b2-4740-8d2a-4a6f64619bc2" containerID="deb34b70b7dbc61de6d4d92ee5f97c64b68483ab7a9de52c357c2663ff82c337" exitCode=143 Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.345403 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2856a778-a8b2-4740-8d2a-4a6f64619bc2","Type":"ContainerDied","Data":"deb34b70b7dbc61de6d4d92ee5f97c64b68483ab7a9de52c357c2663ff82c337"} Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.348415 4922 generic.go:334] "Generic (PLEG): container finished" podID="cda998c8-9655-49e8-ad74-689371f71535" containerID="9dce9da90c9fbb3d596e5a3d01c25c49fc5f8665b585b44d2234a48680cd799a" exitCode=0 Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.348477 4922 generic.go:334] "Generic (PLEG): container finished" podID="cda998c8-9655-49e8-ad74-689371f71535" containerID="c33e043e4f1e87b96e6f147a1fcb7afe16fcf9841b5e0ebbe368f8d27947cc5e" exitCode=143 Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.348558 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cda998c8-9655-49e8-ad74-689371f71535","Type":"ContainerDied","Data":"9dce9da90c9fbb3d596e5a3d01c25c49fc5f8665b585b44d2234a48680cd799a"} Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.348595 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cda998c8-9655-49e8-ad74-689371f71535","Type":"ContainerDied","Data":"c33e043e4f1e87b96e6f147a1fcb7afe16fcf9841b5e0ebbe368f8d27947cc5e"} Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.348613 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cda998c8-9655-49e8-ad74-689371f71535","Type":"ContainerDied","Data":"356cb0c5496ed1db46c504b66fd0a7df5bfe8b80d7e99efd7e71f14288b55ff9"} Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.348629 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.348648 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="4e0d2e17-4045-420d-817b-41a1fc66c425" containerName="nova-scheduler-scheduler" containerID="cri-o://c2ec66f0a6fe1edbd0a3743686091f7121951868a859f3c215737d15adf40c85" gracePeriod=30 Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.348637 4922 scope.go:117] "RemoveContainer" containerID="9dce9da90c9fbb3d596e5a3d01c25c49fc5f8665b585b44d2234a48680cd799a" Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.363619 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.387571 4922 scope.go:117] "RemoveContainer" containerID="c33e043e4f1e87b96e6f147a1fcb7afe16fcf9841b5e0ebbe368f8d27947cc5e" Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.423417 4922 scope.go:117] "RemoveContainer" containerID="9dce9da90c9fbb3d596e5a3d01c25c49fc5f8665b585b44d2234a48680cd799a" Feb 18 11:58:27 crc kubenswrapper[4922]: E0218 11:58:27.423960 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dce9da90c9fbb3d596e5a3d01c25c49fc5f8665b585b44d2234a48680cd799a\": container with ID starting with 9dce9da90c9fbb3d596e5a3d01c25c49fc5f8665b585b44d2234a48680cd799a not found: ID does not exist" containerID="9dce9da90c9fbb3d596e5a3d01c25c49fc5f8665b585b44d2234a48680cd799a" Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.423993 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dce9da90c9fbb3d596e5a3d01c25c49fc5f8665b585b44d2234a48680cd799a"} err="failed to get container status \"9dce9da90c9fbb3d596e5a3d01c25c49fc5f8665b585b44d2234a48680cd799a\": rpc error: code = NotFound desc = could not find container \"9dce9da90c9fbb3d596e5a3d01c25c49fc5f8665b585b44d2234a48680cd799a\": container with ID starting with 9dce9da90c9fbb3d596e5a3d01c25c49fc5f8665b585b44d2234a48680cd799a not found: ID does not exist" Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.424016 4922 scope.go:117] "RemoveContainer" containerID="c33e043e4f1e87b96e6f147a1fcb7afe16fcf9841b5e0ebbe368f8d27947cc5e" Feb 18 11:58:27 crc kubenswrapper[4922]: E0218 11:58:27.424305 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c33e043e4f1e87b96e6f147a1fcb7afe16fcf9841b5e0ebbe368f8d27947cc5e\": container with ID starting with c33e043e4f1e87b96e6f147a1fcb7afe16fcf9841b5e0ebbe368f8d27947cc5e not found: ID does not exist" containerID="c33e043e4f1e87b96e6f147a1fcb7afe16fcf9841b5e0ebbe368f8d27947cc5e" Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.424325 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c33e043e4f1e87b96e6f147a1fcb7afe16fcf9841b5e0ebbe368f8d27947cc5e"} err="failed to get container status \"c33e043e4f1e87b96e6f147a1fcb7afe16fcf9841b5e0ebbe368f8d27947cc5e\": rpc error: code = NotFound desc = could not find container \"c33e043e4f1e87b96e6f147a1fcb7afe16fcf9841b5e0ebbe368f8d27947cc5e\": container with ID starting with c33e043e4f1e87b96e6f147a1fcb7afe16fcf9841b5e0ebbe368f8d27947cc5e not found: ID does not exist" Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.424338 4922 scope.go:117] "RemoveContainer" containerID="9dce9da90c9fbb3d596e5a3d01c25c49fc5f8665b585b44d2234a48680cd799a" Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.424588 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dce9da90c9fbb3d596e5a3d01c25c49fc5f8665b585b44d2234a48680cd799a"} err="failed to get container status \"9dce9da90c9fbb3d596e5a3d01c25c49fc5f8665b585b44d2234a48680cd799a\": rpc error: code = NotFound desc = could not find container \"9dce9da90c9fbb3d596e5a3d01c25c49fc5f8665b585b44d2234a48680cd799a\": container with ID starting with 9dce9da90c9fbb3d596e5a3d01c25c49fc5f8665b585b44d2234a48680cd799a not found: ID does not exist" Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.424618 4922 scope.go:117] "RemoveContainer" containerID="c33e043e4f1e87b96e6f147a1fcb7afe16fcf9841b5e0ebbe368f8d27947cc5e" Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.424947 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c33e043e4f1e87b96e6f147a1fcb7afe16fcf9841b5e0ebbe368f8d27947cc5e"} err="failed to get container status \"c33e043e4f1e87b96e6f147a1fcb7afe16fcf9841b5e0ebbe368f8d27947cc5e\": rpc error: code = NotFound desc = could not find container \"c33e043e4f1e87b96e6f147a1fcb7afe16fcf9841b5e0ebbe368f8d27947cc5e\": container with ID starting with c33e043e4f1e87b96e6f147a1fcb7afe16fcf9841b5e0ebbe368f8d27947cc5e not found: ID does not exist" Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.452904 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cda998c8-9655-49e8-ad74-689371f71535-nova-metadata-tls-certs\") pod \"cda998c8-9655-49e8-ad74-689371f71535\" (UID: \"cda998c8-9655-49e8-ad74-689371f71535\") " Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.453077 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cda998c8-9655-49e8-ad74-689371f71535-logs\") pod \"cda998c8-9655-49e8-ad74-689371f71535\" (UID: \"cda998c8-9655-49e8-ad74-689371f71535\") " Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.453135 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mx867\" (UniqueName: \"kubernetes.io/projected/cda998c8-9655-49e8-ad74-689371f71535-kube-api-access-mx867\") pod \"cda998c8-9655-49e8-ad74-689371f71535\" (UID: \"cda998c8-9655-49e8-ad74-689371f71535\") " Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.453246 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cda998c8-9655-49e8-ad74-689371f71535-combined-ca-bundle\") pod \"cda998c8-9655-49e8-ad74-689371f71535\" (UID: \"cda998c8-9655-49e8-ad74-689371f71535\") " Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.453321 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cda998c8-9655-49e8-ad74-689371f71535-config-data\") pod \"cda998c8-9655-49e8-ad74-689371f71535\" (UID: \"cda998c8-9655-49e8-ad74-689371f71535\") " Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.453562 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cda998c8-9655-49e8-ad74-689371f71535-logs" (OuterVolumeSpecName: "logs") pod "cda998c8-9655-49e8-ad74-689371f71535" (UID: "cda998c8-9655-49e8-ad74-689371f71535"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.453853 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cda998c8-9655-49e8-ad74-689371f71535-logs\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.460765 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cda998c8-9655-49e8-ad74-689371f71535-kube-api-access-mx867" (OuterVolumeSpecName: "kube-api-access-mx867") pod "cda998c8-9655-49e8-ad74-689371f71535" (UID: "cda998c8-9655-49e8-ad74-689371f71535"). InnerVolumeSpecName "kube-api-access-mx867". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.489567 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cda998c8-9655-49e8-ad74-689371f71535-config-data" (OuterVolumeSpecName: "config-data") pod "cda998c8-9655-49e8-ad74-689371f71535" (UID: "cda998c8-9655-49e8-ad74-689371f71535"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.493349 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cda998c8-9655-49e8-ad74-689371f71535-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cda998c8-9655-49e8-ad74-689371f71535" (UID: "cda998c8-9655-49e8-ad74-689371f71535"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.523517 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cda998c8-9655-49e8-ad74-689371f71535-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "cda998c8-9655-49e8-ad74-689371f71535" (UID: "cda998c8-9655-49e8-ad74-689371f71535"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.555709 4922 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cda998c8-9655-49e8-ad74-689371f71535-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.555962 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mx867\" (UniqueName: \"kubernetes.io/projected/cda998c8-9655-49e8-ad74-689371f71535-kube-api-access-mx867\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.555972 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cda998c8-9655-49e8-ad74-689371f71535-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.555982 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cda998c8-9655-49e8-ad74-689371f71535-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.689416 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.707455 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.719222 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 18 11:58:27 crc kubenswrapper[4922]: E0218 11:58:27.719660 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cda998c8-9655-49e8-ad74-689371f71535" containerName="nova-metadata-log" Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.719672 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="cda998c8-9655-49e8-ad74-689371f71535" containerName="nova-metadata-log" Feb 18 11:58:27 crc kubenswrapper[4922]: E0218 11:58:27.719692 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cda998c8-9655-49e8-ad74-689371f71535" containerName="nova-metadata-metadata" Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.719700 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="cda998c8-9655-49e8-ad74-689371f71535" containerName="nova-metadata-metadata" Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.719910 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="cda998c8-9655-49e8-ad74-689371f71535" containerName="nova-metadata-metadata" Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.719926 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="cda998c8-9655-49e8-ad74-689371f71535" containerName="nova-metadata-log" Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.720999 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.724199 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.725962 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.741976 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.860926 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10a54dd7-a74b-49c4-a631-ad8fe2c22d58-logs\") pod \"nova-metadata-0\" (UID: \"10a54dd7-a74b-49c4-a631-ad8fe2c22d58\") " pod="openstack/nova-metadata-0" Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.860980 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm79t\" (UniqueName: \"kubernetes.io/projected/10a54dd7-a74b-49c4-a631-ad8fe2c22d58-kube-api-access-zm79t\") pod \"nova-metadata-0\" (UID: \"10a54dd7-a74b-49c4-a631-ad8fe2c22d58\") " pod="openstack/nova-metadata-0" Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.861126 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/10a54dd7-a74b-49c4-a631-ad8fe2c22d58-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"10a54dd7-a74b-49c4-a631-ad8fe2c22d58\") " pod="openstack/nova-metadata-0" Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.861263 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10a54dd7-a74b-49c4-a631-ad8fe2c22d58-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"10a54dd7-a74b-49c4-a631-ad8fe2c22d58\") " pod="openstack/nova-metadata-0" Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.861331 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10a54dd7-a74b-49c4-a631-ad8fe2c22d58-config-data\") pod \"nova-metadata-0\" (UID: \"10a54dd7-a74b-49c4-a631-ad8fe2c22d58\") " pod="openstack/nova-metadata-0" Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.963628 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/10a54dd7-a74b-49c4-a631-ad8fe2c22d58-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"10a54dd7-a74b-49c4-a631-ad8fe2c22d58\") " pod="openstack/nova-metadata-0" Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.963691 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10a54dd7-a74b-49c4-a631-ad8fe2c22d58-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"10a54dd7-a74b-49c4-a631-ad8fe2c22d58\") " pod="openstack/nova-metadata-0" Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.963710 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10a54dd7-a74b-49c4-a631-ad8fe2c22d58-config-data\") pod \"nova-metadata-0\" (UID: \"10a54dd7-a74b-49c4-a631-ad8fe2c22d58\") " pod="openstack/nova-metadata-0" Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.963780 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10a54dd7-a74b-49c4-a631-ad8fe2c22d58-logs\") pod \"nova-metadata-0\" (UID: \"10a54dd7-a74b-49c4-a631-ad8fe2c22d58\") " pod="openstack/nova-metadata-0" Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.963801 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zm79t\" (UniqueName: \"kubernetes.io/projected/10a54dd7-a74b-49c4-a631-ad8fe2c22d58-kube-api-access-zm79t\") pod \"nova-metadata-0\" (UID: \"10a54dd7-a74b-49c4-a631-ad8fe2c22d58\") " pod="openstack/nova-metadata-0" Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.964621 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10a54dd7-a74b-49c4-a631-ad8fe2c22d58-logs\") pod \"nova-metadata-0\" (UID: \"10a54dd7-a74b-49c4-a631-ad8fe2c22d58\") " pod="openstack/nova-metadata-0" Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.969474 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10a54dd7-a74b-49c4-a631-ad8fe2c22d58-config-data\") pod \"nova-metadata-0\" (UID: \"10a54dd7-a74b-49c4-a631-ad8fe2c22d58\") " pod="openstack/nova-metadata-0" Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.969799 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/10a54dd7-a74b-49c4-a631-ad8fe2c22d58-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"10a54dd7-a74b-49c4-a631-ad8fe2c22d58\") " pod="openstack/nova-metadata-0" Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.987048 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10a54dd7-a74b-49c4-a631-ad8fe2c22d58-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"10a54dd7-a74b-49c4-a631-ad8fe2c22d58\") " pod="openstack/nova-metadata-0" Feb 18 11:58:27 crc kubenswrapper[4922]: I0218 11:58:27.989844 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm79t\" (UniqueName: \"kubernetes.io/projected/10a54dd7-a74b-49c4-a631-ad8fe2c22d58-kube-api-access-zm79t\") pod \"nova-metadata-0\" (UID: \"10a54dd7-a74b-49c4-a631-ad8fe2c22d58\") " pod="openstack/nova-metadata-0" Feb 18 11:58:28 crc kubenswrapper[4922]: I0218 11:58:28.069860 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 11:58:28 crc kubenswrapper[4922]: I0218 11:58:28.358925 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"31ef9a9b-fedd-4afd-8582-19ef097c98a2","Type":"ContainerStarted","Data":"dc8f2a77b2439df31b74dae3e22ccb5a2f39f8b3df7aa6e39e60625e8e0063ab"} Feb 18 11:58:28 crc kubenswrapper[4922]: I0218 11:58:28.359323 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"31ef9a9b-fedd-4afd-8582-19ef097c98a2","Type":"ContainerStarted","Data":"06bfc094ce0deb9842f7a22e016d3ee33b6481c041a25f6fda233b69aaff27d6"} Feb 18 11:58:28 crc kubenswrapper[4922]: I0218 11:58:28.360490 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 18 11:58:28 crc kubenswrapper[4922]: I0218 11:58:28.382679 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.382658528 podStartE2EDuration="2.382658528s" podCreationTimestamp="2026-02-18 11:58:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:58:28.376640916 +0000 UTC m=+1310.104344996" watchObservedRunningTime="2026-02-18 11:58:28.382658528 +0000 UTC m=+1310.110362608" Feb 18 11:58:28 crc kubenswrapper[4922]: I0218 11:58:28.536967 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 11:58:28 crc kubenswrapper[4922]: W0218 11:58:28.541633 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10a54dd7_a74b_49c4_a631_ad8fe2c22d58.slice/crio-0eb58513d2b717b0e4ac58c75a07be6926255b4cc98ac974a0fd2d38b35e5a01 WatchSource:0}: Error finding container 0eb58513d2b717b0e4ac58c75a07be6926255b4cc98ac974a0fd2d38b35e5a01: Status 404 returned error can't find the container with id 0eb58513d2b717b0e4ac58c75a07be6926255b4cc98ac974a0fd2d38b35e5a01 Feb 18 11:58:28 crc kubenswrapper[4922]: I0218 11:58:28.989004 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cda998c8-9655-49e8-ad74-689371f71535" path="/var/lib/kubelet/pods/cda998c8-9655-49e8-ad74-689371f71535/volumes" Feb 18 11:58:29 crc kubenswrapper[4922]: I0218 11:58:29.374622 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"10a54dd7-a74b-49c4-a631-ad8fe2c22d58","Type":"ContainerStarted","Data":"6eb028dcb7cb22f62f9b70fd21469cacb245f308ad0c1ab142649395ab9a2db8"} Feb 18 11:58:29 crc kubenswrapper[4922]: I0218 11:58:29.374678 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"10a54dd7-a74b-49c4-a631-ad8fe2c22d58","Type":"ContainerStarted","Data":"b08e25eab8acf946afba995372397a3ab97e8a1f62eebeb6161f0fd5696132b0"} Feb 18 11:58:29 crc kubenswrapper[4922]: I0218 11:58:29.374692 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"10a54dd7-a74b-49c4-a631-ad8fe2c22d58","Type":"ContainerStarted","Data":"0eb58513d2b717b0e4ac58c75a07be6926255b4cc98ac974a0fd2d38b35e5a01"} Feb 18 11:58:29 crc kubenswrapper[4922]: I0218 11:58:29.404531 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.404512096 podStartE2EDuration="2.404512096s" podCreationTimestamp="2026-02-18 11:58:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:58:29.393504438 +0000 UTC m=+1311.121208508" watchObservedRunningTime="2026-02-18 11:58:29.404512096 +0000 UTC m=+1311.132216176" Feb 18 11:58:29 crc kubenswrapper[4922]: E0218 11:58:29.731591 4922 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c2ec66f0a6fe1edbd0a3743686091f7121951868a859f3c215737d15adf40c85" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 18 11:58:29 crc kubenswrapper[4922]: E0218 11:58:29.733213 4922 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c2ec66f0a6fe1edbd0a3743686091f7121951868a859f3c215737d15adf40c85" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 18 11:58:29 crc kubenswrapper[4922]: E0218 11:58:29.736261 4922 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c2ec66f0a6fe1edbd0a3743686091f7121951868a859f3c215737d15adf40c85" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 18 11:58:29 crc kubenswrapper[4922]: E0218 11:58:29.736308 4922 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="4e0d2e17-4045-420d-817b-41a1fc66c425" containerName="nova-scheduler-scheduler" Feb 18 11:58:29 crc kubenswrapper[4922]: I0218 11:58:29.748571 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bccf8f775-pc7hm" Feb 18 11:58:29 crc kubenswrapper[4922]: I0218 11:58:29.821804 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-78vwz"] Feb 18 11:58:29 crc kubenswrapper[4922]: I0218 11:58:29.822082 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6578955fd5-78vwz" podUID="d36f2285-2752-4cad-bf52-fe6ae0b262d1" containerName="dnsmasq-dns" containerID="cri-o://dcc50eb32469aa9ff5b74469bad439e5bb4a6c4458c1ea0f88da9a06c0a2437a" gracePeriod=10 Feb 18 11:58:30 crc kubenswrapper[4922]: I0218 11:58:30.386618 4922 generic.go:334] "Generic (PLEG): container finished" podID="01766e5f-d149-4175-9fdb-15e65b0e0665" containerID="ede12ae546b38989e02144d9c29b45bfe17c0490fbfcd99cc1a79c54dc349009" exitCode=137 Feb 18 11:58:30 crc kubenswrapper[4922]: I0218 11:58:30.386678 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01766e5f-d149-4175-9fdb-15e65b0e0665","Type":"ContainerDied","Data":"ede12ae546b38989e02144d9c29b45bfe17c0490fbfcd99cc1a79c54dc349009"} Feb 18 11:58:30 crc kubenswrapper[4922]: I0218 11:58:30.386990 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01766e5f-d149-4175-9fdb-15e65b0e0665","Type":"ContainerDied","Data":"9caa36cbf4a061073a94710aba88a19e4ceb523f9ac965d8c82018ec849187a8"} Feb 18 11:58:30 crc kubenswrapper[4922]: I0218 11:58:30.387007 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9caa36cbf4a061073a94710aba88a19e4ceb523f9ac965d8c82018ec849187a8" Feb 18 11:58:30 crc kubenswrapper[4922]: I0218 11:58:30.389042 4922 generic.go:334] "Generic (PLEG): container finished" podID="d36f2285-2752-4cad-bf52-fe6ae0b262d1" containerID="dcc50eb32469aa9ff5b74469bad439e5bb4a6c4458c1ea0f88da9a06c0a2437a" exitCode=0 Feb 18 11:58:30 crc kubenswrapper[4922]: I0218 11:58:30.389122 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-78vwz" event={"ID":"d36f2285-2752-4cad-bf52-fe6ae0b262d1","Type":"ContainerDied","Data":"dcc50eb32469aa9ff5b74469bad439e5bb4a6c4458c1ea0f88da9a06c0a2437a"} Feb 18 11:58:30 crc kubenswrapper[4922]: I0218 11:58:30.466878 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 11:58:30 crc kubenswrapper[4922]: I0218 11:58:30.475426 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-78vwz" Feb 18 11:58:30 crc kubenswrapper[4922]: I0218 11:58:30.530129 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d36f2285-2752-4cad-bf52-fe6ae0b262d1-ovsdbserver-nb\") pod \"d36f2285-2752-4cad-bf52-fe6ae0b262d1\" (UID: \"d36f2285-2752-4cad-bf52-fe6ae0b262d1\") " Feb 18 11:58:30 crc kubenswrapper[4922]: I0218 11:58:30.530190 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01766e5f-d149-4175-9fdb-15e65b0e0665-combined-ca-bundle\") pod \"01766e5f-d149-4175-9fdb-15e65b0e0665\" (UID: \"01766e5f-d149-4175-9fdb-15e65b0e0665\") " Feb 18 11:58:30 crc kubenswrapper[4922]: I0218 11:58:30.530255 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01766e5f-d149-4175-9fdb-15e65b0e0665-run-httpd\") pod \"01766e5f-d149-4175-9fdb-15e65b0e0665\" (UID: \"01766e5f-d149-4175-9fdb-15e65b0e0665\") " Feb 18 11:58:30 crc kubenswrapper[4922]: I0218 11:58:30.530337 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d36f2285-2752-4cad-bf52-fe6ae0b262d1-ovsdbserver-sb\") pod \"d36f2285-2752-4cad-bf52-fe6ae0b262d1\" (UID: \"d36f2285-2752-4cad-bf52-fe6ae0b262d1\") " Feb 18 11:58:30 crc kubenswrapper[4922]: I0218 11:58:30.530386 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01766e5f-d149-4175-9fdb-15e65b0e0665-config-data\") pod \"01766e5f-d149-4175-9fdb-15e65b0e0665\" (UID: \"01766e5f-d149-4175-9fdb-15e65b0e0665\") " Feb 18 11:58:30 crc kubenswrapper[4922]: I0218 11:58:30.530447 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01766e5f-d149-4175-9fdb-15e65b0e0665-scripts\") pod \"01766e5f-d149-4175-9fdb-15e65b0e0665\" (UID: \"01766e5f-d149-4175-9fdb-15e65b0e0665\") " Feb 18 11:58:30 crc kubenswrapper[4922]: I0218 11:58:30.530473 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkcct\" (UniqueName: \"kubernetes.io/projected/01766e5f-d149-4175-9fdb-15e65b0e0665-kube-api-access-bkcct\") pod \"01766e5f-d149-4175-9fdb-15e65b0e0665\" (UID: \"01766e5f-d149-4175-9fdb-15e65b0e0665\") " Feb 18 11:58:30 crc kubenswrapper[4922]: I0218 11:58:30.530498 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mphx\" (UniqueName: \"kubernetes.io/projected/d36f2285-2752-4cad-bf52-fe6ae0b262d1-kube-api-access-4mphx\") pod \"d36f2285-2752-4cad-bf52-fe6ae0b262d1\" (UID: \"d36f2285-2752-4cad-bf52-fe6ae0b262d1\") " Feb 18 11:58:30 crc kubenswrapper[4922]: I0218 11:58:30.530587 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d36f2285-2752-4cad-bf52-fe6ae0b262d1-config\") pod \"d36f2285-2752-4cad-bf52-fe6ae0b262d1\" (UID: \"d36f2285-2752-4cad-bf52-fe6ae0b262d1\") " Feb 18 11:58:30 crc kubenswrapper[4922]: I0218 11:58:30.530615 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d36f2285-2752-4cad-bf52-fe6ae0b262d1-dns-swift-storage-0\") pod \"d36f2285-2752-4cad-bf52-fe6ae0b262d1\" (UID: \"d36f2285-2752-4cad-bf52-fe6ae0b262d1\") " Feb 18 11:58:30 crc kubenswrapper[4922]: I0218 11:58:30.530647 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01766e5f-d149-4175-9fdb-15e65b0e0665-sg-core-conf-yaml\") pod \"01766e5f-d149-4175-9fdb-15e65b0e0665\" (UID: \"01766e5f-d149-4175-9fdb-15e65b0e0665\") " Feb 18 11:58:30 crc kubenswrapper[4922]: I0218 11:58:30.530697 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01766e5f-d149-4175-9fdb-15e65b0e0665-log-httpd\") pod \"01766e5f-d149-4175-9fdb-15e65b0e0665\" (UID: \"01766e5f-d149-4175-9fdb-15e65b0e0665\") " Feb 18 11:58:30 crc kubenswrapper[4922]: I0218 11:58:30.530737 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d36f2285-2752-4cad-bf52-fe6ae0b262d1-dns-svc\") pod \"d36f2285-2752-4cad-bf52-fe6ae0b262d1\" (UID: \"d36f2285-2752-4cad-bf52-fe6ae0b262d1\") " Feb 18 11:58:30 crc kubenswrapper[4922]: I0218 11:58:30.533196 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01766e5f-d149-4175-9fdb-15e65b0e0665-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "01766e5f-d149-4175-9fdb-15e65b0e0665" (UID: "01766e5f-d149-4175-9fdb-15e65b0e0665"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:58:30 crc kubenswrapper[4922]: I0218 11:58:30.535657 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01766e5f-d149-4175-9fdb-15e65b0e0665-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "01766e5f-d149-4175-9fdb-15e65b0e0665" (UID: "01766e5f-d149-4175-9fdb-15e65b0e0665"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:58:30 crc kubenswrapper[4922]: I0218 11:58:30.540770 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01766e5f-d149-4175-9fdb-15e65b0e0665-scripts" (OuterVolumeSpecName: "scripts") pod "01766e5f-d149-4175-9fdb-15e65b0e0665" (UID: "01766e5f-d149-4175-9fdb-15e65b0e0665"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:58:30 crc kubenswrapper[4922]: I0218 11:58:30.543786 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d36f2285-2752-4cad-bf52-fe6ae0b262d1-kube-api-access-4mphx" (OuterVolumeSpecName: "kube-api-access-4mphx") pod "d36f2285-2752-4cad-bf52-fe6ae0b262d1" (UID: "d36f2285-2752-4cad-bf52-fe6ae0b262d1"). InnerVolumeSpecName "kube-api-access-4mphx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:58:30 crc kubenswrapper[4922]: I0218 11:58:30.548608 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01766e5f-d149-4175-9fdb-15e65b0e0665-kube-api-access-bkcct" (OuterVolumeSpecName: "kube-api-access-bkcct") pod "01766e5f-d149-4175-9fdb-15e65b0e0665" (UID: "01766e5f-d149-4175-9fdb-15e65b0e0665"). InnerVolumeSpecName "kube-api-access-bkcct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:58:30 crc kubenswrapper[4922]: I0218 11:58:30.569414 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01766e5f-d149-4175-9fdb-15e65b0e0665-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "01766e5f-d149-4175-9fdb-15e65b0e0665" (UID: "01766e5f-d149-4175-9fdb-15e65b0e0665"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:58:30 crc kubenswrapper[4922]: I0218 11:58:30.588576 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d36f2285-2752-4cad-bf52-fe6ae0b262d1-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d36f2285-2752-4cad-bf52-fe6ae0b262d1" (UID: "d36f2285-2752-4cad-bf52-fe6ae0b262d1"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:58:30 crc kubenswrapper[4922]: I0218 11:58:30.593292 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d36f2285-2752-4cad-bf52-fe6ae0b262d1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d36f2285-2752-4cad-bf52-fe6ae0b262d1" (UID: "d36f2285-2752-4cad-bf52-fe6ae0b262d1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:58:30 crc kubenswrapper[4922]: I0218 11:58:30.603299 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d36f2285-2752-4cad-bf52-fe6ae0b262d1-config" (OuterVolumeSpecName: "config") pod "d36f2285-2752-4cad-bf52-fe6ae0b262d1" (UID: "d36f2285-2752-4cad-bf52-fe6ae0b262d1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:58:30 crc kubenswrapper[4922]: I0218 11:58:30.607259 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d36f2285-2752-4cad-bf52-fe6ae0b262d1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d36f2285-2752-4cad-bf52-fe6ae0b262d1" (UID: "d36f2285-2752-4cad-bf52-fe6ae0b262d1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:58:30 crc kubenswrapper[4922]: I0218 11:58:30.608944 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d36f2285-2752-4cad-bf52-fe6ae0b262d1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d36f2285-2752-4cad-bf52-fe6ae0b262d1" (UID: "d36f2285-2752-4cad-bf52-fe6ae0b262d1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:58:30 crc kubenswrapper[4922]: I0218 11:58:30.626552 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01766e5f-d149-4175-9fdb-15e65b0e0665-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "01766e5f-d149-4175-9fdb-15e65b0e0665" (UID: "01766e5f-d149-4175-9fdb-15e65b0e0665"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:58:30 crc kubenswrapper[4922]: I0218 11:58:30.632851 4922 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01766e5f-d149-4175-9fdb-15e65b0e0665-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:30 crc kubenswrapper[4922]: I0218 11:58:30.632887 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d36f2285-2752-4cad-bf52-fe6ae0b262d1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:30 crc kubenswrapper[4922]: I0218 11:58:30.632899 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01766e5f-d149-4175-9fdb-15e65b0e0665-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:30 crc kubenswrapper[4922]: I0218 11:58:30.632908 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkcct\" (UniqueName: \"kubernetes.io/projected/01766e5f-d149-4175-9fdb-15e65b0e0665-kube-api-access-bkcct\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:30 crc kubenswrapper[4922]: I0218 11:58:30.632918 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mphx\" (UniqueName: \"kubernetes.io/projected/d36f2285-2752-4cad-bf52-fe6ae0b262d1-kube-api-access-4mphx\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:30 crc kubenswrapper[4922]: I0218 11:58:30.632927 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d36f2285-2752-4cad-bf52-fe6ae0b262d1-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:30 crc kubenswrapper[4922]: I0218 11:58:30.632935 4922 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d36f2285-2752-4cad-bf52-fe6ae0b262d1-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:30 crc kubenswrapper[4922]: I0218 11:58:30.632942 4922 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01766e5f-d149-4175-9fdb-15e65b0e0665-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:30 crc kubenswrapper[4922]: I0218 11:58:30.632950 4922 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01766e5f-d149-4175-9fdb-15e65b0e0665-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:30 crc kubenswrapper[4922]: I0218 11:58:30.632957 4922 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d36f2285-2752-4cad-bf52-fe6ae0b262d1-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:30 crc kubenswrapper[4922]: I0218 11:58:30.632965 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d36f2285-2752-4cad-bf52-fe6ae0b262d1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:30 crc kubenswrapper[4922]: I0218 11:58:30.632972 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01766e5f-d149-4175-9fdb-15e65b0e0665-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:30 crc kubenswrapper[4922]: I0218 11:58:30.652324 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01766e5f-d149-4175-9fdb-15e65b0e0665-config-data" (OuterVolumeSpecName: "config-data") pod "01766e5f-d149-4175-9fdb-15e65b0e0665" (UID: "01766e5f-d149-4175-9fdb-15e65b0e0665"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:58:30 crc kubenswrapper[4922]: I0218 11:58:30.734270 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01766e5f-d149-4175-9fdb-15e65b0e0665-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.401522 4922 generic.go:334] "Generic (PLEG): container finished" podID="2856a778-a8b2-4740-8d2a-4a6f64619bc2" containerID="783aeec10a7933650343fa4b55375b721d65807531d8e7dbf7c2823b6b487dd6" exitCode=0 Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.401639 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2856a778-a8b2-4740-8d2a-4a6f64619bc2","Type":"ContainerDied","Data":"783aeec10a7933650343fa4b55375b721d65807531d8e7dbf7c2823b6b487dd6"} Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.405047 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-78vwz" event={"ID":"d36f2285-2752-4cad-bf52-fe6ae0b262d1","Type":"ContainerDied","Data":"f6b2696bce7ccb6880bdda930a5ccfaf927c9ebc64dad81fb193a050fd9b8c85"} Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.405094 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.405118 4922 scope.go:117] "RemoveContainer" containerID="dcc50eb32469aa9ff5b74469bad439e5bb4a6c4458c1ea0f88da9a06c0a2437a" Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.405128 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-78vwz" Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.452138 4922 scope.go:117] "RemoveContainer" containerID="839de4434ebe21a5f0abbc718b56284e0f7743bf3463c809b6cae16fa7c2db5d" Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.456454 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-78vwz"] Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.465110 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-78vwz"] Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.475831 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.486614 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.500184 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 18 11:58:31 crc kubenswrapper[4922]: E0218 11:58:31.500627 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01766e5f-d149-4175-9fdb-15e65b0e0665" containerName="ceilometer-central-agent" Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.500645 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="01766e5f-d149-4175-9fdb-15e65b0e0665" containerName="ceilometer-central-agent" Feb 18 11:58:31 crc kubenswrapper[4922]: E0218 11:58:31.500657 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d36f2285-2752-4cad-bf52-fe6ae0b262d1" containerName="init" Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.500663 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="d36f2285-2752-4cad-bf52-fe6ae0b262d1" containerName="init" Feb 18 11:58:31 crc kubenswrapper[4922]: E0218 11:58:31.500675 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01766e5f-d149-4175-9fdb-15e65b0e0665" containerName="proxy-httpd" Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.500682 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="01766e5f-d149-4175-9fdb-15e65b0e0665" containerName="proxy-httpd" Feb 18 11:58:31 crc kubenswrapper[4922]: E0218 11:58:31.500696 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d36f2285-2752-4cad-bf52-fe6ae0b262d1" containerName="dnsmasq-dns" Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.500702 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="d36f2285-2752-4cad-bf52-fe6ae0b262d1" containerName="dnsmasq-dns" Feb 18 11:58:31 crc kubenswrapper[4922]: E0218 11:58:31.500713 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01766e5f-d149-4175-9fdb-15e65b0e0665" containerName="sg-core" Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.500719 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="01766e5f-d149-4175-9fdb-15e65b0e0665" containerName="sg-core" Feb 18 11:58:31 crc kubenswrapper[4922]: E0218 11:58:31.500743 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01766e5f-d149-4175-9fdb-15e65b0e0665" containerName="ceilometer-notification-agent" Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.500749 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="01766e5f-d149-4175-9fdb-15e65b0e0665" containerName="ceilometer-notification-agent" Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.500902 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="01766e5f-d149-4175-9fdb-15e65b0e0665" containerName="sg-core" Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.500915 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="01766e5f-d149-4175-9fdb-15e65b0e0665" containerName="ceilometer-central-agent" Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.500928 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="d36f2285-2752-4cad-bf52-fe6ae0b262d1" containerName="dnsmasq-dns" Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.500940 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="01766e5f-d149-4175-9fdb-15e65b0e0665" containerName="ceilometer-notification-agent" Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.500951 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="01766e5f-d149-4175-9fdb-15e65b0e0665" containerName="proxy-httpd" Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.502944 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.505783 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.507347 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.531105 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.557617 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4572162-62bc-4e43-b260-497a609abd8e-run-httpd\") pod \"ceilometer-0\" (UID: \"e4572162-62bc-4e43-b260-497a609abd8e\") " pod="openstack/ceilometer-0" Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.557716 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4572162-62bc-4e43-b260-497a609abd8e-scripts\") pod \"ceilometer-0\" (UID: \"e4572162-62bc-4e43-b260-497a609abd8e\") " pod="openstack/ceilometer-0" Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.557737 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4572162-62bc-4e43-b260-497a609abd8e-config-data\") pod \"ceilometer-0\" (UID: \"e4572162-62bc-4e43-b260-497a609abd8e\") " pod="openstack/ceilometer-0" Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.557762 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4572162-62bc-4e43-b260-497a609abd8e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e4572162-62bc-4e43-b260-497a609abd8e\") " pod="openstack/ceilometer-0" Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.557792 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4572162-62bc-4e43-b260-497a609abd8e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e4572162-62bc-4e43-b260-497a609abd8e\") " pod="openstack/ceilometer-0" Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.557853 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2h7j\" (UniqueName: \"kubernetes.io/projected/e4572162-62bc-4e43-b260-497a609abd8e-kube-api-access-d2h7j\") pod \"ceilometer-0\" (UID: \"e4572162-62bc-4e43-b260-497a609abd8e\") " pod="openstack/ceilometer-0" Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.557878 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4572162-62bc-4e43-b260-497a609abd8e-log-httpd\") pod \"ceilometer-0\" (UID: \"e4572162-62bc-4e43-b260-497a609abd8e\") " pod="openstack/ceilometer-0" Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.636866 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.659579 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4572162-62bc-4e43-b260-497a609abd8e-run-httpd\") pod \"ceilometer-0\" (UID: \"e4572162-62bc-4e43-b260-497a609abd8e\") " pod="openstack/ceilometer-0" Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.659660 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4572162-62bc-4e43-b260-497a609abd8e-scripts\") pod \"ceilometer-0\" (UID: \"e4572162-62bc-4e43-b260-497a609abd8e\") " pod="openstack/ceilometer-0" Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.659681 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4572162-62bc-4e43-b260-497a609abd8e-config-data\") pod \"ceilometer-0\" (UID: \"e4572162-62bc-4e43-b260-497a609abd8e\") " pod="openstack/ceilometer-0" Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.659700 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4572162-62bc-4e43-b260-497a609abd8e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e4572162-62bc-4e43-b260-497a609abd8e\") " pod="openstack/ceilometer-0" Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.659723 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4572162-62bc-4e43-b260-497a609abd8e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e4572162-62bc-4e43-b260-497a609abd8e\") " pod="openstack/ceilometer-0" Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.659768 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2h7j\" (UniqueName: \"kubernetes.io/projected/e4572162-62bc-4e43-b260-497a609abd8e-kube-api-access-d2h7j\") pod \"ceilometer-0\" (UID: \"e4572162-62bc-4e43-b260-497a609abd8e\") " pod="openstack/ceilometer-0" Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.659790 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4572162-62bc-4e43-b260-497a609abd8e-log-httpd\") pod \"ceilometer-0\" (UID: \"e4572162-62bc-4e43-b260-497a609abd8e\") " pod="openstack/ceilometer-0" Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.660398 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4572162-62bc-4e43-b260-497a609abd8e-log-httpd\") pod \"ceilometer-0\" (UID: \"e4572162-62bc-4e43-b260-497a609abd8e\") " pod="openstack/ceilometer-0" Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.660405 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4572162-62bc-4e43-b260-497a609abd8e-run-httpd\") pod \"ceilometer-0\" (UID: \"e4572162-62bc-4e43-b260-497a609abd8e\") " pod="openstack/ceilometer-0" Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.666146 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4572162-62bc-4e43-b260-497a609abd8e-scripts\") pod \"ceilometer-0\" (UID: \"e4572162-62bc-4e43-b260-497a609abd8e\") " pod="openstack/ceilometer-0" Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.673700 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4572162-62bc-4e43-b260-497a609abd8e-config-data\") pod \"ceilometer-0\" (UID: \"e4572162-62bc-4e43-b260-497a609abd8e\") " pod="openstack/ceilometer-0" Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.675084 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4572162-62bc-4e43-b260-497a609abd8e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e4572162-62bc-4e43-b260-497a609abd8e\") " pod="openstack/ceilometer-0" Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.677104 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4572162-62bc-4e43-b260-497a609abd8e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e4572162-62bc-4e43-b260-497a609abd8e\") " pod="openstack/ceilometer-0" Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.677152 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2h7j\" (UniqueName: \"kubernetes.io/projected/e4572162-62bc-4e43-b260-497a609abd8e-kube-api-access-d2h7j\") pod \"ceilometer-0\" (UID: \"e4572162-62bc-4e43-b260-497a609abd8e\") " pod="openstack/ceilometer-0" Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.761173 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2856a778-a8b2-4740-8d2a-4a6f64619bc2-combined-ca-bundle\") pod \"2856a778-a8b2-4740-8d2a-4a6f64619bc2\" (UID: \"2856a778-a8b2-4740-8d2a-4a6f64619bc2\") " Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.761384 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqjqm\" (UniqueName: \"kubernetes.io/projected/2856a778-a8b2-4740-8d2a-4a6f64619bc2-kube-api-access-bqjqm\") pod \"2856a778-a8b2-4740-8d2a-4a6f64619bc2\" (UID: \"2856a778-a8b2-4740-8d2a-4a6f64619bc2\") " Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.761410 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2856a778-a8b2-4740-8d2a-4a6f64619bc2-config-data\") pod \"2856a778-a8b2-4740-8d2a-4a6f64619bc2\" (UID: \"2856a778-a8b2-4740-8d2a-4a6f64619bc2\") " Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.761487 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2856a778-a8b2-4740-8d2a-4a6f64619bc2-logs\") pod \"2856a778-a8b2-4740-8d2a-4a6f64619bc2\" (UID: \"2856a778-a8b2-4740-8d2a-4a6f64619bc2\") " Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.762421 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2856a778-a8b2-4740-8d2a-4a6f64619bc2-logs" (OuterVolumeSpecName: "logs") pod "2856a778-a8b2-4740-8d2a-4a6f64619bc2" (UID: "2856a778-a8b2-4740-8d2a-4a6f64619bc2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.767022 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2856a778-a8b2-4740-8d2a-4a6f64619bc2-kube-api-access-bqjqm" (OuterVolumeSpecName: "kube-api-access-bqjqm") pod "2856a778-a8b2-4740-8d2a-4a6f64619bc2" (UID: "2856a778-a8b2-4740-8d2a-4a6f64619bc2"). InnerVolumeSpecName "kube-api-access-bqjqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.790904 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2856a778-a8b2-4740-8d2a-4a6f64619bc2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2856a778-a8b2-4740-8d2a-4a6f64619bc2" (UID: "2856a778-a8b2-4740-8d2a-4a6f64619bc2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.794840 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2856a778-a8b2-4740-8d2a-4a6f64619bc2-config-data" (OuterVolumeSpecName: "config-data") pod "2856a778-a8b2-4740-8d2a-4a6f64619bc2" (UID: "2856a778-a8b2-4740-8d2a-4a6f64619bc2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.840876 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.863392 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2856a778-a8b2-4740-8d2a-4a6f64619bc2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.863420 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2856a778-a8b2-4740-8d2a-4a6f64619bc2-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.863429 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqjqm\" (UniqueName: \"kubernetes.io/projected/2856a778-a8b2-4740-8d2a-4a6f64619bc2-kube-api-access-bqjqm\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:31 crc kubenswrapper[4922]: I0218 11:58:31.863437 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2856a778-a8b2-4740-8d2a-4a6f64619bc2-logs\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.192335 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.270397 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e0d2e17-4045-420d-817b-41a1fc66c425-config-data\") pod \"4e0d2e17-4045-420d-817b-41a1fc66c425\" (UID: \"4e0d2e17-4045-420d-817b-41a1fc66c425\") " Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.270472 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvvg5\" (UniqueName: \"kubernetes.io/projected/4e0d2e17-4045-420d-817b-41a1fc66c425-kube-api-access-qvvg5\") pod \"4e0d2e17-4045-420d-817b-41a1fc66c425\" (UID: \"4e0d2e17-4045-420d-817b-41a1fc66c425\") " Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.270523 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e0d2e17-4045-420d-817b-41a1fc66c425-combined-ca-bundle\") pod \"4e0d2e17-4045-420d-817b-41a1fc66c425\" (UID: \"4e0d2e17-4045-420d-817b-41a1fc66c425\") " Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.275274 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e0d2e17-4045-420d-817b-41a1fc66c425-kube-api-access-qvvg5" (OuterVolumeSpecName: "kube-api-access-qvvg5") pod "4e0d2e17-4045-420d-817b-41a1fc66c425" (UID: "4e0d2e17-4045-420d-817b-41a1fc66c425"). InnerVolumeSpecName "kube-api-access-qvvg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.319351 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e0d2e17-4045-420d-817b-41a1fc66c425-config-data" (OuterVolumeSpecName: "config-data") pod "4e0d2e17-4045-420d-817b-41a1fc66c425" (UID: "4e0d2e17-4045-420d-817b-41a1fc66c425"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.331940 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e0d2e17-4045-420d-817b-41a1fc66c425-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4e0d2e17-4045-420d-817b-41a1fc66c425" (UID: "4e0d2e17-4045-420d-817b-41a1fc66c425"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.365810 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.373311 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e0d2e17-4045-420d-817b-41a1fc66c425-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.373345 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvvg5\" (UniqueName: \"kubernetes.io/projected/4e0d2e17-4045-420d-817b-41a1fc66c425-kube-api-access-qvvg5\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.373374 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e0d2e17-4045-420d-817b-41a1fc66c425-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.417515 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4572162-62bc-4e43-b260-497a609abd8e","Type":"ContainerStarted","Data":"be12ad7e80f1808fcecd3d9fd6a8cd2df6d592559b9d82740e6f5d36a70ad362"} Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.419388 4922 generic.go:334] "Generic (PLEG): container finished" podID="4e0d2e17-4045-420d-817b-41a1fc66c425" containerID="c2ec66f0a6fe1edbd0a3743686091f7121951868a859f3c215737d15adf40c85" exitCode=0 Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.419561 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4e0d2e17-4045-420d-817b-41a1fc66c425","Type":"ContainerDied","Data":"c2ec66f0a6fe1edbd0a3743686091f7121951868a859f3c215737d15adf40c85"} Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.419606 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4e0d2e17-4045-420d-817b-41a1fc66c425","Type":"ContainerDied","Data":"5c4ed2cdb2b752aa88aa0c848f0558afd5579a29c6af7de94b0afbe9de2ec4ec"} Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.419634 4922 scope.go:117] "RemoveContainer" containerID="c2ec66f0a6fe1edbd0a3743686091f7121951868a859f3c215737d15adf40c85" Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.419757 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.428209 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2856a778-a8b2-4740-8d2a-4a6f64619bc2","Type":"ContainerDied","Data":"0bc208ef650f9f609657312f6d8f2198c5a12a9f657dff0516610817ba9c8516"} Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.428283 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.462665 4922 scope.go:117] "RemoveContainer" containerID="c2ec66f0a6fe1edbd0a3743686091f7121951868a859f3c215737d15adf40c85" Feb 18 11:58:32 crc kubenswrapper[4922]: E0218 11:58:32.463478 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2ec66f0a6fe1edbd0a3743686091f7121951868a859f3c215737d15adf40c85\": container with ID starting with c2ec66f0a6fe1edbd0a3743686091f7121951868a859f3c215737d15adf40c85 not found: ID does not exist" containerID="c2ec66f0a6fe1edbd0a3743686091f7121951868a859f3c215737d15adf40c85" Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.463669 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2ec66f0a6fe1edbd0a3743686091f7121951868a859f3c215737d15adf40c85"} err="failed to get container status \"c2ec66f0a6fe1edbd0a3743686091f7121951868a859f3c215737d15adf40c85\": rpc error: code = NotFound desc = could not find container \"c2ec66f0a6fe1edbd0a3743686091f7121951868a859f3c215737d15adf40c85\": container with ID starting with c2ec66f0a6fe1edbd0a3743686091f7121951868a859f3c215737d15adf40c85 not found: ID does not exist" Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.463691 4922 scope.go:117] "RemoveContainer" containerID="783aeec10a7933650343fa4b55375b721d65807531d8e7dbf7c2823b6b487dd6" Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.477832 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.490466 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.513884 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.524384 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.533999 4922 scope.go:117] "RemoveContainer" containerID="deb34b70b7dbc61de6d4d92ee5f97c64b68483ab7a9de52c357c2663ff82c337" Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.536025 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 11:58:32 crc kubenswrapper[4922]: E0218 11:58:32.536443 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2856a778-a8b2-4740-8d2a-4a6f64619bc2" containerName="nova-api-api" Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.536461 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="2856a778-a8b2-4740-8d2a-4a6f64619bc2" containerName="nova-api-api" Feb 18 11:58:32 crc kubenswrapper[4922]: E0218 11:58:32.536488 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2856a778-a8b2-4740-8d2a-4a6f64619bc2" containerName="nova-api-log" Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.536495 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="2856a778-a8b2-4740-8d2a-4a6f64619bc2" containerName="nova-api-log" Feb 18 11:58:32 crc kubenswrapper[4922]: E0218 11:58:32.536521 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e0d2e17-4045-420d-817b-41a1fc66c425" containerName="nova-scheduler-scheduler" Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.536528 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e0d2e17-4045-420d-817b-41a1fc66c425" containerName="nova-scheduler-scheduler" Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.536702 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e0d2e17-4045-420d-817b-41a1fc66c425" containerName="nova-scheduler-scheduler" Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.536717 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="2856a778-a8b2-4740-8d2a-4a6f64619bc2" containerName="nova-api-log" Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.536735 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="2856a778-a8b2-4740-8d2a-4a6f64619bc2" containerName="nova-api-api" Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.537350 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.539337 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.547519 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.561560 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.563203 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.567740 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.586582 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.688265 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47d3c20f-b062-4987-bbc5-c0c030d5f340-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"47d3c20f-b062-4987-bbc5-c0c030d5f340\") " pod="openstack/nova-api-0" Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.688353 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47d3c20f-b062-4987-bbc5-c0c030d5f340-logs\") pod \"nova-api-0\" (UID: \"47d3c20f-b062-4987-bbc5-c0c030d5f340\") " pod="openstack/nova-api-0" Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.688415 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe95484c-ea5d-4ea3-8915-bb6734014373-config-data\") pod \"nova-scheduler-0\" (UID: \"fe95484c-ea5d-4ea3-8915-bb6734014373\") " pod="openstack/nova-scheduler-0" Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.688456 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcwtr\" (UniqueName: \"kubernetes.io/projected/fe95484c-ea5d-4ea3-8915-bb6734014373-kube-api-access-bcwtr\") pod \"nova-scheduler-0\" (UID: \"fe95484c-ea5d-4ea3-8915-bb6734014373\") " pod="openstack/nova-scheduler-0" Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.688498 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe95484c-ea5d-4ea3-8915-bb6734014373-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fe95484c-ea5d-4ea3-8915-bb6734014373\") " pod="openstack/nova-scheduler-0" Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.688527 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47d3c20f-b062-4987-bbc5-c0c030d5f340-config-data\") pod \"nova-api-0\" (UID: \"47d3c20f-b062-4987-bbc5-c0c030d5f340\") " pod="openstack/nova-api-0" Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.688556 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vftr\" (UniqueName: \"kubernetes.io/projected/47d3c20f-b062-4987-bbc5-c0c030d5f340-kube-api-access-4vftr\") pod \"nova-api-0\" (UID: \"47d3c20f-b062-4987-bbc5-c0c030d5f340\") " pod="openstack/nova-api-0" Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.789937 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47d3c20f-b062-4987-bbc5-c0c030d5f340-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"47d3c20f-b062-4987-bbc5-c0c030d5f340\") " pod="openstack/nova-api-0" Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.790043 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47d3c20f-b062-4987-bbc5-c0c030d5f340-logs\") pod \"nova-api-0\" (UID: \"47d3c20f-b062-4987-bbc5-c0c030d5f340\") " pod="openstack/nova-api-0" Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.790086 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe95484c-ea5d-4ea3-8915-bb6734014373-config-data\") pod \"nova-scheduler-0\" (UID: \"fe95484c-ea5d-4ea3-8915-bb6734014373\") " pod="openstack/nova-scheduler-0" Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.790109 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcwtr\" (UniqueName: \"kubernetes.io/projected/fe95484c-ea5d-4ea3-8915-bb6734014373-kube-api-access-bcwtr\") pod \"nova-scheduler-0\" (UID: \"fe95484c-ea5d-4ea3-8915-bb6734014373\") " pod="openstack/nova-scheduler-0" Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.790146 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe95484c-ea5d-4ea3-8915-bb6734014373-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fe95484c-ea5d-4ea3-8915-bb6734014373\") " pod="openstack/nova-scheduler-0" Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.790183 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47d3c20f-b062-4987-bbc5-c0c030d5f340-config-data\") pod \"nova-api-0\" (UID: \"47d3c20f-b062-4987-bbc5-c0c030d5f340\") " pod="openstack/nova-api-0" Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.790219 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vftr\" (UniqueName: \"kubernetes.io/projected/47d3c20f-b062-4987-bbc5-c0c030d5f340-kube-api-access-4vftr\") pod \"nova-api-0\" (UID: \"47d3c20f-b062-4987-bbc5-c0c030d5f340\") " pod="openstack/nova-api-0" Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.790796 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47d3c20f-b062-4987-bbc5-c0c030d5f340-logs\") pod \"nova-api-0\" (UID: \"47d3c20f-b062-4987-bbc5-c0c030d5f340\") " pod="openstack/nova-api-0" Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.798222 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe95484c-ea5d-4ea3-8915-bb6734014373-config-data\") pod \"nova-scheduler-0\" (UID: \"fe95484c-ea5d-4ea3-8915-bb6734014373\") " pod="openstack/nova-scheduler-0" Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.799320 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47d3c20f-b062-4987-bbc5-c0c030d5f340-config-data\") pod \"nova-api-0\" (UID: \"47d3c20f-b062-4987-bbc5-c0c030d5f340\") " pod="openstack/nova-api-0" Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.800332 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe95484c-ea5d-4ea3-8915-bb6734014373-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fe95484c-ea5d-4ea3-8915-bb6734014373\") " pod="openstack/nova-scheduler-0" Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.802742 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47d3c20f-b062-4987-bbc5-c0c030d5f340-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"47d3c20f-b062-4987-bbc5-c0c030d5f340\") " pod="openstack/nova-api-0" Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.808090 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcwtr\" (UniqueName: \"kubernetes.io/projected/fe95484c-ea5d-4ea3-8915-bb6734014373-kube-api-access-bcwtr\") pod \"nova-scheduler-0\" (UID: \"fe95484c-ea5d-4ea3-8915-bb6734014373\") " pod="openstack/nova-scheduler-0" Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.808655 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vftr\" (UniqueName: \"kubernetes.io/projected/47d3c20f-b062-4987-bbc5-c0c030d5f340-kube-api-access-4vftr\") pod \"nova-api-0\" (UID: \"47d3c20f-b062-4987-bbc5-c0c030d5f340\") " pod="openstack/nova-api-0" Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.869037 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.891681 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.984969 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01766e5f-d149-4175-9fdb-15e65b0e0665" path="/var/lib/kubelet/pods/01766e5f-d149-4175-9fdb-15e65b0e0665/volumes" Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.986130 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2856a778-a8b2-4740-8d2a-4a6f64619bc2" path="/var/lib/kubelet/pods/2856a778-a8b2-4740-8d2a-4a6f64619bc2/volumes" Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.986730 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e0d2e17-4045-420d-817b-41a1fc66c425" path="/var/lib/kubelet/pods/4e0d2e17-4045-420d-817b-41a1fc66c425/volumes" Feb 18 11:58:32 crc kubenswrapper[4922]: I0218 11:58:32.987757 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d36f2285-2752-4cad-bf52-fe6ae0b262d1" path="/var/lib/kubelet/pods/d36f2285-2752-4cad-bf52-fe6ae0b262d1/volumes" Feb 18 11:58:33 crc kubenswrapper[4922]: I0218 11:58:33.070489 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 18 11:58:33 crc kubenswrapper[4922]: I0218 11:58:33.070567 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 18 11:58:33 crc kubenswrapper[4922]: W0218 11:58:33.329466 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47d3c20f_b062_4987_bbc5_c0c030d5f340.slice/crio-601e916b018d0517ad916460b508b132b7e210db7b6fd91891fecb64ba408b19 WatchSource:0}: Error finding container 601e916b018d0517ad916460b508b132b7e210db7b6fd91891fecb64ba408b19: Status 404 returned error can't find the container with id 601e916b018d0517ad916460b508b132b7e210db7b6fd91891fecb64ba408b19 Feb 18 11:58:33 crc kubenswrapper[4922]: I0218 11:58:33.333022 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 18 11:58:33 crc kubenswrapper[4922]: I0218 11:58:33.422708 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 11:58:33 crc kubenswrapper[4922]: I0218 11:58:33.448230 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fe95484c-ea5d-4ea3-8915-bb6734014373","Type":"ContainerStarted","Data":"bde417a0228740daf9a9d49c2dd39d718faec4df17eddbb3036309edba429d67"} Feb 18 11:58:33 crc kubenswrapper[4922]: I0218 11:58:33.453211 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4572162-62bc-4e43-b260-497a609abd8e","Type":"ContainerStarted","Data":"722060fc2ca1ae896424d7a12df3ac6d06986256febfd98592ede2235bdf88db"} Feb 18 11:58:33 crc kubenswrapper[4922]: I0218 11:58:33.458404 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"47d3c20f-b062-4987-bbc5-c0c030d5f340","Type":"ContainerStarted","Data":"601e916b018d0517ad916460b508b132b7e210db7b6fd91891fecb64ba408b19"} Feb 18 11:58:34 crc kubenswrapper[4922]: I0218 11:58:34.489271 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4572162-62bc-4e43-b260-497a609abd8e","Type":"ContainerStarted","Data":"f89cedadf6eff1297f8ed80ba94b9fc241d82002338f1e62865bf185883f0e21"} Feb 18 11:58:34 crc kubenswrapper[4922]: I0218 11:58:34.493077 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"47d3c20f-b062-4987-bbc5-c0c030d5f340","Type":"ContainerStarted","Data":"d99ebc30f0488568efddde0bb6744320dc826a942aa1a3c10cb346c0e3fb14c8"} Feb 18 11:58:34 crc kubenswrapper[4922]: I0218 11:58:34.493117 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"47d3c20f-b062-4987-bbc5-c0c030d5f340","Type":"ContainerStarted","Data":"eb934588c08751d8b95e01ad04d470827c74e1b9d1823bd58e871581661e4e03"} Feb 18 11:58:34 crc kubenswrapper[4922]: I0218 11:58:34.498581 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fe95484c-ea5d-4ea3-8915-bb6734014373","Type":"ContainerStarted","Data":"8d90a705ba699c8182d6d1e0ea35773ad23d4e3cf41e4c0e22f8061e6e782bf5"} Feb 18 11:58:34 crc kubenswrapper[4922]: I0218 11:58:34.519167 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.519130412 podStartE2EDuration="2.519130412s" podCreationTimestamp="2026-02-18 11:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:58:34.515634014 +0000 UTC m=+1316.243338094" watchObservedRunningTime="2026-02-18 11:58:34.519130412 +0000 UTC m=+1316.246834502" Feb 18 11:58:34 crc kubenswrapper[4922]: I0218 11:58:34.550330 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.5503119400000003 podStartE2EDuration="2.55031194s" podCreationTimestamp="2026-02-18 11:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:58:34.536630454 +0000 UTC m=+1316.264334534" watchObservedRunningTime="2026-02-18 11:58:34.55031194 +0000 UTC m=+1316.278016020" Feb 18 11:58:35 crc kubenswrapper[4922]: I0218 11:58:35.509996 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4572162-62bc-4e43-b260-497a609abd8e","Type":"ContainerStarted","Data":"6f03b8ff2055021074f5cc7bd367566e2ac086e99a1b9b651b0a938adeca63c6"} Feb 18 11:58:36 crc kubenswrapper[4922]: I0218 11:58:36.523840 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4572162-62bc-4e43-b260-497a609abd8e","Type":"ContainerStarted","Data":"3883aa82d9558789bd29239e1692a77ac44f74c43c7e20ecd49ca776053c9112"} Feb 18 11:58:36 crc kubenswrapper[4922]: I0218 11:58:36.524802 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 18 11:58:36 crc kubenswrapper[4922]: I0218 11:58:36.560613 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.724613164 podStartE2EDuration="5.560579332s" podCreationTimestamp="2026-02-18 11:58:31 +0000 UTC" firstStartedPulling="2026-02-18 11:58:32.374029613 +0000 UTC m=+1314.101733693" lastFinishedPulling="2026-02-18 11:58:36.209995761 +0000 UTC m=+1317.937699861" observedRunningTime="2026-02-18 11:58:36.548265571 +0000 UTC m=+1318.275969641" watchObservedRunningTime="2026-02-18 11:58:36.560579332 +0000 UTC m=+1318.288283412" Feb 18 11:58:36 crc kubenswrapper[4922]: I0218 11:58:36.790832 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 18 11:58:37 crc kubenswrapper[4922]: I0218 11:58:37.870047 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 18 11:58:38 crc kubenswrapper[4922]: I0218 11:58:38.070797 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 18 11:58:38 crc kubenswrapper[4922]: I0218 11:58:38.070841 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 18 11:58:39 crc kubenswrapper[4922]: I0218 11:58:39.082616 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="10a54dd7-a74b-49c4-a631-ad8fe2c22d58" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.213:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 18 11:58:39 crc kubenswrapper[4922]: I0218 11:58:39.082631 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="10a54dd7-a74b-49c4-a631-ad8fe2c22d58" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.213:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 18 11:58:42 crc kubenswrapper[4922]: I0218 11:58:42.869804 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 18 11:58:42 crc kubenswrapper[4922]: I0218 11:58:42.892090 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 18 11:58:42 crc kubenswrapper[4922]: I0218 11:58:42.892402 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 18 11:58:42 crc kubenswrapper[4922]: I0218 11:58:42.903149 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 18 11:58:43 crc kubenswrapper[4922]: I0218 11:58:43.626753 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 18 11:58:43 crc kubenswrapper[4922]: I0218 11:58:43.974823 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="47d3c20f-b062-4987-bbc5-c0c030d5f340" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.216:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 18 11:58:43 crc kubenswrapper[4922]: I0218 11:58:43.975243 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="47d3c20f-b062-4987-bbc5-c0c030d5f340" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.216:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 18 11:58:48 crc kubenswrapper[4922]: I0218 11:58:48.076295 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 18 11:58:48 crc kubenswrapper[4922]: I0218 11:58:48.077608 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 18 11:58:48 crc kubenswrapper[4922]: I0218 11:58:48.082319 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 18 11:58:48 crc kubenswrapper[4922]: I0218 11:58:48.641451 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 18 11:58:51 crc kubenswrapper[4922]: I0218 11:58:51.685060 4922 generic.go:334] "Generic (PLEG): container finished" podID="3ce92cda-4459-4a73-8fc2-84bbb56eccce" containerID="c14ea122843eb522430115cdefefb0c3b7d6b58a0e5c203b5fb2fca56b1e56ba" exitCode=137 Feb 18 11:58:51 crc kubenswrapper[4922]: I0218 11:58:51.686051 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3ce92cda-4459-4a73-8fc2-84bbb56eccce","Type":"ContainerDied","Data":"c14ea122843eb522430115cdefefb0c3b7d6b58a0e5c203b5fb2fca56b1e56ba"} Feb 18 11:58:51 crc kubenswrapper[4922]: I0218 11:58:51.686107 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3ce92cda-4459-4a73-8fc2-84bbb56eccce","Type":"ContainerDied","Data":"92d7d00c3ed4a00a1aeac19f0bbbe19ab3df77be12277cadf25b39bf998465e7"} Feb 18 11:58:51 crc kubenswrapper[4922]: I0218 11:58:51.686122 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92d7d00c3ed4a00a1aeac19f0bbbe19ab3df77be12277cadf25b39bf998465e7" Feb 18 11:58:51 crc kubenswrapper[4922]: I0218 11:58:51.710884 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 18 11:58:51 crc kubenswrapper[4922]: I0218 11:58:51.782961 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ce92cda-4459-4a73-8fc2-84bbb56eccce-config-data\") pod \"3ce92cda-4459-4a73-8fc2-84bbb56eccce\" (UID: \"3ce92cda-4459-4a73-8fc2-84bbb56eccce\") " Feb 18 11:58:51 crc kubenswrapper[4922]: I0218 11:58:51.783229 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ce92cda-4459-4a73-8fc2-84bbb56eccce-combined-ca-bundle\") pod \"3ce92cda-4459-4a73-8fc2-84bbb56eccce\" (UID: \"3ce92cda-4459-4a73-8fc2-84bbb56eccce\") " Feb 18 11:58:51 crc kubenswrapper[4922]: I0218 11:58:51.783302 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mv5hb\" (UniqueName: \"kubernetes.io/projected/3ce92cda-4459-4a73-8fc2-84bbb56eccce-kube-api-access-mv5hb\") pod \"3ce92cda-4459-4a73-8fc2-84bbb56eccce\" (UID: \"3ce92cda-4459-4a73-8fc2-84bbb56eccce\") " Feb 18 11:58:51 crc kubenswrapper[4922]: I0218 11:58:51.792407 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ce92cda-4459-4a73-8fc2-84bbb56eccce-kube-api-access-mv5hb" (OuterVolumeSpecName: "kube-api-access-mv5hb") pod "3ce92cda-4459-4a73-8fc2-84bbb56eccce" (UID: "3ce92cda-4459-4a73-8fc2-84bbb56eccce"). InnerVolumeSpecName "kube-api-access-mv5hb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:58:51 crc kubenswrapper[4922]: I0218 11:58:51.813278 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ce92cda-4459-4a73-8fc2-84bbb56eccce-config-data" (OuterVolumeSpecName: "config-data") pod "3ce92cda-4459-4a73-8fc2-84bbb56eccce" (UID: "3ce92cda-4459-4a73-8fc2-84bbb56eccce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:58:51 crc kubenswrapper[4922]: I0218 11:58:51.815464 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ce92cda-4459-4a73-8fc2-84bbb56eccce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ce92cda-4459-4a73-8fc2-84bbb56eccce" (UID: "3ce92cda-4459-4a73-8fc2-84bbb56eccce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:58:51 crc kubenswrapper[4922]: I0218 11:58:51.885316 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ce92cda-4459-4a73-8fc2-84bbb56eccce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:51 crc kubenswrapper[4922]: I0218 11:58:51.885630 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mv5hb\" (UniqueName: \"kubernetes.io/projected/3ce92cda-4459-4a73-8fc2-84bbb56eccce-kube-api-access-mv5hb\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:51 crc kubenswrapper[4922]: I0218 11:58:51.885644 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ce92cda-4459-4a73-8fc2-84bbb56eccce-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:52 crc kubenswrapper[4922]: I0218 11:58:52.696231 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 18 11:58:52 crc kubenswrapper[4922]: I0218 11:58:52.740581 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 18 11:58:52 crc kubenswrapper[4922]: I0218 11:58:52.764543 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 18 11:58:52 crc kubenswrapper[4922]: I0218 11:58:52.778948 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 18 11:58:52 crc kubenswrapper[4922]: E0218 11:58:52.779716 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ce92cda-4459-4a73-8fc2-84bbb56eccce" containerName="nova-cell1-novncproxy-novncproxy" Feb 18 11:58:52 crc kubenswrapper[4922]: I0218 11:58:52.779742 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ce92cda-4459-4a73-8fc2-84bbb56eccce" containerName="nova-cell1-novncproxy-novncproxy" Feb 18 11:58:52 crc kubenswrapper[4922]: I0218 11:58:52.780054 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ce92cda-4459-4a73-8fc2-84bbb56eccce" containerName="nova-cell1-novncproxy-novncproxy" Feb 18 11:58:52 crc kubenswrapper[4922]: I0218 11:58:52.783112 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 18 11:58:52 crc kubenswrapper[4922]: I0218 11:58:52.786480 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 18 11:58:52 crc kubenswrapper[4922]: I0218 11:58:52.786803 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 18 11:58:52 crc kubenswrapper[4922]: I0218 11:58:52.787001 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 18 11:58:52 crc kubenswrapper[4922]: I0218 11:58:52.797569 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 18 11:58:52 crc kubenswrapper[4922]: I0218 11:58:52.898989 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 18 11:58:52 crc kubenswrapper[4922]: I0218 11:58:52.899048 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 18 11:58:52 crc kubenswrapper[4922]: I0218 11:58:52.899847 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 18 11:58:52 crc kubenswrapper[4922]: I0218 11:58:52.899890 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 18 11:58:52 crc kubenswrapper[4922]: I0218 11:58:52.903899 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 18 11:58:52 crc kubenswrapper[4922]: I0218 11:58:52.904707 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2whs\" (UniqueName: \"kubernetes.io/projected/5f598a92-b7cc-4584-9a17-d4c6d031ceeb-kube-api-access-h2whs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5f598a92-b7cc-4584-9a17-d4c6d031ceeb\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 11:58:52 crc kubenswrapper[4922]: I0218 11:58:52.904739 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 18 11:58:52 crc kubenswrapper[4922]: I0218 11:58:52.904748 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f598a92-b7cc-4584-9a17-d4c6d031ceeb-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5f598a92-b7cc-4584-9a17-d4c6d031ceeb\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 11:58:52 crc kubenswrapper[4922]: I0218 11:58:52.905095 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f598a92-b7cc-4584-9a17-d4c6d031ceeb-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5f598a92-b7cc-4584-9a17-d4c6d031ceeb\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 11:58:52 crc kubenswrapper[4922]: I0218 11:58:52.905169 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f598a92-b7cc-4584-9a17-d4c6d031ceeb-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5f598a92-b7cc-4584-9a17-d4c6d031ceeb\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 11:58:52 crc kubenswrapper[4922]: I0218 11:58:52.905320 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f598a92-b7cc-4584-9a17-d4c6d031ceeb-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5f598a92-b7cc-4584-9a17-d4c6d031ceeb\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 11:58:52 crc kubenswrapper[4922]: E0218 11:58:52.932199 4922 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ce92cda_4459_4a73_8fc2_84bbb56eccce.slice/crio-92d7d00c3ed4a00a1aeac19f0bbbe19ab3df77be12277cadf25b39bf998465e7\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ce92cda_4459_4a73_8fc2_84bbb56eccce.slice\": RecentStats: unable to find data in memory cache]" Feb 18 11:58:52 crc kubenswrapper[4922]: I0218 11:58:52.988150 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ce92cda-4459-4a73-8fc2-84bbb56eccce" path="/var/lib/kubelet/pods/3ce92cda-4459-4a73-8fc2-84bbb56eccce/volumes" Feb 18 11:58:53 crc kubenswrapper[4922]: I0218 11:58:53.007044 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f598a92-b7cc-4584-9a17-d4c6d031ceeb-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5f598a92-b7cc-4584-9a17-d4c6d031ceeb\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 11:58:53 crc kubenswrapper[4922]: I0218 11:58:53.007125 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f598a92-b7cc-4584-9a17-d4c6d031ceeb-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5f598a92-b7cc-4584-9a17-d4c6d031ceeb\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 11:58:53 crc kubenswrapper[4922]: I0218 11:58:53.007223 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2whs\" (UniqueName: \"kubernetes.io/projected/5f598a92-b7cc-4584-9a17-d4c6d031ceeb-kube-api-access-h2whs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5f598a92-b7cc-4584-9a17-d4c6d031ceeb\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 11:58:53 crc kubenswrapper[4922]: I0218 11:58:53.007243 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f598a92-b7cc-4584-9a17-d4c6d031ceeb-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5f598a92-b7cc-4584-9a17-d4c6d031ceeb\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 11:58:53 crc kubenswrapper[4922]: I0218 11:58:53.007308 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f598a92-b7cc-4584-9a17-d4c6d031ceeb-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5f598a92-b7cc-4584-9a17-d4c6d031ceeb\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 11:58:53 crc kubenswrapper[4922]: I0218 11:58:53.021823 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f598a92-b7cc-4584-9a17-d4c6d031ceeb-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5f598a92-b7cc-4584-9a17-d4c6d031ceeb\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 11:58:53 crc kubenswrapper[4922]: I0218 11:58:53.029620 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f598a92-b7cc-4584-9a17-d4c6d031ceeb-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5f598a92-b7cc-4584-9a17-d4c6d031ceeb\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 11:58:53 crc kubenswrapper[4922]: I0218 11:58:53.032026 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f598a92-b7cc-4584-9a17-d4c6d031ceeb-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5f598a92-b7cc-4584-9a17-d4c6d031ceeb\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 11:58:53 crc kubenswrapper[4922]: I0218 11:58:53.035000 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f598a92-b7cc-4584-9a17-d4c6d031ceeb-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5f598a92-b7cc-4584-9a17-d4c6d031ceeb\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 11:58:53 crc kubenswrapper[4922]: I0218 11:58:53.046094 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2whs\" (UniqueName: \"kubernetes.io/projected/5f598a92-b7cc-4584-9a17-d4c6d031ceeb-kube-api-access-h2whs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5f598a92-b7cc-4584-9a17-d4c6d031ceeb\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 11:58:53 crc kubenswrapper[4922]: I0218 11:58:53.104731 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-c2lrw"] Feb 18 11:58:53 crc kubenswrapper[4922]: I0218 11:58:53.106496 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-c2lrw" Feb 18 11:58:53 crc kubenswrapper[4922]: I0218 11:58:53.110783 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 18 11:58:53 crc kubenswrapper[4922]: I0218 11:58:53.126293 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-c2lrw"] Feb 18 11:58:53 crc kubenswrapper[4922]: I0218 11:58:53.211783 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7ec5b650-c58d-4b8b-a903-7b95c211139c-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-c2lrw\" (UID: \"7ec5b650-c58d-4b8b-a903-7b95c211139c\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-c2lrw" Feb 18 11:58:53 crc kubenswrapper[4922]: I0218 11:58:53.211868 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ec5b650-c58d-4b8b-a903-7b95c211139c-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-c2lrw\" (UID: \"7ec5b650-c58d-4b8b-a903-7b95c211139c\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-c2lrw" Feb 18 11:58:53 crc kubenswrapper[4922]: I0218 11:58:53.211902 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chbg5\" (UniqueName: \"kubernetes.io/projected/7ec5b650-c58d-4b8b-a903-7b95c211139c-kube-api-access-chbg5\") pod \"dnsmasq-dns-cd5cbd7b9-c2lrw\" (UID: \"7ec5b650-c58d-4b8b-a903-7b95c211139c\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-c2lrw" Feb 18 11:58:53 crc kubenswrapper[4922]: I0218 11:58:53.211968 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ec5b650-c58d-4b8b-a903-7b95c211139c-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-c2lrw\" (UID: \"7ec5b650-c58d-4b8b-a903-7b95c211139c\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-c2lrw" Feb 18 11:58:53 crc kubenswrapper[4922]: I0218 11:58:53.211999 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ec5b650-c58d-4b8b-a903-7b95c211139c-config\") pod \"dnsmasq-dns-cd5cbd7b9-c2lrw\" (UID: \"7ec5b650-c58d-4b8b-a903-7b95c211139c\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-c2lrw" Feb 18 11:58:53 crc kubenswrapper[4922]: I0218 11:58:53.212020 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ec5b650-c58d-4b8b-a903-7b95c211139c-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-c2lrw\" (UID: \"7ec5b650-c58d-4b8b-a903-7b95c211139c\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-c2lrw" Feb 18 11:58:53 crc kubenswrapper[4922]: I0218 11:58:53.314137 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ec5b650-c58d-4b8b-a903-7b95c211139c-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-c2lrw\" (UID: \"7ec5b650-c58d-4b8b-a903-7b95c211139c\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-c2lrw" Feb 18 11:58:53 crc kubenswrapper[4922]: I0218 11:58:53.314650 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chbg5\" (UniqueName: \"kubernetes.io/projected/7ec5b650-c58d-4b8b-a903-7b95c211139c-kube-api-access-chbg5\") pod \"dnsmasq-dns-cd5cbd7b9-c2lrw\" (UID: \"7ec5b650-c58d-4b8b-a903-7b95c211139c\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-c2lrw" Feb 18 11:58:53 crc kubenswrapper[4922]: I0218 11:58:53.314882 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ec5b650-c58d-4b8b-a903-7b95c211139c-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-c2lrw\" (UID: \"7ec5b650-c58d-4b8b-a903-7b95c211139c\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-c2lrw" Feb 18 11:58:53 crc kubenswrapper[4922]: I0218 11:58:53.314978 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ec5b650-c58d-4b8b-a903-7b95c211139c-config\") pod \"dnsmasq-dns-cd5cbd7b9-c2lrw\" (UID: \"7ec5b650-c58d-4b8b-a903-7b95c211139c\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-c2lrw" Feb 18 11:58:53 crc kubenswrapper[4922]: I0218 11:58:53.315014 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ec5b650-c58d-4b8b-a903-7b95c211139c-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-c2lrw\" (UID: \"7ec5b650-c58d-4b8b-a903-7b95c211139c\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-c2lrw" Feb 18 11:58:53 crc kubenswrapper[4922]: I0218 11:58:53.315201 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7ec5b650-c58d-4b8b-a903-7b95c211139c-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-c2lrw\" (UID: \"7ec5b650-c58d-4b8b-a903-7b95c211139c\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-c2lrw" Feb 18 11:58:53 crc kubenswrapper[4922]: I0218 11:58:53.316280 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7ec5b650-c58d-4b8b-a903-7b95c211139c-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-c2lrw\" (UID: \"7ec5b650-c58d-4b8b-a903-7b95c211139c\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-c2lrw" Feb 18 11:58:53 crc kubenswrapper[4922]: I0218 11:58:53.316990 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ec5b650-c58d-4b8b-a903-7b95c211139c-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-c2lrw\" (UID: \"7ec5b650-c58d-4b8b-a903-7b95c211139c\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-c2lrw" Feb 18 11:58:53 crc kubenswrapper[4922]: I0218 11:58:53.317486 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ec5b650-c58d-4b8b-a903-7b95c211139c-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-c2lrw\" (UID: \"7ec5b650-c58d-4b8b-a903-7b95c211139c\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-c2lrw" Feb 18 11:58:53 crc kubenswrapper[4922]: I0218 11:58:53.318022 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ec5b650-c58d-4b8b-a903-7b95c211139c-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-c2lrw\" (UID: \"7ec5b650-c58d-4b8b-a903-7b95c211139c\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-c2lrw" Feb 18 11:58:53 crc kubenswrapper[4922]: I0218 11:58:53.318354 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ec5b650-c58d-4b8b-a903-7b95c211139c-config\") pod \"dnsmasq-dns-cd5cbd7b9-c2lrw\" (UID: \"7ec5b650-c58d-4b8b-a903-7b95c211139c\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-c2lrw" Feb 18 11:58:53 crc kubenswrapper[4922]: I0218 11:58:53.352135 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chbg5\" (UniqueName: \"kubernetes.io/projected/7ec5b650-c58d-4b8b-a903-7b95c211139c-kube-api-access-chbg5\") pod \"dnsmasq-dns-cd5cbd7b9-c2lrw\" (UID: \"7ec5b650-c58d-4b8b-a903-7b95c211139c\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-c2lrw" Feb 18 11:58:53 crc kubenswrapper[4922]: I0218 11:58:53.460023 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-c2lrw" Feb 18 11:58:53 crc kubenswrapper[4922]: I0218 11:58:53.712224 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 18 11:58:53 crc kubenswrapper[4922]: I0218 11:58:53.999855 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-c2lrw"] Feb 18 11:58:54 crc kubenswrapper[4922]: W0218 11:58:54.001546 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ec5b650_c58d_4b8b_a903_7b95c211139c.slice/crio-ec7c038a1e2201c0d4227eabf40ab3692ebca8c65c8f35a81b082989a7f57de6 WatchSource:0}: Error finding container ec7c038a1e2201c0d4227eabf40ab3692ebca8c65c8f35a81b082989a7f57de6: Status 404 returned error can't find the container with id ec7c038a1e2201c0d4227eabf40ab3692ebca8c65c8f35a81b082989a7f57de6 Feb 18 11:58:54 crc kubenswrapper[4922]: I0218 11:58:54.721223 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5f598a92-b7cc-4584-9a17-d4c6d031ceeb","Type":"ContainerStarted","Data":"003eac58c03e9c8f7ea6c029fc1db8f62780dac546972e4e818e3806deaeb204"} Feb 18 11:58:54 crc kubenswrapper[4922]: I0218 11:58:54.721832 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5f598a92-b7cc-4584-9a17-d4c6d031ceeb","Type":"ContainerStarted","Data":"cfc37d913d6a2fdff18a19dd632edb4a4e54665ecfc74a7e53fdf5d756b1cb82"} Feb 18 11:58:54 crc kubenswrapper[4922]: I0218 11:58:54.724975 4922 generic.go:334] "Generic (PLEG): container finished" podID="7ec5b650-c58d-4b8b-a903-7b95c211139c" containerID="002e5f15ac919f451d0a14749c9df4d942fa8a8527a3d44eb7afde0e3a5e390a" exitCode=0 Feb 18 11:58:54 crc kubenswrapper[4922]: I0218 11:58:54.725655 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-c2lrw" event={"ID":"7ec5b650-c58d-4b8b-a903-7b95c211139c","Type":"ContainerDied","Data":"002e5f15ac919f451d0a14749c9df4d942fa8a8527a3d44eb7afde0e3a5e390a"} Feb 18 11:58:54 crc kubenswrapper[4922]: I0218 11:58:54.725694 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-c2lrw" event={"ID":"7ec5b650-c58d-4b8b-a903-7b95c211139c","Type":"ContainerStarted","Data":"ec7c038a1e2201c0d4227eabf40ab3692ebca8c65c8f35a81b082989a7f57de6"} Feb 18 11:58:54 crc kubenswrapper[4922]: I0218 11:58:54.750020 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.749999754 podStartE2EDuration="2.749999754s" podCreationTimestamp="2026-02-18 11:58:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:58:54.742871094 +0000 UTC m=+1336.470575174" watchObservedRunningTime="2026-02-18 11:58:54.749999754 +0000 UTC m=+1336.477703834" Feb 18 11:58:55 crc kubenswrapper[4922]: I0218 11:58:55.463791 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 18 11:58:55 crc kubenswrapper[4922]: I0218 11:58:55.676113 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 11:58:55 crc kubenswrapper[4922]: I0218 11:58:55.676493 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e4572162-62bc-4e43-b260-497a609abd8e" containerName="proxy-httpd" containerID="cri-o://3883aa82d9558789bd29239e1692a77ac44f74c43c7e20ecd49ca776053c9112" gracePeriod=30 Feb 18 11:58:55 crc kubenswrapper[4922]: I0218 11:58:55.676528 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e4572162-62bc-4e43-b260-497a609abd8e" containerName="ceilometer-notification-agent" containerID="cri-o://f89cedadf6eff1297f8ed80ba94b9fc241d82002338f1e62865bf185883f0e21" gracePeriod=30 Feb 18 11:58:55 crc kubenswrapper[4922]: I0218 11:58:55.676528 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e4572162-62bc-4e43-b260-497a609abd8e" containerName="sg-core" containerID="cri-o://6f03b8ff2055021074f5cc7bd367566e2ac086e99a1b9b651b0a938adeca63c6" gracePeriod=30 Feb 18 11:58:55 crc kubenswrapper[4922]: I0218 11:58:55.676676 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e4572162-62bc-4e43-b260-497a609abd8e" containerName="ceilometer-central-agent" containerID="cri-o://722060fc2ca1ae896424d7a12df3ac6d06986256febfd98592ede2235bdf88db" gracePeriod=30 Feb 18 11:58:55 crc kubenswrapper[4922]: I0218 11:58:55.691920 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="e4572162-62bc-4e43-b260-497a609abd8e" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.214:3000/\": EOF" Feb 18 11:58:55 crc kubenswrapper[4922]: I0218 11:58:55.740750 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-c2lrw" event={"ID":"7ec5b650-c58d-4b8b-a903-7b95c211139c","Type":"ContainerStarted","Data":"e9ab7fb9a54d8f679a096c44f20292d1e9d8bf666191a5f0522387545b3f92c4"} Feb 18 11:58:55 crc kubenswrapper[4922]: I0218 11:58:55.740899 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="47d3c20f-b062-4987-bbc5-c0c030d5f340" containerName="nova-api-log" containerID="cri-o://eb934588c08751d8b95e01ad04d470827c74e1b9d1823bd58e871581661e4e03" gracePeriod=30 Feb 18 11:58:55 crc kubenswrapper[4922]: I0218 11:58:55.740990 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cd5cbd7b9-c2lrw" Feb 18 11:58:55 crc kubenswrapper[4922]: I0218 11:58:55.741030 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="47d3c20f-b062-4987-bbc5-c0c030d5f340" containerName="nova-api-api" containerID="cri-o://d99ebc30f0488568efddde0bb6744320dc826a942aa1a3c10cb346c0e3fb14c8" gracePeriod=30 Feb 18 11:58:55 crc kubenswrapper[4922]: I0218 11:58:55.775975 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cd5cbd7b9-c2lrw" podStartSLOduration=2.775958967 podStartE2EDuration="2.775958967s" podCreationTimestamp="2026-02-18 11:58:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:58:55.771991876 +0000 UTC m=+1337.499695956" watchObservedRunningTime="2026-02-18 11:58:55.775958967 +0000 UTC m=+1337.503663047" Feb 18 11:58:56 crc kubenswrapper[4922]: I0218 11:58:56.784816 4922 generic.go:334] "Generic (PLEG): container finished" podID="e4572162-62bc-4e43-b260-497a609abd8e" containerID="3883aa82d9558789bd29239e1692a77ac44f74c43c7e20ecd49ca776053c9112" exitCode=0 Feb 18 11:58:56 crc kubenswrapper[4922]: I0218 11:58:56.785117 4922 generic.go:334] "Generic (PLEG): container finished" podID="e4572162-62bc-4e43-b260-497a609abd8e" containerID="6f03b8ff2055021074f5cc7bd367566e2ac086e99a1b9b651b0a938adeca63c6" exitCode=2 Feb 18 11:58:56 crc kubenswrapper[4922]: I0218 11:58:56.785131 4922 generic.go:334] "Generic (PLEG): container finished" podID="e4572162-62bc-4e43-b260-497a609abd8e" containerID="722060fc2ca1ae896424d7a12df3ac6d06986256febfd98592ede2235bdf88db" exitCode=0 Feb 18 11:58:56 crc kubenswrapper[4922]: I0218 11:58:56.785031 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4572162-62bc-4e43-b260-497a609abd8e","Type":"ContainerDied","Data":"3883aa82d9558789bd29239e1692a77ac44f74c43c7e20ecd49ca776053c9112"} Feb 18 11:58:56 crc kubenswrapper[4922]: I0218 11:58:56.785214 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4572162-62bc-4e43-b260-497a609abd8e","Type":"ContainerDied","Data":"6f03b8ff2055021074f5cc7bd367566e2ac086e99a1b9b651b0a938adeca63c6"} Feb 18 11:58:56 crc kubenswrapper[4922]: I0218 11:58:56.785233 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4572162-62bc-4e43-b260-497a609abd8e","Type":"ContainerDied","Data":"722060fc2ca1ae896424d7a12df3ac6d06986256febfd98592ede2235bdf88db"} Feb 18 11:58:56 crc kubenswrapper[4922]: I0218 11:58:56.791043 4922 generic.go:334] "Generic (PLEG): container finished" podID="47d3c20f-b062-4987-bbc5-c0c030d5f340" containerID="eb934588c08751d8b95e01ad04d470827c74e1b9d1823bd58e871581661e4e03" exitCode=143 Feb 18 11:58:56 crc kubenswrapper[4922]: I0218 11:58:56.791107 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"47d3c20f-b062-4987-bbc5-c0c030d5f340","Type":"ContainerDied","Data":"eb934588c08751d8b95e01ad04d470827c74e1b9d1823bd58e871581661e4e03"} Feb 18 11:58:58 crc kubenswrapper[4922]: I0218 11:58:58.112119 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 18 11:58:59 crc kubenswrapper[4922]: I0218 11:58:59.451871 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 11:58:59 crc kubenswrapper[4922]: I0218 11:58:59.550025 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47d3c20f-b062-4987-bbc5-c0c030d5f340-combined-ca-bundle\") pod \"47d3c20f-b062-4987-bbc5-c0c030d5f340\" (UID: \"47d3c20f-b062-4987-bbc5-c0c030d5f340\") " Feb 18 11:58:59 crc kubenswrapper[4922]: I0218 11:58:59.550109 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47d3c20f-b062-4987-bbc5-c0c030d5f340-config-data\") pod \"47d3c20f-b062-4987-bbc5-c0c030d5f340\" (UID: \"47d3c20f-b062-4987-bbc5-c0c030d5f340\") " Feb 18 11:58:59 crc kubenswrapper[4922]: I0218 11:58:59.550261 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47d3c20f-b062-4987-bbc5-c0c030d5f340-logs\") pod \"47d3c20f-b062-4987-bbc5-c0c030d5f340\" (UID: \"47d3c20f-b062-4987-bbc5-c0c030d5f340\") " Feb 18 11:58:59 crc kubenswrapper[4922]: I0218 11:58:59.550320 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vftr\" (UniqueName: \"kubernetes.io/projected/47d3c20f-b062-4987-bbc5-c0c030d5f340-kube-api-access-4vftr\") pod \"47d3c20f-b062-4987-bbc5-c0c030d5f340\" (UID: \"47d3c20f-b062-4987-bbc5-c0c030d5f340\") " Feb 18 11:58:59 crc kubenswrapper[4922]: I0218 11:58:59.550631 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47d3c20f-b062-4987-bbc5-c0c030d5f340-logs" (OuterVolumeSpecName: "logs") pod "47d3c20f-b062-4987-bbc5-c0c030d5f340" (UID: "47d3c20f-b062-4987-bbc5-c0c030d5f340"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:58:59 crc kubenswrapper[4922]: I0218 11:58:59.551323 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47d3c20f-b062-4987-bbc5-c0c030d5f340-logs\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:59 crc kubenswrapper[4922]: I0218 11:58:59.558398 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47d3c20f-b062-4987-bbc5-c0c030d5f340-kube-api-access-4vftr" (OuterVolumeSpecName: "kube-api-access-4vftr") pod "47d3c20f-b062-4987-bbc5-c0c030d5f340" (UID: "47d3c20f-b062-4987-bbc5-c0c030d5f340"). InnerVolumeSpecName "kube-api-access-4vftr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:58:59 crc kubenswrapper[4922]: I0218 11:58:59.589153 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47d3c20f-b062-4987-bbc5-c0c030d5f340-config-data" (OuterVolumeSpecName: "config-data") pod "47d3c20f-b062-4987-bbc5-c0c030d5f340" (UID: "47d3c20f-b062-4987-bbc5-c0c030d5f340"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:58:59 crc kubenswrapper[4922]: I0218 11:58:59.603142 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47d3c20f-b062-4987-bbc5-c0c030d5f340-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "47d3c20f-b062-4987-bbc5-c0c030d5f340" (UID: "47d3c20f-b062-4987-bbc5-c0c030d5f340"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:58:59 crc kubenswrapper[4922]: I0218 11:58:59.653412 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47d3c20f-b062-4987-bbc5-c0c030d5f340-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:59 crc kubenswrapper[4922]: I0218 11:58:59.653455 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47d3c20f-b062-4987-bbc5-c0c030d5f340-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:59 crc kubenswrapper[4922]: I0218 11:58:59.653467 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vftr\" (UniqueName: \"kubernetes.io/projected/47d3c20f-b062-4987-bbc5-c0c030d5f340-kube-api-access-4vftr\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:59 crc kubenswrapper[4922]: I0218 11:58:59.820860 4922 generic.go:334] "Generic (PLEG): container finished" podID="47d3c20f-b062-4987-bbc5-c0c030d5f340" containerID="d99ebc30f0488568efddde0bb6744320dc826a942aa1a3c10cb346c0e3fb14c8" exitCode=0 Feb 18 11:58:59 crc kubenswrapper[4922]: I0218 11:58:59.820900 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"47d3c20f-b062-4987-bbc5-c0c030d5f340","Type":"ContainerDied","Data":"d99ebc30f0488568efddde0bb6744320dc826a942aa1a3c10cb346c0e3fb14c8"} Feb 18 11:58:59 crc kubenswrapper[4922]: I0218 11:58:59.820924 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"47d3c20f-b062-4987-bbc5-c0c030d5f340","Type":"ContainerDied","Data":"601e916b018d0517ad916460b508b132b7e210db7b6fd91891fecb64ba408b19"} Feb 18 11:58:59 crc kubenswrapper[4922]: I0218 11:58:59.820941 4922 scope.go:117] "RemoveContainer" containerID="d99ebc30f0488568efddde0bb6744320dc826a942aa1a3c10cb346c0e3fb14c8" Feb 18 11:58:59 crc kubenswrapper[4922]: I0218 11:58:59.821071 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 11:58:59 crc kubenswrapper[4922]: I0218 11:58:59.845431 4922 scope.go:117] "RemoveContainer" containerID="eb934588c08751d8b95e01ad04d470827c74e1b9d1823bd58e871581661e4e03" Feb 18 11:58:59 crc kubenswrapper[4922]: I0218 11:58:59.861011 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 18 11:58:59 crc kubenswrapper[4922]: I0218 11:58:59.873723 4922 scope.go:117] "RemoveContainer" containerID="d99ebc30f0488568efddde0bb6744320dc826a942aa1a3c10cb346c0e3fb14c8" Feb 18 11:58:59 crc kubenswrapper[4922]: E0218 11:58:59.874216 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d99ebc30f0488568efddde0bb6744320dc826a942aa1a3c10cb346c0e3fb14c8\": container with ID starting with d99ebc30f0488568efddde0bb6744320dc826a942aa1a3c10cb346c0e3fb14c8 not found: ID does not exist" containerID="d99ebc30f0488568efddde0bb6744320dc826a942aa1a3c10cb346c0e3fb14c8" Feb 18 11:58:59 crc kubenswrapper[4922]: I0218 11:58:59.874245 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d99ebc30f0488568efddde0bb6744320dc826a942aa1a3c10cb346c0e3fb14c8"} err="failed to get container status \"d99ebc30f0488568efddde0bb6744320dc826a942aa1a3c10cb346c0e3fb14c8\": rpc error: code = NotFound desc = could not find container \"d99ebc30f0488568efddde0bb6744320dc826a942aa1a3c10cb346c0e3fb14c8\": container with ID starting with d99ebc30f0488568efddde0bb6744320dc826a942aa1a3c10cb346c0e3fb14c8 not found: ID does not exist" Feb 18 11:58:59 crc kubenswrapper[4922]: I0218 11:58:59.874266 4922 scope.go:117] "RemoveContainer" containerID="eb934588c08751d8b95e01ad04d470827c74e1b9d1823bd58e871581661e4e03" Feb 18 11:58:59 crc kubenswrapper[4922]: I0218 11:58:59.874318 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 18 11:58:59 crc kubenswrapper[4922]: E0218 11:58:59.874594 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb934588c08751d8b95e01ad04d470827c74e1b9d1823bd58e871581661e4e03\": container with ID starting with eb934588c08751d8b95e01ad04d470827c74e1b9d1823bd58e871581661e4e03 not found: ID does not exist" containerID="eb934588c08751d8b95e01ad04d470827c74e1b9d1823bd58e871581661e4e03" Feb 18 11:58:59 crc kubenswrapper[4922]: I0218 11:58:59.874615 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb934588c08751d8b95e01ad04d470827c74e1b9d1823bd58e871581661e4e03"} err="failed to get container status \"eb934588c08751d8b95e01ad04d470827c74e1b9d1823bd58e871581661e4e03\": rpc error: code = NotFound desc = could not find container \"eb934588c08751d8b95e01ad04d470827c74e1b9d1823bd58e871581661e4e03\": container with ID starting with eb934588c08751d8b95e01ad04d470827c74e1b9d1823bd58e871581661e4e03 not found: ID does not exist" Feb 18 11:58:59 crc kubenswrapper[4922]: I0218 11:58:59.885372 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 18 11:58:59 crc kubenswrapper[4922]: E0218 11:58:59.885847 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47d3c20f-b062-4987-bbc5-c0c030d5f340" containerName="nova-api-api" Feb 18 11:58:59 crc kubenswrapper[4922]: I0218 11:58:59.885859 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="47d3c20f-b062-4987-bbc5-c0c030d5f340" containerName="nova-api-api" Feb 18 11:58:59 crc kubenswrapper[4922]: E0218 11:58:59.885871 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47d3c20f-b062-4987-bbc5-c0c030d5f340" containerName="nova-api-log" Feb 18 11:58:59 crc kubenswrapper[4922]: I0218 11:58:59.885878 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="47d3c20f-b062-4987-bbc5-c0c030d5f340" containerName="nova-api-log" Feb 18 11:58:59 crc kubenswrapper[4922]: I0218 11:58:59.886061 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="47d3c20f-b062-4987-bbc5-c0c030d5f340" containerName="nova-api-log" Feb 18 11:58:59 crc kubenswrapper[4922]: I0218 11:58:59.886074 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="47d3c20f-b062-4987-bbc5-c0c030d5f340" containerName="nova-api-api" Feb 18 11:58:59 crc kubenswrapper[4922]: I0218 11:58:59.887128 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 11:58:59 crc kubenswrapper[4922]: I0218 11:58:59.925309 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 18 11:58:59 crc kubenswrapper[4922]: I0218 11:58:59.925578 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 18 11:58:59 crc kubenswrapper[4922]: I0218 11:58:59.925621 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 18 11:58:59 crc kubenswrapper[4922]: I0218 11:58:59.947140 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 18 11:58:59 crc kubenswrapper[4922]: I0218 11:58:59.960139 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c0a029b-ba40-494a-b439-5ddf2073ad00-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5c0a029b-ba40-494a-b439-5ddf2073ad00\") " pod="openstack/nova-api-0" Feb 18 11:58:59 crc kubenswrapper[4922]: I0218 11:58:59.960655 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c0a029b-ba40-494a-b439-5ddf2073ad00-config-data\") pod \"nova-api-0\" (UID: \"5c0a029b-ba40-494a-b439-5ddf2073ad00\") " pod="openstack/nova-api-0" Feb 18 11:58:59 crc kubenswrapper[4922]: I0218 11:58:59.960780 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c0a029b-ba40-494a-b439-5ddf2073ad00-logs\") pod \"nova-api-0\" (UID: \"5c0a029b-ba40-494a-b439-5ddf2073ad00\") " pod="openstack/nova-api-0" Feb 18 11:58:59 crc kubenswrapper[4922]: I0218 11:58:59.960835 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn58b\" (UniqueName: \"kubernetes.io/projected/5c0a029b-ba40-494a-b439-5ddf2073ad00-kube-api-access-wn58b\") pod \"nova-api-0\" (UID: \"5c0a029b-ba40-494a-b439-5ddf2073ad00\") " pod="openstack/nova-api-0" Feb 18 11:58:59 crc kubenswrapper[4922]: I0218 11:58:59.961265 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c0a029b-ba40-494a-b439-5ddf2073ad00-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5c0a029b-ba40-494a-b439-5ddf2073ad00\") " pod="openstack/nova-api-0" Feb 18 11:58:59 crc kubenswrapper[4922]: I0218 11:58:59.961431 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c0a029b-ba40-494a-b439-5ddf2073ad00-public-tls-certs\") pod \"nova-api-0\" (UID: \"5c0a029b-ba40-494a-b439-5ddf2073ad00\") " pod="openstack/nova-api-0" Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.063334 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c0a029b-ba40-494a-b439-5ddf2073ad00-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5c0a029b-ba40-494a-b439-5ddf2073ad00\") " pod="openstack/nova-api-0" Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.063729 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c0a029b-ba40-494a-b439-5ddf2073ad00-public-tls-certs\") pod \"nova-api-0\" (UID: \"5c0a029b-ba40-494a-b439-5ddf2073ad00\") " pod="openstack/nova-api-0" Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.063785 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c0a029b-ba40-494a-b439-5ddf2073ad00-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5c0a029b-ba40-494a-b439-5ddf2073ad00\") " pod="openstack/nova-api-0" Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.063870 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c0a029b-ba40-494a-b439-5ddf2073ad00-config-data\") pod \"nova-api-0\" (UID: \"5c0a029b-ba40-494a-b439-5ddf2073ad00\") " pod="openstack/nova-api-0" Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.063918 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c0a029b-ba40-494a-b439-5ddf2073ad00-logs\") pod \"nova-api-0\" (UID: \"5c0a029b-ba40-494a-b439-5ddf2073ad00\") " pod="openstack/nova-api-0" Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.063951 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wn58b\" (UniqueName: \"kubernetes.io/projected/5c0a029b-ba40-494a-b439-5ddf2073ad00-kube-api-access-wn58b\") pod \"nova-api-0\" (UID: \"5c0a029b-ba40-494a-b439-5ddf2073ad00\") " pod="openstack/nova-api-0" Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.064933 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c0a029b-ba40-494a-b439-5ddf2073ad00-logs\") pod \"nova-api-0\" (UID: \"5c0a029b-ba40-494a-b439-5ddf2073ad00\") " pod="openstack/nova-api-0" Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.071824 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c0a029b-ba40-494a-b439-5ddf2073ad00-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5c0a029b-ba40-494a-b439-5ddf2073ad00\") " pod="openstack/nova-api-0" Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.072902 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c0a029b-ba40-494a-b439-5ddf2073ad00-public-tls-certs\") pod \"nova-api-0\" (UID: \"5c0a029b-ba40-494a-b439-5ddf2073ad00\") " pod="openstack/nova-api-0" Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.075245 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c0a029b-ba40-494a-b439-5ddf2073ad00-config-data\") pod \"nova-api-0\" (UID: \"5c0a029b-ba40-494a-b439-5ddf2073ad00\") " pod="openstack/nova-api-0" Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.078252 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c0a029b-ba40-494a-b439-5ddf2073ad00-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5c0a029b-ba40-494a-b439-5ddf2073ad00\") " pod="openstack/nova-api-0" Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.089171 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn58b\" (UniqueName: \"kubernetes.io/projected/5c0a029b-ba40-494a-b439-5ddf2073ad00-kube-api-access-wn58b\") pod \"nova-api-0\" (UID: \"5c0a029b-ba40-494a-b439-5ddf2073ad00\") " pod="openstack/nova-api-0" Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.260111 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.448386 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.589754 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4572162-62bc-4e43-b260-497a609abd8e-config-data\") pod \"e4572162-62bc-4e43-b260-497a609abd8e\" (UID: \"e4572162-62bc-4e43-b260-497a609abd8e\") " Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.589816 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4572162-62bc-4e43-b260-497a609abd8e-run-httpd\") pod \"e4572162-62bc-4e43-b260-497a609abd8e\" (UID: \"e4572162-62bc-4e43-b260-497a609abd8e\") " Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.589884 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4572162-62bc-4e43-b260-497a609abd8e-scripts\") pod \"e4572162-62bc-4e43-b260-497a609abd8e\" (UID: \"e4572162-62bc-4e43-b260-497a609abd8e\") " Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.589919 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2h7j\" (UniqueName: \"kubernetes.io/projected/e4572162-62bc-4e43-b260-497a609abd8e-kube-api-access-d2h7j\") pod \"e4572162-62bc-4e43-b260-497a609abd8e\" (UID: \"e4572162-62bc-4e43-b260-497a609abd8e\") " Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.589956 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4572162-62bc-4e43-b260-497a609abd8e-combined-ca-bundle\") pod \"e4572162-62bc-4e43-b260-497a609abd8e\" (UID: \"e4572162-62bc-4e43-b260-497a609abd8e\") " Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.590039 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4572162-62bc-4e43-b260-497a609abd8e-sg-core-conf-yaml\") pod \"e4572162-62bc-4e43-b260-497a609abd8e\" (UID: \"e4572162-62bc-4e43-b260-497a609abd8e\") " Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.590078 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4572162-62bc-4e43-b260-497a609abd8e-log-httpd\") pod \"e4572162-62bc-4e43-b260-497a609abd8e\" (UID: \"e4572162-62bc-4e43-b260-497a609abd8e\") " Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.591979 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4572162-62bc-4e43-b260-497a609abd8e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e4572162-62bc-4e43-b260-497a609abd8e" (UID: "e4572162-62bc-4e43-b260-497a609abd8e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.592097 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4572162-62bc-4e43-b260-497a609abd8e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e4572162-62bc-4e43-b260-497a609abd8e" (UID: "e4572162-62bc-4e43-b260-497a609abd8e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.604430 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4572162-62bc-4e43-b260-497a609abd8e-scripts" (OuterVolumeSpecName: "scripts") pod "e4572162-62bc-4e43-b260-497a609abd8e" (UID: "e4572162-62bc-4e43-b260-497a609abd8e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.612548 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4572162-62bc-4e43-b260-497a609abd8e-kube-api-access-d2h7j" (OuterVolumeSpecName: "kube-api-access-d2h7j") pod "e4572162-62bc-4e43-b260-497a609abd8e" (UID: "e4572162-62bc-4e43-b260-497a609abd8e"). InnerVolumeSpecName "kube-api-access-d2h7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.628647 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4572162-62bc-4e43-b260-497a609abd8e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e4572162-62bc-4e43-b260-497a609abd8e" (UID: "e4572162-62bc-4e43-b260-497a609abd8e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.692013 4922 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4572162-62bc-4e43-b260-497a609abd8e-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.692036 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4572162-62bc-4e43-b260-497a609abd8e-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.692047 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2h7j\" (UniqueName: \"kubernetes.io/projected/e4572162-62bc-4e43-b260-497a609abd8e-kube-api-access-d2h7j\") on node \"crc\" DevicePath \"\"" Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.692056 4922 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4572162-62bc-4e43-b260-497a609abd8e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.692064 4922 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4572162-62bc-4e43-b260-497a609abd8e-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.709229 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4572162-62bc-4e43-b260-497a609abd8e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4572162-62bc-4e43-b260-497a609abd8e" (UID: "e4572162-62bc-4e43-b260-497a609abd8e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.715680 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4572162-62bc-4e43-b260-497a609abd8e-config-data" (OuterVolumeSpecName: "config-data") pod "e4572162-62bc-4e43-b260-497a609abd8e" (UID: "e4572162-62bc-4e43-b260-497a609abd8e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.793551 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4572162-62bc-4e43-b260-497a609abd8e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.793759 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4572162-62bc-4e43-b260-497a609abd8e-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.793869 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.832431 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5c0a029b-ba40-494a-b439-5ddf2073ad00","Type":"ContainerStarted","Data":"1e23b924bddad76596057cd410450387011174217e8c9d8f3144d75f72013eac"} Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.836103 4922 generic.go:334] "Generic (PLEG): container finished" podID="e4572162-62bc-4e43-b260-497a609abd8e" containerID="f89cedadf6eff1297f8ed80ba94b9fc241d82002338f1e62865bf185883f0e21" exitCode=0 Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.836143 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4572162-62bc-4e43-b260-497a609abd8e","Type":"ContainerDied","Data":"f89cedadf6eff1297f8ed80ba94b9fc241d82002338f1e62865bf185883f0e21"} Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.836450 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4572162-62bc-4e43-b260-497a609abd8e","Type":"ContainerDied","Data":"be12ad7e80f1808fcecd3d9fd6a8cd2df6d592559b9d82740e6f5d36a70ad362"} Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.836183 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.836510 4922 scope.go:117] "RemoveContainer" containerID="3883aa82d9558789bd29239e1692a77ac44f74c43c7e20ecd49ca776053c9112" Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.862354 4922 scope.go:117] "RemoveContainer" containerID="6f03b8ff2055021074f5cc7bd367566e2ac086e99a1b9b651b0a938adeca63c6" Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.921424 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.939455 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.953724 4922 scope.go:117] "RemoveContainer" containerID="f89cedadf6eff1297f8ed80ba94b9fc241d82002338f1e62865bf185883f0e21" Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.964328 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 18 11:59:00 crc kubenswrapper[4922]: E0218 11:59:00.965010 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4572162-62bc-4e43-b260-497a609abd8e" containerName="proxy-httpd" Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.965029 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4572162-62bc-4e43-b260-497a609abd8e" containerName="proxy-httpd" Feb 18 11:59:00 crc kubenswrapper[4922]: E0218 11:59:00.965054 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4572162-62bc-4e43-b260-497a609abd8e" containerName="ceilometer-central-agent" Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.965063 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4572162-62bc-4e43-b260-497a609abd8e" containerName="ceilometer-central-agent" Feb 18 11:59:00 crc kubenswrapper[4922]: E0218 11:59:00.965084 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4572162-62bc-4e43-b260-497a609abd8e" containerName="sg-core" Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.965092 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4572162-62bc-4e43-b260-497a609abd8e" containerName="sg-core" Feb 18 11:59:00 crc kubenswrapper[4922]: E0218 11:59:00.965107 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4572162-62bc-4e43-b260-497a609abd8e" containerName="ceilometer-notification-agent" Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.965113 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4572162-62bc-4e43-b260-497a609abd8e" containerName="ceilometer-notification-agent" Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.965310 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4572162-62bc-4e43-b260-497a609abd8e" containerName="ceilometer-notification-agent" Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.965325 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4572162-62bc-4e43-b260-497a609abd8e" containerName="sg-core" Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.965337 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4572162-62bc-4e43-b260-497a609abd8e" containerName="proxy-httpd" Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.965356 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4572162-62bc-4e43-b260-497a609abd8e" containerName="ceilometer-central-agent" Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.969575 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.976905 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 18 11:59:00 crc kubenswrapper[4922]: I0218 11:59:00.977423 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 18 11:59:01 crc kubenswrapper[4922]: I0218 11:59:01.082862 4922 scope.go:117] "RemoveContainer" containerID="722060fc2ca1ae896424d7a12df3ac6d06986256febfd98592ede2235bdf88db" Feb 18 11:59:01 crc kubenswrapper[4922]: I0218 11:59:01.159854 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47d3c20f-b062-4987-bbc5-c0c030d5f340" path="/var/lib/kubelet/pods/47d3c20f-b062-4987-bbc5-c0c030d5f340/volumes" Feb 18 11:59:01 crc kubenswrapper[4922]: I0218 11:59:01.162024 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4572162-62bc-4e43-b260-497a609abd8e" path="/var/lib/kubelet/pods/e4572162-62bc-4e43-b260-497a609abd8e/volumes" Feb 18 11:59:01 crc kubenswrapper[4922]: I0218 11:59:01.163841 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 11:59:01 crc kubenswrapper[4922]: I0218 11:59:01.174683 4922 scope.go:117] "RemoveContainer" containerID="3883aa82d9558789bd29239e1692a77ac44f74c43c7e20ecd49ca776053c9112" Feb 18 11:59:01 crc kubenswrapper[4922]: E0218 11:59:01.175206 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3883aa82d9558789bd29239e1692a77ac44f74c43c7e20ecd49ca776053c9112\": container with ID starting with 3883aa82d9558789bd29239e1692a77ac44f74c43c7e20ecd49ca776053c9112 not found: ID does not exist" containerID="3883aa82d9558789bd29239e1692a77ac44f74c43c7e20ecd49ca776053c9112" Feb 18 11:59:01 crc kubenswrapper[4922]: I0218 11:59:01.175247 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3883aa82d9558789bd29239e1692a77ac44f74c43c7e20ecd49ca776053c9112"} err="failed to get container status \"3883aa82d9558789bd29239e1692a77ac44f74c43c7e20ecd49ca776053c9112\": rpc error: code = NotFound desc = could not find container \"3883aa82d9558789bd29239e1692a77ac44f74c43c7e20ecd49ca776053c9112\": container with ID starting with 3883aa82d9558789bd29239e1692a77ac44f74c43c7e20ecd49ca776053c9112 not found: ID does not exist" Feb 18 11:59:01 crc kubenswrapper[4922]: I0218 11:59:01.175294 4922 scope.go:117] "RemoveContainer" containerID="6f03b8ff2055021074f5cc7bd367566e2ac086e99a1b9b651b0a938adeca63c6" Feb 18 11:59:01 crc kubenswrapper[4922]: E0218 11:59:01.175618 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f03b8ff2055021074f5cc7bd367566e2ac086e99a1b9b651b0a938adeca63c6\": container with ID starting with 6f03b8ff2055021074f5cc7bd367566e2ac086e99a1b9b651b0a938adeca63c6 not found: ID does not exist" containerID="6f03b8ff2055021074f5cc7bd367566e2ac086e99a1b9b651b0a938adeca63c6" Feb 18 11:59:01 crc kubenswrapper[4922]: I0218 11:59:01.175663 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f03b8ff2055021074f5cc7bd367566e2ac086e99a1b9b651b0a938adeca63c6"} err="failed to get container status \"6f03b8ff2055021074f5cc7bd367566e2ac086e99a1b9b651b0a938adeca63c6\": rpc error: code = NotFound desc = could not find container \"6f03b8ff2055021074f5cc7bd367566e2ac086e99a1b9b651b0a938adeca63c6\": container with ID starting with 6f03b8ff2055021074f5cc7bd367566e2ac086e99a1b9b651b0a938adeca63c6 not found: ID does not exist" Feb 18 11:59:01 crc kubenswrapper[4922]: I0218 11:59:01.175685 4922 scope.go:117] "RemoveContainer" containerID="f89cedadf6eff1297f8ed80ba94b9fc241d82002338f1e62865bf185883f0e21" Feb 18 11:59:01 crc kubenswrapper[4922]: E0218 11:59:01.176076 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f89cedadf6eff1297f8ed80ba94b9fc241d82002338f1e62865bf185883f0e21\": container with ID starting with f89cedadf6eff1297f8ed80ba94b9fc241d82002338f1e62865bf185883f0e21 not found: ID does not exist" containerID="f89cedadf6eff1297f8ed80ba94b9fc241d82002338f1e62865bf185883f0e21" Feb 18 11:59:01 crc kubenswrapper[4922]: I0218 11:59:01.176121 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f89cedadf6eff1297f8ed80ba94b9fc241d82002338f1e62865bf185883f0e21"} err="failed to get container status \"f89cedadf6eff1297f8ed80ba94b9fc241d82002338f1e62865bf185883f0e21\": rpc error: code = NotFound desc = could not find container \"f89cedadf6eff1297f8ed80ba94b9fc241d82002338f1e62865bf185883f0e21\": container with ID starting with f89cedadf6eff1297f8ed80ba94b9fc241d82002338f1e62865bf185883f0e21 not found: ID does not exist" Feb 18 11:59:01 crc kubenswrapper[4922]: I0218 11:59:01.176150 4922 scope.go:117] "RemoveContainer" containerID="722060fc2ca1ae896424d7a12df3ac6d06986256febfd98592ede2235bdf88db" Feb 18 11:59:01 crc kubenswrapper[4922]: E0218 11:59:01.176675 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"722060fc2ca1ae896424d7a12df3ac6d06986256febfd98592ede2235bdf88db\": container with ID starting with 722060fc2ca1ae896424d7a12df3ac6d06986256febfd98592ede2235bdf88db not found: ID does not exist" containerID="722060fc2ca1ae896424d7a12df3ac6d06986256febfd98592ede2235bdf88db" Feb 18 11:59:01 crc kubenswrapper[4922]: I0218 11:59:01.176726 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"722060fc2ca1ae896424d7a12df3ac6d06986256febfd98592ede2235bdf88db"} err="failed to get container status \"722060fc2ca1ae896424d7a12df3ac6d06986256febfd98592ede2235bdf88db\": rpc error: code = NotFound desc = could not find container \"722060fc2ca1ae896424d7a12df3ac6d06986256febfd98592ede2235bdf88db\": container with ID starting with 722060fc2ca1ae896424d7a12df3ac6d06986256febfd98592ede2235bdf88db not found: ID does not exist" Feb 18 11:59:01 crc kubenswrapper[4922]: I0218 11:59:01.194960 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3b8f165-b92e-47d4-ada4-5eee351d6a5a-run-httpd\") pod \"ceilometer-0\" (UID: \"e3b8f165-b92e-47d4-ada4-5eee351d6a5a\") " pod="openstack/ceilometer-0" Feb 18 11:59:01 crc kubenswrapper[4922]: I0218 11:59:01.195015 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3b8f165-b92e-47d4-ada4-5eee351d6a5a-scripts\") pod \"ceilometer-0\" (UID: \"e3b8f165-b92e-47d4-ada4-5eee351d6a5a\") " pod="openstack/ceilometer-0" Feb 18 11:59:01 crc kubenswrapper[4922]: I0218 11:59:01.195202 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3b8f165-b92e-47d4-ada4-5eee351d6a5a-log-httpd\") pod \"ceilometer-0\" (UID: \"e3b8f165-b92e-47d4-ada4-5eee351d6a5a\") " pod="openstack/ceilometer-0" Feb 18 11:59:01 crc kubenswrapper[4922]: I0218 11:59:01.195693 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3b8f165-b92e-47d4-ada4-5eee351d6a5a-config-data\") pod \"ceilometer-0\" (UID: \"e3b8f165-b92e-47d4-ada4-5eee351d6a5a\") " pod="openstack/ceilometer-0" Feb 18 11:59:01 crc kubenswrapper[4922]: I0218 11:59:01.195732 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h25l\" (UniqueName: \"kubernetes.io/projected/e3b8f165-b92e-47d4-ada4-5eee351d6a5a-kube-api-access-8h25l\") pod \"ceilometer-0\" (UID: \"e3b8f165-b92e-47d4-ada4-5eee351d6a5a\") " pod="openstack/ceilometer-0" Feb 18 11:59:01 crc kubenswrapper[4922]: I0218 11:59:01.195772 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3b8f165-b92e-47d4-ada4-5eee351d6a5a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e3b8f165-b92e-47d4-ada4-5eee351d6a5a\") " pod="openstack/ceilometer-0" Feb 18 11:59:01 crc kubenswrapper[4922]: I0218 11:59:01.196068 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e3b8f165-b92e-47d4-ada4-5eee351d6a5a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e3b8f165-b92e-47d4-ada4-5eee351d6a5a\") " pod="openstack/ceilometer-0" Feb 18 11:59:01 crc kubenswrapper[4922]: I0218 11:59:01.297814 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3b8f165-b92e-47d4-ada4-5eee351d6a5a-log-httpd\") pod \"ceilometer-0\" (UID: \"e3b8f165-b92e-47d4-ada4-5eee351d6a5a\") " pod="openstack/ceilometer-0" Feb 18 11:59:01 crc kubenswrapper[4922]: I0218 11:59:01.298215 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3b8f165-b92e-47d4-ada4-5eee351d6a5a-config-data\") pod \"ceilometer-0\" (UID: \"e3b8f165-b92e-47d4-ada4-5eee351d6a5a\") " pod="openstack/ceilometer-0" Feb 18 11:59:01 crc kubenswrapper[4922]: I0218 11:59:01.298248 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8h25l\" (UniqueName: \"kubernetes.io/projected/e3b8f165-b92e-47d4-ada4-5eee351d6a5a-kube-api-access-8h25l\") pod \"ceilometer-0\" (UID: \"e3b8f165-b92e-47d4-ada4-5eee351d6a5a\") " pod="openstack/ceilometer-0" Feb 18 11:59:01 crc kubenswrapper[4922]: I0218 11:59:01.298275 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3b8f165-b92e-47d4-ada4-5eee351d6a5a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e3b8f165-b92e-47d4-ada4-5eee351d6a5a\") " pod="openstack/ceilometer-0" Feb 18 11:59:01 crc kubenswrapper[4922]: I0218 11:59:01.298354 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e3b8f165-b92e-47d4-ada4-5eee351d6a5a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e3b8f165-b92e-47d4-ada4-5eee351d6a5a\") " pod="openstack/ceilometer-0" Feb 18 11:59:01 crc kubenswrapper[4922]: I0218 11:59:01.299120 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3b8f165-b92e-47d4-ada4-5eee351d6a5a-run-httpd\") pod \"ceilometer-0\" (UID: \"e3b8f165-b92e-47d4-ada4-5eee351d6a5a\") " pod="openstack/ceilometer-0" Feb 18 11:59:01 crc kubenswrapper[4922]: I0218 11:59:01.299148 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3b8f165-b92e-47d4-ada4-5eee351d6a5a-scripts\") pod \"ceilometer-0\" (UID: \"e3b8f165-b92e-47d4-ada4-5eee351d6a5a\") " pod="openstack/ceilometer-0" Feb 18 11:59:01 crc kubenswrapper[4922]: I0218 11:59:01.298443 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3b8f165-b92e-47d4-ada4-5eee351d6a5a-log-httpd\") pod \"ceilometer-0\" (UID: \"e3b8f165-b92e-47d4-ada4-5eee351d6a5a\") " pod="openstack/ceilometer-0" Feb 18 11:59:01 crc kubenswrapper[4922]: I0218 11:59:01.299728 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3b8f165-b92e-47d4-ada4-5eee351d6a5a-run-httpd\") pod \"ceilometer-0\" (UID: \"e3b8f165-b92e-47d4-ada4-5eee351d6a5a\") " pod="openstack/ceilometer-0" Feb 18 11:59:01 crc kubenswrapper[4922]: I0218 11:59:01.304425 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3b8f165-b92e-47d4-ada4-5eee351d6a5a-scripts\") pod \"ceilometer-0\" (UID: \"e3b8f165-b92e-47d4-ada4-5eee351d6a5a\") " pod="openstack/ceilometer-0" Feb 18 11:59:01 crc kubenswrapper[4922]: I0218 11:59:01.305490 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3b8f165-b92e-47d4-ada4-5eee351d6a5a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e3b8f165-b92e-47d4-ada4-5eee351d6a5a\") " pod="openstack/ceilometer-0" Feb 18 11:59:01 crc kubenswrapper[4922]: I0218 11:59:01.305587 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e3b8f165-b92e-47d4-ada4-5eee351d6a5a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e3b8f165-b92e-47d4-ada4-5eee351d6a5a\") " pod="openstack/ceilometer-0" Feb 18 11:59:01 crc kubenswrapper[4922]: I0218 11:59:01.320162 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3b8f165-b92e-47d4-ada4-5eee351d6a5a-config-data\") pod \"ceilometer-0\" (UID: \"e3b8f165-b92e-47d4-ada4-5eee351d6a5a\") " pod="openstack/ceilometer-0" Feb 18 11:59:01 crc kubenswrapper[4922]: I0218 11:59:01.321173 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8h25l\" (UniqueName: \"kubernetes.io/projected/e3b8f165-b92e-47d4-ada4-5eee351d6a5a-kube-api-access-8h25l\") pod \"ceilometer-0\" (UID: \"e3b8f165-b92e-47d4-ada4-5eee351d6a5a\") " pod="openstack/ceilometer-0" Feb 18 11:59:01 crc kubenswrapper[4922]: I0218 11:59:01.465755 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 11:59:01 crc kubenswrapper[4922]: I0218 11:59:01.860009 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5c0a029b-ba40-494a-b439-5ddf2073ad00","Type":"ContainerStarted","Data":"1ec2718d8714c27f5e2e142d0c677a6a64bedbb507cfb8ea954dbcb0e4cebc94"} Feb 18 11:59:01 crc kubenswrapper[4922]: I0218 11:59:01.860284 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5c0a029b-ba40-494a-b439-5ddf2073ad00","Type":"ContainerStarted","Data":"43904287d555f659fd122c3d980c18b3522ce068f02a16e4a164632620f31422"} Feb 18 11:59:01 crc kubenswrapper[4922]: I0218 11:59:01.880871 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.880846613 podStartE2EDuration="2.880846613s" podCreationTimestamp="2026-02-18 11:58:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:59:01.87794129 +0000 UTC m=+1343.605645390" watchObservedRunningTime="2026-02-18 11:59:01.880846613 +0000 UTC m=+1343.608550693" Feb 18 11:59:02 crc kubenswrapper[4922]: I0218 11:59:02.020110 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 11:59:02 crc kubenswrapper[4922]: W0218 11:59:02.022866 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3b8f165_b92e_47d4_ada4_5eee351d6a5a.slice/crio-f48c4f106861c2b0fbbd89f3ec527e011b85b43938cee7887c256abc61904ff7 WatchSource:0}: Error finding container f48c4f106861c2b0fbbd89f3ec527e011b85b43938cee7887c256abc61904ff7: Status 404 returned error can't find the container with id f48c4f106861c2b0fbbd89f3ec527e011b85b43938cee7887c256abc61904ff7 Feb 18 11:59:02 crc kubenswrapper[4922]: I0218 11:59:02.875537 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e3b8f165-b92e-47d4-ada4-5eee351d6a5a","Type":"ContainerStarted","Data":"a917c6100631e0cecbb43e73dcdd3017b9ee5fcde8f4b2eb40ffa1cf7a6ddd61"} Feb 18 11:59:02 crc kubenswrapper[4922]: I0218 11:59:02.876054 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e3b8f165-b92e-47d4-ada4-5eee351d6a5a","Type":"ContainerStarted","Data":"f48c4f106861c2b0fbbd89f3ec527e011b85b43938cee7887c256abc61904ff7"} Feb 18 11:59:03 crc kubenswrapper[4922]: I0218 11:59:03.112210 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 18 11:59:03 crc kubenswrapper[4922]: I0218 11:59:03.134765 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 18 11:59:03 crc kubenswrapper[4922]: I0218 11:59:03.461500 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cd5cbd7b9-c2lrw" Feb 18 11:59:03 crc kubenswrapper[4922]: I0218 11:59:03.565989 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-pc7hm"] Feb 18 11:59:03 crc kubenswrapper[4922]: I0218 11:59:03.566197 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bccf8f775-pc7hm" podUID="5fe07f11-0f10-4aa7-ab94-51d42b7a6367" containerName="dnsmasq-dns" containerID="cri-o://42671006b982c23fe661235bf5a74dcd5e79ee9d03166f4a0fe4596b08069112" gracePeriod=10 Feb 18 11:59:03 crc kubenswrapper[4922]: I0218 11:59:03.889606 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e3b8f165-b92e-47d4-ada4-5eee351d6a5a","Type":"ContainerStarted","Data":"c0b1ec47222cf49451bd1b033a64e0ee80b9aa5aed82b2529850f17df47b4532"} Feb 18 11:59:03 crc kubenswrapper[4922]: I0218 11:59:03.894644 4922 generic.go:334] "Generic (PLEG): container finished" podID="5fe07f11-0f10-4aa7-ab94-51d42b7a6367" containerID="42671006b982c23fe661235bf5a74dcd5e79ee9d03166f4a0fe4596b08069112" exitCode=0 Feb 18 11:59:03 crc kubenswrapper[4922]: I0218 11:59:03.896212 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-pc7hm" event={"ID":"5fe07f11-0f10-4aa7-ab94-51d42b7a6367","Type":"ContainerDied","Data":"42671006b982c23fe661235bf5a74dcd5e79ee9d03166f4a0fe4596b08069112"} Feb 18 11:59:03 crc kubenswrapper[4922]: I0218 11:59:03.910960 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 18 11:59:04 crc kubenswrapper[4922]: I0218 11:59:04.787480 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-dqtvp"] Feb 18 11:59:04 crc kubenswrapper[4922]: I0218 11:59:04.790831 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dqtvp" Feb 18 11:59:04 crc kubenswrapper[4922]: I0218 11:59:04.792703 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 18 11:59:04 crc kubenswrapper[4922]: I0218 11:59:04.824876 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 18 11:59:04 crc kubenswrapper[4922]: I0218 11:59:04.837967 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-dqtvp"] Feb 18 11:59:04 crc kubenswrapper[4922]: I0218 11:59:04.904124 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3440f4a-2f1d-4d69-aafe-ec2eb86183cc-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-dqtvp\" (UID: \"b3440f4a-2f1d-4d69-aafe-ec2eb86183cc\") " pod="openstack/nova-cell1-cell-mapping-dqtvp" Feb 18 11:59:04 crc kubenswrapper[4922]: I0218 11:59:04.904478 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3440f4a-2f1d-4d69-aafe-ec2eb86183cc-scripts\") pod \"nova-cell1-cell-mapping-dqtvp\" (UID: \"b3440f4a-2f1d-4d69-aafe-ec2eb86183cc\") " pod="openstack/nova-cell1-cell-mapping-dqtvp" Feb 18 11:59:04 crc kubenswrapper[4922]: I0218 11:59:04.904525 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb558\" (UniqueName: \"kubernetes.io/projected/b3440f4a-2f1d-4d69-aafe-ec2eb86183cc-kube-api-access-fb558\") pod \"nova-cell1-cell-mapping-dqtvp\" (UID: \"b3440f4a-2f1d-4d69-aafe-ec2eb86183cc\") " pod="openstack/nova-cell1-cell-mapping-dqtvp" Feb 18 11:59:04 crc kubenswrapper[4922]: I0218 11:59:04.904557 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3440f4a-2f1d-4d69-aafe-ec2eb86183cc-config-data\") pod \"nova-cell1-cell-mapping-dqtvp\" (UID: \"b3440f4a-2f1d-4d69-aafe-ec2eb86183cc\") " pod="openstack/nova-cell1-cell-mapping-dqtvp" Feb 18 11:59:04 crc kubenswrapper[4922]: I0218 11:59:04.931252 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-pc7hm" Feb 18 11:59:04 crc kubenswrapper[4922]: I0218 11:59:04.936641 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-pc7hm" event={"ID":"5fe07f11-0f10-4aa7-ab94-51d42b7a6367","Type":"ContainerDied","Data":"0199a345c591cc6803424fbda06bc8be81c6f2649bc4692acfd3ea6b8d197abc"} Feb 18 11:59:04 crc kubenswrapper[4922]: I0218 11:59:04.936684 4922 scope.go:117] "RemoveContainer" containerID="42671006b982c23fe661235bf5a74dcd5e79ee9d03166f4a0fe4596b08069112" Feb 18 11:59:05 crc kubenswrapper[4922]: I0218 11:59:05.017702 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3440f4a-2f1d-4d69-aafe-ec2eb86183cc-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-dqtvp\" (UID: \"b3440f4a-2f1d-4d69-aafe-ec2eb86183cc\") " pod="openstack/nova-cell1-cell-mapping-dqtvp" Feb 18 11:59:05 crc kubenswrapper[4922]: I0218 11:59:05.017776 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3440f4a-2f1d-4d69-aafe-ec2eb86183cc-scripts\") pod \"nova-cell1-cell-mapping-dqtvp\" (UID: \"b3440f4a-2f1d-4d69-aafe-ec2eb86183cc\") " pod="openstack/nova-cell1-cell-mapping-dqtvp" Feb 18 11:59:05 crc kubenswrapper[4922]: I0218 11:59:05.017817 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb558\" (UniqueName: \"kubernetes.io/projected/b3440f4a-2f1d-4d69-aafe-ec2eb86183cc-kube-api-access-fb558\") pod \"nova-cell1-cell-mapping-dqtvp\" (UID: \"b3440f4a-2f1d-4d69-aafe-ec2eb86183cc\") " pod="openstack/nova-cell1-cell-mapping-dqtvp" Feb 18 11:59:05 crc kubenswrapper[4922]: I0218 11:59:05.017848 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3440f4a-2f1d-4d69-aafe-ec2eb86183cc-config-data\") pod \"nova-cell1-cell-mapping-dqtvp\" (UID: \"b3440f4a-2f1d-4d69-aafe-ec2eb86183cc\") " pod="openstack/nova-cell1-cell-mapping-dqtvp" Feb 18 11:59:05 crc kubenswrapper[4922]: I0218 11:59:05.040387 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3440f4a-2f1d-4d69-aafe-ec2eb86183cc-config-data\") pod \"nova-cell1-cell-mapping-dqtvp\" (UID: \"b3440f4a-2f1d-4d69-aafe-ec2eb86183cc\") " pod="openstack/nova-cell1-cell-mapping-dqtvp" Feb 18 11:59:05 crc kubenswrapper[4922]: I0218 11:59:05.044994 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3440f4a-2f1d-4d69-aafe-ec2eb86183cc-scripts\") pod \"nova-cell1-cell-mapping-dqtvp\" (UID: \"b3440f4a-2f1d-4d69-aafe-ec2eb86183cc\") " pod="openstack/nova-cell1-cell-mapping-dqtvp" Feb 18 11:59:05 crc kubenswrapper[4922]: I0218 11:59:05.048995 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3440f4a-2f1d-4d69-aafe-ec2eb86183cc-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-dqtvp\" (UID: \"b3440f4a-2f1d-4d69-aafe-ec2eb86183cc\") " pod="openstack/nova-cell1-cell-mapping-dqtvp" Feb 18 11:59:05 crc kubenswrapper[4922]: I0218 11:59:05.055723 4922 scope.go:117] "RemoveContainer" containerID="8aa54b45b2152668d79b56f9c12b91df1925011ab0dbd7a2601a1ffa9f2d27a9" Feb 18 11:59:05 crc kubenswrapper[4922]: I0218 11:59:05.065917 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb558\" (UniqueName: \"kubernetes.io/projected/b3440f4a-2f1d-4d69-aafe-ec2eb86183cc-kube-api-access-fb558\") pod \"nova-cell1-cell-mapping-dqtvp\" (UID: \"b3440f4a-2f1d-4d69-aafe-ec2eb86183cc\") " pod="openstack/nova-cell1-cell-mapping-dqtvp" Feb 18 11:59:05 crc kubenswrapper[4922]: I0218 11:59:05.134859 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5fe07f11-0f10-4aa7-ab94-51d42b7a6367-ovsdbserver-sb\") pod \"5fe07f11-0f10-4aa7-ab94-51d42b7a6367\" (UID: \"5fe07f11-0f10-4aa7-ab94-51d42b7a6367\") " Feb 18 11:59:05 crc kubenswrapper[4922]: I0218 11:59:05.134941 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5fe07f11-0f10-4aa7-ab94-51d42b7a6367-dns-svc\") pod \"5fe07f11-0f10-4aa7-ab94-51d42b7a6367\" (UID: \"5fe07f11-0f10-4aa7-ab94-51d42b7a6367\") " Feb 18 11:59:05 crc kubenswrapper[4922]: I0218 11:59:05.134991 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fe07f11-0f10-4aa7-ab94-51d42b7a6367-config\") pod \"5fe07f11-0f10-4aa7-ab94-51d42b7a6367\" (UID: \"5fe07f11-0f10-4aa7-ab94-51d42b7a6367\") " Feb 18 11:59:05 crc kubenswrapper[4922]: I0218 11:59:05.135046 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5fe07f11-0f10-4aa7-ab94-51d42b7a6367-dns-swift-storage-0\") pod \"5fe07f11-0f10-4aa7-ab94-51d42b7a6367\" (UID: \"5fe07f11-0f10-4aa7-ab94-51d42b7a6367\") " Feb 18 11:59:05 crc kubenswrapper[4922]: I0218 11:59:05.135121 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5fe07f11-0f10-4aa7-ab94-51d42b7a6367-ovsdbserver-nb\") pod \"5fe07f11-0f10-4aa7-ab94-51d42b7a6367\" (UID: \"5fe07f11-0f10-4aa7-ab94-51d42b7a6367\") " Feb 18 11:59:05 crc kubenswrapper[4922]: I0218 11:59:05.135148 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29tzw\" (UniqueName: \"kubernetes.io/projected/5fe07f11-0f10-4aa7-ab94-51d42b7a6367-kube-api-access-29tzw\") pod \"5fe07f11-0f10-4aa7-ab94-51d42b7a6367\" (UID: \"5fe07f11-0f10-4aa7-ab94-51d42b7a6367\") " Feb 18 11:59:05 crc kubenswrapper[4922]: I0218 11:59:05.157102 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe07f11-0f10-4aa7-ab94-51d42b7a6367-kube-api-access-29tzw" (OuterVolumeSpecName: "kube-api-access-29tzw") pod "5fe07f11-0f10-4aa7-ab94-51d42b7a6367" (UID: "5fe07f11-0f10-4aa7-ab94-51d42b7a6367"). InnerVolumeSpecName "kube-api-access-29tzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:59:05 crc kubenswrapper[4922]: I0218 11:59:05.187532 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fe07f11-0f10-4aa7-ab94-51d42b7a6367-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5fe07f11-0f10-4aa7-ab94-51d42b7a6367" (UID: "5fe07f11-0f10-4aa7-ab94-51d42b7a6367"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:59:05 crc kubenswrapper[4922]: I0218 11:59:05.201515 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fe07f11-0f10-4aa7-ab94-51d42b7a6367-config" (OuterVolumeSpecName: "config") pod "5fe07f11-0f10-4aa7-ab94-51d42b7a6367" (UID: "5fe07f11-0f10-4aa7-ab94-51d42b7a6367"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:59:05 crc kubenswrapper[4922]: I0218 11:59:05.218786 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fe07f11-0f10-4aa7-ab94-51d42b7a6367-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5fe07f11-0f10-4aa7-ab94-51d42b7a6367" (UID: "5fe07f11-0f10-4aa7-ab94-51d42b7a6367"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:59:05 crc kubenswrapper[4922]: I0218 11:59:05.220590 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fe07f11-0f10-4aa7-ab94-51d42b7a6367-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5fe07f11-0f10-4aa7-ab94-51d42b7a6367" (UID: "5fe07f11-0f10-4aa7-ab94-51d42b7a6367"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:59:05 crc kubenswrapper[4922]: I0218 11:59:05.238743 4922 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5fe07f11-0f10-4aa7-ab94-51d42b7a6367-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 18 11:59:05 crc kubenswrapper[4922]: I0218 11:59:05.238802 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5fe07f11-0f10-4aa7-ab94-51d42b7a6367-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 18 11:59:05 crc kubenswrapper[4922]: I0218 11:59:05.238814 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29tzw\" (UniqueName: \"kubernetes.io/projected/5fe07f11-0f10-4aa7-ab94-51d42b7a6367-kube-api-access-29tzw\") on node \"crc\" DevicePath \"\"" Feb 18 11:59:05 crc kubenswrapper[4922]: I0218 11:59:05.238829 4922 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5fe07f11-0f10-4aa7-ab94-51d42b7a6367-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 11:59:05 crc kubenswrapper[4922]: I0218 11:59:05.238857 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fe07f11-0f10-4aa7-ab94-51d42b7a6367-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:59:05 crc kubenswrapper[4922]: I0218 11:59:05.238889 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fe07f11-0f10-4aa7-ab94-51d42b7a6367-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5fe07f11-0f10-4aa7-ab94-51d42b7a6367" (UID: "5fe07f11-0f10-4aa7-ab94-51d42b7a6367"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:59:05 crc kubenswrapper[4922]: I0218 11:59:05.289482 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dqtvp" Feb 18 11:59:05 crc kubenswrapper[4922]: I0218 11:59:05.340496 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5fe07f11-0f10-4aa7-ab94-51d42b7a6367-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 18 11:59:05 crc kubenswrapper[4922]: I0218 11:59:05.759416 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-dqtvp"] Feb 18 11:59:05 crc kubenswrapper[4922]: I0218 11:59:05.964961 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e3b8f165-b92e-47d4-ada4-5eee351d6a5a","Type":"ContainerStarted","Data":"4e1f36256af229f8391ab980166248080c950f7ddf5a27ccd1a51d73b649da7a"} Feb 18 11:59:05 crc kubenswrapper[4922]: I0218 11:59:05.966498 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-pc7hm" Feb 18 11:59:05 crc kubenswrapper[4922]: I0218 11:59:05.972255 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dqtvp" event={"ID":"b3440f4a-2f1d-4d69-aafe-ec2eb86183cc","Type":"ContainerStarted","Data":"55f5fe7cc5fea944498467b4885f7a182e7fdf03f39f9a7436de1aa8836d34cf"} Feb 18 11:59:06 crc kubenswrapper[4922]: I0218 11:59:06.003879 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-pc7hm"] Feb 18 11:59:06 crc kubenswrapper[4922]: I0218 11:59:06.012998 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-pc7hm"] Feb 18 11:59:06 crc kubenswrapper[4922]: I0218 11:59:06.986294 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe07f11-0f10-4aa7-ab94-51d42b7a6367" path="/var/lib/kubelet/pods/5fe07f11-0f10-4aa7-ab94-51d42b7a6367/volumes" Feb 18 11:59:06 crc kubenswrapper[4922]: I0218 11:59:06.988797 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dqtvp" event={"ID":"b3440f4a-2f1d-4d69-aafe-ec2eb86183cc","Type":"ContainerStarted","Data":"fe9e0b8c506169c8ff2962eae62f59ba2039b683aba9a221a651171f7491288f"} Feb 18 11:59:07 crc kubenswrapper[4922]: I0218 11:59:07.005744 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-dqtvp" podStartSLOduration=3.005724069 podStartE2EDuration="3.005724069s" podCreationTimestamp="2026-02-18 11:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:59:07.003755859 +0000 UTC m=+1348.731459959" watchObservedRunningTime="2026-02-18 11:59:07.005724069 +0000 UTC m=+1348.733428149" Feb 18 11:59:09 crc kubenswrapper[4922]: I0218 11:59:09.009711 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e3b8f165-b92e-47d4-ada4-5eee351d6a5a","Type":"ContainerStarted","Data":"230a0bc54b8dd07f66eb86ad2215c814c3b880945116c4dab8556a97fba068aa"} Feb 18 11:59:09 crc kubenswrapper[4922]: I0218 11:59:09.010224 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 18 11:59:09 crc kubenswrapper[4922]: I0218 11:59:09.037964 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.294086494 podStartE2EDuration="9.037864193s" podCreationTimestamp="2026-02-18 11:59:00 +0000 UTC" firstStartedPulling="2026-02-18 11:59:02.028236269 +0000 UTC m=+1343.755940349" lastFinishedPulling="2026-02-18 11:59:07.772013968 +0000 UTC m=+1349.499718048" observedRunningTime="2026-02-18 11:59:09.028343462 +0000 UTC m=+1350.756047532" watchObservedRunningTime="2026-02-18 11:59:09.037864193 +0000 UTC m=+1350.765568273" Feb 18 11:59:09 crc kubenswrapper[4922]: I0218 11:59:09.755670 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-bccf8f775-pc7hm" podUID="5fe07f11-0f10-4aa7-ab94-51d42b7a6367" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.209:5353: i/o timeout" Feb 18 11:59:10 crc kubenswrapper[4922]: I0218 11:59:10.261660 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 18 11:59:10 crc kubenswrapper[4922]: I0218 11:59:10.261969 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 18 11:59:11 crc kubenswrapper[4922]: I0218 11:59:11.277651 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5c0a029b-ba40-494a-b439-5ddf2073ad00" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.219:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 18 11:59:11 crc kubenswrapper[4922]: I0218 11:59:11.277713 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5c0a029b-ba40-494a-b439-5ddf2073ad00" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.219:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 18 11:59:12 crc kubenswrapper[4922]: I0218 11:59:12.036537 4922 generic.go:334] "Generic (PLEG): container finished" podID="b3440f4a-2f1d-4d69-aafe-ec2eb86183cc" containerID="fe9e0b8c506169c8ff2962eae62f59ba2039b683aba9a221a651171f7491288f" exitCode=0 Feb 18 11:59:12 crc kubenswrapper[4922]: I0218 11:59:12.036600 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dqtvp" event={"ID":"b3440f4a-2f1d-4d69-aafe-ec2eb86183cc","Type":"ContainerDied","Data":"fe9e0b8c506169c8ff2962eae62f59ba2039b683aba9a221a651171f7491288f"} Feb 18 11:59:13 crc kubenswrapper[4922]: I0218 11:59:13.430920 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dqtvp" Feb 18 11:59:13 crc kubenswrapper[4922]: I0218 11:59:13.499311 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fb558\" (UniqueName: \"kubernetes.io/projected/b3440f4a-2f1d-4d69-aafe-ec2eb86183cc-kube-api-access-fb558\") pod \"b3440f4a-2f1d-4d69-aafe-ec2eb86183cc\" (UID: \"b3440f4a-2f1d-4d69-aafe-ec2eb86183cc\") " Feb 18 11:59:13 crc kubenswrapper[4922]: I0218 11:59:13.499684 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3440f4a-2f1d-4d69-aafe-ec2eb86183cc-combined-ca-bundle\") pod \"b3440f4a-2f1d-4d69-aafe-ec2eb86183cc\" (UID: \"b3440f4a-2f1d-4d69-aafe-ec2eb86183cc\") " Feb 18 11:59:13 crc kubenswrapper[4922]: I0218 11:59:13.499814 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3440f4a-2f1d-4d69-aafe-ec2eb86183cc-config-data\") pod \"b3440f4a-2f1d-4d69-aafe-ec2eb86183cc\" (UID: \"b3440f4a-2f1d-4d69-aafe-ec2eb86183cc\") " Feb 18 11:59:13 crc kubenswrapper[4922]: I0218 11:59:13.499849 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3440f4a-2f1d-4d69-aafe-ec2eb86183cc-scripts\") pod \"b3440f4a-2f1d-4d69-aafe-ec2eb86183cc\" (UID: \"b3440f4a-2f1d-4d69-aafe-ec2eb86183cc\") " Feb 18 11:59:13 crc kubenswrapper[4922]: I0218 11:59:13.505335 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3440f4a-2f1d-4d69-aafe-ec2eb86183cc-scripts" (OuterVolumeSpecName: "scripts") pod "b3440f4a-2f1d-4d69-aafe-ec2eb86183cc" (UID: "b3440f4a-2f1d-4d69-aafe-ec2eb86183cc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:59:13 crc kubenswrapper[4922]: I0218 11:59:13.505793 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3440f4a-2f1d-4d69-aafe-ec2eb86183cc-kube-api-access-fb558" (OuterVolumeSpecName: "kube-api-access-fb558") pod "b3440f4a-2f1d-4d69-aafe-ec2eb86183cc" (UID: "b3440f4a-2f1d-4d69-aafe-ec2eb86183cc"). InnerVolumeSpecName "kube-api-access-fb558". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:59:13 crc kubenswrapper[4922]: I0218 11:59:13.532461 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3440f4a-2f1d-4d69-aafe-ec2eb86183cc-config-data" (OuterVolumeSpecName: "config-data") pod "b3440f4a-2f1d-4d69-aafe-ec2eb86183cc" (UID: "b3440f4a-2f1d-4d69-aafe-ec2eb86183cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:59:13 crc kubenswrapper[4922]: I0218 11:59:13.547244 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3440f4a-2f1d-4d69-aafe-ec2eb86183cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b3440f4a-2f1d-4d69-aafe-ec2eb86183cc" (UID: "b3440f4a-2f1d-4d69-aafe-ec2eb86183cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:59:13 crc kubenswrapper[4922]: I0218 11:59:13.602352 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3440f4a-2f1d-4d69-aafe-ec2eb86183cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:59:13 crc kubenswrapper[4922]: I0218 11:59:13.602640 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3440f4a-2f1d-4d69-aafe-ec2eb86183cc-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 11:59:13 crc kubenswrapper[4922]: I0218 11:59:13.602653 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3440f4a-2f1d-4d69-aafe-ec2eb86183cc-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 11:59:13 crc kubenswrapper[4922]: I0218 11:59:13.602665 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fb558\" (UniqueName: \"kubernetes.io/projected/b3440f4a-2f1d-4d69-aafe-ec2eb86183cc-kube-api-access-fb558\") on node \"crc\" DevicePath \"\"" Feb 18 11:59:14 crc kubenswrapper[4922]: I0218 11:59:14.056289 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dqtvp" event={"ID":"b3440f4a-2f1d-4d69-aafe-ec2eb86183cc","Type":"ContainerDied","Data":"55f5fe7cc5fea944498467b4885f7a182e7fdf03f39f9a7436de1aa8836d34cf"} Feb 18 11:59:14 crc kubenswrapper[4922]: I0218 11:59:14.056598 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55f5fe7cc5fea944498467b4885f7a182e7fdf03f39f9a7436de1aa8836d34cf" Feb 18 11:59:14 crc kubenswrapper[4922]: I0218 11:59:14.056320 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dqtvp" Feb 18 11:59:14 crc kubenswrapper[4922]: I0218 11:59:14.233121 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 18 11:59:14 crc kubenswrapper[4922]: I0218 11:59:14.233633 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5c0a029b-ba40-494a-b439-5ddf2073ad00" containerName="nova-api-log" containerID="cri-o://43904287d555f659fd122c3d980c18b3522ce068f02a16e4a164632620f31422" gracePeriod=30 Feb 18 11:59:14 crc kubenswrapper[4922]: I0218 11:59:14.233746 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5c0a029b-ba40-494a-b439-5ddf2073ad00" containerName="nova-api-api" containerID="cri-o://1ec2718d8714c27f5e2e142d0c677a6a64bedbb507cfb8ea954dbcb0e4cebc94" gracePeriod=30 Feb 18 11:59:14 crc kubenswrapper[4922]: I0218 11:59:14.245632 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 11:59:14 crc kubenswrapper[4922]: I0218 11:59:14.245891 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="fe95484c-ea5d-4ea3-8915-bb6734014373" containerName="nova-scheduler-scheduler" containerID="cri-o://8d90a705ba699c8182d6d1e0ea35773ad23d4e3cf41e4c0e22f8061e6e782bf5" gracePeriod=30 Feb 18 11:59:14 crc kubenswrapper[4922]: I0218 11:59:14.313109 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 11:59:14 crc kubenswrapper[4922]: I0218 11:59:14.313333 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="10a54dd7-a74b-49c4-a631-ad8fe2c22d58" containerName="nova-metadata-log" containerID="cri-o://b08e25eab8acf946afba995372397a3ab97e8a1f62eebeb6161f0fd5696132b0" gracePeriod=30 Feb 18 11:59:14 crc kubenswrapper[4922]: I0218 11:59:14.313486 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="10a54dd7-a74b-49c4-a631-ad8fe2c22d58" containerName="nova-metadata-metadata" containerID="cri-o://6eb028dcb7cb22f62f9b70fd21469cacb245f308ad0c1ab142649395ab9a2db8" gracePeriod=30 Feb 18 11:59:15 crc kubenswrapper[4922]: I0218 11:59:15.066615 4922 generic.go:334] "Generic (PLEG): container finished" podID="10a54dd7-a74b-49c4-a631-ad8fe2c22d58" containerID="b08e25eab8acf946afba995372397a3ab97e8a1f62eebeb6161f0fd5696132b0" exitCode=143 Feb 18 11:59:15 crc kubenswrapper[4922]: I0218 11:59:15.066686 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"10a54dd7-a74b-49c4-a631-ad8fe2c22d58","Type":"ContainerDied","Data":"b08e25eab8acf946afba995372397a3ab97e8a1f62eebeb6161f0fd5696132b0"} Feb 18 11:59:15 crc kubenswrapper[4922]: I0218 11:59:15.068405 4922 generic.go:334] "Generic (PLEG): container finished" podID="5c0a029b-ba40-494a-b439-5ddf2073ad00" containerID="43904287d555f659fd122c3d980c18b3522ce068f02a16e4a164632620f31422" exitCode=143 Feb 18 11:59:15 crc kubenswrapper[4922]: I0218 11:59:15.068433 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5c0a029b-ba40-494a-b439-5ddf2073ad00","Type":"ContainerDied","Data":"43904287d555f659fd122c3d980c18b3522ce068f02a16e4a164632620f31422"} Feb 18 11:59:17 crc kubenswrapper[4922]: I0218 11:59:17.017057 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 18 11:59:17 crc kubenswrapper[4922]: I0218 11:59:17.088062 4922 generic.go:334] "Generic (PLEG): container finished" podID="fe95484c-ea5d-4ea3-8915-bb6734014373" containerID="8d90a705ba699c8182d6d1e0ea35773ad23d4e3cf41e4c0e22f8061e6e782bf5" exitCode=0 Feb 18 11:59:17 crc kubenswrapper[4922]: I0218 11:59:17.088111 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fe95484c-ea5d-4ea3-8915-bb6734014373","Type":"ContainerDied","Data":"8d90a705ba699c8182d6d1e0ea35773ad23d4e3cf41e4c0e22f8061e6e782bf5"} Feb 18 11:59:17 crc kubenswrapper[4922]: I0218 11:59:17.088120 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 18 11:59:17 crc kubenswrapper[4922]: I0218 11:59:17.088142 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fe95484c-ea5d-4ea3-8915-bb6734014373","Type":"ContainerDied","Data":"bde417a0228740daf9a9d49c2dd39d718faec4df17eddbb3036309edba429d67"} Feb 18 11:59:17 crc kubenswrapper[4922]: I0218 11:59:17.088163 4922 scope.go:117] "RemoveContainer" containerID="8d90a705ba699c8182d6d1e0ea35773ad23d4e3cf41e4c0e22f8061e6e782bf5" Feb 18 11:59:17 crc kubenswrapper[4922]: I0218 11:59:17.100261 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe95484c-ea5d-4ea3-8915-bb6734014373-config-data\") pod \"fe95484c-ea5d-4ea3-8915-bb6734014373\" (UID: \"fe95484c-ea5d-4ea3-8915-bb6734014373\") " Feb 18 11:59:17 crc kubenswrapper[4922]: I0218 11:59:17.100479 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe95484c-ea5d-4ea3-8915-bb6734014373-combined-ca-bundle\") pod \"fe95484c-ea5d-4ea3-8915-bb6734014373\" (UID: \"fe95484c-ea5d-4ea3-8915-bb6734014373\") " Feb 18 11:59:17 crc kubenswrapper[4922]: I0218 11:59:17.100540 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcwtr\" (UniqueName: \"kubernetes.io/projected/fe95484c-ea5d-4ea3-8915-bb6734014373-kube-api-access-bcwtr\") pod \"fe95484c-ea5d-4ea3-8915-bb6734014373\" (UID: \"fe95484c-ea5d-4ea3-8915-bb6734014373\") " Feb 18 11:59:17 crc kubenswrapper[4922]: I0218 11:59:17.105961 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe95484c-ea5d-4ea3-8915-bb6734014373-kube-api-access-bcwtr" (OuterVolumeSpecName: "kube-api-access-bcwtr") pod "fe95484c-ea5d-4ea3-8915-bb6734014373" (UID: "fe95484c-ea5d-4ea3-8915-bb6734014373"). InnerVolumeSpecName "kube-api-access-bcwtr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:59:17 crc kubenswrapper[4922]: I0218 11:59:17.114662 4922 scope.go:117] "RemoveContainer" containerID="8d90a705ba699c8182d6d1e0ea35773ad23d4e3cf41e4c0e22f8061e6e782bf5" Feb 18 11:59:17 crc kubenswrapper[4922]: E0218 11:59:17.115611 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d90a705ba699c8182d6d1e0ea35773ad23d4e3cf41e4c0e22f8061e6e782bf5\": container with ID starting with 8d90a705ba699c8182d6d1e0ea35773ad23d4e3cf41e4c0e22f8061e6e782bf5 not found: ID does not exist" containerID="8d90a705ba699c8182d6d1e0ea35773ad23d4e3cf41e4c0e22f8061e6e782bf5" Feb 18 11:59:17 crc kubenswrapper[4922]: I0218 11:59:17.115701 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d90a705ba699c8182d6d1e0ea35773ad23d4e3cf41e4c0e22f8061e6e782bf5"} err="failed to get container status \"8d90a705ba699c8182d6d1e0ea35773ad23d4e3cf41e4c0e22f8061e6e782bf5\": rpc error: code = NotFound desc = could not find container \"8d90a705ba699c8182d6d1e0ea35773ad23d4e3cf41e4c0e22f8061e6e782bf5\": container with ID starting with 8d90a705ba699c8182d6d1e0ea35773ad23d4e3cf41e4c0e22f8061e6e782bf5 not found: ID does not exist" Feb 18 11:59:17 crc kubenswrapper[4922]: I0218 11:59:17.127579 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe95484c-ea5d-4ea3-8915-bb6734014373-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe95484c-ea5d-4ea3-8915-bb6734014373" (UID: "fe95484c-ea5d-4ea3-8915-bb6734014373"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:59:17 crc kubenswrapper[4922]: I0218 11:59:17.137152 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe95484c-ea5d-4ea3-8915-bb6734014373-config-data" (OuterVolumeSpecName: "config-data") pod "fe95484c-ea5d-4ea3-8915-bb6734014373" (UID: "fe95484c-ea5d-4ea3-8915-bb6734014373"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:59:17 crc kubenswrapper[4922]: I0218 11:59:17.202318 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcwtr\" (UniqueName: \"kubernetes.io/projected/fe95484c-ea5d-4ea3-8915-bb6734014373-kube-api-access-bcwtr\") on node \"crc\" DevicePath \"\"" Feb 18 11:59:17 crc kubenswrapper[4922]: I0218 11:59:17.202378 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe95484c-ea5d-4ea3-8915-bb6734014373-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 11:59:17 crc kubenswrapper[4922]: I0218 11:59:17.202388 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe95484c-ea5d-4ea3-8915-bb6734014373-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:59:17 crc kubenswrapper[4922]: I0218 11:59:17.436015 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 11:59:17 crc kubenswrapper[4922]: I0218 11:59:17.450840 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 11:59:17 crc kubenswrapper[4922]: I0218 11:59:17.465415 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 11:59:17 crc kubenswrapper[4922]: E0218 11:59:17.465872 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3440f4a-2f1d-4d69-aafe-ec2eb86183cc" containerName="nova-manage" Feb 18 11:59:17 crc kubenswrapper[4922]: I0218 11:59:17.465895 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3440f4a-2f1d-4d69-aafe-ec2eb86183cc" containerName="nova-manage" Feb 18 11:59:17 crc kubenswrapper[4922]: E0218 11:59:17.465917 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fe07f11-0f10-4aa7-ab94-51d42b7a6367" containerName="init" Feb 18 11:59:17 crc kubenswrapper[4922]: I0218 11:59:17.465924 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fe07f11-0f10-4aa7-ab94-51d42b7a6367" containerName="init" Feb 18 11:59:17 crc kubenswrapper[4922]: E0218 11:59:17.465935 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fe07f11-0f10-4aa7-ab94-51d42b7a6367" containerName="dnsmasq-dns" Feb 18 11:59:17 crc kubenswrapper[4922]: I0218 11:59:17.465941 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fe07f11-0f10-4aa7-ab94-51d42b7a6367" containerName="dnsmasq-dns" Feb 18 11:59:17 crc kubenswrapper[4922]: E0218 11:59:17.465958 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe95484c-ea5d-4ea3-8915-bb6734014373" containerName="nova-scheduler-scheduler" Feb 18 11:59:17 crc kubenswrapper[4922]: I0218 11:59:17.465964 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe95484c-ea5d-4ea3-8915-bb6734014373" containerName="nova-scheduler-scheduler" Feb 18 11:59:17 crc kubenswrapper[4922]: I0218 11:59:17.466136 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fe07f11-0f10-4aa7-ab94-51d42b7a6367" containerName="dnsmasq-dns" Feb 18 11:59:17 crc kubenswrapper[4922]: I0218 11:59:17.466151 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3440f4a-2f1d-4d69-aafe-ec2eb86183cc" containerName="nova-manage" Feb 18 11:59:17 crc kubenswrapper[4922]: I0218 11:59:17.466160 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe95484c-ea5d-4ea3-8915-bb6734014373" containerName="nova-scheduler-scheduler" Feb 18 11:59:17 crc kubenswrapper[4922]: I0218 11:59:17.466785 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 18 11:59:17 crc kubenswrapper[4922]: I0218 11:59:17.469337 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 18 11:59:17 crc kubenswrapper[4922]: I0218 11:59:17.486236 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 11:59:17 crc kubenswrapper[4922]: I0218 11:59:17.612098 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7319f7de-4554-4a03-ba7f-c0f414ab2fe5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7319f7de-4554-4a03-ba7f-c0f414ab2fe5\") " pod="openstack/nova-scheduler-0" Feb 18 11:59:17 crc kubenswrapper[4922]: I0218 11:59:17.612157 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgb47\" (UniqueName: \"kubernetes.io/projected/7319f7de-4554-4a03-ba7f-c0f414ab2fe5-kube-api-access-fgb47\") pod \"nova-scheduler-0\" (UID: \"7319f7de-4554-4a03-ba7f-c0f414ab2fe5\") " pod="openstack/nova-scheduler-0" Feb 18 11:59:17 crc kubenswrapper[4922]: I0218 11:59:17.612300 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7319f7de-4554-4a03-ba7f-c0f414ab2fe5-config-data\") pod \"nova-scheduler-0\" (UID: \"7319f7de-4554-4a03-ba7f-c0f414ab2fe5\") " pod="openstack/nova-scheduler-0" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:17.716385 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7319f7de-4554-4a03-ba7f-c0f414ab2fe5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7319f7de-4554-4a03-ba7f-c0f414ab2fe5\") " pod="openstack/nova-scheduler-0" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:17.716458 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgb47\" (UniqueName: \"kubernetes.io/projected/7319f7de-4554-4a03-ba7f-c0f414ab2fe5-kube-api-access-fgb47\") pod \"nova-scheduler-0\" (UID: \"7319f7de-4554-4a03-ba7f-c0f414ab2fe5\") " pod="openstack/nova-scheduler-0" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:17.716612 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7319f7de-4554-4a03-ba7f-c0f414ab2fe5-config-data\") pod \"nova-scheduler-0\" (UID: \"7319f7de-4554-4a03-ba7f-c0f414ab2fe5\") " pod="openstack/nova-scheduler-0" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:17.732305 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7319f7de-4554-4a03-ba7f-c0f414ab2fe5-config-data\") pod \"nova-scheduler-0\" (UID: \"7319f7de-4554-4a03-ba7f-c0f414ab2fe5\") " pod="openstack/nova-scheduler-0" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:17.733440 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7319f7de-4554-4a03-ba7f-c0f414ab2fe5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7319f7de-4554-4a03-ba7f-c0f414ab2fe5\") " pod="openstack/nova-scheduler-0" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:17.739750 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgb47\" (UniqueName: \"kubernetes.io/projected/7319f7de-4554-4a03-ba7f-c0f414ab2fe5-kube-api-access-fgb47\") pod \"nova-scheduler-0\" (UID: \"7319f7de-4554-4a03-ba7f-c0f414ab2fe5\") " pod="openstack/nova-scheduler-0" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:17.799530 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:17.940646 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.022485 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c0a029b-ba40-494a-b439-5ddf2073ad00-config-data\") pod \"5c0a029b-ba40-494a-b439-5ddf2073ad00\" (UID: \"5c0a029b-ba40-494a-b439-5ddf2073ad00\") " Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.022521 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c0a029b-ba40-494a-b439-5ddf2073ad00-logs\") pod \"5c0a029b-ba40-494a-b439-5ddf2073ad00\" (UID: \"5c0a029b-ba40-494a-b439-5ddf2073ad00\") " Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.022570 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c0a029b-ba40-494a-b439-5ddf2073ad00-combined-ca-bundle\") pod \"5c0a029b-ba40-494a-b439-5ddf2073ad00\" (UID: \"5c0a029b-ba40-494a-b439-5ddf2073ad00\") " Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.022592 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c0a029b-ba40-494a-b439-5ddf2073ad00-public-tls-certs\") pod \"5c0a029b-ba40-494a-b439-5ddf2073ad00\" (UID: \"5c0a029b-ba40-494a-b439-5ddf2073ad00\") " Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.022645 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wn58b\" (UniqueName: \"kubernetes.io/projected/5c0a029b-ba40-494a-b439-5ddf2073ad00-kube-api-access-wn58b\") pod \"5c0a029b-ba40-494a-b439-5ddf2073ad00\" (UID: \"5c0a029b-ba40-494a-b439-5ddf2073ad00\") " Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.022829 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c0a029b-ba40-494a-b439-5ddf2073ad00-internal-tls-certs\") pod \"5c0a029b-ba40-494a-b439-5ddf2073ad00\" (UID: \"5c0a029b-ba40-494a-b439-5ddf2073ad00\") " Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.023688 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c0a029b-ba40-494a-b439-5ddf2073ad00-logs" (OuterVolumeSpecName: "logs") pod "5c0a029b-ba40-494a-b439-5ddf2073ad00" (UID: "5c0a029b-ba40-494a-b439-5ddf2073ad00"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.027093 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c0a029b-ba40-494a-b439-5ddf2073ad00-kube-api-access-wn58b" (OuterVolumeSpecName: "kube-api-access-wn58b") pod "5c0a029b-ba40-494a-b439-5ddf2073ad00" (UID: "5c0a029b-ba40-494a-b439-5ddf2073ad00"). InnerVolumeSpecName "kube-api-access-wn58b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.050540 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c0a029b-ba40-494a-b439-5ddf2073ad00-config-data" (OuterVolumeSpecName: "config-data") pod "5c0a029b-ba40-494a-b439-5ddf2073ad00" (UID: "5c0a029b-ba40-494a-b439-5ddf2073ad00"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.069452 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c0a029b-ba40-494a-b439-5ddf2073ad00-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c0a029b-ba40-494a-b439-5ddf2073ad00" (UID: "5c0a029b-ba40-494a-b439-5ddf2073ad00"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.077898 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c0a029b-ba40-494a-b439-5ddf2073ad00-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5c0a029b-ba40-494a-b439-5ddf2073ad00" (UID: "5c0a029b-ba40-494a-b439-5ddf2073ad00"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.089614 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c0a029b-ba40-494a-b439-5ddf2073ad00-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5c0a029b-ba40-494a-b439-5ddf2073ad00" (UID: "5c0a029b-ba40-494a-b439-5ddf2073ad00"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.102038 4922 generic.go:334] "Generic (PLEG): container finished" podID="5c0a029b-ba40-494a-b439-5ddf2073ad00" containerID="1ec2718d8714c27f5e2e142d0c677a6a64bedbb507cfb8ea954dbcb0e4cebc94" exitCode=0 Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.102230 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5c0a029b-ba40-494a-b439-5ddf2073ad00","Type":"ContainerDied","Data":"1ec2718d8714c27f5e2e142d0c677a6a64bedbb507cfb8ea954dbcb0e4cebc94"} Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.102269 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5c0a029b-ba40-494a-b439-5ddf2073ad00","Type":"ContainerDied","Data":"1e23b924bddad76596057cd410450387011174217e8c9d8f3144d75f72013eac"} Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.102298 4922 scope.go:117] "RemoveContainer" containerID="1ec2718d8714c27f5e2e142d0c677a6a64bedbb507cfb8ea954dbcb0e4cebc94" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.102300 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.105831 4922 generic.go:334] "Generic (PLEG): container finished" podID="10a54dd7-a74b-49c4-a631-ad8fe2c22d58" containerID="6eb028dcb7cb22f62f9b70fd21469cacb245f308ad0c1ab142649395ab9a2db8" exitCode=0 Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.105903 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"10a54dd7-a74b-49c4-a631-ad8fe2c22d58","Type":"ContainerDied","Data":"6eb028dcb7cb22f62f9b70fd21469cacb245f308ad0c1ab142649395ab9a2db8"} Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.125735 4922 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c0a029b-ba40-494a-b439-5ddf2073ad00-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.125777 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c0a029b-ba40-494a-b439-5ddf2073ad00-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.125789 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c0a029b-ba40-494a-b439-5ddf2073ad00-logs\") on node \"crc\" DevicePath \"\"" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.125804 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c0a029b-ba40-494a-b439-5ddf2073ad00-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.125814 4922 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c0a029b-ba40-494a-b439-5ddf2073ad00-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.125826 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wn58b\" (UniqueName: \"kubernetes.io/projected/5c0a029b-ba40-494a-b439-5ddf2073ad00-kube-api-access-wn58b\") on node \"crc\" DevicePath \"\"" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.130672 4922 scope.go:117] "RemoveContainer" containerID="43904287d555f659fd122c3d980c18b3522ce068f02a16e4a164632620f31422" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.142262 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.157822 4922 scope.go:117] "RemoveContainer" containerID="1ec2718d8714c27f5e2e142d0c677a6a64bedbb507cfb8ea954dbcb0e4cebc94" Feb 18 11:59:18 crc kubenswrapper[4922]: E0218 11:59:18.158211 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ec2718d8714c27f5e2e142d0c677a6a64bedbb507cfb8ea954dbcb0e4cebc94\": container with ID starting with 1ec2718d8714c27f5e2e142d0c677a6a64bedbb507cfb8ea954dbcb0e4cebc94 not found: ID does not exist" containerID="1ec2718d8714c27f5e2e142d0c677a6a64bedbb507cfb8ea954dbcb0e4cebc94" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.158249 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ec2718d8714c27f5e2e142d0c677a6a64bedbb507cfb8ea954dbcb0e4cebc94"} err="failed to get container status \"1ec2718d8714c27f5e2e142d0c677a6a64bedbb507cfb8ea954dbcb0e4cebc94\": rpc error: code = NotFound desc = could not find container \"1ec2718d8714c27f5e2e142d0c677a6a64bedbb507cfb8ea954dbcb0e4cebc94\": container with ID starting with 1ec2718d8714c27f5e2e142d0c677a6a64bedbb507cfb8ea954dbcb0e4cebc94 not found: ID does not exist" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.158276 4922 scope.go:117] "RemoveContainer" containerID="43904287d555f659fd122c3d980c18b3522ce068f02a16e4a164632620f31422" Feb 18 11:59:18 crc kubenswrapper[4922]: E0218 11:59:18.159155 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43904287d555f659fd122c3d980c18b3522ce068f02a16e4a164632620f31422\": container with ID starting with 43904287d555f659fd122c3d980c18b3522ce068f02a16e4a164632620f31422 not found: ID does not exist" containerID="43904287d555f659fd122c3d980c18b3522ce068f02a16e4a164632620f31422" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.159183 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43904287d555f659fd122c3d980c18b3522ce068f02a16e4a164632620f31422"} err="failed to get container status \"43904287d555f659fd122c3d980c18b3522ce068f02a16e4a164632620f31422\": rpc error: code = NotFound desc = could not find container \"43904287d555f659fd122c3d980c18b3522ce068f02a16e4a164632620f31422\": container with ID starting with 43904287d555f659fd122c3d980c18b3522ce068f02a16e4a164632620f31422 not found: ID does not exist" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.161959 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.197948 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 18 11:59:18 crc kubenswrapper[4922]: E0218 11:59:18.199004 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c0a029b-ba40-494a-b439-5ddf2073ad00" containerName="nova-api-api" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.199021 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c0a029b-ba40-494a-b439-5ddf2073ad00" containerName="nova-api-api" Feb 18 11:59:18 crc kubenswrapper[4922]: E0218 11:59:18.199046 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c0a029b-ba40-494a-b439-5ddf2073ad00" containerName="nova-api-log" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.199053 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c0a029b-ba40-494a-b439-5ddf2073ad00" containerName="nova-api-log" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.199649 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c0a029b-ba40-494a-b439-5ddf2073ad00" containerName="nova-api-api" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.199687 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c0a029b-ba40-494a-b439-5ddf2073ad00" containerName="nova-api-log" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.202181 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.206881 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.207540 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.213096 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.231882 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.329270 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b05385b6-6350-4ee0-b628-a1eb55dd6067-public-tls-certs\") pod \"nova-api-0\" (UID: \"b05385b6-6350-4ee0-b628-a1eb55dd6067\") " pod="openstack/nova-api-0" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.329340 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b05385b6-6350-4ee0-b628-a1eb55dd6067-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b05385b6-6350-4ee0-b628-a1eb55dd6067\") " pod="openstack/nova-api-0" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.329409 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b05385b6-6350-4ee0-b628-a1eb55dd6067-config-data\") pod \"nova-api-0\" (UID: \"b05385b6-6350-4ee0-b628-a1eb55dd6067\") " pod="openstack/nova-api-0" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.329464 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b05385b6-6350-4ee0-b628-a1eb55dd6067-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b05385b6-6350-4ee0-b628-a1eb55dd6067\") " pod="openstack/nova-api-0" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.329733 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b05385b6-6350-4ee0-b628-a1eb55dd6067-logs\") pod \"nova-api-0\" (UID: \"b05385b6-6350-4ee0-b628-a1eb55dd6067\") " pod="openstack/nova-api-0" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.330004 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts9pr\" (UniqueName: \"kubernetes.io/projected/b05385b6-6350-4ee0-b628-a1eb55dd6067-kube-api-access-ts9pr\") pod \"nova-api-0\" (UID: \"b05385b6-6350-4ee0-b628-a1eb55dd6067\") " pod="openstack/nova-api-0" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.432114 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b05385b6-6350-4ee0-b628-a1eb55dd6067-logs\") pod \"nova-api-0\" (UID: \"b05385b6-6350-4ee0-b628-a1eb55dd6067\") " pod="openstack/nova-api-0" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.432243 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ts9pr\" (UniqueName: \"kubernetes.io/projected/b05385b6-6350-4ee0-b628-a1eb55dd6067-kube-api-access-ts9pr\") pod \"nova-api-0\" (UID: \"b05385b6-6350-4ee0-b628-a1eb55dd6067\") " pod="openstack/nova-api-0" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.432278 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b05385b6-6350-4ee0-b628-a1eb55dd6067-public-tls-certs\") pod \"nova-api-0\" (UID: \"b05385b6-6350-4ee0-b628-a1eb55dd6067\") " pod="openstack/nova-api-0" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.432318 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b05385b6-6350-4ee0-b628-a1eb55dd6067-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b05385b6-6350-4ee0-b628-a1eb55dd6067\") " pod="openstack/nova-api-0" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.432414 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b05385b6-6350-4ee0-b628-a1eb55dd6067-config-data\") pod \"nova-api-0\" (UID: \"b05385b6-6350-4ee0-b628-a1eb55dd6067\") " pod="openstack/nova-api-0" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.432455 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b05385b6-6350-4ee0-b628-a1eb55dd6067-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b05385b6-6350-4ee0-b628-a1eb55dd6067\") " pod="openstack/nova-api-0" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.435909 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b05385b6-6350-4ee0-b628-a1eb55dd6067-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b05385b6-6350-4ee0-b628-a1eb55dd6067\") " pod="openstack/nova-api-0" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.435984 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b05385b6-6350-4ee0-b628-a1eb55dd6067-logs\") pod \"nova-api-0\" (UID: \"b05385b6-6350-4ee0-b628-a1eb55dd6067\") " pod="openstack/nova-api-0" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.437592 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b05385b6-6350-4ee0-b628-a1eb55dd6067-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b05385b6-6350-4ee0-b628-a1eb55dd6067\") " pod="openstack/nova-api-0" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.437734 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b05385b6-6350-4ee0-b628-a1eb55dd6067-public-tls-certs\") pod \"nova-api-0\" (UID: \"b05385b6-6350-4ee0-b628-a1eb55dd6067\") " pod="openstack/nova-api-0" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.448793 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b05385b6-6350-4ee0-b628-a1eb55dd6067-config-data\") pod \"nova-api-0\" (UID: \"b05385b6-6350-4ee0-b628-a1eb55dd6067\") " pod="openstack/nova-api-0" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.452918 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ts9pr\" (UniqueName: \"kubernetes.io/projected/b05385b6-6350-4ee0-b628-a1eb55dd6067-kube-api-access-ts9pr\") pod \"nova-api-0\" (UID: \"b05385b6-6350-4ee0-b628-a1eb55dd6067\") " pod="openstack/nova-api-0" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.522636 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.853022 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.942153 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10a54dd7-a74b-49c4-a631-ad8fe2c22d58-config-data\") pod \"10a54dd7-a74b-49c4-a631-ad8fe2c22d58\" (UID: \"10a54dd7-a74b-49c4-a631-ad8fe2c22d58\") " Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.942653 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zm79t\" (UniqueName: \"kubernetes.io/projected/10a54dd7-a74b-49c4-a631-ad8fe2c22d58-kube-api-access-zm79t\") pod \"10a54dd7-a74b-49c4-a631-ad8fe2c22d58\" (UID: \"10a54dd7-a74b-49c4-a631-ad8fe2c22d58\") " Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.942819 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10a54dd7-a74b-49c4-a631-ad8fe2c22d58-logs\") pod \"10a54dd7-a74b-49c4-a631-ad8fe2c22d58\" (UID: \"10a54dd7-a74b-49c4-a631-ad8fe2c22d58\") " Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.942944 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10a54dd7-a74b-49c4-a631-ad8fe2c22d58-combined-ca-bundle\") pod \"10a54dd7-a74b-49c4-a631-ad8fe2c22d58\" (UID: \"10a54dd7-a74b-49c4-a631-ad8fe2c22d58\") " Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.943004 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/10a54dd7-a74b-49c4-a631-ad8fe2c22d58-nova-metadata-tls-certs\") pod \"10a54dd7-a74b-49c4-a631-ad8fe2c22d58\" (UID: \"10a54dd7-a74b-49c4-a631-ad8fe2c22d58\") " Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.943448 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10a54dd7-a74b-49c4-a631-ad8fe2c22d58-logs" (OuterVolumeSpecName: "logs") pod "10a54dd7-a74b-49c4-a631-ad8fe2c22d58" (UID: "10a54dd7-a74b-49c4-a631-ad8fe2c22d58"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.943721 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10a54dd7-a74b-49c4-a631-ad8fe2c22d58-logs\") on node \"crc\" DevicePath \"\"" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.947894 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10a54dd7-a74b-49c4-a631-ad8fe2c22d58-kube-api-access-zm79t" (OuterVolumeSpecName: "kube-api-access-zm79t") pod "10a54dd7-a74b-49c4-a631-ad8fe2c22d58" (UID: "10a54dd7-a74b-49c4-a631-ad8fe2c22d58"). InnerVolumeSpecName "kube-api-access-zm79t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.992124 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c0a029b-ba40-494a-b439-5ddf2073ad00" path="/var/lib/kubelet/pods/5c0a029b-ba40-494a-b439-5ddf2073ad00/volumes" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.993216 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe95484c-ea5d-4ea3-8915-bb6734014373" path="/var/lib/kubelet/pods/fe95484c-ea5d-4ea3-8915-bb6734014373/volumes" Feb 18 11:59:18 crc kubenswrapper[4922]: I0218 11:59:18.993289 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10a54dd7-a74b-49c4-a631-ad8fe2c22d58-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "10a54dd7-a74b-49c4-a631-ad8fe2c22d58" (UID: "10a54dd7-a74b-49c4-a631-ad8fe2c22d58"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:59:19 crc kubenswrapper[4922]: I0218 11:59:19.023402 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10a54dd7-a74b-49c4-a631-ad8fe2c22d58-config-data" (OuterVolumeSpecName: "config-data") pod "10a54dd7-a74b-49c4-a631-ad8fe2c22d58" (UID: "10a54dd7-a74b-49c4-a631-ad8fe2c22d58"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:59:19 crc kubenswrapper[4922]: I0218 11:59:19.030721 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 11:59:19 crc kubenswrapper[4922]: W0218 11:59:19.040176 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7319f7de_4554_4a03_ba7f_c0f414ab2fe5.slice/crio-4ffc2f4c77e305fea5a9a0096fb24c1314b1b4c09e6bae68b1b5d4f18815a440 WatchSource:0}: Error finding container 4ffc2f4c77e305fea5a9a0096fb24c1314b1b4c09e6bae68b1b5d4f18815a440: Status 404 returned error can't find the container with id 4ffc2f4c77e305fea5a9a0096fb24c1314b1b4c09e6bae68b1b5d4f18815a440 Feb 18 11:59:19 crc kubenswrapper[4922]: I0218 11:59:19.040833 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10a54dd7-a74b-49c4-a631-ad8fe2c22d58-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "10a54dd7-a74b-49c4-a631-ad8fe2c22d58" (UID: "10a54dd7-a74b-49c4-a631-ad8fe2c22d58"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:59:19 crc kubenswrapper[4922]: I0218 11:59:19.046467 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10a54dd7-a74b-49c4-a631-ad8fe2c22d58-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:59:19 crc kubenswrapper[4922]: I0218 11:59:19.046520 4922 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/10a54dd7-a74b-49c4-a631-ad8fe2c22d58-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 18 11:59:19 crc kubenswrapper[4922]: I0218 11:59:19.046540 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10a54dd7-a74b-49c4-a631-ad8fe2c22d58-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 11:59:19 crc kubenswrapper[4922]: I0218 11:59:19.046552 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zm79t\" (UniqueName: \"kubernetes.io/projected/10a54dd7-a74b-49c4-a631-ad8fe2c22d58-kube-api-access-zm79t\") on node \"crc\" DevicePath \"\"" Feb 18 11:59:19 crc kubenswrapper[4922]: I0218 11:59:19.099774 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 18 11:59:19 crc kubenswrapper[4922]: W0218 11:59:19.109640 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb05385b6_6350_4ee0_b628_a1eb55dd6067.slice/crio-cb44d789f0d429f2f75df1183a28527e1050f66ffa221b07c70145c221061862 WatchSource:0}: Error finding container cb44d789f0d429f2f75df1183a28527e1050f66ffa221b07c70145c221061862: Status 404 returned error can't find the container with id cb44d789f0d429f2f75df1183a28527e1050f66ffa221b07c70145c221061862 Feb 18 11:59:19 crc kubenswrapper[4922]: I0218 11:59:19.116931 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7319f7de-4554-4a03-ba7f-c0f414ab2fe5","Type":"ContainerStarted","Data":"4ffc2f4c77e305fea5a9a0096fb24c1314b1b4c09e6bae68b1b5d4f18815a440"} Feb 18 11:59:19 crc kubenswrapper[4922]: I0218 11:59:19.121086 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"10a54dd7-a74b-49c4-a631-ad8fe2c22d58","Type":"ContainerDied","Data":"0eb58513d2b717b0e4ac58c75a07be6926255b4cc98ac974a0fd2d38b35e5a01"} Feb 18 11:59:19 crc kubenswrapper[4922]: I0218 11:59:19.121133 4922 scope.go:117] "RemoveContainer" containerID="6eb028dcb7cb22f62f9b70fd21469cacb245f308ad0c1ab142649395ab9a2db8" Feb 18 11:59:19 crc kubenswrapper[4922]: I0218 11:59:19.121254 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 11:59:19 crc kubenswrapper[4922]: I0218 11:59:19.158880 4922 scope.go:117] "RemoveContainer" containerID="b08e25eab8acf946afba995372397a3ab97e8a1f62eebeb6161f0fd5696132b0" Feb 18 11:59:19 crc kubenswrapper[4922]: I0218 11:59:19.195638 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 11:59:19 crc kubenswrapper[4922]: I0218 11:59:19.228559 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 11:59:19 crc kubenswrapper[4922]: I0218 11:59:19.249630 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 18 11:59:19 crc kubenswrapper[4922]: E0218 11:59:19.250158 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10a54dd7-a74b-49c4-a631-ad8fe2c22d58" containerName="nova-metadata-metadata" Feb 18 11:59:19 crc kubenswrapper[4922]: I0218 11:59:19.250170 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="10a54dd7-a74b-49c4-a631-ad8fe2c22d58" containerName="nova-metadata-metadata" Feb 18 11:59:19 crc kubenswrapper[4922]: E0218 11:59:19.250198 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10a54dd7-a74b-49c4-a631-ad8fe2c22d58" containerName="nova-metadata-log" Feb 18 11:59:19 crc kubenswrapper[4922]: I0218 11:59:19.250203 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="10a54dd7-a74b-49c4-a631-ad8fe2c22d58" containerName="nova-metadata-log" Feb 18 11:59:19 crc kubenswrapper[4922]: I0218 11:59:19.250410 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="10a54dd7-a74b-49c4-a631-ad8fe2c22d58" containerName="nova-metadata-metadata" Feb 18 11:59:19 crc kubenswrapper[4922]: I0218 11:59:19.250435 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="10a54dd7-a74b-49c4-a631-ad8fe2c22d58" containerName="nova-metadata-log" Feb 18 11:59:19 crc kubenswrapper[4922]: I0218 11:59:19.252406 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 11:59:19 crc kubenswrapper[4922]: I0218 11:59:19.257260 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 18 11:59:19 crc kubenswrapper[4922]: I0218 11:59:19.257769 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 18 11:59:19 crc kubenswrapper[4922]: I0218 11:59:19.263479 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 11:59:19 crc kubenswrapper[4922]: I0218 11:59:19.354824 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ad5f860-e8c4-4b02-bd20-cd4d19ba20a6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3ad5f860-e8c4-4b02-bd20-cd4d19ba20a6\") " pod="openstack/nova-metadata-0" Feb 18 11:59:19 crc kubenswrapper[4922]: I0218 11:59:19.354877 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ad5f860-e8c4-4b02-bd20-cd4d19ba20a6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3ad5f860-e8c4-4b02-bd20-cd4d19ba20a6\") " pod="openstack/nova-metadata-0" Feb 18 11:59:19 crc kubenswrapper[4922]: I0218 11:59:19.354899 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ad5f860-e8c4-4b02-bd20-cd4d19ba20a6-logs\") pod \"nova-metadata-0\" (UID: \"3ad5f860-e8c4-4b02-bd20-cd4d19ba20a6\") " pod="openstack/nova-metadata-0" Feb 18 11:59:19 crc kubenswrapper[4922]: I0218 11:59:19.354930 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6w9h8\" (UniqueName: \"kubernetes.io/projected/3ad5f860-e8c4-4b02-bd20-cd4d19ba20a6-kube-api-access-6w9h8\") pod \"nova-metadata-0\" (UID: \"3ad5f860-e8c4-4b02-bd20-cd4d19ba20a6\") " pod="openstack/nova-metadata-0" Feb 18 11:59:19 crc kubenswrapper[4922]: I0218 11:59:19.354988 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ad5f860-e8c4-4b02-bd20-cd4d19ba20a6-config-data\") pod \"nova-metadata-0\" (UID: \"3ad5f860-e8c4-4b02-bd20-cd4d19ba20a6\") " pod="openstack/nova-metadata-0" Feb 18 11:59:19 crc kubenswrapper[4922]: I0218 11:59:19.456647 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ad5f860-e8c4-4b02-bd20-cd4d19ba20a6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3ad5f860-e8c4-4b02-bd20-cd4d19ba20a6\") " pod="openstack/nova-metadata-0" Feb 18 11:59:19 crc kubenswrapper[4922]: I0218 11:59:19.456692 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ad5f860-e8c4-4b02-bd20-cd4d19ba20a6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3ad5f860-e8c4-4b02-bd20-cd4d19ba20a6\") " pod="openstack/nova-metadata-0" Feb 18 11:59:19 crc kubenswrapper[4922]: I0218 11:59:19.456724 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ad5f860-e8c4-4b02-bd20-cd4d19ba20a6-logs\") pod \"nova-metadata-0\" (UID: \"3ad5f860-e8c4-4b02-bd20-cd4d19ba20a6\") " pod="openstack/nova-metadata-0" Feb 18 11:59:19 crc kubenswrapper[4922]: I0218 11:59:19.456759 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6w9h8\" (UniqueName: \"kubernetes.io/projected/3ad5f860-e8c4-4b02-bd20-cd4d19ba20a6-kube-api-access-6w9h8\") pod \"nova-metadata-0\" (UID: \"3ad5f860-e8c4-4b02-bd20-cd4d19ba20a6\") " pod="openstack/nova-metadata-0" Feb 18 11:59:19 crc kubenswrapper[4922]: I0218 11:59:19.456819 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ad5f860-e8c4-4b02-bd20-cd4d19ba20a6-config-data\") pod \"nova-metadata-0\" (UID: \"3ad5f860-e8c4-4b02-bd20-cd4d19ba20a6\") " pod="openstack/nova-metadata-0" Feb 18 11:59:19 crc kubenswrapper[4922]: I0218 11:59:19.457588 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ad5f860-e8c4-4b02-bd20-cd4d19ba20a6-logs\") pod \"nova-metadata-0\" (UID: \"3ad5f860-e8c4-4b02-bd20-cd4d19ba20a6\") " pod="openstack/nova-metadata-0" Feb 18 11:59:19 crc kubenswrapper[4922]: I0218 11:59:19.461257 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ad5f860-e8c4-4b02-bd20-cd4d19ba20a6-config-data\") pod \"nova-metadata-0\" (UID: \"3ad5f860-e8c4-4b02-bd20-cd4d19ba20a6\") " pod="openstack/nova-metadata-0" Feb 18 11:59:19 crc kubenswrapper[4922]: I0218 11:59:19.461352 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ad5f860-e8c4-4b02-bd20-cd4d19ba20a6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3ad5f860-e8c4-4b02-bd20-cd4d19ba20a6\") " pod="openstack/nova-metadata-0" Feb 18 11:59:19 crc kubenswrapper[4922]: I0218 11:59:19.461712 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ad5f860-e8c4-4b02-bd20-cd4d19ba20a6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3ad5f860-e8c4-4b02-bd20-cd4d19ba20a6\") " pod="openstack/nova-metadata-0" Feb 18 11:59:19 crc kubenswrapper[4922]: I0218 11:59:19.475278 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6w9h8\" (UniqueName: \"kubernetes.io/projected/3ad5f860-e8c4-4b02-bd20-cd4d19ba20a6-kube-api-access-6w9h8\") pod \"nova-metadata-0\" (UID: \"3ad5f860-e8c4-4b02-bd20-cd4d19ba20a6\") " pod="openstack/nova-metadata-0" Feb 18 11:59:19 crc kubenswrapper[4922]: I0218 11:59:19.576973 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 11:59:20 crc kubenswrapper[4922]: I0218 11:59:20.105051 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 11:59:20 crc kubenswrapper[4922]: I0218 11:59:20.145998 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b05385b6-6350-4ee0-b628-a1eb55dd6067","Type":"ContainerStarted","Data":"df28e17fccddfa92c1f512f2122f395beb47b3a065c4077ad5cd820115214663"} Feb 18 11:59:20 crc kubenswrapper[4922]: I0218 11:59:20.146169 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b05385b6-6350-4ee0-b628-a1eb55dd6067","Type":"ContainerStarted","Data":"1beb1ab0db9b065ee6bfe84046297aa32e576317f296797a14ba2400f76e47cb"} Feb 18 11:59:20 crc kubenswrapper[4922]: I0218 11:59:20.146280 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b05385b6-6350-4ee0-b628-a1eb55dd6067","Type":"ContainerStarted","Data":"cb44d789f0d429f2f75df1183a28527e1050f66ffa221b07c70145c221061862"} Feb 18 11:59:20 crc kubenswrapper[4922]: I0218 11:59:20.175988 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7319f7de-4554-4a03-ba7f-c0f414ab2fe5","Type":"ContainerStarted","Data":"20cc532910fcb241e32cbde76cb1ec428f2c79cb4dc17c5a1555717b24a957b1"} Feb 18 11:59:20 crc kubenswrapper[4922]: I0218 11:59:20.199725 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3ad5f860-e8c4-4b02-bd20-cd4d19ba20a6","Type":"ContainerStarted","Data":"ac289d5ab2fba79d8d93271e7c9b0948061fceda0f37a1c54e50187cd24f7843"} Feb 18 11:59:20 crc kubenswrapper[4922]: I0218 11:59:20.221465 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.221447782 podStartE2EDuration="2.221447782s" podCreationTimestamp="2026-02-18 11:59:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:59:20.220063526 +0000 UTC m=+1361.947767606" watchObservedRunningTime="2026-02-18 11:59:20.221447782 +0000 UTC m=+1361.949151862" Feb 18 11:59:20 crc kubenswrapper[4922]: I0218 11:59:20.253976 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.253957476 podStartE2EDuration="3.253957476s" podCreationTimestamp="2026-02-18 11:59:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:59:20.246905887 +0000 UTC m=+1361.974609967" watchObservedRunningTime="2026-02-18 11:59:20.253957476 +0000 UTC m=+1361.981661556" Feb 18 11:59:21 crc kubenswrapper[4922]: I0218 11:59:21.013603 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10a54dd7-a74b-49c4-a631-ad8fe2c22d58" path="/var/lib/kubelet/pods/10a54dd7-a74b-49c4-a631-ad8fe2c22d58/volumes" Feb 18 11:59:21 crc kubenswrapper[4922]: I0218 11:59:21.222856 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3ad5f860-e8c4-4b02-bd20-cd4d19ba20a6","Type":"ContainerStarted","Data":"c1ba3779a1cb1ede99b0f035434b810f749eee065544b58c9396282255ad7ea7"} Feb 18 11:59:21 crc kubenswrapper[4922]: I0218 11:59:21.222910 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3ad5f860-e8c4-4b02-bd20-cd4d19ba20a6","Type":"ContainerStarted","Data":"b3ff55e981f818d10524f9732c0dcc9a693963171f691d48b912e51dffe1d8bc"} Feb 18 11:59:21 crc kubenswrapper[4922]: I0218 11:59:21.253591 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.253568714 podStartE2EDuration="2.253568714s" podCreationTimestamp="2026-02-18 11:59:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:59:21.238551783 +0000 UTC m=+1362.966255863" watchObservedRunningTime="2026-02-18 11:59:21.253568714 +0000 UTC m=+1362.981272804" Feb 18 11:59:22 crc kubenswrapper[4922]: I0218 11:59:22.799788 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 18 11:59:23 crc kubenswrapper[4922]: I0218 11:59:23.071410 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="10a54dd7-a74b-49c4-a631-ad8fe2c22d58" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.213:8775/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 18 11:59:23 crc kubenswrapper[4922]: I0218 11:59:23.071433 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="10a54dd7-a74b-49c4-a631-ad8fe2c22d58" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.213:8775/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 18 11:59:24 crc kubenswrapper[4922]: I0218 11:59:24.577649 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 18 11:59:24 crc kubenswrapper[4922]: I0218 11:59:24.577971 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 18 11:59:27 crc kubenswrapper[4922]: I0218 11:59:27.799959 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 18 11:59:27 crc kubenswrapper[4922]: I0218 11:59:27.836172 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 18 11:59:28 crc kubenswrapper[4922]: I0218 11:59:28.325909 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 18 11:59:28 crc kubenswrapper[4922]: I0218 11:59:28.523074 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 18 11:59:28 crc kubenswrapper[4922]: I0218 11:59:28.523421 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 18 11:59:29 crc kubenswrapper[4922]: I0218 11:59:29.536536 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b05385b6-6350-4ee0-b628-a1eb55dd6067" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.223:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 18 11:59:29 crc kubenswrapper[4922]: I0218 11:59:29.536572 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b05385b6-6350-4ee0-b628-a1eb55dd6067" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.223:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 18 11:59:29 crc kubenswrapper[4922]: I0218 11:59:29.577477 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 18 11:59:29 crc kubenswrapper[4922]: I0218 11:59:29.578938 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 18 11:59:30 crc kubenswrapper[4922]: I0218 11:59:30.588610 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3ad5f860-e8c4-4b02-bd20-cd4d19ba20a6" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.224:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 18 11:59:30 crc kubenswrapper[4922]: I0218 11:59:30.588623 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3ad5f860-e8c4-4b02-bd20-cd4d19ba20a6" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.224:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 18 11:59:31 crc kubenswrapper[4922]: I0218 11:59:31.484278 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 18 11:59:35 crc kubenswrapper[4922]: I0218 11:59:35.079600 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 18 11:59:35 crc kubenswrapper[4922]: I0218 11:59:35.080183 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="2aa305a0-c015-43c2-851c-8eff778238be" containerName="kube-state-metrics" containerID="cri-o://c04aa8b95a26312fcc998c31a3b946f55c4ecd9e671bf072302577040fa7d761" gracePeriod=30 Feb 18 11:59:35 crc kubenswrapper[4922]: I0218 11:59:35.362185 4922 generic.go:334] "Generic (PLEG): container finished" podID="2aa305a0-c015-43c2-851c-8eff778238be" containerID="c04aa8b95a26312fcc998c31a3b946f55c4ecd9e671bf072302577040fa7d761" exitCode=2 Feb 18 11:59:35 crc kubenswrapper[4922]: I0218 11:59:35.362299 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2aa305a0-c015-43c2-851c-8eff778238be","Type":"ContainerDied","Data":"c04aa8b95a26312fcc998c31a3b946f55c4ecd9e671bf072302577040fa7d761"} Feb 18 11:59:35 crc kubenswrapper[4922]: I0218 11:59:35.602528 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 18 11:59:35 crc kubenswrapper[4922]: I0218 11:59:35.682782 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhpfc\" (UniqueName: \"kubernetes.io/projected/2aa305a0-c015-43c2-851c-8eff778238be-kube-api-access-hhpfc\") pod \"2aa305a0-c015-43c2-851c-8eff778238be\" (UID: \"2aa305a0-c015-43c2-851c-8eff778238be\") " Feb 18 11:59:35 crc kubenswrapper[4922]: I0218 11:59:35.689345 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2aa305a0-c015-43c2-851c-8eff778238be-kube-api-access-hhpfc" (OuterVolumeSpecName: "kube-api-access-hhpfc") pod "2aa305a0-c015-43c2-851c-8eff778238be" (UID: "2aa305a0-c015-43c2-851c-8eff778238be"). InnerVolumeSpecName "kube-api-access-hhpfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:59:35 crc kubenswrapper[4922]: I0218 11:59:35.785423 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhpfc\" (UniqueName: \"kubernetes.io/projected/2aa305a0-c015-43c2-851c-8eff778238be-kube-api-access-hhpfc\") on node \"crc\" DevicePath \"\"" Feb 18 11:59:36 crc kubenswrapper[4922]: I0218 11:59:36.377100 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2aa305a0-c015-43c2-851c-8eff778238be","Type":"ContainerDied","Data":"3a3d098eed640f36965fdadc7b1dd0c83929950b22d8057eb96a4ca71c50bd14"} Feb 18 11:59:36 crc kubenswrapper[4922]: I0218 11:59:36.377152 4922 scope.go:117] "RemoveContainer" containerID="c04aa8b95a26312fcc998c31a3b946f55c4ecd9e671bf072302577040fa7d761" Feb 18 11:59:36 crc kubenswrapper[4922]: I0218 11:59:36.378391 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 18 11:59:36 crc kubenswrapper[4922]: I0218 11:59:36.415133 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 18 11:59:36 crc kubenswrapper[4922]: I0218 11:59:36.423572 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 18 11:59:36 crc kubenswrapper[4922]: I0218 11:59:36.442493 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 18 11:59:36 crc kubenswrapper[4922]: E0218 11:59:36.442868 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aa305a0-c015-43c2-851c-8eff778238be" containerName="kube-state-metrics" Feb 18 11:59:36 crc kubenswrapper[4922]: I0218 11:59:36.442884 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aa305a0-c015-43c2-851c-8eff778238be" containerName="kube-state-metrics" Feb 18 11:59:36 crc kubenswrapper[4922]: I0218 11:59:36.443081 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="2aa305a0-c015-43c2-851c-8eff778238be" containerName="kube-state-metrics" Feb 18 11:59:36 crc kubenswrapper[4922]: I0218 11:59:36.443692 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 18 11:59:36 crc kubenswrapper[4922]: I0218 11:59:36.445553 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 18 11:59:36 crc kubenswrapper[4922]: I0218 11:59:36.447702 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 18 11:59:36 crc kubenswrapper[4922]: I0218 11:59:36.476089 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 18 11:59:36 crc kubenswrapper[4922]: I0218 11:59:36.498541 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b492a6f-c8fc-4a76-8645-9f94a29d5e6b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1b492a6f-c8fc-4a76-8645-9f94a29d5e6b\") " pod="openstack/kube-state-metrics-0" Feb 18 11:59:36 crc kubenswrapper[4922]: I0218 11:59:36.498617 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1b492a6f-c8fc-4a76-8645-9f94a29d5e6b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1b492a6f-c8fc-4a76-8645-9f94a29d5e6b\") " pod="openstack/kube-state-metrics-0" Feb 18 11:59:36 crc kubenswrapper[4922]: I0218 11:59:36.498671 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b492a6f-c8fc-4a76-8645-9f94a29d5e6b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1b492a6f-c8fc-4a76-8645-9f94a29d5e6b\") " pod="openstack/kube-state-metrics-0" Feb 18 11:59:36 crc kubenswrapper[4922]: I0218 11:59:36.498764 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn8pc\" (UniqueName: \"kubernetes.io/projected/1b492a6f-c8fc-4a76-8645-9f94a29d5e6b-kube-api-access-vn8pc\") pod \"kube-state-metrics-0\" (UID: \"1b492a6f-c8fc-4a76-8645-9f94a29d5e6b\") " pod="openstack/kube-state-metrics-0" Feb 18 11:59:36 crc kubenswrapper[4922]: I0218 11:59:36.601409 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b492a6f-c8fc-4a76-8645-9f94a29d5e6b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1b492a6f-c8fc-4a76-8645-9f94a29d5e6b\") " pod="openstack/kube-state-metrics-0" Feb 18 11:59:36 crc kubenswrapper[4922]: I0218 11:59:36.601527 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1b492a6f-c8fc-4a76-8645-9f94a29d5e6b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1b492a6f-c8fc-4a76-8645-9f94a29d5e6b\") " pod="openstack/kube-state-metrics-0" Feb 18 11:59:36 crc kubenswrapper[4922]: I0218 11:59:36.601589 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b492a6f-c8fc-4a76-8645-9f94a29d5e6b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1b492a6f-c8fc-4a76-8645-9f94a29d5e6b\") " pod="openstack/kube-state-metrics-0" Feb 18 11:59:36 crc kubenswrapper[4922]: I0218 11:59:36.601753 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn8pc\" (UniqueName: \"kubernetes.io/projected/1b492a6f-c8fc-4a76-8645-9f94a29d5e6b-kube-api-access-vn8pc\") pod \"kube-state-metrics-0\" (UID: \"1b492a6f-c8fc-4a76-8645-9f94a29d5e6b\") " pod="openstack/kube-state-metrics-0" Feb 18 11:59:36 crc kubenswrapper[4922]: I0218 11:59:36.610266 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b492a6f-c8fc-4a76-8645-9f94a29d5e6b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1b492a6f-c8fc-4a76-8645-9f94a29d5e6b\") " pod="openstack/kube-state-metrics-0" Feb 18 11:59:36 crc kubenswrapper[4922]: I0218 11:59:36.618883 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b492a6f-c8fc-4a76-8645-9f94a29d5e6b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1b492a6f-c8fc-4a76-8645-9f94a29d5e6b\") " pod="openstack/kube-state-metrics-0" Feb 18 11:59:36 crc kubenswrapper[4922]: I0218 11:59:36.629749 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1b492a6f-c8fc-4a76-8645-9f94a29d5e6b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1b492a6f-c8fc-4a76-8645-9f94a29d5e6b\") " pod="openstack/kube-state-metrics-0" Feb 18 11:59:36 crc kubenswrapper[4922]: I0218 11:59:36.634191 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn8pc\" (UniqueName: \"kubernetes.io/projected/1b492a6f-c8fc-4a76-8645-9f94a29d5e6b-kube-api-access-vn8pc\") pod \"kube-state-metrics-0\" (UID: \"1b492a6f-c8fc-4a76-8645-9f94a29d5e6b\") " pod="openstack/kube-state-metrics-0" Feb 18 11:59:36 crc kubenswrapper[4922]: I0218 11:59:36.780030 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 18 11:59:37 crc kubenswrapper[4922]: I0218 11:59:37.034317 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2aa305a0-c015-43c2-851c-8eff778238be" path="/var/lib/kubelet/pods/2aa305a0-c015-43c2-851c-8eff778238be/volumes" Feb 18 11:59:37 crc kubenswrapper[4922]: I0218 11:59:37.277324 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 11:59:37 crc kubenswrapper[4922]: I0218 11:59:37.278019 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e3b8f165-b92e-47d4-ada4-5eee351d6a5a" containerName="ceilometer-central-agent" containerID="cri-o://a917c6100631e0cecbb43e73dcdd3017b9ee5fcde8f4b2eb40ffa1cf7a6ddd61" gracePeriod=30 Feb 18 11:59:37 crc kubenswrapper[4922]: I0218 11:59:37.278038 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e3b8f165-b92e-47d4-ada4-5eee351d6a5a" containerName="proxy-httpd" containerID="cri-o://230a0bc54b8dd07f66eb86ad2215c814c3b880945116c4dab8556a97fba068aa" gracePeriod=30 Feb 18 11:59:37 crc kubenswrapper[4922]: I0218 11:59:37.278094 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e3b8f165-b92e-47d4-ada4-5eee351d6a5a" containerName="sg-core" containerID="cri-o://4e1f36256af229f8391ab980166248080c950f7ddf5a27ccd1a51d73b649da7a" gracePeriod=30 Feb 18 11:59:37 crc kubenswrapper[4922]: I0218 11:59:37.278180 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e3b8f165-b92e-47d4-ada4-5eee351d6a5a" containerName="ceilometer-notification-agent" containerID="cri-o://c0b1ec47222cf49451bd1b033a64e0ee80b9aa5aed82b2529850f17df47b4532" gracePeriod=30 Feb 18 11:59:37 crc kubenswrapper[4922]: I0218 11:59:37.297005 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 18 11:59:37 crc kubenswrapper[4922]: W0218 11:59:37.299631 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b492a6f_c8fc_4a76_8645_9f94a29d5e6b.slice/crio-73bd1d1d00a22d2a791aab7c6ffd7b5d9ff1ba7174a16f36465d2da663434a86 WatchSource:0}: Error finding container 73bd1d1d00a22d2a791aab7c6ffd7b5d9ff1ba7174a16f36465d2da663434a86: Status 404 returned error can't find the container with id 73bd1d1d00a22d2a791aab7c6ffd7b5d9ff1ba7174a16f36465d2da663434a86 Feb 18 11:59:37 crc kubenswrapper[4922]: I0218 11:59:37.391329 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1b492a6f-c8fc-4a76-8645-9f94a29d5e6b","Type":"ContainerStarted","Data":"73bd1d1d00a22d2a791aab7c6ffd7b5d9ff1ba7174a16f36465d2da663434a86"} Feb 18 11:59:38 crc kubenswrapper[4922]: I0218 11:59:38.405149 4922 generic.go:334] "Generic (PLEG): container finished" podID="e3b8f165-b92e-47d4-ada4-5eee351d6a5a" containerID="230a0bc54b8dd07f66eb86ad2215c814c3b880945116c4dab8556a97fba068aa" exitCode=0 Feb 18 11:59:38 crc kubenswrapper[4922]: I0218 11:59:38.405483 4922 generic.go:334] "Generic (PLEG): container finished" podID="e3b8f165-b92e-47d4-ada4-5eee351d6a5a" containerID="4e1f36256af229f8391ab980166248080c950f7ddf5a27ccd1a51d73b649da7a" exitCode=2 Feb 18 11:59:38 crc kubenswrapper[4922]: I0218 11:59:38.405498 4922 generic.go:334] "Generic (PLEG): container finished" podID="e3b8f165-b92e-47d4-ada4-5eee351d6a5a" containerID="a917c6100631e0cecbb43e73dcdd3017b9ee5fcde8f4b2eb40ffa1cf7a6ddd61" exitCode=0 Feb 18 11:59:38 crc kubenswrapper[4922]: I0218 11:59:38.405249 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e3b8f165-b92e-47d4-ada4-5eee351d6a5a","Type":"ContainerDied","Data":"230a0bc54b8dd07f66eb86ad2215c814c3b880945116c4dab8556a97fba068aa"} Feb 18 11:59:38 crc kubenswrapper[4922]: I0218 11:59:38.405544 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e3b8f165-b92e-47d4-ada4-5eee351d6a5a","Type":"ContainerDied","Data":"4e1f36256af229f8391ab980166248080c950f7ddf5a27ccd1a51d73b649da7a"} Feb 18 11:59:38 crc kubenswrapper[4922]: I0218 11:59:38.405570 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e3b8f165-b92e-47d4-ada4-5eee351d6a5a","Type":"ContainerDied","Data":"a917c6100631e0cecbb43e73dcdd3017b9ee5fcde8f4b2eb40ffa1cf7a6ddd61"} Feb 18 11:59:38 crc kubenswrapper[4922]: I0218 11:59:38.532209 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 18 11:59:38 crc kubenswrapper[4922]: I0218 11:59:38.532643 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 18 11:59:38 crc kubenswrapper[4922]: I0218 11:59:38.542860 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 18 11:59:38 crc kubenswrapper[4922]: I0218 11:59:38.546604 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 18 11:59:39 crc kubenswrapper[4922]: I0218 11:59:39.418226 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1b492a6f-c8fc-4a76-8645-9f94a29d5e6b","Type":"ContainerStarted","Data":"ab9c779636aeab10119c3cbb2558f0699003e506990c48dfde5601fb4b98b651"} Feb 18 11:59:39 crc kubenswrapper[4922]: I0218 11:59:39.418517 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 18 11:59:39 crc kubenswrapper[4922]: I0218 11:59:39.419220 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 18 11:59:39 crc kubenswrapper[4922]: I0218 11:59:39.438140 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 18 11:59:39 crc kubenswrapper[4922]: I0218 11:59:39.443146 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.948916229 podStartE2EDuration="3.443123388s" podCreationTimestamp="2026-02-18 11:59:36 +0000 UTC" firstStartedPulling="2026-02-18 11:59:37.301969064 +0000 UTC m=+1379.029673144" lastFinishedPulling="2026-02-18 11:59:38.796176223 +0000 UTC m=+1380.523880303" observedRunningTime="2026-02-18 11:59:39.432041847 +0000 UTC m=+1381.159745927" watchObservedRunningTime="2026-02-18 11:59:39.443123388 +0000 UTC m=+1381.170827468" Feb 18 11:59:39 crc kubenswrapper[4922]: I0218 11:59:39.588986 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 18 11:59:39 crc kubenswrapper[4922]: I0218 11:59:39.594799 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 18 11:59:39 crc kubenswrapper[4922]: I0218 11:59:39.596814 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 18 11:59:40 crc kubenswrapper[4922]: I0218 11:59:40.432137 4922 generic.go:334] "Generic (PLEG): container finished" podID="e3b8f165-b92e-47d4-ada4-5eee351d6a5a" containerID="c0b1ec47222cf49451bd1b033a64e0ee80b9aa5aed82b2529850f17df47b4532" exitCode=0 Feb 18 11:59:40 crc kubenswrapper[4922]: I0218 11:59:40.432189 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e3b8f165-b92e-47d4-ada4-5eee351d6a5a","Type":"ContainerDied","Data":"c0b1ec47222cf49451bd1b033a64e0ee80b9aa5aed82b2529850f17df47b4532"} Feb 18 11:59:40 crc kubenswrapper[4922]: I0218 11:59:40.438689 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 18 11:59:40 crc kubenswrapper[4922]: I0218 11:59:40.799800 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 11:59:40 crc kubenswrapper[4922]: I0218 11:59:40.807960 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3b8f165-b92e-47d4-ada4-5eee351d6a5a-log-httpd\") pod \"e3b8f165-b92e-47d4-ada4-5eee351d6a5a\" (UID: \"e3b8f165-b92e-47d4-ada4-5eee351d6a5a\") " Feb 18 11:59:40 crc kubenswrapper[4922]: I0218 11:59:40.808014 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3b8f165-b92e-47d4-ada4-5eee351d6a5a-config-data\") pod \"e3b8f165-b92e-47d4-ada4-5eee351d6a5a\" (UID: \"e3b8f165-b92e-47d4-ada4-5eee351d6a5a\") " Feb 18 11:59:40 crc kubenswrapper[4922]: I0218 11:59:40.808071 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3b8f165-b92e-47d4-ada4-5eee351d6a5a-run-httpd\") pod \"e3b8f165-b92e-47d4-ada4-5eee351d6a5a\" (UID: \"e3b8f165-b92e-47d4-ada4-5eee351d6a5a\") " Feb 18 11:59:40 crc kubenswrapper[4922]: I0218 11:59:40.808140 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e3b8f165-b92e-47d4-ada4-5eee351d6a5a-sg-core-conf-yaml\") pod \"e3b8f165-b92e-47d4-ada4-5eee351d6a5a\" (UID: \"e3b8f165-b92e-47d4-ada4-5eee351d6a5a\") " Feb 18 11:59:40 crc kubenswrapper[4922]: I0218 11:59:40.808172 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8h25l\" (UniqueName: \"kubernetes.io/projected/e3b8f165-b92e-47d4-ada4-5eee351d6a5a-kube-api-access-8h25l\") pod \"e3b8f165-b92e-47d4-ada4-5eee351d6a5a\" (UID: \"e3b8f165-b92e-47d4-ada4-5eee351d6a5a\") " Feb 18 11:59:40 crc kubenswrapper[4922]: I0218 11:59:40.808227 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3b8f165-b92e-47d4-ada4-5eee351d6a5a-combined-ca-bundle\") pod \"e3b8f165-b92e-47d4-ada4-5eee351d6a5a\" (UID: \"e3b8f165-b92e-47d4-ada4-5eee351d6a5a\") " Feb 18 11:59:40 crc kubenswrapper[4922]: I0218 11:59:40.808281 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3b8f165-b92e-47d4-ada4-5eee351d6a5a-scripts\") pod \"e3b8f165-b92e-47d4-ada4-5eee351d6a5a\" (UID: \"e3b8f165-b92e-47d4-ada4-5eee351d6a5a\") " Feb 18 11:59:40 crc kubenswrapper[4922]: I0218 11:59:40.809531 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3b8f165-b92e-47d4-ada4-5eee351d6a5a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e3b8f165-b92e-47d4-ada4-5eee351d6a5a" (UID: "e3b8f165-b92e-47d4-ada4-5eee351d6a5a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:59:40 crc kubenswrapper[4922]: I0218 11:59:40.809848 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3b8f165-b92e-47d4-ada4-5eee351d6a5a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e3b8f165-b92e-47d4-ada4-5eee351d6a5a" (UID: "e3b8f165-b92e-47d4-ada4-5eee351d6a5a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:59:40 crc kubenswrapper[4922]: I0218 11:59:40.815789 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3b8f165-b92e-47d4-ada4-5eee351d6a5a-kube-api-access-8h25l" (OuterVolumeSpecName: "kube-api-access-8h25l") pod "e3b8f165-b92e-47d4-ada4-5eee351d6a5a" (UID: "e3b8f165-b92e-47d4-ada4-5eee351d6a5a"). InnerVolumeSpecName "kube-api-access-8h25l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:59:40 crc kubenswrapper[4922]: I0218 11:59:40.826530 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3b8f165-b92e-47d4-ada4-5eee351d6a5a-scripts" (OuterVolumeSpecName: "scripts") pod "e3b8f165-b92e-47d4-ada4-5eee351d6a5a" (UID: "e3b8f165-b92e-47d4-ada4-5eee351d6a5a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:59:40 crc kubenswrapper[4922]: I0218 11:59:40.887174 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3b8f165-b92e-47d4-ada4-5eee351d6a5a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e3b8f165-b92e-47d4-ada4-5eee351d6a5a" (UID: "e3b8f165-b92e-47d4-ada4-5eee351d6a5a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:59:40 crc kubenswrapper[4922]: I0218 11:59:40.911630 4922 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3b8f165-b92e-47d4-ada4-5eee351d6a5a-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 11:59:40 crc kubenswrapper[4922]: I0218 11:59:40.911667 4922 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3b8f165-b92e-47d4-ada4-5eee351d6a5a-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 11:59:40 crc kubenswrapper[4922]: I0218 11:59:40.911681 4922 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e3b8f165-b92e-47d4-ada4-5eee351d6a5a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 18 11:59:40 crc kubenswrapper[4922]: I0218 11:59:40.911695 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8h25l\" (UniqueName: \"kubernetes.io/projected/e3b8f165-b92e-47d4-ada4-5eee351d6a5a-kube-api-access-8h25l\") on node \"crc\" DevicePath \"\"" Feb 18 11:59:40 crc kubenswrapper[4922]: I0218 11:59:40.911706 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3b8f165-b92e-47d4-ada4-5eee351d6a5a-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 11:59:40 crc kubenswrapper[4922]: I0218 11:59:40.934708 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3b8f165-b92e-47d4-ada4-5eee351d6a5a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3b8f165-b92e-47d4-ada4-5eee351d6a5a" (UID: "e3b8f165-b92e-47d4-ada4-5eee351d6a5a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:59:40 crc kubenswrapper[4922]: I0218 11:59:40.972683 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3b8f165-b92e-47d4-ada4-5eee351d6a5a-config-data" (OuterVolumeSpecName: "config-data") pod "e3b8f165-b92e-47d4-ada4-5eee351d6a5a" (UID: "e3b8f165-b92e-47d4-ada4-5eee351d6a5a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:59:41 crc kubenswrapper[4922]: I0218 11:59:41.014051 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3b8f165-b92e-47d4-ada4-5eee351d6a5a-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 11:59:41 crc kubenswrapper[4922]: I0218 11:59:41.014089 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3b8f165-b92e-47d4-ada4-5eee351d6a5a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:59:41 crc kubenswrapper[4922]: I0218 11:59:41.443704 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e3b8f165-b92e-47d4-ada4-5eee351d6a5a","Type":"ContainerDied","Data":"f48c4f106861c2b0fbbd89f3ec527e011b85b43938cee7887c256abc61904ff7"} Feb 18 11:59:41 crc kubenswrapper[4922]: I0218 11:59:41.443749 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 11:59:41 crc kubenswrapper[4922]: I0218 11:59:41.443767 4922 scope.go:117] "RemoveContainer" containerID="230a0bc54b8dd07f66eb86ad2215c814c3b880945116c4dab8556a97fba068aa" Feb 18 11:59:41 crc kubenswrapper[4922]: I0218 11:59:41.466566 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 11:59:41 crc kubenswrapper[4922]: I0218 11:59:41.485672 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 18 11:59:41 crc kubenswrapper[4922]: I0218 11:59:41.489518 4922 scope.go:117] "RemoveContainer" containerID="4e1f36256af229f8391ab980166248080c950f7ddf5a27ccd1a51d73b649da7a" Feb 18 11:59:41 crc kubenswrapper[4922]: I0218 11:59:41.503154 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 18 11:59:41 crc kubenswrapper[4922]: E0218 11:59:41.503668 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3b8f165-b92e-47d4-ada4-5eee351d6a5a" containerName="ceilometer-central-agent" Feb 18 11:59:41 crc kubenswrapper[4922]: I0218 11:59:41.503689 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3b8f165-b92e-47d4-ada4-5eee351d6a5a" containerName="ceilometer-central-agent" Feb 18 11:59:41 crc kubenswrapper[4922]: E0218 11:59:41.503704 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3b8f165-b92e-47d4-ada4-5eee351d6a5a" containerName="proxy-httpd" Feb 18 11:59:41 crc kubenswrapper[4922]: I0218 11:59:41.503710 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3b8f165-b92e-47d4-ada4-5eee351d6a5a" containerName="proxy-httpd" Feb 18 11:59:41 crc kubenswrapper[4922]: E0218 11:59:41.503734 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3b8f165-b92e-47d4-ada4-5eee351d6a5a" containerName="sg-core" Feb 18 11:59:41 crc kubenswrapper[4922]: I0218 11:59:41.503742 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3b8f165-b92e-47d4-ada4-5eee351d6a5a" containerName="sg-core" Feb 18 11:59:41 crc kubenswrapper[4922]: E0218 11:59:41.503767 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3b8f165-b92e-47d4-ada4-5eee351d6a5a" containerName="ceilometer-notification-agent" Feb 18 11:59:41 crc kubenswrapper[4922]: I0218 11:59:41.503773 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3b8f165-b92e-47d4-ada4-5eee351d6a5a" containerName="ceilometer-notification-agent" Feb 18 11:59:41 crc kubenswrapper[4922]: I0218 11:59:41.503972 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3b8f165-b92e-47d4-ada4-5eee351d6a5a" containerName="ceilometer-notification-agent" Feb 18 11:59:41 crc kubenswrapper[4922]: I0218 11:59:41.503991 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3b8f165-b92e-47d4-ada4-5eee351d6a5a" containerName="ceilometer-central-agent" Feb 18 11:59:41 crc kubenswrapper[4922]: I0218 11:59:41.504000 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3b8f165-b92e-47d4-ada4-5eee351d6a5a" containerName="sg-core" Feb 18 11:59:41 crc kubenswrapper[4922]: I0218 11:59:41.504017 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3b8f165-b92e-47d4-ada4-5eee351d6a5a" containerName="proxy-httpd" Feb 18 11:59:41 crc kubenswrapper[4922]: I0218 11:59:41.506164 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 11:59:41 crc kubenswrapper[4922]: I0218 11:59:41.509655 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 18 11:59:41 crc kubenswrapper[4922]: I0218 11:59:41.509986 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 18 11:59:41 crc kubenswrapper[4922]: I0218 11:59:41.511793 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 18 11:59:41 crc kubenswrapper[4922]: I0218 11:59:41.514840 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 11:59:41 crc kubenswrapper[4922]: I0218 11:59:41.515034 4922 scope.go:117] "RemoveContainer" containerID="c0b1ec47222cf49451bd1b033a64e0ee80b9aa5aed82b2529850f17df47b4532" Feb 18 11:59:41 crc kubenswrapper[4922]: I0218 11:59:41.521571 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfc3cdcf-4513-4e18-8d43-c435fd877ae7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bfc3cdcf-4513-4e18-8d43-c435fd877ae7\") " pod="openstack/ceilometer-0" Feb 18 11:59:41 crc kubenswrapper[4922]: I0218 11:59:41.521656 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfc3cdcf-4513-4e18-8d43-c435fd877ae7-log-httpd\") pod \"ceilometer-0\" (UID: \"bfc3cdcf-4513-4e18-8d43-c435fd877ae7\") " pod="openstack/ceilometer-0" Feb 18 11:59:41 crc kubenswrapper[4922]: I0218 11:59:41.521697 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfc3cdcf-4513-4e18-8d43-c435fd877ae7-config-data\") pod \"ceilometer-0\" (UID: \"bfc3cdcf-4513-4e18-8d43-c435fd877ae7\") " pod="openstack/ceilometer-0" Feb 18 11:59:41 crc kubenswrapper[4922]: I0218 11:59:41.521744 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfc3cdcf-4513-4e18-8d43-c435fd877ae7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bfc3cdcf-4513-4e18-8d43-c435fd877ae7\") " pod="openstack/ceilometer-0" Feb 18 11:59:41 crc kubenswrapper[4922]: I0218 11:59:41.521771 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw462\" (UniqueName: \"kubernetes.io/projected/bfc3cdcf-4513-4e18-8d43-c435fd877ae7-kube-api-access-tw462\") pod \"ceilometer-0\" (UID: \"bfc3cdcf-4513-4e18-8d43-c435fd877ae7\") " pod="openstack/ceilometer-0" Feb 18 11:59:41 crc kubenswrapper[4922]: I0218 11:59:41.521872 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bfc3cdcf-4513-4e18-8d43-c435fd877ae7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bfc3cdcf-4513-4e18-8d43-c435fd877ae7\") " pod="openstack/ceilometer-0" Feb 18 11:59:41 crc kubenswrapper[4922]: I0218 11:59:41.521932 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfc3cdcf-4513-4e18-8d43-c435fd877ae7-scripts\") pod \"ceilometer-0\" (UID: \"bfc3cdcf-4513-4e18-8d43-c435fd877ae7\") " pod="openstack/ceilometer-0" Feb 18 11:59:41 crc kubenswrapper[4922]: I0218 11:59:41.521983 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfc3cdcf-4513-4e18-8d43-c435fd877ae7-run-httpd\") pod \"ceilometer-0\" (UID: \"bfc3cdcf-4513-4e18-8d43-c435fd877ae7\") " pod="openstack/ceilometer-0" Feb 18 11:59:41 crc kubenswrapper[4922]: I0218 11:59:41.546557 4922 scope.go:117] "RemoveContainer" containerID="a917c6100631e0cecbb43e73dcdd3017b9ee5fcde8f4b2eb40ffa1cf7a6ddd61" Feb 18 11:59:41 crc kubenswrapper[4922]: I0218 11:59:41.623160 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfc3cdcf-4513-4e18-8d43-c435fd877ae7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bfc3cdcf-4513-4e18-8d43-c435fd877ae7\") " pod="openstack/ceilometer-0" Feb 18 11:59:41 crc kubenswrapper[4922]: I0218 11:59:41.623207 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw462\" (UniqueName: \"kubernetes.io/projected/bfc3cdcf-4513-4e18-8d43-c435fd877ae7-kube-api-access-tw462\") pod \"ceilometer-0\" (UID: \"bfc3cdcf-4513-4e18-8d43-c435fd877ae7\") " pod="openstack/ceilometer-0" Feb 18 11:59:41 crc kubenswrapper[4922]: I0218 11:59:41.623282 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bfc3cdcf-4513-4e18-8d43-c435fd877ae7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bfc3cdcf-4513-4e18-8d43-c435fd877ae7\") " pod="openstack/ceilometer-0" Feb 18 11:59:41 crc kubenswrapper[4922]: I0218 11:59:41.623312 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfc3cdcf-4513-4e18-8d43-c435fd877ae7-scripts\") pod \"ceilometer-0\" (UID: \"bfc3cdcf-4513-4e18-8d43-c435fd877ae7\") " pod="openstack/ceilometer-0" Feb 18 11:59:41 crc kubenswrapper[4922]: I0218 11:59:41.623341 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfc3cdcf-4513-4e18-8d43-c435fd877ae7-run-httpd\") pod \"ceilometer-0\" (UID: \"bfc3cdcf-4513-4e18-8d43-c435fd877ae7\") " pod="openstack/ceilometer-0" Feb 18 11:59:41 crc kubenswrapper[4922]: I0218 11:59:41.623401 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfc3cdcf-4513-4e18-8d43-c435fd877ae7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bfc3cdcf-4513-4e18-8d43-c435fd877ae7\") " pod="openstack/ceilometer-0" Feb 18 11:59:41 crc kubenswrapper[4922]: I0218 11:59:41.623653 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfc3cdcf-4513-4e18-8d43-c435fd877ae7-log-httpd\") pod \"ceilometer-0\" (UID: \"bfc3cdcf-4513-4e18-8d43-c435fd877ae7\") " pod="openstack/ceilometer-0" Feb 18 11:59:41 crc kubenswrapper[4922]: I0218 11:59:41.623694 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfc3cdcf-4513-4e18-8d43-c435fd877ae7-config-data\") pod \"ceilometer-0\" (UID: \"bfc3cdcf-4513-4e18-8d43-c435fd877ae7\") " pod="openstack/ceilometer-0" Feb 18 11:59:41 crc kubenswrapper[4922]: I0218 11:59:41.624044 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfc3cdcf-4513-4e18-8d43-c435fd877ae7-run-httpd\") pod \"ceilometer-0\" (UID: \"bfc3cdcf-4513-4e18-8d43-c435fd877ae7\") " pod="openstack/ceilometer-0" Feb 18 11:59:41 crc kubenswrapper[4922]: I0218 11:59:41.624389 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfc3cdcf-4513-4e18-8d43-c435fd877ae7-log-httpd\") pod \"ceilometer-0\" (UID: \"bfc3cdcf-4513-4e18-8d43-c435fd877ae7\") " pod="openstack/ceilometer-0" Feb 18 11:59:41 crc kubenswrapper[4922]: I0218 11:59:41.628893 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfc3cdcf-4513-4e18-8d43-c435fd877ae7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bfc3cdcf-4513-4e18-8d43-c435fd877ae7\") " pod="openstack/ceilometer-0" Feb 18 11:59:41 crc kubenswrapper[4922]: I0218 11:59:41.630426 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bfc3cdcf-4513-4e18-8d43-c435fd877ae7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bfc3cdcf-4513-4e18-8d43-c435fd877ae7\") " pod="openstack/ceilometer-0" Feb 18 11:59:41 crc kubenswrapper[4922]: I0218 11:59:41.634687 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfc3cdcf-4513-4e18-8d43-c435fd877ae7-config-data\") pod \"ceilometer-0\" (UID: \"bfc3cdcf-4513-4e18-8d43-c435fd877ae7\") " pod="openstack/ceilometer-0" Feb 18 11:59:41 crc kubenswrapper[4922]: I0218 11:59:41.636634 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfc3cdcf-4513-4e18-8d43-c435fd877ae7-scripts\") pod \"ceilometer-0\" (UID: \"bfc3cdcf-4513-4e18-8d43-c435fd877ae7\") " pod="openstack/ceilometer-0" Feb 18 11:59:41 crc kubenswrapper[4922]: I0218 11:59:41.645345 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw462\" (UniqueName: \"kubernetes.io/projected/bfc3cdcf-4513-4e18-8d43-c435fd877ae7-kube-api-access-tw462\") pod \"ceilometer-0\" (UID: \"bfc3cdcf-4513-4e18-8d43-c435fd877ae7\") " pod="openstack/ceilometer-0" Feb 18 11:59:41 crc kubenswrapper[4922]: I0218 11:59:41.659604 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfc3cdcf-4513-4e18-8d43-c435fd877ae7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bfc3cdcf-4513-4e18-8d43-c435fd877ae7\") " pod="openstack/ceilometer-0" Feb 18 11:59:41 crc kubenswrapper[4922]: I0218 11:59:41.820925 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 11:59:42 crc kubenswrapper[4922]: W0218 11:59:42.322022 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbfc3cdcf_4513_4e18_8d43_c435fd877ae7.slice/crio-3fd39719d8855e2f91756da051f1d588a70b474048d8acaa977554ff1f3d9420 WatchSource:0}: Error finding container 3fd39719d8855e2f91756da051f1d588a70b474048d8acaa977554ff1f3d9420: Status 404 returned error can't find the container with id 3fd39719d8855e2f91756da051f1d588a70b474048d8acaa977554ff1f3d9420 Feb 18 11:59:42 crc kubenswrapper[4922]: I0218 11:59:42.322379 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 11:59:42 crc kubenswrapper[4922]: I0218 11:59:42.458426 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bfc3cdcf-4513-4e18-8d43-c435fd877ae7","Type":"ContainerStarted","Data":"3fd39719d8855e2f91756da051f1d588a70b474048d8acaa977554ff1f3d9420"} Feb 18 11:59:42 crc kubenswrapper[4922]: I0218 11:59:42.993464 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3b8f165-b92e-47d4-ada4-5eee351d6a5a" path="/var/lib/kubelet/pods/e3b8f165-b92e-47d4-ada4-5eee351d6a5a/volumes" Feb 18 11:59:43 crc kubenswrapper[4922]: I0218 11:59:43.469991 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bfc3cdcf-4513-4e18-8d43-c435fd877ae7","Type":"ContainerStarted","Data":"d190cb0679446d5880e640741f917ecaff38dfe1ba9ef2c8f4d95c17496508e6"} Feb 18 11:59:45 crc kubenswrapper[4922]: I0218 11:59:45.491205 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bfc3cdcf-4513-4e18-8d43-c435fd877ae7","Type":"ContainerStarted","Data":"14ab6968b09de88f6b954ec95726553215462b06544acf3a2c7829a7ad204e89"} Feb 18 11:59:46 crc kubenswrapper[4922]: I0218 11:59:46.797385 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 18 11:59:47 crc kubenswrapper[4922]: I0218 11:59:47.513814 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bfc3cdcf-4513-4e18-8d43-c435fd877ae7","Type":"ContainerStarted","Data":"dda346ed1aec44db04feee05d66d67fced8a78e5b16d2764dd2e01b3cd7eab87"} Feb 18 11:59:50 crc kubenswrapper[4922]: I0218 11:59:50.553101 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bfc3cdcf-4513-4e18-8d43-c435fd877ae7","Type":"ContainerStarted","Data":"b3ac14f8c3f4f10ff1101938d96e0428be00ec2f23a4dc040a5a0dfd7a403a26"} Feb 18 11:59:50 crc kubenswrapper[4922]: I0218 11:59:50.553701 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 18 11:59:50 crc kubenswrapper[4922]: I0218 11:59:50.573122 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.265204988 podStartE2EDuration="9.573100568s" podCreationTimestamp="2026-02-18 11:59:41 +0000 UTC" firstStartedPulling="2026-02-18 11:59:42.324222986 +0000 UTC m=+1384.051927066" lastFinishedPulling="2026-02-18 11:59:49.632118566 +0000 UTC m=+1391.359822646" observedRunningTime="2026-02-18 11:59:50.572923423 +0000 UTC m=+1392.300627523" watchObservedRunningTime="2026-02-18 11:59:50.573100568 +0000 UTC m=+1392.300804648" Feb 18 12:00:00 crc kubenswrapper[4922]: I0218 12:00:00.154212 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523600-9hg97"] Feb 18 12:00:00 crc kubenswrapper[4922]: I0218 12:00:00.156497 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523600-9hg97" Feb 18 12:00:00 crc kubenswrapper[4922]: I0218 12:00:00.158255 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 18 12:00:00 crc kubenswrapper[4922]: I0218 12:00:00.166524 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 18 12:00:00 crc kubenswrapper[4922]: I0218 12:00:00.169471 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523600-9hg97"] Feb 18 12:00:00 crc kubenswrapper[4922]: I0218 12:00:00.297690 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ee2dabc9-c094-41a8-8efd-7b113f5c634c-config-volume\") pod \"collect-profiles-29523600-9hg97\" (UID: \"ee2dabc9-c094-41a8-8efd-7b113f5c634c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523600-9hg97" Feb 18 12:00:00 crc kubenswrapper[4922]: I0218 12:00:00.297801 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ee2dabc9-c094-41a8-8efd-7b113f5c634c-secret-volume\") pod \"collect-profiles-29523600-9hg97\" (UID: \"ee2dabc9-c094-41a8-8efd-7b113f5c634c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523600-9hg97" Feb 18 12:00:00 crc kubenswrapper[4922]: I0218 12:00:00.298195 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgltn\" (UniqueName: \"kubernetes.io/projected/ee2dabc9-c094-41a8-8efd-7b113f5c634c-kube-api-access-fgltn\") pod \"collect-profiles-29523600-9hg97\" (UID: \"ee2dabc9-c094-41a8-8efd-7b113f5c634c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523600-9hg97" Feb 18 12:00:00 crc kubenswrapper[4922]: I0218 12:00:00.401135 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgltn\" (UniqueName: \"kubernetes.io/projected/ee2dabc9-c094-41a8-8efd-7b113f5c634c-kube-api-access-fgltn\") pod \"collect-profiles-29523600-9hg97\" (UID: \"ee2dabc9-c094-41a8-8efd-7b113f5c634c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523600-9hg97" Feb 18 12:00:00 crc kubenswrapper[4922]: I0218 12:00:00.401324 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ee2dabc9-c094-41a8-8efd-7b113f5c634c-config-volume\") pod \"collect-profiles-29523600-9hg97\" (UID: \"ee2dabc9-c094-41a8-8efd-7b113f5c634c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523600-9hg97" Feb 18 12:00:00 crc kubenswrapper[4922]: I0218 12:00:00.401355 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ee2dabc9-c094-41a8-8efd-7b113f5c634c-secret-volume\") pod \"collect-profiles-29523600-9hg97\" (UID: \"ee2dabc9-c094-41a8-8efd-7b113f5c634c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523600-9hg97" Feb 18 12:00:00 crc kubenswrapper[4922]: I0218 12:00:00.402383 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ee2dabc9-c094-41a8-8efd-7b113f5c634c-config-volume\") pod \"collect-profiles-29523600-9hg97\" (UID: \"ee2dabc9-c094-41a8-8efd-7b113f5c634c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523600-9hg97" Feb 18 12:00:00 crc kubenswrapper[4922]: I0218 12:00:00.409696 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ee2dabc9-c094-41a8-8efd-7b113f5c634c-secret-volume\") pod \"collect-profiles-29523600-9hg97\" (UID: \"ee2dabc9-c094-41a8-8efd-7b113f5c634c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523600-9hg97" Feb 18 12:00:00 crc kubenswrapper[4922]: I0218 12:00:00.435569 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgltn\" (UniqueName: \"kubernetes.io/projected/ee2dabc9-c094-41a8-8efd-7b113f5c634c-kube-api-access-fgltn\") pod \"collect-profiles-29523600-9hg97\" (UID: \"ee2dabc9-c094-41a8-8efd-7b113f5c634c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523600-9hg97" Feb 18 12:00:00 crc kubenswrapper[4922]: I0218 12:00:00.483157 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523600-9hg97" Feb 18 12:00:00 crc kubenswrapper[4922]: I0218 12:00:00.954012 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523600-9hg97"] Feb 18 12:00:01 crc kubenswrapper[4922]: I0218 12:00:01.654207 4922 generic.go:334] "Generic (PLEG): container finished" podID="ee2dabc9-c094-41a8-8efd-7b113f5c634c" containerID="8bef9aa4b92aba91322be4b15768a495bfe0d2b031bccfdb47f0999ccd8a7508" exitCode=0 Feb 18 12:00:01 crc kubenswrapper[4922]: I0218 12:00:01.654254 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523600-9hg97" event={"ID":"ee2dabc9-c094-41a8-8efd-7b113f5c634c","Type":"ContainerDied","Data":"8bef9aa4b92aba91322be4b15768a495bfe0d2b031bccfdb47f0999ccd8a7508"} Feb 18 12:00:01 crc kubenswrapper[4922]: I0218 12:00:01.655364 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523600-9hg97" event={"ID":"ee2dabc9-c094-41a8-8efd-7b113f5c634c","Type":"ContainerStarted","Data":"79b3ffcfd2518bc6578364c9c8a0c4c14abd73b2e873e3bed532e4d07863d8c0"} Feb 18 12:00:03 crc kubenswrapper[4922]: I0218 12:00:03.048912 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523600-9hg97" Feb 18 12:00:03 crc kubenswrapper[4922]: I0218 12:00:03.156500 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ee2dabc9-c094-41a8-8efd-7b113f5c634c-config-volume\") pod \"ee2dabc9-c094-41a8-8efd-7b113f5c634c\" (UID: \"ee2dabc9-c094-41a8-8efd-7b113f5c634c\") " Feb 18 12:00:03 crc kubenswrapper[4922]: I0218 12:00:03.156829 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgltn\" (UniqueName: \"kubernetes.io/projected/ee2dabc9-c094-41a8-8efd-7b113f5c634c-kube-api-access-fgltn\") pod \"ee2dabc9-c094-41a8-8efd-7b113f5c634c\" (UID: \"ee2dabc9-c094-41a8-8efd-7b113f5c634c\") " Feb 18 12:00:03 crc kubenswrapper[4922]: I0218 12:00:03.157289 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ee2dabc9-c094-41a8-8efd-7b113f5c634c-secret-volume\") pod \"ee2dabc9-c094-41a8-8efd-7b113f5c634c\" (UID: \"ee2dabc9-c094-41a8-8efd-7b113f5c634c\") " Feb 18 12:00:03 crc kubenswrapper[4922]: I0218 12:00:03.157336 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee2dabc9-c094-41a8-8efd-7b113f5c634c-config-volume" (OuterVolumeSpecName: "config-volume") pod "ee2dabc9-c094-41a8-8efd-7b113f5c634c" (UID: "ee2dabc9-c094-41a8-8efd-7b113f5c634c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 12:00:03 crc kubenswrapper[4922]: I0218 12:00:03.159311 4922 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ee2dabc9-c094-41a8-8efd-7b113f5c634c-config-volume\") on node \"crc\" DevicePath \"\"" Feb 18 12:00:03 crc kubenswrapper[4922]: I0218 12:00:03.162834 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee2dabc9-c094-41a8-8efd-7b113f5c634c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ee2dabc9-c094-41a8-8efd-7b113f5c634c" (UID: "ee2dabc9-c094-41a8-8efd-7b113f5c634c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:00:03 crc kubenswrapper[4922]: I0218 12:00:03.162841 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee2dabc9-c094-41a8-8efd-7b113f5c634c-kube-api-access-fgltn" (OuterVolumeSpecName: "kube-api-access-fgltn") pod "ee2dabc9-c094-41a8-8efd-7b113f5c634c" (UID: "ee2dabc9-c094-41a8-8efd-7b113f5c634c"). InnerVolumeSpecName "kube-api-access-fgltn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:00:03 crc kubenswrapper[4922]: I0218 12:00:03.260914 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgltn\" (UniqueName: \"kubernetes.io/projected/ee2dabc9-c094-41a8-8efd-7b113f5c634c-kube-api-access-fgltn\") on node \"crc\" DevicePath \"\"" Feb 18 12:00:03 crc kubenswrapper[4922]: I0218 12:00:03.260965 4922 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ee2dabc9-c094-41a8-8efd-7b113f5c634c-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 18 12:00:03 crc kubenswrapper[4922]: I0218 12:00:03.672116 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523600-9hg97" event={"ID":"ee2dabc9-c094-41a8-8efd-7b113f5c634c","Type":"ContainerDied","Data":"79b3ffcfd2518bc6578364c9c8a0c4c14abd73b2e873e3bed532e4d07863d8c0"} Feb 18 12:00:03 crc kubenswrapper[4922]: I0218 12:00:03.672152 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79b3ffcfd2518bc6578364c9c8a0c4c14abd73b2e873e3bed532e4d07863d8c0" Feb 18 12:00:03 crc kubenswrapper[4922]: I0218 12:00:03.672153 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523600-9hg97" Feb 18 12:00:09 crc kubenswrapper[4922]: I0218 12:00:09.345848 4922 scope.go:117] "RemoveContainer" containerID="f9617d9b57da4e95bb7eb7f0412ba485b6082bcd962a46f87e7d295e47d23bfb" Feb 18 12:00:09 crc kubenswrapper[4922]: I0218 12:00:09.811018 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 12:00:09 crc kubenswrapper[4922]: I0218 12:00:09.811654 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 12:00:11 crc kubenswrapper[4922]: I0218 12:00:11.832228 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 18 12:00:21 crc kubenswrapper[4922]: I0218 12:00:21.201021 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 18 12:00:22 crc kubenswrapper[4922]: I0218 12:00:22.116047 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 18 12:00:25 crc kubenswrapper[4922]: I0218 12:00:25.686021 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="cef557d2-b935-4cf6-98f1-d3c2251c0e38" containerName="rabbitmq" containerID="cri-o://745e13628b88d2d6249c23c49d872e4f0c6a229af5e85aec709bffe789a60191" gracePeriod=604796 Feb 18 12:00:26 crc kubenswrapper[4922]: I0218 12:00:26.958474 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="12b84523-522e-4e8c-b78e-0094262fb1f8" containerName="rabbitmq" containerID="cri-o://ebcec927014698c8dd3e3cce0832985c8aae1c112a06d773a543bb68530768f3" gracePeriod=604796 Feb 18 12:00:28 crc kubenswrapper[4922]: I0218 12:00:28.525453 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="cef557d2-b935-4cf6-98f1-d3c2251c0e38" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.104:5671: connect: connection refused" Feb 18 12:00:28 crc kubenswrapper[4922]: I0218 12:00:28.871344 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="12b84523-522e-4e8c-b78e-0094262fb1f8" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.105:5671: connect: connection refused" Feb 18 12:00:31 crc kubenswrapper[4922]: I0218 12:00:31.974193 4922 generic.go:334] "Generic (PLEG): container finished" podID="cef557d2-b935-4cf6-98f1-d3c2251c0e38" containerID="745e13628b88d2d6249c23c49d872e4f0c6a229af5e85aec709bffe789a60191" exitCode=0 Feb 18 12:00:31 crc kubenswrapper[4922]: I0218 12:00:31.974825 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cef557d2-b935-4cf6-98f1-d3c2251c0e38","Type":"ContainerDied","Data":"745e13628b88d2d6249c23c49d872e4f0c6a229af5e85aec709bffe789a60191"} Feb 18 12:00:32 crc kubenswrapper[4922]: I0218 12:00:32.256653 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 18 12:00:32 crc kubenswrapper[4922]: I0218 12:00:32.330041 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cef557d2-b935-4cf6-98f1-d3c2251c0e38-server-conf\") pod \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\" (UID: \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\") " Feb 18 12:00:32 crc kubenswrapper[4922]: I0218 12:00:32.330464 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gvvw\" (UniqueName: \"kubernetes.io/projected/cef557d2-b935-4cf6-98f1-d3c2251c0e38-kube-api-access-9gvvw\") pod \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\" (UID: \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\") " Feb 18 12:00:32 crc kubenswrapper[4922]: I0218 12:00:32.330581 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cef557d2-b935-4cf6-98f1-d3c2251c0e38-rabbitmq-plugins\") pod \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\" (UID: \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\") " Feb 18 12:00:32 crc kubenswrapper[4922]: I0218 12:00:32.330681 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cef557d2-b935-4cf6-98f1-d3c2251c0e38-plugins-conf\") pod \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\" (UID: \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\") " Feb 18 12:00:32 crc kubenswrapper[4922]: I0218 12:00:32.330717 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\" (UID: \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\") " Feb 18 12:00:32 crc kubenswrapper[4922]: I0218 12:00:32.330787 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cef557d2-b935-4cf6-98f1-d3c2251c0e38-rabbitmq-erlang-cookie\") pod \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\" (UID: \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\") " Feb 18 12:00:32 crc kubenswrapper[4922]: I0218 12:00:32.330831 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cef557d2-b935-4cf6-98f1-d3c2251c0e38-config-data\") pod \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\" (UID: \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\") " Feb 18 12:00:32 crc kubenswrapper[4922]: I0218 12:00:32.330870 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cef557d2-b935-4cf6-98f1-d3c2251c0e38-erlang-cookie-secret\") pod \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\" (UID: \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\") " Feb 18 12:00:32 crc kubenswrapper[4922]: I0218 12:00:32.330952 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cef557d2-b935-4cf6-98f1-d3c2251c0e38-rabbitmq-confd\") pod \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\" (UID: \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\") " Feb 18 12:00:32 crc kubenswrapper[4922]: I0218 12:00:32.331000 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cef557d2-b935-4cf6-98f1-d3c2251c0e38-rabbitmq-tls\") pod \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\" (UID: \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\") " Feb 18 12:00:32 crc kubenswrapper[4922]: I0218 12:00:32.331037 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cef557d2-b935-4cf6-98f1-d3c2251c0e38-pod-info\") pod \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\" (UID: \"cef557d2-b935-4cf6-98f1-d3c2251c0e38\") " Feb 18 12:00:32 crc kubenswrapper[4922]: I0218 12:00:32.332749 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cef557d2-b935-4cf6-98f1-d3c2251c0e38-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "cef557d2-b935-4cf6-98f1-d3c2251c0e38" (UID: "cef557d2-b935-4cf6-98f1-d3c2251c0e38"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:00:32 crc kubenswrapper[4922]: I0218 12:00:32.335852 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cef557d2-b935-4cf6-98f1-d3c2251c0e38-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "cef557d2-b935-4cf6-98f1-d3c2251c0e38" (UID: "cef557d2-b935-4cf6-98f1-d3c2251c0e38"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 12:00:32 crc kubenswrapper[4922]: I0218 12:00:32.336228 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cef557d2-b935-4cf6-98f1-d3c2251c0e38-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "cef557d2-b935-4cf6-98f1-d3c2251c0e38" (UID: "cef557d2-b935-4cf6-98f1-d3c2251c0e38"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:00:32 crc kubenswrapper[4922]: I0218 12:00:32.344014 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "persistence") pod "cef557d2-b935-4cf6-98f1-d3c2251c0e38" (UID: "cef557d2-b935-4cf6-98f1-d3c2251c0e38"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 18 12:00:32 crc kubenswrapper[4922]: I0218 12:00:32.347561 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cef557d2-b935-4cf6-98f1-d3c2251c0e38-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "cef557d2-b935-4cf6-98f1-d3c2251c0e38" (UID: "cef557d2-b935-4cf6-98f1-d3c2251c0e38"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:00:32 crc kubenswrapper[4922]: I0218 12:00:32.349716 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/cef557d2-b935-4cf6-98f1-d3c2251c0e38-pod-info" (OuterVolumeSpecName: "pod-info") pod "cef557d2-b935-4cf6-98f1-d3c2251c0e38" (UID: "cef557d2-b935-4cf6-98f1-d3c2251c0e38"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 18 12:00:32 crc kubenswrapper[4922]: I0218 12:00:32.355865 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cef557d2-b935-4cf6-98f1-d3c2251c0e38-kube-api-access-9gvvw" (OuterVolumeSpecName: "kube-api-access-9gvvw") pod "cef557d2-b935-4cf6-98f1-d3c2251c0e38" (UID: "cef557d2-b935-4cf6-98f1-d3c2251c0e38"). InnerVolumeSpecName "kube-api-access-9gvvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:00:32 crc kubenswrapper[4922]: I0218 12:00:32.358425 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cef557d2-b935-4cf6-98f1-d3c2251c0e38-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "cef557d2-b935-4cf6-98f1-d3c2251c0e38" (UID: "cef557d2-b935-4cf6-98f1-d3c2251c0e38"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:00:32 crc kubenswrapper[4922]: I0218 12:00:32.392567 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cef557d2-b935-4cf6-98f1-d3c2251c0e38-config-data" (OuterVolumeSpecName: "config-data") pod "cef557d2-b935-4cf6-98f1-d3c2251c0e38" (UID: "cef557d2-b935-4cf6-98f1-d3c2251c0e38"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 12:00:32 crc kubenswrapper[4922]: I0218 12:00:32.433414 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cef557d2-b935-4cf6-98f1-d3c2251c0e38-server-conf" (OuterVolumeSpecName: "server-conf") pod "cef557d2-b935-4cf6-98f1-d3c2251c0e38" (UID: "cef557d2-b935-4cf6-98f1-d3c2251c0e38"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 12:00:32 crc kubenswrapper[4922]: I0218 12:00:32.433529 4922 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cef557d2-b935-4cf6-98f1-d3c2251c0e38-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 18 12:00:32 crc kubenswrapper[4922]: I0218 12:00:32.433567 4922 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Feb 18 12:00:32 crc kubenswrapper[4922]: I0218 12:00:32.433578 4922 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cef557d2-b935-4cf6-98f1-d3c2251c0e38-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 18 12:00:32 crc kubenswrapper[4922]: I0218 12:00:32.433591 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cef557d2-b935-4cf6-98f1-d3c2251c0e38-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 12:00:32 crc kubenswrapper[4922]: I0218 12:00:32.433600 4922 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cef557d2-b935-4cf6-98f1-d3c2251c0e38-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 18 12:00:32 crc kubenswrapper[4922]: I0218 12:00:32.433609 4922 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cef557d2-b935-4cf6-98f1-d3c2251c0e38-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 18 12:00:32 crc kubenswrapper[4922]: I0218 12:00:32.433617 4922 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cef557d2-b935-4cf6-98f1-d3c2251c0e38-pod-info\") on node \"crc\" DevicePath \"\"" Feb 18 12:00:32 crc kubenswrapper[4922]: I0218 12:00:32.433626 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gvvw\" (UniqueName: \"kubernetes.io/projected/cef557d2-b935-4cf6-98f1-d3c2251c0e38-kube-api-access-9gvvw\") on node \"crc\" DevicePath \"\"" Feb 18 12:00:32 crc kubenswrapper[4922]: I0218 12:00:32.433634 4922 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cef557d2-b935-4cf6-98f1-d3c2251c0e38-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 18 12:00:32 crc kubenswrapper[4922]: I0218 12:00:32.459483 4922 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Feb 18 12:00:32 crc kubenswrapper[4922]: I0218 12:00:32.506183 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cef557d2-b935-4cf6-98f1-d3c2251c0e38-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "cef557d2-b935-4cf6-98f1-d3c2251c0e38" (UID: "cef557d2-b935-4cf6-98f1-d3c2251c0e38"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:00:32 crc kubenswrapper[4922]: I0218 12:00:32.535809 4922 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Feb 18 12:00:32 crc kubenswrapper[4922]: I0218 12:00:32.535843 4922 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cef557d2-b935-4cf6-98f1-d3c2251c0e38-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 18 12:00:32 crc kubenswrapper[4922]: I0218 12:00:32.535854 4922 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cef557d2-b935-4cf6-98f1-d3c2251c0e38-server-conf\") on node \"crc\" DevicePath \"\"" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.002990 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cef557d2-b935-4cf6-98f1-d3c2251c0e38","Type":"ContainerDied","Data":"d9788f1e654fa9ba3ad3f0a6ae9798137af27ad57e7e68121b8391b0725d166c"} Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.003043 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.003051 4922 scope.go:117] "RemoveContainer" containerID="745e13628b88d2d6249c23c49d872e4f0c6a229af5e85aec709bffe789a60191" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.035701 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.040505 4922 scope.go:117] "RemoveContainer" containerID="fcebb6698c6e4874e1a9ec8fbf1e0b1f0b32b5ba69a663e4d66216e0e480bd70" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.044748 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.065968 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 18 12:00:33 crc kubenswrapper[4922]: E0218 12:00:33.066669 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cef557d2-b935-4cf6-98f1-d3c2251c0e38" containerName="rabbitmq" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.066689 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="cef557d2-b935-4cf6-98f1-d3c2251c0e38" containerName="rabbitmq" Feb 18 12:00:33 crc kubenswrapper[4922]: E0218 12:00:33.066713 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cef557d2-b935-4cf6-98f1-d3c2251c0e38" containerName="setup-container" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.066720 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="cef557d2-b935-4cf6-98f1-d3c2251c0e38" containerName="setup-container" Feb 18 12:00:33 crc kubenswrapper[4922]: E0218 12:00:33.066740 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee2dabc9-c094-41a8-8efd-7b113f5c634c" containerName="collect-profiles" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.066746 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee2dabc9-c094-41a8-8efd-7b113f5c634c" containerName="collect-profiles" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.066912 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="cef557d2-b935-4cf6-98f1-d3c2251c0e38" containerName="rabbitmq" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.066926 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee2dabc9-c094-41a8-8efd-7b113f5c634c" containerName="collect-profiles" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.068416 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.070421 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.071476 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.071516 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-ctw5n" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.071652 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.071681 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.071798 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.071889 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.094104 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.151840 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bb934d91-0203-48d1-be6a-ab13e821993d-config-data\") pod \"rabbitmq-server-0\" (UID: \"bb934d91-0203-48d1-be6a-ab13e821993d\") " pod="openstack/rabbitmq-server-0" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.151888 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"bb934d91-0203-48d1-be6a-ab13e821993d\") " pod="openstack/rabbitmq-server-0" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.151909 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bb934d91-0203-48d1-be6a-ab13e821993d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"bb934d91-0203-48d1-be6a-ab13e821993d\") " pod="openstack/rabbitmq-server-0" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.151928 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bb934d91-0203-48d1-be6a-ab13e821993d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"bb934d91-0203-48d1-be6a-ab13e821993d\") " pod="openstack/rabbitmq-server-0" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.151973 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bb934d91-0203-48d1-be6a-ab13e821993d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"bb934d91-0203-48d1-be6a-ab13e821993d\") " pod="openstack/rabbitmq-server-0" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.151993 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bb934d91-0203-48d1-be6a-ab13e821993d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"bb934d91-0203-48d1-be6a-ab13e821993d\") " pod="openstack/rabbitmq-server-0" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.152010 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bb934d91-0203-48d1-be6a-ab13e821993d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"bb934d91-0203-48d1-be6a-ab13e821993d\") " pod="openstack/rabbitmq-server-0" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.152033 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bb934d91-0203-48d1-be6a-ab13e821993d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"bb934d91-0203-48d1-be6a-ab13e821993d\") " pod="openstack/rabbitmq-server-0" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.152064 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bb934d91-0203-48d1-be6a-ab13e821993d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"bb934d91-0203-48d1-be6a-ab13e821993d\") " pod="openstack/rabbitmq-server-0" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.152087 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb55s\" (UniqueName: \"kubernetes.io/projected/bb934d91-0203-48d1-be6a-ab13e821993d-kube-api-access-jb55s\") pod \"rabbitmq-server-0\" (UID: \"bb934d91-0203-48d1-be6a-ab13e821993d\") " pod="openstack/rabbitmq-server-0" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.152144 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bb934d91-0203-48d1-be6a-ab13e821993d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"bb934d91-0203-48d1-be6a-ab13e821993d\") " pod="openstack/rabbitmq-server-0" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.254017 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bb934d91-0203-48d1-be6a-ab13e821993d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"bb934d91-0203-48d1-be6a-ab13e821993d\") " pod="openstack/rabbitmq-server-0" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.254129 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bb934d91-0203-48d1-be6a-ab13e821993d-config-data\") pod \"rabbitmq-server-0\" (UID: \"bb934d91-0203-48d1-be6a-ab13e821993d\") " pod="openstack/rabbitmq-server-0" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.254171 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"bb934d91-0203-48d1-be6a-ab13e821993d\") " pod="openstack/rabbitmq-server-0" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.254191 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bb934d91-0203-48d1-be6a-ab13e821993d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"bb934d91-0203-48d1-be6a-ab13e821993d\") " pod="openstack/rabbitmq-server-0" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.254216 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bb934d91-0203-48d1-be6a-ab13e821993d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"bb934d91-0203-48d1-be6a-ab13e821993d\") " pod="openstack/rabbitmq-server-0" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.254280 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bb934d91-0203-48d1-be6a-ab13e821993d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"bb934d91-0203-48d1-be6a-ab13e821993d\") " pod="openstack/rabbitmq-server-0" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.254310 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bb934d91-0203-48d1-be6a-ab13e821993d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"bb934d91-0203-48d1-be6a-ab13e821993d\") " pod="openstack/rabbitmq-server-0" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.254335 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bb934d91-0203-48d1-be6a-ab13e821993d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"bb934d91-0203-48d1-be6a-ab13e821993d\") " pod="openstack/rabbitmq-server-0" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.254389 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bb934d91-0203-48d1-be6a-ab13e821993d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"bb934d91-0203-48d1-be6a-ab13e821993d\") " pod="openstack/rabbitmq-server-0" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.254427 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bb934d91-0203-48d1-be6a-ab13e821993d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"bb934d91-0203-48d1-be6a-ab13e821993d\") " pod="openstack/rabbitmq-server-0" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.254455 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb55s\" (UniqueName: \"kubernetes.io/projected/bb934d91-0203-48d1-be6a-ab13e821993d-kube-api-access-jb55s\") pod \"rabbitmq-server-0\" (UID: \"bb934d91-0203-48d1-be6a-ab13e821993d\") " pod="openstack/rabbitmq-server-0" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.255276 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bb934d91-0203-48d1-be6a-ab13e821993d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"bb934d91-0203-48d1-be6a-ab13e821993d\") " pod="openstack/rabbitmq-server-0" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.255476 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bb934d91-0203-48d1-be6a-ab13e821993d-config-data\") pod \"rabbitmq-server-0\" (UID: \"bb934d91-0203-48d1-be6a-ab13e821993d\") " pod="openstack/rabbitmq-server-0" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.255504 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"bb934d91-0203-48d1-be6a-ab13e821993d\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-server-0" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.255668 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bb934d91-0203-48d1-be6a-ab13e821993d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"bb934d91-0203-48d1-be6a-ab13e821993d\") " pod="openstack/rabbitmq-server-0" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.256192 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bb934d91-0203-48d1-be6a-ab13e821993d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"bb934d91-0203-48d1-be6a-ab13e821993d\") " pod="openstack/rabbitmq-server-0" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.256271 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bb934d91-0203-48d1-be6a-ab13e821993d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"bb934d91-0203-48d1-be6a-ab13e821993d\") " pod="openstack/rabbitmq-server-0" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.260378 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bb934d91-0203-48d1-be6a-ab13e821993d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"bb934d91-0203-48d1-be6a-ab13e821993d\") " pod="openstack/rabbitmq-server-0" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.263556 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bb934d91-0203-48d1-be6a-ab13e821993d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"bb934d91-0203-48d1-be6a-ab13e821993d\") " pod="openstack/rabbitmq-server-0" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.270059 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bb934d91-0203-48d1-be6a-ab13e821993d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"bb934d91-0203-48d1-be6a-ab13e821993d\") " pod="openstack/rabbitmq-server-0" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.277323 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bb934d91-0203-48d1-be6a-ab13e821993d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"bb934d91-0203-48d1-be6a-ab13e821993d\") " pod="openstack/rabbitmq-server-0" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.277537 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb55s\" (UniqueName: \"kubernetes.io/projected/bb934d91-0203-48d1-be6a-ab13e821993d-kube-api-access-jb55s\") pod \"rabbitmq-server-0\" (UID: \"bb934d91-0203-48d1-be6a-ab13e821993d\") " pod="openstack/rabbitmq-server-0" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.325762 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"bb934d91-0203-48d1-be6a-ab13e821993d\") " pod="openstack/rabbitmq-server-0" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.440645 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.531886 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.564487 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9xcj\" (UniqueName: \"kubernetes.io/projected/12b84523-522e-4e8c-b78e-0094262fb1f8-kube-api-access-d9xcj\") pod \"12b84523-522e-4e8c-b78e-0094262fb1f8\" (UID: \"12b84523-522e-4e8c-b78e-0094262fb1f8\") " Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.564782 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/12b84523-522e-4e8c-b78e-0094262fb1f8-erlang-cookie-secret\") pod \"12b84523-522e-4e8c-b78e-0094262fb1f8\" (UID: \"12b84523-522e-4e8c-b78e-0094262fb1f8\") " Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.564860 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/12b84523-522e-4e8c-b78e-0094262fb1f8-rabbitmq-tls\") pod \"12b84523-522e-4e8c-b78e-0094262fb1f8\" (UID: \"12b84523-522e-4e8c-b78e-0094262fb1f8\") " Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.564898 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/12b84523-522e-4e8c-b78e-0094262fb1f8-rabbitmq-erlang-cookie\") pod \"12b84523-522e-4e8c-b78e-0094262fb1f8\" (UID: \"12b84523-522e-4e8c-b78e-0094262fb1f8\") " Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.564926 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/12b84523-522e-4e8c-b78e-0094262fb1f8-pod-info\") pod \"12b84523-522e-4e8c-b78e-0094262fb1f8\" (UID: \"12b84523-522e-4e8c-b78e-0094262fb1f8\") " Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.564963 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/12b84523-522e-4e8c-b78e-0094262fb1f8-plugins-conf\") pod \"12b84523-522e-4e8c-b78e-0094262fb1f8\" (UID: \"12b84523-522e-4e8c-b78e-0094262fb1f8\") " Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.565000 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/12b84523-522e-4e8c-b78e-0094262fb1f8-rabbitmq-confd\") pod \"12b84523-522e-4e8c-b78e-0094262fb1f8\" (UID: \"12b84523-522e-4e8c-b78e-0094262fb1f8\") " Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.565016 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/12b84523-522e-4e8c-b78e-0094262fb1f8-config-data\") pod \"12b84523-522e-4e8c-b78e-0094262fb1f8\" (UID: \"12b84523-522e-4e8c-b78e-0094262fb1f8\") " Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.565061 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/12b84523-522e-4e8c-b78e-0094262fb1f8-server-conf\") pod \"12b84523-522e-4e8c-b78e-0094262fb1f8\" (UID: \"12b84523-522e-4e8c-b78e-0094262fb1f8\") " Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.565102 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/12b84523-522e-4e8c-b78e-0094262fb1f8-rabbitmq-plugins\") pod \"12b84523-522e-4e8c-b78e-0094262fb1f8\" (UID: \"12b84523-522e-4e8c-b78e-0094262fb1f8\") " Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.565188 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"12b84523-522e-4e8c-b78e-0094262fb1f8\" (UID: \"12b84523-522e-4e8c-b78e-0094262fb1f8\") " Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.567025 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12b84523-522e-4e8c-b78e-0094262fb1f8-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "12b84523-522e-4e8c-b78e-0094262fb1f8" (UID: "12b84523-522e-4e8c-b78e-0094262fb1f8"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.573087 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12b84523-522e-4e8c-b78e-0094262fb1f8-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "12b84523-522e-4e8c-b78e-0094262fb1f8" (UID: "12b84523-522e-4e8c-b78e-0094262fb1f8"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.574511 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12b84523-522e-4e8c-b78e-0094262fb1f8-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "12b84523-522e-4e8c-b78e-0094262fb1f8" (UID: "12b84523-522e-4e8c-b78e-0094262fb1f8"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.577037 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12b84523-522e-4e8c-b78e-0094262fb1f8-kube-api-access-d9xcj" (OuterVolumeSpecName: "kube-api-access-d9xcj") pod "12b84523-522e-4e8c-b78e-0094262fb1f8" (UID: "12b84523-522e-4e8c-b78e-0094262fb1f8"). InnerVolumeSpecName "kube-api-access-d9xcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.578630 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/12b84523-522e-4e8c-b78e-0094262fb1f8-pod-info" (OuterVolumeSpecName: "pod-info") pod "12b84523-522e-4e8c-b78e-0094262fb1f8" (UID: "12b84523-522e-4e8c-b78e-0094262fb1f8"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.579100 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "persistence") pod "12b84523-522e-4e8c-b78e-0094262fb1f8" (UID: "12b84523-522e-4e8c-b78e-0094262fb1f8"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.579892 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12b84523-522e-4e8c-b78e-0094262fb1f8-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "12b84523-522e-4e8c-b78e-0094262fb1f8" (UID: "12b84523-522e-4e8c-b78e-0094262fb1f8"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.580744 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12b84523-522e-4e8c-b78e-0094262fb1f8-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "12b84523-522e-4e8c-b78e-0094262fb1f8" (UID: "12b84523-522e-4e8c-b78e-0094262fb1f8"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.628708 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12b84523-522e-4e8c-b78e-0094262fb1f8-config-data" (OuterVolumeSpecName: "config-data") pod "12b84523-522e-4e8c-b78e-0094262fb1f8" (UID: "12b84523-522e-4e8c-b78e-0094262fb1f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.721311 4922 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/12b84523-522e-4e8c-b78e-0094262fb1f8-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.721351 4922 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/12b84523-522e-4e8c-b78e-0094262fb1f8-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.721403 4922 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/12b84523-522e-4e8c-b78e-0094262fb1f8-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.721438 4922 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/12b84523-522e-4e8c-b78e-0094262fb1f8-pod-info\") on node \"crc\" DevicePath \"\"" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.721447 4922 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/12b84523-522e-4e8c-b78e-0094262fb1f8-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.721456 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/12b84523-522e-4e8c-b78e-0094262fb1f8-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.721467 4922 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/12b84523-522e-4e8c-b78e-0094262fb1f8-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.721496 4922 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.721505 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9xcj\" (UniqueName: \"kubernetes.io/projected/12b84523-522e-4e8c-b78e-0094262fb1f8-kube-api-access-d9xcj\") on node \"crc\" DevicePath \"\"" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.753444 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12b84523-522e-4e8c-b78e-0094262fb1f8-server-conf" (OuterVolumeSpecName: "server-conf") pod "12b84523-522e-4e8c-b78e-0094262fb1f8" (UID: "12b84523-522e-4e8c-b78e-0094262fb1f8"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.759380 4922 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.788094 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12b84523-522e-4e8c-b78e-0094262fb1f8-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "12b84523-522e-4e8c-b78e-0094262fb1f8" (UID: "12b84523-522e-4e8c-b78e-0094262fb1f8"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.823655 4922 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.823687 4922 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/12b84523-522e-4e8c-b78e-0094262fb1f8-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.823698 4922 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/12b84523-522e-4e8c-b78e-0094262fb1f8-server-conf\") on node \"crc\" DevicePath \"\"" Feb 18 12:00:33 crc kubenswrapper[4922]: I0218 12:00:33.971934 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.018008 4922 generic.go:334] "Generic (PLEG): container finished" podID="12b84523-522e-4e8c-b78e-0094262fb1f8" containerID="ebcec927014698c8dd3e3cce0832985c8aae1c112a06d773a543bb68530768f3" exitCode=0 Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.018122 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.018116 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"12b84523-522e-4e8c-b78e-0094262fb1f8","Type":"ContainerDied","Data":"ebcec927014698c8dd3e3cce0832985c8aae1c112a06d773a543bb68530768f3"} Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.019477 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"12b84523-522e-4e8c-b78e-0094262fb1f8","Type":"ContainerDied","Data":"aab8e1d8bb4c1667bc6b73808bdb819ba395465155e9f67195316f9044955cf6"} Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.019533 4922 scope.go:117] "RemoveContainer" containerID="ebcec927014698c8dd3e3cce0832985c8aae1c112a06d773a543bb68530768f3" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.027560 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bb934d91-0203-48d1-be6a-ab13e821993d","Type":"ContainerStarted","Data":"feb4e60334b86b208e674d9d4c6478a43e10f01e123da2c32ceacc5e936672dc"} Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.106319 4922 scope.go:117] "RemoveContainer" containerID="e2215857f67c60d9f5c91d2142e0be2f147282bd8c00e873605b8bd17b7df49a" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.153453 4922 scope.go:117] "RemoveContainer" containerID="ebcec927014698c8dd3e3cce0832985c8aae1c112a06d773a543bb68530768f3" Feb 18 12:00:34 crc kubenswrapper[4922]: E0218 12:00:34.154328 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebcec927014698c8dd3e3cce0832985c8aae1c112a06d773a543bb68530768f3\": container with ID starting with ebcec927014698c8dd3e3cce0832985c8aae1c112a06d773a543bb68530768f3 not found: ID does not exist" containerID="ebcec927014698c8dd3e3cce0832985c8aae1c112a06d773a543bb68530768f3" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.154387 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebcec927014698c8dd3e3cce0832985c8aae1c112a06d773a543bb68530768f3"} err="failed to get container status \"ebcec927014698c8dd3e3cce0832985c8aae1c112a06d773a543bb68530768f3\": rpc error: code = NotFound desc = could not find container \"ebcec927014698c8dd3e3cce0832985c8aae1c112a06d773a543bb68530768f3\": container with ID starting with ebcec927014698c8dd3e3cce0832985c8aae1c112a06d773a543bb68530768f3 not found: ID does not exist" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.154417 4922 scope.go:117] "RemoveContainer" containerID="e2215857f67c60d9f5c91d2142e0be2f147282bd8c00e873605b8bd17b7df49a" Feb 18 12:00:34 crc kubenswrapper[4922]: E0218 12:00:34.154810 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2215857f67c60d9f5c91d2142e0be2f147282bd8c00e873605b8bd17b7df49a\": container with ID starting with e2215857f67c60d9f5c91d2142e0be2f147282bd8c00e873605b8bd17b7df49a not found: ID does not exist" containerID="e2215857f67c60d9f5c91d2142e0be2f147282bd8c00e873605b8bd17b7df49a" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.154829 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2215857f67c60d9f5c91d2142e0be2f147282bd8c00e873605b8bd17b7df49a"} err="failed to get container status \"e2215857f67c60d9f5c91d2142e0be2f147282bd8c00e873605b8bd17b7df49a\": rpc error: code = NotFound desc = could not find container \"e2215857f67c60d9f5c91d2142e0be2f147282bd8c00e873605b8bd17b7df49a\": container with ID starting with e2215857f67c60d9f5c91d2142e0be2f147282bd8c00e873605b8bd17b7df49a not found: ID does not exist" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.160447 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.169760 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.195498 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 18 12:00:34 crc kubenswrapper[4922]: E0218 12:00:34.195904 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12b84523-522e-4e8c-b78e-0094262fb1f8" containerName="rabbitmq" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.195920 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="12b84523-522e-4e8c-b78e-0094262fb1f8" containerName="rabbitmq" Feb 18 12:00:34 crc kubenswrapper[4922]: E0218 12:00:34.195954 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12b84523-522e-4e8c-b78e-0094262fb1f8" containerName="setup-container" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.195960 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="12b84523-522e-4e8c-b78e-0094262fb1f8" containerName="setup-container" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.196149 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="12b84523-522e-4e8c-b78e-0094262fb1f8" containerName="rabbitmq" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.197162 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.198606 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.198669 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.199012 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.199144 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.199289 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.200074 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-8fwmc" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.200582 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.205509 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.302203 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kjhlh"] Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.304495 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kjhlh" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.322185 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kjhlh"] Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.354825 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9eb7dcb0-20c5-414c-bc86-58461654bcb5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9eb7dcb0-20c5-414c-bc86-58461654bcb5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.354883 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9eb7dcb0-20c5-414c-bc86-58461654bcb5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9eb7dcb0-20c5-414c-bc86-58461654bcb5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.355041 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9eb7dcb0-20c5-414c-bc86-58461654bcb5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9eb7dcb0-20c5-414c-bc86-58461654bcb5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.355157 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9eb7dcb0-20c5-414c-bc86-58461654bcb5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9eb7dcb0-20c5-414c-bc86-58461654bcb5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.355219 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9eb7dcb0-20c5-414c-bc86-58461654bcb5-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9eb7dcb0-20c5-414c-bc86-58461654bcb5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.355463 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9eb7dcb0-20c5-414c-bc86-58461654bcb5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9eb7dcb0-20c5-414c-bc86-58461654bcb5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.355536 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9eb7dcb0-20c5-414c-bc86-58461654bcb5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9eb7dcb0-20c5-414c-bc86-58461654bcb5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.355565 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szj9q\" (UniqueName: \"kubernetes.io/projected/9eb7dcb0-20c5-414c-bc86-58461654bcb5-kube-api-access-szj9q\") pod \"rabbitmq-cell1-server-0\" (UID: \"9eb7dcb0-20c5-414c-bc86-58461654bcb5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.355604 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9eb7dcb0-20c5-414c-bc86-58461654bcb5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9eb7dcb0-20c5-414c-bc86-58461654bcb5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.355791 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9eb7dcb0-20c5-414c-bc86-58461654bcb5-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9eb7dcb0-20c5-414c-bc86-58461654bcb5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.355895 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9eb7dcb0-20c5-414c-bc86-58461654bcb5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.457393 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9eb7dcb0-20c5-414c-bc86-58461654bcb5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9eb7dcb0-20c5-414c-bc86-58461654bcb5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.457471 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9eb7dcb0-20c5-414c-bc86-58461654bcb5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9eb7dcb0-20c5-414c-bc86-58461654bcb5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.457514 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szj9q\" (UniqueName: \"kubernetes.io/projected/9eb7dcb0-20c5-414c-bc86-58461654bcb5-kube-api-access-szj9q\") pod \"rabbitmq-cell1-server-0\" (UID: \"9eb7dcb0-20c5-414c-bc86-58461654bcb5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.457546 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9eb7dcb0-20c5-414c-bc86-58461654bcb5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9eb7dcb0-20c5-414c-bc86-58461654bcb5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.457575 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6666e009-8c33-402c-865e-03e35b98ad97-utilities\") pod \"redhat-operators-kjhlh\" (UID: \"6666e009-8c33-402c-865e-03e35b98ad97\") " pod="openshift-marketplace/redhat-operators-kjhlh" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.457616 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9eb7dcb0-20c5-414c-bc86-58461654bcb5-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9eb7dcb0-20c5-414c-bc86-58461654bcb5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.457673 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9eb7dcb0-20c5-414c-bc86-58461654bcb5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.457734 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8p6v\" (UniqueName: \"kubernetes.io/projected/6666e009-8c33-402c-865e-03e35b98ad97-kube-api-access-s8p6v\") pod \"redhat-operators-kjhlh\" (UID: \"6666e009-8c33-402c-865e-03e35b98ad97\") " pod="openshift-marketplace/redhat-operators-kjhlh" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.457765 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9eb7dcb0-20c5-414c-bc86-58461654bcb5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9eb7dcb0-20c5-414c-bc86-58461654bcb5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.457797 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9eb7dcb0-20c5-414c-bc86-58461654bcb5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9eb7dcb0-20c5-414c-bc86-58461654bcb5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.457821 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6666e009-8c33-402c-865e-03e35b98ad97-catalog-content\") pod \"redhat-operators-kjhlh\" (UID: \"6666e009-8c33-402c-865e-03e35b98ad97\") " pod="openshift-marketplace/redhat-operators-kjhlh" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.457874 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9eb7dcb0-20c5-414c-bc86-58461654bcb5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9eb7dcb0-20c5-414c-bc86-58461654bcb5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.457911 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9eb7dcb0-20c5-414c-bc86-58461654bcb5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9eb7dcb0-20c5-414c-bc86-58461654bcb5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.457943 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9eb7dcb0-20c5-414c-bc86-58461654bcb5-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9eb7dcb0-20c5-414c-bc86-58461654bcb5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.458353 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9eb7dcb0-20c5-414c-bc86-58461654bcb5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9eb7dcb0-20c5-414c-bc86-58461654bcb5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.458714 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9eb7dcb0-20c5-414c-bc86-58461654bcb5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9eb7dcb0-20c5-414c-bc86-58461654bcb5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.458850 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9eb7dcb0-20c5-414c-bc86-58461654bcb5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9eb7dcb0-20c5-414c-bc86-58461654bcb5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.459007 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9eb7dcb0-20c5-414c-bc86-58461654bcb5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9eb7dcb0-20c5-414c-bc86-58461654bcb5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.459040 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9eb7dcb0-20c5-414c-bc86-58461654bcb5\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-cell1-server-0" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.459438 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9eb7dcb0-20c5-414c-bc86-58461654bcb5-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9eb7dcb0-20c5-414c-bc86-58461654bcb5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.464004 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9eb7dcb0-20c5-414c-bc86-58461654bcb5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9eb7dcb0-20c5-414c-bc86-58461654bcb5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.464994 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9eb7dcb0-20c5-414c-bc86-58461654bcb5-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9eb7dcb0-20c5-414c-bc86-58461654bcb5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.466256 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9eb7dcb0-20c5-414c-bc86-58461654bcb5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9eb7dcb0-20c5-414c-bc86-58461654bcb5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.468775 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9eb7dcb0-20c5-414c-bc86-58461654bcb5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9eb7dcb0-20c5-414c-bc86-58461654bcb5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.479495 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szj9q\" (UniqueName: \"kubernetes.io/projected/9eb7dcb0-20c5-414c-bc86-58461654bcb5-kube-api-access-szj9q\") pod \"rabbitmq-cell1-server-0\" (UID: \"9eb7dcb0-20c5-414c-bc86-58461654bcb5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.495698 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9eb7dcb0-20c5-414c-bc86-58461654bcb5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.559305 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6666e009-8c33-402c-865e-03e35b98ad97-utilities\") pod \"redhat-operators-kjhlh\" (UID: \"6666e009-8c33-402c-865e-03e35b98ad97\") " pod="openshift-marketplace/redhat-operators-kjhlh" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.560117 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8p6v\" (UniqueName: \"kubernetes.io/projected/6666e009-8c33-402c-865e-03e35b98ad97-kube-api-access-s8p6v\") pod \"redhat-operators-kjhlh\" (UID: \"6666e009-8c33-402c-865e-03e35b98ad97\") " pod="openshift-marketplace/redhat-operators-kjhlh" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.560015 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6666e009-8c33-402c-865e-03e35b98ad97-utilities\") pod \"redhat-operators-kjhlh\" (UID: \"6666e009-8c33-402c-865e-03e35b98ad97\") " pod="openshift-marketplace/redhat-operators-kjhlh" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.559873 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.560338 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6666e009-8c33-402c-865e-03e35b98ad97-catalog-content\") pod \"redhat-operators-kjhlh\" (UID: \"6666e009-8c33-402c-865e-03e35b98ad97\") " pod="openshift-marketplace/redhat-operators-kjhlh" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.560747 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6666e009-8c33-402c-865e-03e35b98ad97-catalog-content\") pod \"redhat-operators-kjhlh\" (UID: \"6666e009-8c33-402c-865e-03e35b98ad97\") " pod="openshift-marketplace/redhat-operators-kjhlh" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.594873 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8p6v\" (UniqueName: \"kubernetes.io/projected/6666e009-8c33-402c-865e-03e35b98ad97-kube-api-access-s8p6v\") pod \"redhat-operators-kjhlh\" (UID: \"6666e009-8c33-402c-865e-03e35b98ad97\") " pod="openshift-marketplace/redhat-operators-kjhlh" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.619797 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kjhlh" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.985286 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12b84523-522e-4e8c-b78e-0094262fb1f8" path="/var/lib/kubelet/pods/12b84523-522e-4e8c-b78e-0094262fb1f8/volumes" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.986250 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cef557d2-b935-4cf6-98f1-d3c2251c0e38" path="/var/lib/kubelet/pods/cef557d2-b935-4cf6-98f1-d3c2251c0e38/volumes" Feb 18 12:00:34 crc kubenswrapper[4922]: I0218 12:00:34.991658 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kjhlh"] Feb 18 12:00:35 crc kubenswrapper[4922]: I0218 12:00:35.038154 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 18 12:00:35 crc kubenswrapper[4922]: I0218 12:00:35.934522 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d558885bc-pg7nw"] Feb 18 12:00:35 crc kubenswrapper[4922]: I0218 12:00:35.937503 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-pg7nw" Feb 18 12:00:35 crc kubenswrapper[4922]: I0218 12:00:35.943112 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Feb 18 12:00:35 crc kubenswrapper[4922]: I0218 12:00:35.970452 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-pg7nw"] Feb 18 12:00:36 crc kubenswrapper[4922]: I0218 12:00:36.019226 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec061216-02ec-4395-a5a8-baa7004bf191-config\") pod \"dnsmasq-dns-d558885bc-pg7nw\" (UID: \"ec061216-02ec-4395-a5a8-baa7004bf191\") " pod="openstack/dnsmasq-dns-d558885bc-pg7nw" Feb 18 12:00:36 crc kubenswrapper[4922]: I0218 12:00:36.019275 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z99xf\" (UniqueName: \"kubernetes.io/projected/ec061216-02ec-4395-a5a8-baa7004bf191-kube-api-access-z99xf\") pod \"dnsmasq-dns-d558885bc-pg7nw\" (UID: \"ec061216-02ec-4395-a5a8-baa7004bf191\") " pod="openstack/dnsmasq-dns-d558885bc-pg7nw" Feb 18 12:00:36 crc kubenswrapper[4922]: I0218 12:00:36.019344 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec061216-02ec-4395-a5a8-baa7004bf191-dns-svc\") pod \"dnsmasq-dns-d558885bc-pg7nw\" (UID: \"ec061216-02ec-4395-a5a8-baa7004bf191\") " pod="openstack/dnsmasq-dns-d558885bc-pg7nw" Feb 18 12:00:36 crc kubenswrapper[4922]: I0218 12:00:36.019596 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec061216-02ec-4395-a5a8-baa7004bf191-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-pg7nw\" (UID: \"ec061216-02ec-4395-a5a8-baa7004bf191\") " pod="openstack/dnsmasq-dns-d558885bc-pg7nw" Feb 18 12:00:36 crc kubenswrapper[4922]: I0218 12:00:36.019672 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec061216-02ec-4395-a5a8-baa7004bf191-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-pg7nw\" (UID: \"ec061216-02ec-4395-a5a8-baa7004bf191\") " pod="openstack/dnsmasq-dns-d558885bc-pg7nw" Feb 18 12:00:36 crc kubenswrapper[4922]: I0218 12:00:36.019807 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ec061216-02ec-4395-a5a8-baa7004bf191-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-pg7nw\" (UID: \"ec061216-02ec-4395-a5a8-baa7004bf191\") " pod="openstack/dnsmasq-dns-d558885bc-pg7nw" Feb 18 12:00:36 crc kubenswrapper[4922]: I0218 12:00:36.019923 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ec061216-02ec-4395-a5a8-baa7004bf191-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-pg7nw\" (UID: \"ec061216-02ec-4395-a5a8-baa7004bf191\") " pod="openstack/dnsmasq-dns-d558885bc-pg7nw" Feb 18 12:00:36 crc kubenswrapper[4922]: I0218 12:00:36.065573 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9eb7dcb0-20c5-414c-bc86-58461654bcb5","Type":"ContainerStarted","Data":"d4ee5258db15b40985f75491134792db54b8f88a99d9dc67132d9916cec20645"} Feb 18 12:00:36 crc kubenswrapper[4922]: I0218 12:00:36.067113 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bb934d91-0203-48d1-be6a-ab13e821993d","Type":"ContainerStarted","Data":"601c44d2e7a1e66d83dce04779c8353d850c14d7d1ba8a2cf3bd8ac47fff773a"} Feb 18 12:00:36 crc kubenswrapper[4922]: I0218 12:00:36.069881 4922 generic.go:334] "Generic (PLEG): container finished" podID="6666e009-8c33-402c-865e-03e35b98ad97" containerID="a7ee766db58b3d5721673fe438cd3aa1ba8418159b881cea3cf559cebc804ed4" exitCode=0 Feb 18 12:00:36 crc kubenswrapper[4922]: I0218 12:00:36.069932 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kjhlh" event={"ID":"6666e009-8c33-402c-865e-03e35b98ad97","Type":"ContainerDied","Data":"a7ee766db58b3d5721673fe438cd3aa1ba8418159b881cea3cf559cebc804ed4"} Feb 18 12:00:36 crc kubenswrapper[4922]: I0218 12:00:36.069971 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kjhlh" event={"ID":"6666e009-8c33-402c-865e-03e35b98ad97","Type":"ContainerStarted","Data":"16b2380d157280b8df652af0256b934f5510d0494aaa70da277ce7c3af2d5728"} Feb 18 12:00:36 crc kubenswrapper[4922]: I0218 12:00:36.124676 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec061216-02ec-4395-a5a8-baa7004bf191-dns-svc\") pod \"dnsmasq-dns-d558885bc-pg7nw\" (UID: \"ec061216-02ec-4395-a5a8-baa7004bf191\") " pod="openstack/dnsmasq-dns-d558885bc-pg7nw" Feb 18 12:00:36 crc kubenswrapper[4922]: I0218 12:00:36.124809 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec061216-02ec-4395-a5a8-baa7004bf191-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-pg7nw\" (UID: \"ec061216-02ec-4395-a5a8-baa7004bf191\") " pod="openstack/dnsmasq-dns-d558885bc-pg7nw" Feb 18 12:00:36 crc kubenswrapper[4922]: I0218 12:00:36.124845 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec061216-02ec-4395-a5a8-baa7004bf191-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-pg7nw\" (UID: \"ec061216-02ec-4395-a5a8-baa7004bf191\") " pod="openstack/dnsmasq-dns-d558885bc-pg7nw" Feb 18 12:00:36 crc kubenswrapper[4922]: I0218 12:00:36.124918 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ec061216-02ec-4395-a5a8-baa7004bf191-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-pg7nw\" (UID: \"ec061216-02ec-4395-a5a8-baa7004bf191\") " pod="openstack/dnsmasq-dns-d558885bc-pg7nw" Feb 18 12:00:36 crc kubenswrapper[4922]: I0218 12:00:36.124970 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ec061216-02ec-4395-a5a8-baa7004bf191-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-pg7nw\" (UID: \"ec061216-02ec-4395-a5a8-baa7004bf191\") " pod="openstack/dnsmasq-dns-d558885bc-pg7nw" Feb 18 12:00:36 crc kubenswrapper[4922]: I0218 12:00:36.125030 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec061216-02ec-4395-a5a8-baa7004bf191-config\") pod \"dnsmasq-dns-d558885bc-pg7nw\" (UID: \"ec061216-02ec-4395-a5a8-baa7004bf191\") " pod="openstack/dnsmasq-dns-d558885bc-pg7nw" Feb 18 12:00:36 crc kubenswrapper[4922]: I0218 12:00:36.125055 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z99xf\" (UniqueName: \"kubernetes.io/projected/ec061216-02ec-4395-a5a8-baa7004bf191-kube-api-access-z99xf\") pod \"dnsmasq-dns-d558885bc-pg7nw\" (UID: \"ec061216-02ec-4395-a5a8-baa7004bf191\") " pod="openstack/dnsmasq-dns-d558885bc-pg7nw" Feb 18 12:00:36 crc kubenswrapper[4922]: I0218 12:00:36.126273 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec061216-02ec-4395-a5a8-baa7004bf191-dns-svc\") pod \"dnsmasq-dns-d558885bc-pg7nw\" (UID: \"ec061216-02ec-4395-a5a8-baa7004bf191\") " pod="openstack/dnsmasq-dns-d558885bc-pg7nw" Feb 18 12:00:36 crc kubenswrapper[4922]: I0218 12:00:36.126808 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec061216-02ec-4395-a5a8-baa7004bf191-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-pg7nw\" (UID: \"ec061216-02ec-4395-a5a8-baa7004bf191\") " pod="openstack/dnsmasq-dns-d558885bc-pg7nw" Feb 18 12:00:36 crc kubenswrapper[4922]: I0218 12:00:36.127332 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec061216-02ec-4395-a5a8-baa7004bf191-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-pg7nw\" (UID: \"ec061216-02ec-4395-a5a8-baa7004bf191\") " pod="openstack/dnsmasq-dns-d558885bc-pg7nw" Feb 18 12:00:36 crc kubenswrapper[4922]: I0218 12:00:36.127931 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ec061216-02ec-4395-a5a8-baa7004bf191-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-pg7nw\" (UID: \"ec061216-02ec-4395-a5a8-baa7004bf191\") " pod="openstack/dnsmasq-dns-d558885bc-pg7nw" Feb 18 12:00:36 crc kubenswrapper[4922]: I0218 12:00:36.128566 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec061216-02ec-4395-a5a8-baa7004bf191-config\") pod \"dnsmasq-dns-d558885bc-pg7nw\" (UID: \"ec061216-02ec-4395-a5a8-baa7004bf191\") " pod="openstack/dnsmasq-dns-d558885bc-pg7nw" Feb 18 12:00:36 crc kubenswrapper[4922]: I0218 12:00:36.128696 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ec061216-02ec-4395-a5a8-baa7004bf191-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-pg7nw\" (UID: \"ec061216-02ec-4395-a5a8-baa7004bf191\") " pod="openstack/dnsmasq-dns-d558885bc-pg7nw" Feb 18 12:00:36 crc kubenswrapper[4922]: I0218 12:00:36.151061 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z99xf\" (UniqueName: \"kubernetes.io/projected/ec061216-02ec-4395-a5a8-baa7004bf191-kube-api-access-z99xf\") pod \"dnsmasq-dns-d558885bc-pg7nw\" (UID: \"ec061216-02ec-4395-a5a8-baa7004bf191\") " pod="openstack/dnsmasq-dns-d558885bc-pg7nw" Feb 18 12:00:36 crc kubenswrapper[4922]: I0218 12:00:36.274887 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-pg7nw" Feb 18 12:00:36 crc kubenswrapper[4922]: I0218 12:00:36.850164 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-pg7nw"] Feb 18 12:00:37 crc kubenswrapper[4922]: I0218 12:00:37.081466 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-pg7nw" event={"ID":"ec061216-02ec-4395-a5a8-baa7004bf191","Type":"ContainerStarted","Data":"ce44beb1a2faa1a1bce3fbd62fd03f10442fe344f181a8b9740c07dc8a5954e6"} Feb 18 12:00:37 crc kubenswrapper[4922]: I0218 12:00:37.087275 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9eb7dcb0-20c5-414c-bc86-58461654bcb5","Type":"ContainerStarted","Data":"0ea30ab9744b418aaa71c6de8970bfdb30e18f4cbcf5605e9ca3cf28ff78e461"} Feb 18 12:00:38 crc kubenswrapper[4922]: I0218 12:00:38.097919 4922 generic.go:334] "Generic (PLEG): container finished" podID="ec061216-02ec-4395-a5a8-baa7004bf191" containerID="a65dc7db9a28e801a77617198d8984945af873b42a9d92e64f4d248230c46bbb" exitCode=0 Feb 18 12:00:38 crc kubenswrapper[4922]: I0218 12:00:38.097998 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-pg7nw" event={"ID":"ec061216-02ec-4395-a5a8-baa7004bf191","Type":"ContainerDied","Data":"a65dc7db9a28e801a77617198d8984945af873b42a9d92e64f4d248230c46bbb"} Feb 18 12:00:38 crc kubenswrapper[4922]: I0218 12:00:38.102060 4922 generic.go:334] "Generic (PLEG): container finished" podID="6666e009-8c33-402c-865e-03e35b98ad97" containerID="95962c83417849aecd0d2c14af67a541fb2f9eb477c7d2dac43a445ab6b17a5d" exitCode=0 Feb 18 12:00:38 crc kubenswrapper[4922]: I0218 12:00:38.103117 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kjhlh" event={"ID":"6666e009-8c33-402c-865e-03e35b98ad97","Type":"ContainerDied","Data":"95962c83417849aecd0d2c14af67a541fb2f9eb477c7d2dac43a445ab6b17a5d"} Feb 18 12:00:39 crc kubenswrapper[4922]: I0218 12:00:39.119037 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kjhlh" event={"ID":"6666e009-8c33-402c-865e-03e35b98ad97","Type":"ContainerStarted","Data":"d89593d4fb50825f53b58c9f55d45d23ff0b9a1968fcffa7473579b979ff011a"} Feb 18 12:00:39 crc kubenswrapper[4922]: I0218 12:00:39.124704 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-pg7nw" event={"ID":"ec061216-02ec-4395-a5a8-baa7004bf191","Type":"ContainerStarted","Data":"d2822d61699885da5388dcd0043c11b3bef2124daed27853cb52ed512f1cbcf2"} Feb 18 12:00:39 crc kubenswrapper[4922]: I0218 12:00:39.124887 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d558885bc-pg7nw" Feb 18 12:00:39 crc kubenswrapper[4922]: I0218 12:00:39.145604 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kjhlh" podStartSLOduration=2.736054368 podStartE2EDuration="5.145586487s" podCreationTimestamp="2026-02-18 12:00:34 +0000 UTC" firstStartedPulling="2026-02-18 12:00:36.071922317 +0000 UTC m=+1437.799626387" lastFinishedPulling="2026-02-18 12:00:38.481454426 +0000 UTC m=+1440.209158506" observedRunningTime="2026-02-18 12:00:39.13623631 +0000 UTC m=+1440.863940420" watchObservedRunningTime="2026-02-18 12:00:39.145586487 +0000 UTC m=+1440.873290567" Feb 18 12:00:39 crc kubenswrapper[4922]: I0218 12:00:39.162068 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d558885bc-pg7nw" podStartSLOduration=4.162048305 podStartE2EDuration="4.162048305s" podCreationTimestamp="2026-02-18 12:00:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 12:00:39.16029629 +0000 UTC m=+1440.888000390" watchObservedRunningTime="2026-02-18 12:00:39.162048305 +0000 UTC m=+1440.889752385" Feb 18 12:00:39 crc kubenswrapper[4922]: I0218 12:00:39.807116 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 12:00:39 crc kubenswrapper[4922]: I0218 12:00:39.807180 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 12:00:44 crc kubenswrapper[4922]: I0218 12:00:44.620426 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kjhlh" Feb 18 12:00:44 crc kubenswrapper[4922]: I0218 12:00:44.620994 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kjhlh" Feb 18 12:00:44 crc kubenswrapper[4922]: I0218 12:00:44.666882 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kjhlh" Feb 18 12:00:45 crc kubenswrapper[4922]: I0218 12:00:45.230456 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kjhlh" Feb 18 12:00:45 crc kubenswrapper[4922]: I0218 12:00:45.277402 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kjhlh"] Feb 18 12:00:46 crc kubenswrapper[4922]: I0218 12:00:46.276507 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-d558885bc-pg7nw" Feb 18 12:00:46 crc kubenswrapper[4922]: I0218 12:00:46.368991 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-c2lrw"] Feb 18 12:00:46 crc kubenswrapper[4922]: I0218 12:00:46.369658 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cd5cbd7b9-c2lrw" podUID="7ec5b650-c58d-4b8b-a903-7b95c211139c" containerName="dnsmasq-dns" containerID="cri-o://e9ab7fb9a54d8f679a096c44f20292d1e9d8bf666191a5f0522387545b3f92c4" gracePeriod=10 Feb 18 12:00:46 crc kubenswrapper[4922]: I0218 12:00:46.458959 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b6dc74c5-zdlvc"] Feb 18 12:00:46 crc kubenswrapper[4922]: I0218 12:00:46.460563 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b6dc74c5-zdlvc" Feb 18 12:00:46 crc kubenswrapper[4922]: I0218 12:00:46.482660 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b6dc74c5-zdlvc"] Feb 18 12:00:46 crc kubenswrapper[4922]: I0218 12:00:46.543774 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7048bd5-50d1-472a-a898-6cf57cf126d8-ovsdbserver-sb\") pod \"dnsmasq-dns-6b6dc74c5-zdlvc\" (UID: \"d7048bd5-50d1-472a-a898-6cf57cf126d8\") " pod="openstack/dnsmasq-dns-6b6dc74c5-zdlvc" Feb 18 12:00:46 crc kubenswrapper[4922]: I0218 12:00:46.543828 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d7048bd5-50d1-472a-a898-6cf57cf126d8-openstack-edpm-ipam\") pod \"dnsmasq-dns-6b6dc74c5-zdlvc\" (UID: \"d7048bd5-50d1-472a-a898-6cf57cf126d8\") " pod="openstack/dnsmasq-dns-6b6dc74c5-zdlvc" Feb 18 12:00:46 crc kubenswrapper[4922]: I0218 12:00:46.543878 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgmvm\" (UniqueName: \"kubernetes.io/projected/d7048bd5-50d1-472a-a898-6cf57cf126d8-kube-api-access-kgmvm\") pod \"dnsmasq-dns-6b6dc74c5-zdlvc\" (UID: \"d7048bd5-50d1-472a-a898-6cf57cf126d8\") " pod="openstack/dnsmasq-dns-6b6dc74c5-zdlvc" Feb 18 12:00:46 crc kubenswrapper[4922]: I0218 12:00:46.543944 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7048bd5-50d1-472a-a898-6cf57cf126d8-config\") pod \"dnsmasq-dns-6b6dc74c5-zdlvc\" (UID: \"d7048bd5-50d1-472a-a898-6cf57cf126d8\") " pod="openstack/dnsmasq-dns-6b6dc74c5-zdlvc" Feb 18 12:00:46 crc kubenswrapper[4922]: I0218 12:00:46.544046 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7048bd5-50d1-472a-a898-6cf57cf126d8-ovsdbserver-nb\") pod \"dnsmasq-dns-6b6dc74c5-zdlvc\" (UID: \"d7048bd5-50d1-472a-a898-6cf57cf126d8\") " pod="openstack/dnsmasq-dns-6b6dc74c5-zdlvc" Feb 18 12:00:46 crc kubenswrapper[4922]: I0218 12:00:46.544181 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7048bd5-50d1-472a-a898-6cf57cf126d8-dns-swift-storage-0\") pod \"dnsmasq-dns-6b6dc74c5-zdlvc\" (UID: \"d7048bd5-50d1-472a-a898-6cf57cf126d8\") " pod="openstack/dnsmasq-dns-6b6dc74c5-zdlvc" Feb 18 12:00:46 crc kubenswrapper[4922]: I0218 12:00:46.544243 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7048bd5-50d1-472a-a898-6cf57cf126d8-dns-svc\") pod \"dnsmasq-dns-6b6dc74c5-zdlvc\" (UID: \"d7048bd5-50d1-472a-a898-6cf57cf126d8\") " pod="openstack/dnsmasq-dns-6b6dc74c5-zdlvc" Feb 18 12:00:46 crc kubenswrapper[4922]: I0218 12:00:46.648058 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7048bd5-50d1-472a-a898-6cf57cf126d8-dns-swift-storage-0\") pod \"dnsmasq-dns-6b6dc74c5-zdlvc\" (UID: \"d7048bd5-50d1-472a-a898-6cf57cf126d8\") " pod="openstack/dnsmasq-dns-6b6dc74c5-zdlvc" Feb 18 12:00:46 crc kubenswrapper[4922]: I0218 12:00:46.648128 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7048bd5-50d1-472a-a898-6cf57cf126d8-dns-svc\") pod \"dnsmasq-dns-6b6dc74c5-zdlvc\" (UID: \"d7048bd5-50d1-472a-a898-6cf57cf126d8\") " pod="openstack/dnsmasq-dns-6b6dc74c5-zdlvc" Feb 18 12:00:46 crc kubenswrapper[4922]: I0218 12:00:46.648319 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7048bd5-50d1-472a-a898-6cf57cf126d8-ovsdbserver-sb\") pod \"dnsmasq-dns-6b6dc74c5-zdlvc\" (UID: \"d7048bd5-50d1-472a-a898-6cf57cf126d8\") " pod="openstack/dnsmasq-dns-6b6dc74c5-zdlvc" Feb 18 12:00:46 crc kubenswrapper[4922]: I0218 12:00:46.648379 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d7048bd5-50d1-472a-a898-6cf57cf126d8-openstack-edpm-ipam\") pod \"dnsmasq-dns-6b6dc74c5-zdlvc\" (UID: \"d7048bd5-50d1-472a-a898-6cf57cf126d8\") " pod="openstack/dnsmasq-dns-6b6dc74c5-zdlvc" Feb 18 12:00:46 crc kubenswrapper[4922]: I0218 12:00:46.648410 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgmvm\" (UniqueName: \"kubernetes.io/projected/d7048bd5-50d1-472a-a898-6cf57cf126d8-kube-api-access-kgmvm\") pod \"dnsmasq-dns-6b6dc74c5-zdlvc\" (UID: \"d7048bd5-50d1-472a-a898-6cf57cf126d8\") " pod="openstack/dnsmasq-dns-6b6dc74c5-zdlvc" Feb 18 12:00:46 crc kubenswrapper[4922]: I0218 12:00:46.648957 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7048bd5-50d1-472a-a898-6cf57cf126d8-config\") pod \"dnsmasq-dns-6b6dc74c5-zdlvc\" (UID: \"d7048bd5-50d1-472a-a898-6cf57cf126d8\") " pod="openstack/dnsmasq-dns-6b6dc74c5-zdlvc" Feb 18 12:00:46 crc kubenswrapper[4922]: I0218 12:00:46.649138 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7048bd5-50d1-472a-a898-6cf57cf126d8-ovsdbserver-nb\") pod \"dnsmasq-dns-6b6dc74c5-zdlvc\" (UID: \"d7048bd5-50d1-472a-a898-6cf57cf126d8\") " pod="openstack/dnsmasq-dns-6b6dc74c5-zdlvc" Feb 18 12:00:46 crc kubenswrapper[4922]: I0218 12:00:46.650668 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7048bd5-50d1-472a-a898-6cf57cf126d8-dns-svc\") pod \"dnsmasq-dns-6b6dc74c5-zdlvc\" (UID: \"d7048bd5-50d1-472a-a898-6cf57cf126d8\") " pod="openstack/dnsmasq-dns-6b6dc74c5-zdlvc" Feb 18 12:00:46 crc kubenswrapper[4922]: I0218 12:00:46.650701 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7048bd5-50d1-472a-a898-6cf57cf126d8-ovsdbserver-sb\") pod \"dnsmasq-dns-6b6dc74c5-zdlvc\" (UID: \"d7048bd5-50d1-472a-a898-6cf57cf126d8\") " pod="openstack/dnsmasq-dns-6b6dc74c5-zdlvc" Feb 18 12:00:46 crc kubenswrapper[4922]: I0218 12:00:46.651232 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7048bd5-50d1-472a-a898-6cf57cf126d8-dns-swift-storage-0\") pod \"dnsmasq-dns-6b6dc74c5-zdlvc\" (UID: \"d7048bd5-50d1-472a-a898-6cf57cf126d8\") " pod="openstack/dnsmasq-dns-6b6dc74c5-zdlvc" Feb 18 12:00:46 crc kubenswrapper[4922]: I0218 12:00:46.651489 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d7048bd5-50d1-472a-a898-6cf57cf126d8-openstack-edpm-ipam\") pod \"dnsmasq-dns-6b6dc74c5-zdlvc\" (UID: \"d7048bd5-50d1-472a-a898-6cf57cf126d8\") " pod="openstack/dnsmasq-dns-6b6dc74c5-zdlvc" Feb 18 12:00:46 crc kubenswrapper[4922]: I0218 12:00:46.651851 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7048bd5-50d1-472a-a898-6cf57cf126d8-config\") pod \"dnsmasq-dns-6b6dc74c5-zdlvc\" (UID: \"d7048bd5-50d1-472a-a898-6cf57cf126d8\") " pod="openstack/dnsmasq-dns-6b6dc74c5-zdlvc" Feb 18 12:00:46 crc kubenswrapper[4922]: I0218 12:00:46.652660 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7048bd5-50d1-472a-a898-6cf57cf126d8-ovsdbserver-nb\") pod \"dnsmasq-dns-6b6dc74c5-zdlvc\" (UID: \"d7048bd5-50d1-472a-a898-6cf57cf126d8\") " pod="openstack/dnsmasq-dns-6b6dc74c5-zdlvc" Feb 18 12:00:46 crc kubenswrapper[4922]: I0218 12:00:46.681767 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgmvm\" (UniqueName: \"kubernetes.io/projected/d7048bd5-50d1-472a-a898-6cf57cf126d8-kube-api-access-kgmvm\") pod \"dnsmasq-dns-6b6dc74c5-zdlvc\" (UID: \"d7048bd5-50d1-472a-a898-6cf57cf126d8\") " pod="openstack/dnsmasq-dns-6b6dc74c5-zdlvc" Feb 18 12:00:46 crc kubenswrapper[4922]: I0218 12:00:46.833247 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b6dc74c5-zdlvc" Feb 18 12:00:46 crc kubenswrapper[4922]: I0218 12:00:46.960494 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-c2lrw" Feb 18 12:00:47 crc kubenswrapper[4922]: I0218 12:00:47.055732 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7ec5b650-c58d-4b8b-a903-7b95c211139c-dns-swift-storage-0\") pod \"7ec5b650-c58d-4b8b-a903-7b95c211139c\" (UID: \"7ec5b650-c58d-4b8b-a903-7b95c211139c\") " Feb 18 12:00:47 crc kubenswrapper[4922]: I0218 12:00:47.056094 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ec5b650-c58d-4b8b-a903-7b95c211139c-dns-svc\") pod \"7ec5b650-c58d-4b8b-a903-7b95c211139c\" (UID: \"7ec5b650-c58d-4b8b-a903-7b95c211139c\") " Feb 18 12:00:47 crc kubenswrapper[4922]: I0218 12:00:47.056528 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ec5b650-c58d-4b8b-a903-7b95c211139c-ovsdbserver-nb\") pod \"7ec5b650-c58d-4b8b-a903-7b95c211139c\" (UID: \"7ec5b650-c58d-4b8b-a903-7b95c211139c\") " Feb 18 12:00:47 crc kubenswrapper[4922]: I0218 12:00:47.056575 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ec5b650-c58d-4b8b-a903-7b95c211139c-config\") pod \"7ec5b650-c58d-4b8b-a903-7b95c211139c\" (UID: \"7ec5b650-c58d-4b8b-a903-7b95c211139c\") " Feb 18 12:00:47 crc kubenswrapper[4922]: I0218 12:00:47.056642 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ec5b650-c58d-4b8b-a903-7b95c211139c-ovsdbserver-sb\") pod \"7ec5b650-c58d-4b8b-a903-7b95c211139c\" (UID: \"7ec5b650-c58d-4b8b-a903-7b95c211139c\") " Feb 18 12:00:47 crc kubenswrapper[4922]: I0218 12:00:47.056713 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chbg5\" (UniqueName: \"kubernetes.io/projected/7ec5b650-c58d-4b8b-a903-7b95c211139c-kube-api-access-chbg5\") pod \"7ec5b650-c58d-4b8b-a903-7b95c211139c\" (UID: \"7ec5b650-c58d-4b8b-a903-7b95c211139c\") " Feb 18 12:00:47 crc kubenswrapper[4922]: I0218 12:00:47.074039 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ec5b650-c58d-4b8b-a903-7b95c211139c-kube-api-access-chbg5" (OuterVolumeSpecName: "kube-api-access-chbg5") pod "7ec5b650-c58d-4b8b-a903-7b95c211139c" (UID: "7ec5b650-c58d-4b8b-a903-7b95c211139c"). InnerVolumeSpecName "kube-api-access-chbg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:00:47 crc kubenswrapper[4922]: I0218 12:00:47.119903 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ec5b650-c58d-4b8b-a903-7b95c211139c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7ec5b650-c58d-4b8b-a903-7b95c211139c" (UID: "7ec5b650-c58d-4b8b-a903-7b95c211139c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 12:00:47 crc kubenswrapper[4922]: I0218 12:00:47.133532 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ec5b650-c58d-4b8b-a903-7b95c211139c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7ec5b650-c58d-4b8b-a903-7b95c211139c" (UID: "7ec5b650-c58d-4b8b-a903-7b95c211139c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 12:00:47 crc kubenswrapper[4922]: I0218 12:00:47.140003 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ec5b650-c58d-4b8b-a903-7b95c211139c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7ec5b650-c58d-4b8b-a903-7b95c211139c" (UID: "7ec5b650-c58d-4b8b-a903-7b95c211139c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 12:00:47 crc kubenswrapper[4922]: I0218 12:00:47.161980 4922 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7ec5b650-c58d-4b8b-a903-7b95c211139c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 18 12:00:47 crc kubenswrapper[4922]: I0218 12:00:47.162226 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ec5b650-c58d-4b8b-a903-7b95c211139c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 18 12:00:47 crc kubenswrapper[4922]: I0218 12:00:47.162297 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ec5b650-c58d-4b8b-a903-7b95c211139c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 18 12:00:47 crc kubenswrapper[4922]: I0218 12:00:47.162414 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chbg5\" (UniqueName: \"kubernetes.io/projected/7ec5b650-c58d-4b8b-a903-7b95c211139c-kube-api-access-chbg5\") on node \"crc\" DevicePath \"\"" Feb 18 12:00:47 crc kubenswrapper[4922]: I0218 12:00:47.162906 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ec5b650-c58d-4b8b-a903-7b95c211139c-config" (OuterVolumeSpecName: "config") pod "7ec5b650-c58d-4b8b-a903-7b95c211139c" (UID: "7ec5b650-c58d-4b8b-a903-7b95c211139c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 12:00:47 crc kubenswrapper[4922]: I0218 12:00:47.194829 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ec5b650-c58d-4b8b-a903-7b95c211139c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7ec5b650-c58d-4b8b-a903-7b95c211139c" (UID: "7ec5b650-c58d-4b8b-a903-7b95c211139c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 12:00:47 crc kubenswrapper[4922]: I0218 12:00:47.205436 4922 generic.go:334] "Generic (PLEG): container finished" podID="7ec5b650-c58d-4b8b-a903-7b95c211139c" containerID="e9ab7fb9a54d8f679a096c44f20292d1e9d8bf666191a5f0522387545b3f92c4" exitCode=0 Feb 18 12:00:47 crc kubenswrapper[4922]: I0218 12:00:47.205656 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-c2lrw" event={"ID":"7ec5b650-c58d-4b8b-a903-7b95c211139c","Type":"ContainerDied","Data":"e9ab7fb9a54d8f679a096c44f20292d1e9d8bf666191a5f0522387545b3f92c4"} Feb 18 12:00:47 crc kubenswrapper[4922]: I0218 12:00:47.205725 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-c2lrw" event={"ID":"7ec5b650-c58d-4b8b-a903-7b95c211139c","Type":"ContainerDied","Data":"ec7c038a1e2201c0d4227eabf40ab3692ebca8c65c8f35a81b082989a7f57de6"} Feb 18 12:00:47 crc kubenswrapper[4922]: I0218 12:00:47.205751 4922 scope.go:117] "RemoveContainer" containerID="e9ab7fb9a54d8f679a096c44f20292d1e9d8bf666191a5f0522387545b3f92c4" Feb 18 12:00:47 crc kubenswrapper[4922]: I0218 12:00:47.205986 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-c2lrw" Feb 18 12:00:47 crc kubenswrapper[4922]: I0218 12:00:47.206227 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kjhlh" podUID="6666e009-8c33-402c-865e-03e35b98ad97" containerName="registry-server" containerID="cri-o://d89593d4fb50825f53b58c9f55d45d23ff0b9a1968fcffa7473579b979ff011a" gracePeriod=2 Feb 18 12:00:47 crc kubenswrapper[4922]: I0218 12:00:47.251975 4922 scope.go:117] "RemoveContainer" containerID="002e5f15ac919f451d0a14749c9df4d942fa8a8527a3d44eb7afde0e3a5e390a" Feb 18 12:00:47 crc kubenswrapper[4922]: I0218 12:00:47.268617 4922 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ec5b650-c58d-4b8b-a903-7b95c211139c-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 12:00:47 crc kubenswrapper[4922]: I0218 12:00:47.268662 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ec5b650-c58d-4b8b-a903-7b95c211139c-config\") on node \"crc\" DevicePath \"\"" Feb 18 12:00:47 crc kubenswrapper[4922]: I0218 12:00:47.313421 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-c2lrw"] Feb 18 12:00:47 crc kubenswrapper[4922]: I0218 12:00:47.321574 4922 scope.go:117] "RemoveContainer" containerID="e9ab7fb9a54d8f679a096c44f20292d1e9d8bf666191a5f0522387545b3f92c4" Feb 18 12:00:47 crc kubenswrapper[4922]: E0218 12:00:47.327554 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9ab7fb9a54d8f679a096c44f20292d1e9d8bf666191a5f0522387545b3f92c4\": container with ID starting with e9ab7fb9a54d8f679a096c44f20292d1e9d8bf666191a5f0522387545b3f92c4 not found: ID does not exist" containerID="e9ab7fb9a54d8f679a096c44f20292d1e9d8bf666191a5f0522387545b3f92c4" Feb 18 12:00:47 crc kubenswrapper[4922]: I0218 12:00:47.327621 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9ab7fb9a54d8f679a096c44f20292d1e9d8bf666191a5f0522387545b3f92c4"} err="failed to get container status \"e9ab7fb9a54d8f679a096c44f20292d1e9d8bf666191a5f0522387545b3f92c4\": rpc error: code = NotFound desc = could not find container \"e9ab7fb9a54d8f679a096c44f20292d1e9d8bf666191a5f0522387545b3f92c4\": container with ID starting with e9ab7fb9a54d8f679a096c44f20292d1e9d8bf666191a5f0522387545b3f92c4 not found: ID does not exist" Feb 18 12:00:47 crc kubenswrapper[4922]: I0218 12:00:47.327657 4922 scope.go:117] "RemoveContainer" containerID="002e5f15ac919f451d0a14749c9df4d942fa8a8527a3d44eb7afde0e3a5e390a" Feb 18 12:00:47 crc kubenswrapper[4922]: E0218 12:00:47.332285 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"002e5f15ac919f451d0a14749c9df4d942fa8a8527a3d44eb7afde0e3a5e390a\": container with ID starting with 002e5f15ac919f451d0a14749c9df4d942fa8a8527a3d44eb7afde0e3a5e390a not found: ID does not exist" containerID="002e5f15ac919f451d0a14749c9df4d942fa8a8527a3d44eb7afde0e3a5e390a" Feb 18 12:00:47 crc kubenswrapper[4922]: I0218 12:00:47.332332 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"002e5f15ac919f451d0a14749c9df4d942fa8a8527a3d44eb7afde0e3a5e390a"} err="failed to get container status \"002e5f15ac919f451d0a14749c9df4d942fa8a8527a3d44eb7afde0e3a5e390a\": rpc error: code = NotFound desc = could not find container \"002e5f15ac919f451d0a14749c9df4d942fa8a8527a3d44eb7afde0e3a5e390a\": container with ID starting with 002e5f15ac919f451d0a14749c9df4d942fa8a8527a3d44eb7afde0e3a5e390a not found: ID does not exist" Feb 18 12:00:47 crc kubenswrapper[4922]: I0218 12:00:47.341554 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-c2lrw"] Feb 18 12:00:47 crc kubenswrapper[4922]: I0218 12:00:47.369136 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b6dc74c5-zdlvc"] Feb 18 12:00:47 crc kubenswrapper[4922]: W0218 12:00:47.424696 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7048bd5_50d1_472a_a898_6cf57cf126d8.slice/crio-e533c7e82990e2ba6c93fbb180353bbfe4e66b977189b12e497649fc184f5b6a WatchSource:0}: Error finding container e533c7e82990e2ba6c93fbb180353bbfe4e66b977189b12e497649fc184f5b6a: Status 404 returned error can't find the container with id e533c7e82990e2ba6c93fbb180353bbfe4e66b977189b12e497649fc184f5b6a Feb 18 12:00:47 crc kubenswrapper[4922]: I0218 12:00:47.859652 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kjhlh" Feb 18 12:00:47 crc kubenswrapper[4922]: I0218 12:00:47.984049 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6666e009-8c33-402c-865e-03e35b98ad97-utilities\") pod \"6666e009-8c33-402c-865e-03e35b98ad97\" (UID: \"6666e009-8c33-402c-865e-03e35b98ad97\") " Feb 18 12:00:47 crc kubenswrapper[4922]: I0218 12:00:47.984392 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6666e009-8c33-402c-865e-03e35b98ad97-catalog-content\") pod \"6666e009-8c33-402c-865e-03e35b98ad97\" (UID: \"6666e009-8c33-402c-865e-03e35b98ad97\") " Feb 18 12:00:47 crc kubenswrapper[4922]: I0218 12:00:47.984464 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8p6v\" (UniqueName: \"kubernetes.io/projected/6666e009-8c33-402c-865e-03e35b98ad97-kube-api-access-s8p6v\") pod \"6666e009-8c33-402c-865e-03e35b98ad97\" (UID: \"6666e009-8c33-402c-865e-03e35b98ad97\") " Feb 18 12:00:47 crc kubenswrapper[4922]: I0218 12:00:47.985071 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6666e009-8c33-402c-865e-03e35b98ad97-utilities" (OuterVolumeSpecName: "utilities") pod "6666e009-8c33-402c-865e-03e35b98ad97" (UID: "6666e009-8c33-402c-865e-03e35b98ad97"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:00:47 crc kubenswrapper[4922]: I0218 12:00:47.989324 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6666e009-8c33-402c-865e-03e35b98ad97-kube-api-access-s8p6v" (OuterVolumeSpecName: "kube-api-access-s8p6v") pod "6666e009-8c33-402c-865e-03e35b98ad97" (UID: "6666e009-8c33-402c-865e-03e35b98ad97"). InnerVolumeSpecName "kube-api-access-s8p6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:00:48 crc kubenswrapper[4922]: I0218 12:00:48.086435 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6666e009-8c33-402c-865e-03e35b98ad97-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 12:00:48 crc kubenswrapper[4922]: I0218 12:00:48.086471 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8p6v\" (UniqueName: \"kubernetes.io/projected/6666e009-8c33-402c-865e-03e35b98ad97-kube-api-access-s8p6v\") on node \"crc\" DevicePath \"\"" Feb 18 12:00:48 crc kubenswrapper[4922]: I0218 12:00:48.096259 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6666e009-8c33-402c-865e-03e35b98ad97-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6666e009-8c33-402c-865e-03e35b98ad97" (UID: "6666e009-8c33-402c-865e-03e35b98ad97"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:00:48 crc kubenswrapper[4922]: I0218 12:00:48.188616 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6666e009-8c33-402c-865e-03e35b98ad97-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 12:00:48 crc kubenswrapper[4922]: I0218 12:00:48.217391 4922 generic.go:334] "Generic (PLEG): container finished" podID="6666e009-8c33-402c-865e-03e35b98ad97" containerID="d89593d4fb50825f53b58c9f55d45d23ff0b9a1968fcffa7473579b979ff011a" exitCode=0 Feb 18 12:00:48 crc kubenswrapper[4922]: I0218 12:00:48.217438 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kjhlh" Feb 18 12:00:48 crc kubenswrapper[4922]: I0218 12:00:48.217455 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kjhlh" event={"ID":"6666e009-8c33-402c-865e-03e35b98ad97","Type":"ContainerDied","Data":"d89593d4fb50825f53b58c9f55d45d23ff0b9a1968fcffa7473579b979ff011a"} Feb 18 12:00:48 crc kubenswrapper[4922]: I0218 12:00:48.217479 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kjhlh" event={"ID":"6666e009-8c33-402c-865e-03e35b98ad97","Type":"ContainerDied","Data":"16b2380d157280b8df652af0256b934f5510d0494aaa70da277ce7c3af2d5728"} Feb 18 12:00:48 crc kubenswrapper[4922]: I0218 12:00:48.217508 4922 scope.go:117] "RemoveContainer" containerID="d89593d4fb50825f53b58c9f55d45d23ff0b9a1968fcffa7473579b979ff011a" Feb 18 12:00:48 crc kubenswrapper[4922]: I0218 12:00:48.219879 4922 generic.go:334] "Generic (PLEG): container finished" podID="d7048bd5-50d1-472a-a898-6cf57cf126d8" containerID="f760aef3dd84c8a9b4a1d6590daf508f28b329c6e8c215c202f916bdcf8ff7f3" exitCode=0 Feb 18 12:00:48 crc kubenswrapper[4922]: I0218 12:00:48.219946 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b6dc74c5-zdlvc" event={"ID":"d7048bd5-50d1-472a-a898-6cf57cf126d8","Type":"ContainerDied","Data":"f760aef3dd84c8a9b4a1d6590daf508f28b329c6e8c215c202f916bdcf8ff7f3"} Feb 18 12:00:48 crc kubenswrapper[4922]: I0218 12:00:48.219975 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b6dc74c5-zdlvc" event={"ID":"d7048bd5-50d1-472a-a898-6cf57cf126d8","Type":"ContainerStarted","Data":"e533c7e82990e2ba6c93fbb180353bbfe4e66b977189b12e497649fc184f5b6a"} Feb 18 12:00:48 crc kubenswrapper[4922]: I0218 12:00:48.269270 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kjhlh"] Feb 18 12:00:48 crc kubenswrapper[4922]: I0218 12:00:48.269953 4922 scope.go:117] "RemoveContainer" containerID="95962c83417849aecd0d2c14af67a541fb2f9eb477c7d2dac43a445ab6b17a5d" Feb 18 12:00:48 crc kubenswrapper[4922]: I0218 12:00:48.279577 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kjhlh"] Feb 18 12:00:48 crc kubenswrapper[4922]: I0218 12:00:48.293142 4922 scope.go:117] "RemoveContainer" containerID="a7ee766db58b3d5721673fe438cd3aa1ba8418159b881cea3cf559cebc804ed4" Feb 18 12:00:48 crc kubenswrapper[4922]: I0218 12:00:48.319336 4922 scope.go:117] "RemoveContainer" containerID="d89593d4fb50825f53b58c9f55d45d23ff0b9a1968fcffa7473579b979ff011a" Feb 18 12:00:48 crc kubenswrapper[4922]: E0218 12:00:48.319721 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d89593d4fb50825f53b58c9f55d45d23ff0b9a1968fcffa7473579b979ff011a\": container with ID starting with d89593d4fb50825f53b58c9f55d45d23ff0b9a1968fcffa7473579b979ff011a not found: ID does not exist" containerID="d89593d4fb50825f53b58c9f55d45d23ff0b9a1968fcffa7473579b979ff011a" Feb 18 12:00:48 crc kubenswrapper[4922]: I0218 12:00:48.319746 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d89593d4fb50825f53b58c9f55d45d23ff0b9a1968fcffa7473579b979ff011a"} err="failed to get container status \"d89593d4fb50825f53b58c9f55d45d23ff0b9a1968fcffa7473579b979ff011a\": rpc error: code = NotFound desc = could not find container \"d89593d4fb50825f53b58c9f55d45d23ff0b9a1968fcffa7473579b979ff011a\": container with ID starting with d89593d4fb50825f53b58c9f55d45d23ff0b9a1968fcffa7473579b979ff011a not found: ID does not exist" Feb 18 12:00:48 crc kubenswrapper[4922]: I0218 12:00:48.319767 4922 scope.go:117] "RemoveContainer" containerID="95962c83417849aecd0d2c14af67a541fb2f9eb477c7d2dac43a445ab6b17a5d" Feb 18 12:00:48 crc kubenswrapper[4922]: E0218 12:00:48.320005 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95962c83417849aecd0d2c14af67a541fb2f9eb477c7d2dac43a445ab6b17a5d\": container with ID starting with 95962c83417849aecd0d2c14af67a541fb2f9eb477c7d2dac43a445ab6b17a5d not found: ID does not exist" containerID="95962c83417849aecd0d2c14af67a541fb2f9eb477c7d2dac43a445ab6b17a5d" Feb 18 12:00:48 crc kubenswrapper[4922]: I0218 12:00:48.320023 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95962c83417849aecd0d2c14af67a541fb2f9eb477c7d2dac43a445ab6b17a5d"} err="failed to get container status \"95962c83417849aecd0d2c14af67a541fb2f9eb477c7d2dac43a445ab6b17a5d\": rpc error: code = NotFound desc = could not find container \"95962c83417849aecd0d2c14af67a541fb2f9eb477c7d2dac43a445ab6b17a5d\": container with ID starting with 95962c83417849aecd0d2c14af67a541fb2f9eb477c7d2dac43a445ab6b17a5d not found: ID does not exist" Feb 18 12:00:48 crc kubenswrapper[4922]: I0218 12:00:48.320035 4922 scope.go:117] "RemoveContainer" containerID="a7ee766db58b3d5721673fe438cd3aa1ba8418159b881cea3cf559cebc804ed4" Feb 18 12:00:48 crc kubenswrapper[4922]: E0218 12:00:48.320254 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7ee766db58b3d5721673fe438cd3aa1ba8418159b881cea3cf559cebc804ed4\": container with ID starting with a7ee766db58b3d5721673fe438cd3aa1ba8418159b881cea3cf559cebc804ed4 not found: ID does not exist" containerID="a7ee766db58b3d5721673fe438cd3aa1ba8418159b881cea3cf559cebc804ed4" Feb 18 12:00:48 crc kubenswrapper[4922]: I0218 12:00:48.320271 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7ee766db58b3d5721673fe438cd3aa1ba8418159b881cea3cf559cebc804ed4"} err="failed to get container status \"a7ee766db58b3d5721673fe438cd3aa1ba8418159b881cea3cf559cebc804ed4\": rpc error: code = NotFound desc = could not find container \"a7ee766db58b3d5721673fe438cd3aa1ba8418159b881cea3cf559cebc804ed4\": container with ID starting with a7ee766db58b3d5721673fe438cd3aa1ba8418159b881cea3cf559cebc804ed4 not found: ID does not exist" Feb 18 12:00:48 crc kubenswrapper[4922]: I0218 12:00:48.998116 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6666e009-8c33-402c-865e-03e35b98ad97" path="/var/lib/kubelet/pods/6666e009-8c33-402c-865e-03e35b98ad97/volumes" Feb 18 12:00:48 crc kubenswrapper[4922]: I0218 12:00:48.999335 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ec5b650-c58d-4b8b-a903-7b95c211139c" path="/var/lib/kubelet/pods/7ec5b650-c58d-4b8b-a903-7b95c211139c/volumes" Feb 18 12:00:49 crc kubenswrapper[4922]: I0218 12:00:49.237716 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b6dc74c5-zdlvc" event={"ID":"d7048bd5-50d1-472a-a898-6cf57cf126d8","Type":"ContainerStarted","Data":"313947e563aa52919e647893fe1e854174e8a8d9215bc035726940710068797a"} Feb 18 12:00:49 crc kubenswrapper[4922]: I0218 12:00:49.238002 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b6dc74c5-zdlvc" Feb 18 12:00:49 crc kubenswrapper[4922]: I0218 12:00:49.257595 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b6dc74c5-zdlvc" podStartSLOduration=3.257576363 podStartE2EDuration="3.257576363s" podCreationTimestamp="2026-02-18 12:00:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 12:00:49.254268489 +0000 UTC m=+1450.981972569" watchObservedRunningTime="2026-02-18 12:00:49.257576363 +0000 UTC m=+1450.985280443" Feb 18 12:00:56 crc kubenswrapper[4922]: I0218 12:00:56.835483 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b6dc74c5-zdlvc" Feb 18 12:00:56 crc kubenswrapper[4922]: I0218 12:00:56.941160 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-pg7nw"] Feb 18 12:00:56 crc kubenswrapper[4922]: I0218 12:00:56.948512 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d558885bc-pg7nw" podUID="ec061216-02ec-4395-a5a8-baa7004bf191" containerName="dnsmasq-dns" containerID="cri-o://d2822d61699885da5388dcd0043c11b3bef2124daed27853cb52ed512f1cbcf2" gracePeriod=10 Feb 18 12:00:57 crc kubenswrapper[4922]: I0218 12:00:57.340973 4922 generic.go:334] "Generic (PLEG): container finished" podID="ec061216-02ec-4395-a5a8-baa7004bf191" containerID="d2822d61699885da5388dcd0043c11b3bef2124daed27853cb52ed512f1cbcf2" exitCode=0 Feb 18 12:00:57 crc kubenswrapper[4922]: I0218 12:00:57.341044 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-pg7nw" event={"ID":"ec061216-02ec-4395-a5a8-baa7004bf191","Type":"ContainerDied","Data":"d2822d61699885da5388dcd0043c11b3bef2124daed27853cb52ed512f1cbcf2"} Feb 18 12:00:57 crc kubenswrapper[4922]: I0218 12:00:57.450634 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-pg7nw" Feb 18 12:00:57 crc kubenswrapper[4922]: I0218 12:00:57.636028 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ec061216-02ec-4395-a5a8-baa7004bf191-openstack-edpm-ipam\") pod \"ec061216-02ec-4395-a5a8-baa7004bf191\" (UID: \"ec061216-02ec-4395-a5a8-baa7004bf191\") " Feb 18 12:00:57 crc kubenswrapper[4922]: I0218 12:00:57.636107 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec061216-02ec-4395-a5a8-baa7004bf191-ovsdbserver-sb\") pod \"ec061216-02ec-4395-a5a8-baa7004bf191\" (UID: \"ec061216-02ec-4395-a5a8-baa7004bf191\") " Feb 18 12:00:57 crc kubenswrapper[4922]: I0218 12:00:57.636164 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z99xf\" (UniqueName: \"kubernetes.io/projected/ec061216-02ec-4395-a5a8-baa7004bf191-kube-api-access-z99xf\") pod \"ec061216-02ec-4395-a5a8-baa7004bf191\" (UID: \"ec061216-02ec-4395-a5a8-baa7004bf191\") " Feb 18 12:00:57 crc kubenswrapper[4922]: I0218 12:00:57.636203 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec061216-02ec-4395-a5a8-baa7004bf191-config\") pod \"ec061216-02ec-4395-a5a8-baa7004bf191\" (UID: \"ec061216-02ec-4395-a5a8-baa7004bf191\") " Feb 18 12:00:57 crc kubenswrapper[4922]: I0218 12:00:57.636230 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ec061216-02ec-4395-a5a8-baa7004bf191-dns-swift-storage-0\") pod \"ec061216-02ec-4395-a5a8-baa7004bf191\" (UID: \"ec061216-02ec-4395-a5a8-baa7004bf191\") " Feb 18 12:00:57 crc kubenswrapper[4922]: I0218 12:00:57.637810 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec061216-02ec-4395-a5a8-baa7004bf191-dns-svc\") pod \"ec061216-02ec-4395-a5a8-baa7004bf191\" (UID: \"ec061216-02ec-4395-a5a8-baa7004bf191\") " Feb 18 12:00:57 crc kubenswrapper[4922]: I0218 12:00:57.638137 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec061216-02ec-4395-a5a8-baa7004bf191-ovsdbserver-nb\") pod \"ec061216-02ec-4395-a5a8-baa7004bf191\" (UID: \"ec061216-02ec-4395-a5a8-baa7004bf191\") " Feb 18 12:00:57 crc kubenswrapper[4922]: I0218 12:00:57.642967 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec061216-02ec-4395-a5a8-baa7004bf191-kube-api-access-z99xf" (OuterVolumeSpecName: "kube-api-access-z99xf") pod "ec061216-02ec-4395-a5a8-baa7004bf191" (UID: "ec061216-02ec-4395-a5a8-baa7004bf191"). InnerVolumeSpecName "kube-api-access-z99xf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:00:57 crc kubenswrapper[4922]: I0218 12:00:57.699884 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec061216-02ec-4395-a5a8-baa7004bf191-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ec061216-02ec-4395-a5a8-baa7004bf191" (UID: "ec061216-02ec-4395-a5a8-baa7004bf191"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 12:00:57 crc kubenswrapper[4922]: I0218 12:00:57.699892 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec061216-02ec-4395-a5a8-baa7004bf191-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ec061216-02ec-4395-a5a8-baa7004bf191" (UID: "ec061216-02ec-4395-a5a8-baa7004bf191"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 12:00:57 crc kubenswrapper[4922]: I0218 12:00:57.710138 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec061216-02ec-4395-a5a8-baa7004bf191-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ec061216-02ec-4395-a5a8-baa7004bf191" (UID: "ec061216-02ec-4395-a5a8-baa7004bf191"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 12:00:57 crc kubenswrapper[4922]: I0218 12:00:57.713354 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec061216-02ec-4395-a5a8-baa7004bf191-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "ec061216-02ec-4395-a5a8-baa7004bf191" (UID: "ec061216-02ec-4395-a5a8-baa7004bf191"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 12:00:57 crc kubenswrapper[4922]: I0218 12:00:57.716663 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec061216-02ec-4395-a5a8-baa7004bf191-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ec061216-02ec-4395-a5a8-baa7004bf191" (UID: "ec061216-02ec-4395-a5a8-baa7004bf191"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 12:00:57 crc kubenswrapper[4922]: I0218 12:00:57.722001 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec061216-02ec-4395-a5a8-baa7004bf191-config" (OuterVolumeSpecName: "config") pod "ec061216-02ec-4395-a5a8-baa7004bf191" (UID: "ec061216-02ec-4395-a5a8-baa7004bf191"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 12:00:57 crc kubenswrapper[4922]: I0218 12:00:57.740981 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec061216-02ec-4395-a5a8-baa7004bf191-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 18 12:00:57 crc kubenswrapper[4922]: I0218 12:00:57.741015 4922 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ec061216-02ec-4395-a5a8-baa7004bf191-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 12:00:57 crc kubenswrapper[4922]: I0218 12:00:57.741027 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec061216-02ec-4395-a5a8-baa7004bf191-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 18 12:00:57 crc kubenswrapper[4922]: I0218 12:00:57.741037 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z99xf\" (UniqueName: \"kubernetes.io/projected/ec061216-02ec-4395-a5a8-baa7004bf191-kube-api-access-z99xf\") on node \"crc\" DevicePath \"\"" Feb 18 12:00:57 crc kubenswrapper[4922]: I0218 12:00:57.741063 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec061216-02ec-4395-a5a8-baa7004bf191-config\") on node \"crc\" DevicePath \"\"" Feb 18 12:00:57 crc kubenswrapper[4922]: I0218 12:00:57.741074 4922 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ec061216-02ec-4395-a5a8-baa7004bf191-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 18 12:00:57 crc kubenswrapper[4922]: I0218 12:00:57.741083 4922 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec061216-02ec-4395-a5a8-baa7004bf191-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 12:00:58 crc kubenswrapper[4922]: I0218 12:00:58.352541 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-pg7nw" event={"ID":"ec061216-02ec-4395-a5a8-baa7004bf191","Type":"ContainerDied","Data":"ce44beb1a2faa1a1bce3fbd62fd03f10442fe344f181a8b9740c07dc8a5954e6"} Feb 18 12:00:58 crc kubenswrapper[4922]: I0218 12:00:58.352605 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-pg7nw" Feb 18 12:00:58 crc kubenswrapper[4922]: I0218 12:00:58.352612 4922 scope.go:117] "RemoveContainer" containerID="d2822d61699885da5388dcd0043c11b3bef2124daed27853cb52ed512f1cbcf2" Feb 18 12:00:58 crc kubenswrapper[4922]: I0218 12:00:58.382765 4922 scope.go:117] "RemoveContainer" containerID="a65dc7db9a28e801a77617198d8984945af873b42a9d92e64f4d248230c46bbb" Feb 18 12:00:58 crc kubenswrapper[4922]: I0218 12:00:58.399122 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-pg7nw"] Feb 18 12:00:58 crc kubenswrapper[4922]: I0218 12:00:58.407592 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-pg7nw"] Feb 18 12:00:58 crc kubenswrapper[4922]: I0218 12:00:58.992471 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec061216-02ec-4395-a5a8-baa7004bf191" path="/var/lib/kubelet/pods/ec061216-02ec-4395-a5a8-baa7004bf191/volumes" Feb 18 12:01:00 crc kubenswrapper[4922]: I0218 12:01:00.150987 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29523601-t5w2s"] Feb 18 12:01:00 crc kubenswrapper[4922]: E0218 12:01:00.151706 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ec5b650-c58d-4b8b-a903-7b95c211139c" containerName="dnsmasq-dns" Feb 18 12:01:00 crc kubenswrapper[4922]: I0218 12:01:00.151724 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ec5b650-c58d-4b8b-a903-7b95c211139c" containerName="dnsmasq-dns" Feb 18 12:01:00 crc kubenswrapper[4922]: E0218 12:01:00.151737 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6666e009-8c33-402c-865e-03e35b98ad97" containerName="extract-content" Feb 18 12:01:00 crc kubenswrapper[4922]: I0218 12:01:00.151744 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="6666e009-8c33-402c-865e-03e35b98ad97" containerName="extract-content" Feb 18 12:01:00 crc kubenswrapper[4922]: E0218 12:01:00.151771 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6666e009-8c33-402c-865e-03e35b98ad97" containerName="registry-server" Feb 18 12:01:00 crc kubenswrapper[4922]: I0218 12:01:00.151778 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="6666e009-8c33-402c-865e-03e35b98ad97" containerName="registry-server" Feb 18 12:01:00 crc kubenswrapper[4922]: E0218 12:01:00.151793 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ec5b650-c58d-4b8b-a903-7b95c211139c" containerName="init" Feb 18 12:01:00 crc kubenswrapper[4922]: I0218 12:01:00.151800 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ec5b650-c58d-4b8b-a903-7b95c211139c" containerName="init" Feb 18 12:01:00 crc kubenswrapper[4922]: E0218 12:01:00.151815 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6666e009-8c33-402c-865e-03e35b98ad97" containerName="extract-utilities" Feb 18 12:01:00 crc kubenswrapper[4922]: I0218 12:01:00.151822 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="6666e009-8c33-402c-865e-03e35b98ad97" containerName="extract-utilities" Feb 18 12:01:00 crc kubenswrapper[4922]: E0218 12:01:00.151835 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec061216-02ec-4395-a5a8-baa7004bf191" containerName="dnsmasq-dns" Feb 18 12:01:00 crc kubenswrapper[4922]: I0218 12:01:00.151842 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec061216-02ec-4395-a5a8-baa7004bf191" containerName="dnsmasq-dns" Feb 18 12:01:00 crc kubenswrapper[4922]: E0218 12:01:00.151857 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec061216-02ec-4395-a5a8-baa7004bf191" containerName="init" Feb 18 12:01:00 crc kubenswrapper[4922]: I0218 12:01:00.151868 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec061216-02ec-4395-a5a8-baa7004bf191" containerName="init" Feb 18 12:01:00 crc kubenswrapper[4922]: I0218 12:01:00.152078 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec061216-02ec-4395-a5a8-baa7004bf191" containerName="dnsmasq-dns" Feb 18 12:01:00 crc kubenswrapper[4922]: I0218 12:01:00.152101 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ec5b650-c58d-4b8b-a903-7b95c211139c" containerName="dnsmasq-dns" Feb 18 12:01:00 crc kubenswrapper[4922]: I0218 12:01:00.152116 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="6666e009-8c33-402c-865e-03e35b98ad97" containerName="registry-server" Feb 18 12:01:00 crc kubenswrapper[4922]: I0218 12:01:00.152941 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29523601-t5w2s" Feb 18 12:01:00 crc kubenswrapper[4922]: I0218 12:01:00.164259 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29523601-t5w2s"] Feb 18 12:01:00 crc kubenswrapper[4922]: I0218 12:01:00.188003 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07d51aec-efff-44ea-b9c5-c5335f63e0f2-config-data\") pod \"keystone-cron-29523601-t5w2s\" (UID: \"07d51aec-efff-44ea-b9c5-c5335f63e0f2\") " pod="openstack/keystone-cron-29523601-t5w2s" Feb 18 12:01:00 crc kubenswrapper[4922]: I0218 12:01:00.188047 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glwk2\" (UniqueName: \"kubernetes.io/projected/07d51aec-efff-44ea-b9c5-c5335f63e0f2-kube-api-access-glwk2\") pod \"keystone-cron-29523601-t5w2s\" (UID: \"07d51aec-efff-44ea-b9c5-c5335f63e0f2\") " pod="openstack/keystone-cron-29523601-t5w2s" Feb 18 12:01:00 crc kubenswrapper[4922]: I0218 12:01:00.188126 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07d51aec-efff-44ea-b9c5-c5335f63e0f2-combined-ca-bundle\") pod \"keystone-cron-29523601-t5w2s\" (UID: \"07d51aec-efff-44ea-b9c5-c5335f63e0f2\") " pod="openstack/keystone-cron-29523601-t5w2s" Feb 18 12:01:00 crc kubenswrapper[4922]: I0218 12:01:00.188312 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/07d51aec-efff-44ea-b9c5-c5335f63e0f2-fernet-keys\") pod \"keystone-cron-29523601-t5w2s\" (UID: \"07d51aec-efff-44ea-b9c5-c5335f63e0f2\") " pod="openstack/keystone-cron-29523601-t5w2s" Feb 18 12:01:00 crc kubenswrapper[4922]: I0218 12:01:00.289642 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07d51aec-efff-44ea-b9c5-c5335f63e0f2-combined-ca-bundle\") pod \"keystone-cron-29523601-t5w2s\" (UID: \"07d51aec-efff-44ea-b9c5-c5335f63e0f2\") " pod="openstack/keystone-cron-29523601-t5w2s" Feb 18 12:01:00 crc kubenswrapper[4922]: I0218 12:01:00.289866 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/07d51aec-efff-44ea-b9c5-c5335f63e0f2-fernet-keys\") pod \"keystone-cron-29523601-t5w2s\" (UID: \"07d51aec-efff-44ea-b9c5-c5335f63e0f2\") " pod="openstack/keystone-cron-29523601-t5w2s" Feb 18 12:01:00 crc kubenswrapper[4922]: I0218 12:01:00.289993 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07d51aec-efff-44ea-b9c5-c5335f63e0f2-config-data\") pod \"keystone-cron-29523601-t5w2s\" (UID: \"07d51aec-efff-44ea-b9c5-c5335f63e0f2\") " pod="openstack/keystone-cron-29523601-t5w2s" Feb 18 12:01:00 crc kubenswrapper[4922]: I0218 12:01:00.290081 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glwk2\" (UniqueName: \"kubernetes.io/projected/07d51aec-efff-44ea-b9c5-c5335f63e0f2-kube-api-access-glwk2\") pod \"keystone-cron-29523601-t5w2s\" (UID: \"07d51aec-efff-44ea-b9c5-c5335f63e0f2\") " pod="openstack/keystone-cron-29523601-t5w2s" Feb 18 12:01:00 crc kubenswrapper[4922]: I0218 12:01:00.299194 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07d51aec-efff-44ea-b9c5-c5335f63e0f2-config-data\") pod \"keystone-cron-29523601-t5w2s\" (UID: \"07d51aec-efff-44ea-b9c5-c5335f63e0f2\") " pod="openstack/keystone-cron-29523601-t5w2s" Feb 18 12:01:00 crc kubenswrapper[4922]: I0218 12:01:00.299446 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/07d51aec-efff-44ea-b9c5-c5335f63e0f2-fernet-keys\") pod \"keystone-cron-29523601-t5w2s\" (UID: \"07d51aec-efff-44ea-b9c5-c5335f63e0f2\") " pod="openstack/keystone-cron-29523601-t5w2s" Feb 18 12:01:00 crc kubenswrapper[4922]: I0218 12:01:00.305185 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07d51aec-efff-44ea-b9c5-c5335f63e0f2-combined-ca-bundle\") pod \"keystone-cron-29523601-t5w2s\" (UID: \"07d51aec-efff-44ea-b9c5-c5335f63e0f2\") " pod="openstack/keystone-cron-29523601-t5w2s" Feb 18 12:01:00 crc kubenswrapper[4922]: I0218 12:01:00.318422 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glwk2\" (UniqueName: \"kubernetes.io/projected/07d51aec-efff-44ea-b9c5-c5335f63e0f2-kube-api-access-glwk2\") pod \"keystone-cron-29523601-t5w2s\" (UID: \"07d51aec-efff-44ea-b9c5-c5335f63e0f2\") " pod="openstack/keystone-cron-29523601-t5w2s" Feb 18 12:01:00 crc kubenswrapper[4922]: I0218 12:01:00.469027 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29523601-t5w2s" Feb 18 12:01:00 crc kubenswrapper[4922]: I0218 12:01:00.994577 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29523601-t5w2s"] Feb 18 12:01:01 crc kubenswrapper[4922]: I0218 12:01:01.386415 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29523601-t5w2s" event={"ID":"07d51aec-efff-44ea-b9c5-c5335f63e0f2","Type":"ContainerStarted","Data":"d727a737a65ba6105cfd3ad5cbfbf25f5c64e5110bfdff32ea4e8f4470ef37ca"} Feb 18 12:01:01 crc kubenswrapper[4922]: I0218 12:01:01.386496 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29523601-t5w2s" event={"ID":"07d51aec-efff-44ea-b9c5-c5335f63e0f2","Type":"ContainerStarted","Data":"52162534e4622938a1f196f2b5aacb6d07a616c817761fe3e3576c455d12e223"} Feb 18 12:01:01 crc kubenswrapper[4922]: I0218 12:01:01.410295 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29523601-t5w2s" podStartSLOduration=1.410273576 podStartE2EDuration="1.410273576s" podCreationTimestamp="2026-02-18 12:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 12:01:01.405534375 +0000 UTC m=+1463.133238455" watchObservedRunningTime="2026-02-18 12:01:01.410273576 +0000 UTC m=+1463.137977656" Feb 18 12:01:03 crc kubenswrapper[4922]: I0218 12:01:03.405518 4922 generic.go:334] "Generic (PLEG): container finished" podID="07d51aec-efff-44ea-b9c5-c5335f63e0f2" containerID="d727a737a65ba6105cfd3ad5cbfbf25f5c64e5110bfdff32ea4e8f4470ef37ca" exitCode=0 Feb 18 12:01:03 crc kubenswrapper[4922]: I0218 12:01:03.405552 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29523601-t5w2s" event={"ID":"07d51aec-efff-44ea-b9c5-c5335f63e0f2","Type":"ContainerDied","Data":"d727a737a65ba6105cfd3ad5cbfbf25f5c64e5110bfdff32ea4e8f4470ef37ca"} Feb 18 12:01:04 crc kubenswrapper[4922]: I0218 12:01:04.837793 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29523601-t5w2s" Feb 18 12:01:04 crc kubenswrapper[4922]: I0218 12:01:04.879208 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07d51aec-efff-44ea-b9c5-c5335f63e0f2-config-data\") pod \"07d51aec-efff-44ea-b9c5-c5335f63e0f2\" (UID: \"07d51aec-efff-44ea-b9c5-c5335f63e0f2\") " Feb 18 12:01:04 crc kubenswrapper[4922]: I0218 12:01:04.879316 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glwk2\" (UniqueName: \"kubernetes.io/projected/07d51aec-efff-44ea-b9c5-c5335f63e0f2-kube-api-access-glwk2\") pod \"07d51aec-efff-44ea-b9c5-c5335f63e0f2\" (UID: \"07d51aec-efff-44ea-b9c5-c5335f63e0f2\") " Feb 18 12:01:04 crc kubenswrapper[4922]: I0218 12:01:04.879344 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07d51aec-efff-44ea-b9c5-c5335f63e0f2-combined-ca-bundle\") pod \"07d51aec-efff-44ea-b9c5-c5335f63e0f2\" (UID: \"07d51aec-efff-44ea-b9c5-c5335f63e0f2\") " Feb 18 12:01:04 crc kubenswrapper[4922]: I0218 12:01:04.879378 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/07d51aec-efff-44ea-b9c5-c5335f63e0f2-fernet-keys\") pod \"07d51aec-efff-44ea-b9c5-c5335f63e0f2\" (UID: \"07d51aec-efff-44ea-b9c5-c5335f63e0f2\") " Feb 18 12:01:04 crc kubenswrapper[4922]: I0218 12:01:04.887225 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07d51aec-efff-44ea-b9c5-c5335f63e0f2-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "07d51aec-efff-44ea-b9c5-c5335f63e0f2" (UID: "07d51aec-efff-44ea-b9c5-c5335f63e0f2"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:01:04 crc kubenswrapper[4922]: I0218 12:01:04.910950 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07d51aec-efff-44ea-b9c5-c5335f63e0f2-kube-api-access-glwk2" (OuterVolumeSpecName: "kube-api-access-glwk2") pod "07d51aec-efff-44ea-b9c5-c5335f63e0f2" (UID: "07d51aec-efff-44ea-b9c5-c5335f63e0f2"). InnerVolumeSpecName "kube-api-access-glwk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:01:04 crc kubenswrapper[4922]: I0218 12:01:04.918370 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07d51aec-efff-44ea-b9c5-c5335f63e0f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "07d51aec-efff-44ea-b9c5-c5335f63e0f2" (UID: "07d51aec-efff-44ea-b9c5-c5335f63e0f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:01:04 crc kubenswrapper[4922]: I0218 12:01:04.942689 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07d51aec-efff-44ea-b9c5-c5335f63e0f2-config-data" (OuterVolumeSpecName: "config-data") pod "07d51aec-efff-44ea-b9c5-c5335f63e0f2" (UID: "07d51aec-efff-44ea-b9c5-c5335f63e0f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:01:05 crc kubenswrapper[4922]: I0218 12:01:05.011734 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07d51aec-efff-44ea-b9c5-c5335f63e0f2-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 12:01:05 crc kubenswrapper[4922]: I0218 12:01:05.012072 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glwk2\" (UniqueName: \"kubernetes.io/projected/07d51aec-efff-44ea-b9c5-c5335f63e0f2-kube-api-access-glwk2\") on node \"crc\" DevicePath \"\"" Feb 18 12:01:05 crc kubenswrapper[4922]: I0218 12:01:05.012100 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07d51aec-efff-44ea-b9c5-c5335f63e0f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 12:01:05 crc kubenswrapper[4922]: I0218 12:01:05.012146 4922 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/07d51aec-efff-44ea-b9c5-c5335f63e0f2-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 18 12:01:05 crc kubenswrapper[4922]: I0218 12:01:05.428213 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29523601-t5w2s" event={"ID":"07d51aec-efff-44ea-b9c5-c5335f63e0f2","Type":"ContainerDied","Data":"52162534e4622938a1f196f2b5aacb6d07a616c817761fe3e3576c455d12e223"} Feb 18 12:01:05 crc kubenswrapper[4922]: I0218 12:01:05.428295 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52162534e4622938a1f196f2b5aacb6d07a616c817761fe3e3576c455d12e223" Feb 18 12:01:05 crc kubenswrapper[4922]: I0218 12:01:05.428665 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29523601-t5w2s" Feb 18 12:01:08 crc kubenswrapper[4922]: I0218 12:01:08.462504 4922 generic.go:334] "Generic (PLEG): container finished" podID="bb934d91-0203-48d1-be6a-ab13e821993d" containerID="601c44d2e7a1e66d83dce04779c8353d850c14d7d1ba8a2cf3bd8ac47fff773a" exitCode=0 Feb 18 12:01:08 crc kubenswrapper[4922]: I0218 12:01:08.462594 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bb934d91-0203-48d1-be6a-ab13e821993d","Type":"ContainerDied","Data":"601c44d2e7a1e66d83dce04779c8353d850c14d7d1ba8a2cf3bd8ac47fff773a"} Feb 18 12:01:09 crc kubenswrapper[4922]: I0218 12:01:09.281115 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h7qdx"] Feb 18 12:01:09 crc kubenswrapper[4922]: E0218 12:01:09.282992 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07d51aec-efff-44ea-b9c5-c5335f63e0f2" containerName="keystone-cron" Feb 18 12:01:09 crc kubenswrapper[4922]: I0218 12:01:09.283021 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="07d51aec-efff-44ea-b9c5-c5335f63e0f2" containerName="keystone-cron" Feb 18 12:01:09 crc kubenswrapper[4922]: I0218 12:01:09.283250 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="07d51aec-efff-44ea-b9c5-c5335f63e0f2" containerName="keystone-cron" Feb 18 12:01:09 crc kubenswrapper[4922]: I0218 12:01:09.284162 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h7qdx" Feb 18 12:01:09 crc kubenswrapper[4922]: I0218 12:01:09.286471 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 12:01:09 crc kubenswrapper[4922]: I0218 12:01:09.286683 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 12:01:09 crc kubenswrapper[4922]: I0218 12:01:09.286600 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8fsfv" Feb 18 12:01:09 crc kubenswrapper[4922]: I0218 12:01:09.286990 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 12:01:09 crc kubenswrapper[4922]: I0218 12:01:09.296672 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h7qdx"] Feb 18 12:01:09 crc kubenswrapper[4922]: I0218 12:01:09.399146 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/30aa9b56-28ab-4d32-beb5-965876a6e243-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h7qdx\" (UID: \"30aa9b56-28ab-4d32-beb5-965876a6e243\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h7qdx" Feb 18 12:01:09 crc kubenswrapper[4922]: I0218 12:01:09.399448 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30aa9b56-28ab-4d32-beb5-965876a6e243-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h7qdx\" (UID: \"30aa9b56-28ab-4d32-beb5-965876a6e243\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h7qdx" Feb 18 12:01:09 crc kubenswrapper[4922]: I0218 12:01:09.399577 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zc6j\" (UniqueName: \"kubernetes.io/projected/30aa9b56-28ab-4d32-beb5-965876a6e243-kube-api-access-7zc6j\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h7qdx\" (UID: \"30aa9b56-28ab-4d32-beb5-965876a6e243\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h7qdx" Feb 18 12:01:09 crc kubenswrapper[4922]: I0218 12:01:09.399693 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30aa9b56-28ab-4d32-beb5-965876a6e243-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h7qdx\" (UID: \"30aa9b56-28ab-4d32-beb5-965876a6e243\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h7qdx" Feb 18 12:01:09 crc kubenswrapper[4922]: I0218 12:01:09.473213 4922 generic.go:334] "Generic (PLEG): container finished" podID="9eb7dcb0-20c5-414c-bc86-58461654bcb5" containerID="0ea30ab9744b418aaa71c6de8970bfdb30e18f4cbcf5605e9ca3cf28ff78e461" exitCode=0 Feb 18 12:01:09 crc kubenswrapper[4922]: I0218 12:01:09.473274 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9eb7dcb0-20c5-414c-bc86-58461654bcb5","Type":"ContainerDied","Data":"0ea30ab9744b418aaa71c6de8970bfdb30e18f4cbcf5605e9ca3cf28ff78e461"} Feb 18 12:01:09 crc kubenswrapper[4922]: I0218 12:01:09.477901 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bb934d91-0203-48d1-be6a-ab13e821993d","Type":"ContainerStarted","Data":"9f81234cb9ec53a5cd936915d2bb1d6d2143b00ffa178762474b013411e95937"} Feb 18 12:01:09 crc kubenswrapper[4922]: I0218 12:01:09.478822 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 18 12:01:09 crc kubenswrapper[4922]: I0218 12:01:09.493578 4922 scope.go:117] "RemoveContainer" containerID="5176cb9980de6bcd0a67b80f4ff01a72286ab10295b4c3d177fe01a39914f0b0" Feb 18 12:01:09 crc kubenswrapper[4922]: I0218 12:01:09.501466 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/30aa9b56-28ab-4d32-beb5-965876a6e243-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h7qdx\" (UID: \"30aa9b56-28ab-4d32-beb5-965876a6e243\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h7qdx" Feb 18 12:01:09 crc kubenswrapper[4922]: I0218 12:01:09.501590 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30aa9b56-28ab-4d32-beb5-965876a6e243-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h7qdx\" (UID: \"30aa9b56-28ab-4d32-beb5-965876a6e243\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h7qdx" Feb 18 12:01:09 crc kubenswrapper[4922]: I0218 12:01:09.501652 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zc6j\" (UniqueName: \"kubernetes.io/projected/30aa9b56-28ab-4d32-beb5-965876a6e243-kube-api-access-7zc6j\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h7qdx\" (UID: \"30aa9b56-28ab-4d32-beb5-965876a6e243\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h7qdx" Feb 18 12:01:09 crc kubenswrapper[4922]: I0218 12:01:09.501690 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30aa9b56-28ab-4d32-beb5-965876a6e243-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h7qdx\" (UID: \"30aa9b56-28ab-4d32-beb5-965876a6e243\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h7qdx" Feb 18 12:01:09 crc kubenswrapper[4922]: I0218 12:01:09.505805 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30aa9b56-28ab-4d32-beb5-965876a6e243-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h7qdx\" (UID: \"30aa9b56-28ab-4d32-beb5-965876a6e243\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h7qdx" Feb 18 12:01:09 crc kubenswrapper[4922]: I0218 12:01:09.509744 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/30aa9b56-28ab-4d32-beb5-965876a6e243-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h7qdx\" (UID: \"30aa9b56-28ab-4d32-beb5-965876a6e243\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h7qdx" Feb 18 12:01:09 crc kubenswrapper[4922]: I0218 12:01:09.511438 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30aa9b56-28ab-4d32-beb5-965876a6e243-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h7qdx\" (UID: \"30aa9b56-28ab-4d32-beb5-965876a6e243\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h7qdx" Feb 18 12:01:09 crc kubenswrapper[4922]: I0218 12:01:09.526079 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zc6j\" (UniqueName: \"kubernetes.io/projected/30aa9b56-28ab-4d32-beb5-965876a6e243-kube-api-access-7zc6j\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h7qdx\" (UID: \"30aa9b56-28ab-4d32-beb5-965876a6e243\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h7qdx" Feb 18 12:01:09 crc kubenswrapper[4922]: I0218 12:01:09.552657 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.552634236 podStartE2EDuration="36.552634236s" podCreationTimestamp="2026-02-18 12:00:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 12:01:09.537496713 +0000 UTC m=+1471.265200783" watchObservedRunningTime="2026-02-18 12:01:09.552634236 +0000 UTC m=+1471.280338316" Feb 18 12:01:09 crc kubenswrapper[4922]: I0218 12:01:09.601124 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h7qdx" Feb 18 12:01:09 crc kubenswrapper[4922]: I0218 12:01:09.744694 4922 scope.go:117] "RemoveContainer" containerID="50e02399793b7c21fb8b885dfb76c0b8a822098669151d568f198c715c6c35d1" Feb 18 12:01:09 crc kubenswrapper[4922]: I0218 12:01:09.807074 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 12:01:09 crc kubenswrapper[4922]: I0218 12:01:09.807132 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 12:01:09 crc kubenswrapper[4922]: I0218 12:01:09.807181 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-znglx" Feb 18 12:01:09 crc kubenswrapper[4922]: I0218 12:01:09.808009 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ef7998df4ff2ac956dafb01bf87962d308e4ed2ee7ceb57165a1d59bde7c799c"} pod="openshift-machine-config-operator/machine-config-daemon-znglx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 12:01:09 crc kubenswrapper[4922]: I0218 12:01:09.808087 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" containerID="cri-o://ef7998df4ff2ac956dafb01bf87962d308e4ed2ee7ceb57165a1d59bde7c799c" gracePeriod=600 Feb 18 12:01:10 crc kubenswrapper[4922]: I0218 12:01:10.267724 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h7qdx"] Feb 18 12:01:10 crc kubenswrapper[4922]: W0218 12:01:10.269891 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30aa9b56_28ab_4d32_beb5_965876a6e243.slice/crio-a5c218233f0436075dc0d643615f92c4d2ae717eff4d5e1d3a08f32bad9362e7 WatchSource:0}: Error finding container a5c218233f0436075dc0d643615f92c4d2ae717eff4d5e1d3a08f32bad9362e7: Status 404 returned error can't find the container with id a5c218233f0436075dc0d643615f92c4d2ae717eff4d5e1d3a08f32bad9362e7 Feb 18 12:01:10 crc kubenswrapper[4922]: I0218 12:01:10.506138 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h7qdx" event={"ID":"30aa9b56-28ab-4d32-beb5-965876a6e243","Type":"ContainerStarted","Data":"a5c218233f0436075dc0d643615f92c4d2ae717eff4d5e1d3a08f32bad9362e7"} Feb 18 12:01:10 crc kubenswrapper[4922]: I0218 12:01:10.510258 4922 generic.go:334] "Generic (PLEG): container finished" podID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerID="ef7998df4ff2ac956dafb01bf87962d308e4ed2ee7ceb57165a1d59bde7c799c" exitCode=0 Feb 18 12:01:10 crc kubenswrapper[4922]: I0218 12:01:10.510443 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" event={"ID":"fdb7cedc-b2e3-48f0-80e0-e17073b43228","Type":"ContainerDied","Data":"ef7998df4ff2ac956dafb01bf87962d308e4ed2ee7ceb57165a1d59bde7c799c"} Feb 18 12:01:10 crc kubenswrapper[4922]: I0218 12:01:10.510728 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" event={"ID":"fdb7cedc-b2e3-48f0-80e0-e17073b43228","Type":"ContainerStarted","Data":"8b1f76f76a4d90d637663bb4e07ac7219793473bf03151c5be72aa190a2db8df"} Feb 18 12:01:10 crc kubenswrapper[4922]: I0218 12:01:10.510803 4922 scope.go:117] "RemoveContainer" containerID="6ab0358a6b4b84604aa8265da97113127295fc06806ed22c39a69885110c93fc" Feb 18 12:01:10 crc kubenswrapper[4922]: I0218 12:01:10.517879 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9eb7dcb0-20c5-414c-bc86-58461654bcb5","Type":"ContainerStarted","Data":"9d05dc04734487fb2ac1de6c79ab5bd89a5821495af0603999521dbbacdc9b6a"} Feb 18 12:01:10 crc kubenswrapper[4922]: I0218 12:01:10.518642 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 18 12:01:10 crc kubenswrapper[4922]: I0218 12:01:10.568379 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.568339523 podStartE2EDuration="36.568339523s" podCreationTimestamp="2026-02-18 12:00:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 12:01:10.56269873 +0000 UTC m=+1472.290402840" watchObservedRunningTime="2026-02-18 12:01:10.568339523 +0000 UTC m=+1472.296043613" Feb 18 12:01:21 crc kubenswrapper[4922]: I0218 12:01:21.655234 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h7qdx" event={"ID":"30aa9b56-28ab-4d32-beb5-965876a6e243","Type":"ContainerStarted","Data":"e9389767f4da8a0c337e9f914b300c43c380f37d7c3b13c7680f21f3537ce2e0"} Feb 18 12:01:21 crc kubenswrapper[4922]: I0218 12:01:21.677663 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h7qdx" podStartSLOduration=1.8241466480000001 podStartE2EDuration="12.677645987s" podCreationTimestamp="2026-02-18 12:01:09 +0000 UTC" firstStartedPulling="2026-02-18 12:01:10.272234554 +0000 UTC m=+1471.999938634" lastFinishedPulling="2026-02-18 12:01:21.125733893 +0000 UTC m=+1482.853437973" observedRunningTime="2026-02-18 12:01:21.671764998 +0000 UTC m=+1483.399469088" watchObservedRunningTime="2026-02-18 12:01:21.677645987 +0000 UTC m=+1483.405350067" Feb 18 12:01:23 crc kubenswrapper[4922]: I0218 12:01:23.536715 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 18 12:01:24 crc kubenswrapper[4922]: I0218 12:01:24.563186 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 18 12:01:31 crc kubenswrapper[4922]: I0218 12:01:31.744536 4922 generic.go:334] "Generic (PLEG): container finished" podID="30aa9b56-28ab-4d32-beb5-965876a6e243" containerID="e9389767f4da8a0c337e9f914b300c43c380f37d7c3b13c7680f21f3537ce2e0" exitCode=0 Feb 18 12:01:31 crc kubenswrapper[4922]: I0218 12:01:31.744626 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h7qdx" event={"ID":"30aa9b56-28ab-4d32-beb5-965876a6e243","Type":"ContainerDied","Data":"e9389767f4da8a0c337e9f914b300c43c380f37d7c3b13c7680f21f3537ce2e0"} Feb 18 12:01:33 crc kubenswrapper[4922]: I0218 12:01:33.175244 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h7qdx" Feb 18 12:01:33 crc kubenswrapper[4922]: I0218 12:01:33.321865 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30aa9b56-28ab-4d32-beb5-965876a6e243-inventory\") pod \"30aa9b56-28ab-4d32-beb5-965876a6e243\" (UID: \"30aa9b56-28ab-4d32-beb5-965876a6e243\") " Feb 18 12:01:33 crc kubenswrapper[4922]: I0218 12:01:33.321945 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30aa9b56-28ab-4d32-beb5-965876a6e243-repo-setup-combined-ca-bundle\") pod \"30aa9b56-28ab-4d32-beb5-965876a6e243\" (UID: \"30aa9b56-28ab-4d32-beb5-965876a6e243\") " Feb 18 12:01:33 crc kubenswrapper[4922]: I0218 12:01:33.322254 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zc6j\" (UniqueName: \"kubernetes.io/projected/30aa9b56-28ab-4d32-beb5-965876a6e243-kube-api-access-7zc6j\") pod \"30aa9b56-28ab-4d32-beb5-965876a6e243\" (UID: \"30aa9b56-28ab-4d32-beb5-965876a6e243\") " Feb 18 12:01:33 crc kubenswrapper[4922]: I0218 12:01:33.322300 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/30aa9b56-28ab-4d32-beb5-965876a6e243-ssh-key-openstack-edpm-ipam\") pod \"30aa9b56-28ab-4d32-beb5-965876a6e243\" (UID: \"30aa9b56-28ab-4d32-beb5-965876a6e243\") " Feb 18 12:01:33 crc kubenswrapper[4922]: I0218 12:01:33.333114 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30aa9b56-28ab-4d32-beb5-965876a6e243-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "30aa9b56-28ab-4d32-beb5-965876a6e243" (UID: "30aa9b56-28ab-4d32-beb5-965876a6e243"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:01:33 crc kubenswrapper[4922]: I0218 12:01:33.333122 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30aa9b56-28ab-4d32-beb5-965876a6e243-kube-api-access-7zc6j" (OuterVolumeSpecName: "kube-api-access-7zc6j") pod "30aa9b56-28ab-4d32-beb5-965876a6e243" (UID: "30aa9b56-28ab-4d32-beb5-965876a6e243"). InnerVolumeSpecName "kube-api-access-7zc6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:01:33 crc kubenswrapper[4922]: I0218 12:01:33.367963 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30aa9b56-28ab-4d32-beb5-965876a6e243-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "30aa9b56-28ab-4d32-beb5-965876a6e243" (UID: "30aa9b56-28ab-4d32-beb5-965876a6e243"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:01:33 crc kubenswrapper[4922]: I0218 12:01:33.382302 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30aa9b56-28ab-4d32-beb5-965876a6e243-inventory" (OuterVolumeSpecName: "inventory") pod "30aa9b56-28ab-4d32-beb5-965876a6e243" (UID: "30aa9b56-28ab-4d32-beb5-965876a6e243"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:01:33 crc kubenswrapper[4922]: I0218 12:01:33.424847 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zc6j\" (UniqueName: \"kubernetes.io/projected/30aa9b56-28ab-4d32-beb5-965876a6e243-kube-api-access-7zc6j\") on node \"crc\" DevicePath \"\"" Feb 18 12:01:33 crc kubenswrapper[4922]: I0218 12:01:33.424912 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/30aa9b56-28ab-4d32-beb5-965876a6e243-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 12:01:33 crc kubenswrapper[4922]: I0218 12:01:33.424939 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30aa9b56-28ab-4d32-beb5-965876a6e243-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 12:01:33 crc kubenswrapper[4922]: I0218 12:01:33.424963 4922 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30aa9b56-28ab-4d32-beb5-965876a6e243-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 12:01:33 crc kubenswrapper[4922]: I0218 12:01:33.765952 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h7qdx" event={"ID":"30aa9b56-28ab-4d32-beb5-965876a6e243","Type":"ContainerDied","Data":"a5c218233f0436075dc0d643615f92c4d2ae717eff4d5e1d3a08f32bad9362e7"} Feb 18 12:01:33 crc kubenswrapper[4922]: I0218 12:01:33.766014 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5c218233f0436075dc0d643615f92c4d2ae717eff4d5e1d3a08f32bad9362e7" Feb 18 12:01:33 crc kubenswrapper[4922]: I0218 12:01:33.766128 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h7qdx" Feb 18 12:01:33 crc kubenswrapper[4922]: I0218 12:01:33.855770 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-zs9qz"] Feb 18 12:01:33 crc kubenswrapper[4922]: E0218 12:01:33.856491 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30aa9b56-28ab-4d32-beb5-965876a6e243" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 18 12:01:33 crc kubenswrapper[4922]: I0218 12:01:33.856585 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="30aa9b56-28ab-4d32-beb5-965876a6e243" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 18 12:01:33 crc kubenswrapper[4922]: I0218 12:01:33.856977 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="30aa9b56-28ab-4d32-beb5-965876a6e243" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 18 12:01:33 crc kubenswrapper[4922]: I0218 12:01:33.858032 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zs9qz" Feb 18 12:01:33 crc kubenswrapper[4922]: I0218 12:01:33.860404 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8fsfv" Feb 18 12:01:33 crc kubenswrapper[4922]: I0218 12:01:33.860448 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 12:01:33 crc kubenswrapper[4922]: I0218 12:01:33.860699 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 12:01:33 crc kubenswrapper[4922]: I0218 12:01:33.861291 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 12:01:33 crc kubenswrapper[4922]: I0218 12:01:33.869066 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-zs9qz"] Feb 18 12:01:34 crc kubenswrapper[4922]: I0218 12:01:34.037054 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/08ba745d-df3b-42c0-a384-ca64c96dd47f-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zs9qz\" (UID: \"08ba745d-df3b-42c0-a384-ca64c96dd47f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zs9qz" Feb 18 12:01:34 crc kubenswrapper[4922]: I0218 12:01:34.037128 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q96v5\" (UniqueName: \"kubernetes.io/projected/08ba745d-df3b-42c0-a384-ca64c96dd47f-kube-api-access-q96v5\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zs9qz\" (UID: \"08ba745d-df3b-42c0-a384-ca64c96dd47f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zs9qz" Feb 18 12:01:34 crc kubenswrapper[4922]: I0218 12:01:34.037153 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/08ba745d-df3b-42c0-a384-ca64c96dd47f-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zs9qz\" (UID: \"08ba745d-df3b-42c0-a384-ca64c96dd47f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zs9qz" Feb 18 12:01:34 crc kubenswrapper[4922]: I0218 12:01:34.138549 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/08ba745d-df3b-42c0-a384-ca64c96dd47f-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zs9qz\" (UID: \"08ba745d-df3b-42c0-a384-ca64c96dd47f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zs9qz" Feb 18 12:01:34 crc kubenswrapper[4922]: I0218 12:01:34.138941 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q96v5\" (UniqueName: \"kubernetes.io/projected/08ba745d-df3b-42c0-a384-ca64c96dd47f-kube-api-access-q96v5\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zs9qz\" (UID: \"08ba745d-df3b-42c0-a384-ca64c96dd47f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zs9qz" Feb 18 12:01:34 crc kubenswrapper[4922]: I0218 12:01:34.138979 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/08ba745d-df3b-42c0-a384-ca64c96dd47f-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zs9qz\" (UID: \"08ba745d-df3b-42c0-a384-ca64c96dd47f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zs9qz" Feb 18 12:01:34 crc kubenswrapper[4922]: I0218 12:01:34.143206 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/08ba745d-df3b-42c0-a384-ca64c96dd47f-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zs9qz\" (UID: \"08ba745d-df3b-42c0-a384-ca64c96dd47f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zs9qz" Feb 18 12:01:34 crc kubenswrapper[4922]: I0218 12:01:34.147456 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/08ba745d-df3b-42c0-a384-ca64c96dd47f-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zs9qz\" (UID: \"08ba745d-df3b-42c0-a384-ca64c96dd47f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zs9qz" Feb 18 12:01:34 crc kubenswrapper[4922]: I0218 12:01:34.156303 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q96v5\" (UniqueName: \"kubernetes.io/projected/08ba745d-df3b-42c0-a384-ca64c96dd47f-kube-api-access-q96v5\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zs9qz\" (UID: \"08ba745d-df3b-42c0-a384-ca64c96dd47f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zs9qz" Feb 18 12:01:34 crc kubenswrapper[4922]: I0218 12:01:34.185675 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zs9qz" Feb 18 12:01:34 crc kubenswrapper[4922]: I0218 12:01:34.663525 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-zs9qz"] Feb 18 12:01:34 crc kubenswrapper[4922]: I0218 12:01:34.775590 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zs9qz" event={"ID":"08ba745d-df3b-42c0-a384-ca64c96dd47f","Type":"ContainerStarted","Data":"e05f61cfdec2576ef26cb87c3b27fc9b04217963669c18e44f62bb3cfcd46f2c"} Feb 18 12:01:35 crc kubenswrapper[4922]: I0218 12:01:35.143290 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-s6f4m"] Feb 18 12:01:35 crc kubenswrapper[4922]: I0218 12:01:35.146146 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s6f4m" Feb 18 12:01:35 crc kubenswrapper[4922]: I0218 12:01:35.153721 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s6f4m"] Feb 18 12:01:35 crc kubenswrapper[4922]: I0218 12:01:35.260260 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05c02623-ca55-4852-ac9f-1415e7d3abad-catalog-content\") pod \"certified-operators-s6f4m\" (UID: \"05c02623-ca55-4852-ac9f-1415e7d3abad\") " pod="openshift-marketplace/certified-operators-s6f4m" Feb 18 12:01:35 crc kubenswrapper[4922]: I0218 12:01:35.260638 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lhpc\" (UniqueName: \"kubernetes.io/projected/05c02623-ca55-4852-ac9f-1415e7d3abad-kube-api-access-5lhpc\") pod \"certified-operators-s6f4m\" (UID: \"05c02623-ca55-4852-ac9f-1415e7d3abad\") " pod="openshift-marketplace/certified-operators-s6f4m" Feb 18 12:01:35 crc kubenswrapper[4922]: I0218 12:01:35.260679 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05c02623-ca55-4852-ac9f-1415e7d3abad-utilities\") pod \"certified-operators-s6f4m\" (UID: \"05c02623-ca55-4852-ac9f-1415e7d3abad\") " pod="openshift-marketplace/certified-operators-s6f4m" Feb 18 12:01:35 crc kubenswrapper[4922]: I0218 12:01:35.362115 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05c02623-ca55-4852-ac9f-1415e7d3abad-catalog-content\") pod \"certified-operators-s6f4m\" (UID: \"05c02623-ca55-4852-ac9f-1415e7d3abad\") " pod="openshift-marketplace/certified-operators-s6f4m" Feb 18 12:01:35 crc kubenswrapper[4922]: I0218 12:01:35.362390 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lhpc\" (UniqueName: \"kubernetes.io/projected/05c02623-ca55-4852-ac9f-1415e7d3abad-kube-api-access-5lhpc\") pod \"certified-operators-s6f4m\" (UID: \"05c02623-ca55-4852-ac9f-1415e7d3abad\") " pod="openshift-marketplace/certified-operators-s6f4m" Feb 18 12:01:35 crc kubenswrapper[4922]: I0218 12:01:35.362477 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05c02623-ca55-4852-ac9f-1415e7d3abad-utilities\") pod \"certified-operators-s6f4m\" (UID: \"05c02623-ca55-4852-ac9f-1415e7d3abad\") " pod="openshift-marketplace/certified-operators-s6f4m" Feb 18 12:01:35 crc kubenswrapper[4922]: I0218 12:01:35.362636 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05c02623-ca55-4852-ac9f-1415e7d3abad-catalog-content\") pod \"certified-operators-s6f4m\" (UID: \"05c02623-ca55-4852-ac9f-1415e7d3abad\") " pod="openshift-marketplace/certified-operators-s6f4m" Feb 18 12:01:35 crc kubenswrapper[4922]: I0218 12:01:35.363078 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05c02623-ca55-4852-ac9f-1415e7d3abad-utilities\") pod \"certified-operators-s6f4m\" (UID: \"05c02623-ca55-4852-ac9f-1415e7d3abad\") " pod="openshift-marketplace/certified-operators-s6f4m" Feb 18 12:01:35 crc kubenswrapper[4922]: I0218 12:01:35.382483 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lhpc\" (UniqueName: \"kubernetes.io/projected/05c02623-ca55-4852-ac9f-1415e7d3abad-kube-api-access-5lhpc\") pod \"certified-operators-s6f4m\" (UID: \"05c02623-ca55-4852-ac9f-1415e7d3abad\") " pod="openshift-marketplace/certified-operators-s6f4m" Feb 18 12:01:35 crc kubenswrapper[4922]: I0218 12:01:35.489113 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s6f4m" Feb 18 12:01:35 crc kubenswrapper[4922]: I0218 12:01:35.795295 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zs9qz" event={"ID":"08ba745d-df3b-42c0-a384-ca64c96dd47f","Type":"ContainerStarted","Data":"7c1f4358428a0cbbbae7955082ba8a010faf4ce840919e0b17a376820b5b9299"} Feb 18 12:01:35 crc kubenswrapper[4922]: I0218 12:01:35.825749 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zs9qz" podStartSLOduration=2.291421275 podStartE2EDuration="2.825723274s" podCreationTimestamp="2026-02-18 12:01:33 +0000 UTC" firstStartedPulling="2026-02-18 12:01:34.671254164 +0000 UTC m=+1496.398958254" lastFinishedPulling="2026-02-18 12:01:35.205556173 +0000 UTC m=+1496.933260253" observedRunningTime="2026-02-18 12:01:35.812350666 +0000 UTC m=+1497.540054746" watchObservedRunningTime="2026-02-18 12:01:35.825723274 +0000 UTC m=+1497.553427354" Feb 18 12:01:36 crc kubenswrapper[4922]: I0218 12:01:36.047645 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s6f4m"] Feb 18 12:01:36 crc kubenswrapper[4922]: I0218 12:01:36.808596 4922 generic.go:334] "Generic (PLEG): container finished" podID="05c02623-ca55-4852-ac9f-1415e7d3abad" containerID="5a30dcc68908a23a96146c41c7a80874cf0e57089acae944a8f86d7b46b07b89" exitCode=0 Feb 18 12:01:36 crc kubenswrapper[4922]: I0218 12:01:36.808653 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s6f4m" event={"ID":"05c02623-ca55-4852-ac9f-1415e7d3abad","Type":"ContainerDied","Data":"5a30dcc68908a23a96146c41c7a80874cf0e57089acae944a8f86d7b46b07b89"} Feb 18 12:01:36 crc kubenswrapper[4922]: I0218 12:01:36.808990 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s6f4m" event={"ID":"05c02623-ca55-4852-ac9f-1415e7d3abad","Type":"ContainerStarted","Data":"db4c510ec11e469c18d71a0c6c6742633749b823ee5c8d01a1d647d346beade0"} Feb 18 12:01:37 crc kubenswrapper[4922]: I0218 12:01:37.821552 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s6f4m" event={"ID":"05c02623-ca55-4852-ac9f-1415e7d3abad","Type":"ContainerStarted","Data":"b2420c2b091c5665f89700b0399e1d80eed8c610e604c4b313e169f55232a9ec"} Feb 18 12:01:38 crc kubenswrapper[4922]: I0218 12:01:38.833896 4922 generic.go:334] "Generic (PLEG): container finished" podID="08ba745d-df3b-42c0-a384-ca64c96dd47f" containerID="7c1f4358428a0cbbbae7955082ba8a010faf4ce840919e0b17a376820b5b9299" exitCode=0 Feb 18 12:01:38 crc kubenswrapper[4922]: I0218 12:01:38.833995 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zs9qz" event={"ID":"08ba745d-df3b-42c0-a384-ca64c96dd47f","Type":"ContainerDied","Data":"7c1f4358428a0cbbbae7955082ba8a010faf4ce840919e0b17a376820b5b9299"} Feb 18 12:01:38 crc kubenswrapper[4922]: I0218 12:01:38.836547 4922 generic.go:334] "Generic (PLEG): container finished" podID="05c02623-ca55-4852-ac9f-1415e7d3abad" containerID="b2420c2b091c5665f89700b0399e1d80eed8c610e604c4b313e169f55232a9ec" exitCode=0 Feb 18 12:01:38 crc kubenswrapper[4922]: I0218 12:01:38.836579 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s6f4m" event={"ID":"05c02623-ca55-4852-ac9f-1415e7d3abad","Type":"ContainerDied","Data":"b2420c2b091c5665f89700b0399e1d80eed8c610e604c4b313e169f55232a9ec"} Feb 18 12:01:39 crc kubenswrapper[4922]: I0218 12:01:39.847262 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s6f4m" event={"ID":"05c02623-ca55-4852-ac9f-1415e7d3abad","Type":"ContainerStarted","Data":"a289f1679934ad5340d76b537a689e2bc8e2dc588042505fe99286b54adb293a"} Feb 18 12:01:39 crc kubenswrapper[4922]: I0218 12:01:39.885497 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-s6f4m" podStartSLOduration=2.440724638 podStartE2EDuration="4.88547186s" podCreationTimestamp="2026-02-18 12:01:35 +0000 UTC" firstStartedPulling="2026-02-18 12:01:36.811329633 +0000 UTC m=+1498.539033713" lastFinishedPulling="2026-02-18 12:01:39.256076855 +0000 UTC m=+1500.983780935" observedRunningTime="2026-02-18 12:01:39.874212935 +0000 UTC m=+1501.601917025" watchObservedRunningTime="2026-02-18 12:01:39.88547186 +0000 UTC m=+1501.613175940" Feb 18 12:01:40 crc kubenswrapper[4922]: I0218 12:01:40.270923 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zs9qz" Feb 18 12:01:40 crc kubenswrapper[4922]: I0218 12:01:40.472936 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/08ba745d-df3b-42c0-a384-ca64c96dd47f-ssh-key-openstack-edpm-ipam\") pod \"08ba745d-df3b-42c0-a384-ca64c96dd47f\" (UID: \"08ba745d-df3b-42c0-a384-ca64c96dd47f\") " Feb 18 12:01:40 crc kubenswrapper[4922]: I0218 12:01:40.473137 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/08ba745d-df3b-42c0-a384-ca64c96dd47f-inventory\") pod \"08ba745d-df3b-42c0-a384-ca64c96dd47f\" (UID: \"08ba745d-df3b-42c0-a384-ca64c96dd47f\") " Feb 18 12:01:40 crc kubenswrapper[4922]: I0218 12:01:40.473309 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q96v5\" (UniqueName: \"kubernetes.io/projected/08ba745d-df3b-42c0-a384-ca64c96dd47f-kube-api-access-q96v5\") pod \"08ba745d-df3b-42c0-a384-ca64c96dd47f\" (UID: \"08ba745d-df3b-42c0-a384-ca64c96dd47f\") " Feb 18 12:01:40 crc kubenswrapper[4922]: I0218 12:01:40.480453 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08ba745d-df3b-42c0-a384-ca64c96dd47f-kube-api-access-q96v5" (OuterVolumeSpecName: "kube-api-access-q96v5") pod "08ba745d-df3b-42c0-a384-ca64c96dd47f" (UID: "08ba745d-df3b-42c0-a384-ca64c96dd47f"). InnerVolumeSpecName "kube-api-access-q96v5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:01:40 crc kubenswrapper[4922]: I0218 12:01:40.518469 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08ba745d-df3b-42c0-a384-ca64c96dd47f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "08ba745d-df3b-42c0-a384-ca64c96dd47f" (UID: "08ba745d-df3b-42c0-a384-ca64c96dd47f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:01:40 crc kubenswrapper[4922]: I0218 12:01:40.547503 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08ba745d-df3b-42c0-a384-ca64c96dd47f-inventory" (OuterVolumeSpecName: "inventory") pod "08ba745d-df3b-42c0-a384-ca64c96dd47f" (UID: "08ba745d-df3b-42c0-a384-ca64c96dd47f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:01:40 crc kubenswrapper[4922]: I0218 12:01:40.576104 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q96v5\" (UniqueName: \"kubernetes.io/projected/08ba745d-df3b-42c0-a384-ca64c96dd47f-kube-api-access-q96v5\") on node \"crc\" DevicePath \"\"" Feb 18 12:01:40 crc kubenswrapper[4922]: I0218 12:01:40.576153 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/08ba745d-df3b-42c0-a384-ca64c96dd47f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 12:01:40 crc kubenswrapper[4922]: I0218 12:01:40.576196 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/08ba745d-df3b-42c0-a384-ca64c96dd47f-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 12:01:40 crc kubenswrapper[4922]: I0218 12:01:40.860148 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zs9qz" Feb 18 12:01:40 crc kubenswrapper[4922]: I0218 12:01:40.861811 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zs9qz" event={"ID":"08ba745d-df3b-42c0-a384-ca64c96dd47f","Type":"ContainerDied","Data":"e05f61cfdec2576ef26cb87c3b27fc9b04217963669c18e44f62bb3cfcd46f2c"} Feb 18 12:01:40 crc kubenswrapper[4922]: I0218 12:01:40.861867 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e05f61cfdec2576ef26cb87c3b27fc9b04217963669c18e44f62bb3cfcd46f2c" Feb 18 12:01:40 crc kubenswrapper[4922]: I0218 12:01:40.932951 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5rdwv"] Feb 18 12:01:40 crc kubenswrapper[4922]: E0218 12:01:40.933438 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08ba745d-df3b-42c0-a384-ca64c96dd47f" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 18 12:01:40 crc kubenswrapper[4922]: I0218 12:01:40.933456 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="08ba745d-df3b-42c0-a384-ca64c96dd47f" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 18 12:01:40 crc kubenswrapper[4922]: I0218 12:01:40.933683 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="08ba745d-df3b-42c0-a384-ca64c96dd47f" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 18 12:01:40 crc kubenswrapper[4922]: I0218 12:01:40.934356 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5rdwv" Feb 18 12:01:40 crc kubenswrapper[4922]: I0218 12:01:40.937317 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 12:01:40 crc kubenswrapper[4922]: I0218 12:01:40.937969 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 12:01:40 crc kubenswrapper[4922]: I0218 12:01:40.938182 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8fsfv" Feb 18 12:01:40 crc kubenswrapper[4922]: I0218 12:01:40.940132 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 12:01:40 crc kubenswrapper[4922]: I0218 12:01:40.950082 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5rdwv"] Feb 18 12:01:40 crc kubenswrapper[4922]: I0218 12:01:40.985686 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2685dd3b-59b6-4879-b59a-215b187b1344-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5rdwv\" (UID: \"2685dd3b-59b6-4879-b59a-215b187b1344\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5rdwv" Feb 18 12:01:40 crc kubenswrapper[4922]: I0218 12:01:40.985736 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp5bp\" (UniqueName: \"kubernetes.io/projected/2685dd3b-59b6-4879-b59a-215b187b1344-kube-api-access-qp5bp\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5rdwv\" (UID: \"2685dd3b-59b6-4879-b59a-215b187b1344\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5rdwv" Feb 18 12:01:40 crc kubenswrapper[4922]: I0218 12:01:40.985758 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2685dd3b-59b6-4879-b59a-215b187b1344-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5rdwv\" (UID: \"2685dd3b-59b6-4879-b59a-215b187b1344\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5rdwv" Feb 18 12:01:40 crc kubenswrapper[4922]: I0218 12:01:40.985887 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2685dd3b-59b6-4879-b59a-215b187b1344-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5rdwv\" (UID: \"2685dd3b-59b6-4879-b59a-215b187b1344\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5rdwv" Feb 18 12:01:41 crc kubenswrapper[4922]: I0218 12:01:41.087570 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2685dd3b-59b6-4879-b59a-215b187b1344-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5rdwv\" (UID: \"2685dd3b-59b6-4879-b59a-215b187b1344\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5rdwv" Feb 18 12:01:41 crc kubenswrapper[4922]: I0218 12:01:41.087719 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2685dd3b-59b6-4879-b59a-215b187b1344-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5rdwv\" (UID: \"2685dd3b-59b6-4879-b59a-215b187b1344\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5rdwv" Feb 18 12:01:41 crc kubenswrapper[4922]: I0218 12:01:41.087743 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp5bp\" (UniqueName: \"kubernetes.io/projected/2685dd3b-59b6-4879-b59a-215b187b1344-kube-api-access-qp5bp\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5rdwv\" (UID: \"2685dd3b-59b6-4879-b59a-215b187b1344\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5rdwv" Feb 18 12:01:41 crc kubenswrapper[4922]: I0218 12:01:41.087760 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2685dd3b-59b6-4879-b59a-215b187b1344-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5rdwv\" (UID: \"2685dd3b-59b6-4879-b59a-215b187b1344\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5rdwv" Feb 18 12:01:41 crc kubenswrapper[4922]: I0218 12:01:41.091128 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2685dd3b-59b6-4879-b59a-215b187b1344-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5rdwv\" (UID: \"2685dd3b-59b6-4879-b59a-215b187b1344\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5rdwv" Feb 18 12:01:41 crc kubenswrapper[4922]: I0218 12:01:41.091134 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2685dd3b-59b6-4879-b59a-215b187b1344-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5rdwv\" (UID: \"2685dd3b-59b6-4879-b59a-215b187b1344\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5rdwv" Feb 18 12:01:41 crc kubenswrapper[4922]: I0218 12:01:41.091531 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2685dd3b-59b6-4879-b59a-215b187b1344-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5rdwv\" (UID: \"2685dd3b-59b6-4879-b59a-215b187b1344\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5rdwv" Feb 18 12:01:41 crc kubenswrapper[4922]: I0218 12:01:41.107490 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp5bp\" (UniqueName: \"kubernetes.io/projected/2685dd3b-59b6-4879-b59a-215b187b1344-kube-api-access-qp5bp\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5rdwv\" (UID: \"2685dd3b-59b6-4879-b59a-215b187b1344\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5rdwv" Feb 18 12:01:41 crc kubenswrapper[4922]: I0218 12:01:41.251578 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5rdwv" Feb 18 12:01:41 crc kubenswrapper[4922]: W0218 12:01:41.774145 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2685dd3b_59b6_4879_b59a_215b187b1344.slice/crio-d8930c40b05ae338628746413aafafd1402c2f3d896ff0e8171acc0037b589fd WatchSource:0}: Error finding container d8930c40b05ae338628746413aafafd1402c2f3d896ff0e8171acc0037b589fd: Status 404 returned error can't find the container with id d8930c40b05ae338628746413aafafd1402c2f3d896ff0e8171acc0037b589fd Feb 18 12:01:41 crc kubenswrapper[4922]: I0218 12:01:41.784016 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5rdwv"] Feb 18 12:01:41 crc kubenswrapper[4922]: I0218 12:01:41.873117 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5rdwv" event={"ID":"2685dd3b-59b6-4879-b59a-215b187b1344","Type":"ContainerStarted","Data":"d8930c40b05ae338628746413aafafd1402c2f3d896ff0e8171acc0037b589fd"} Feb 18 12:01:42 crc kubenswrapper[4922]: I0218 12:01:42.883229 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5rdwv" event={"ID":"2685dd3b-59b6-4879-b59a-215b187b1344","Type":"ContainerStarted","Data":"753438b91bc2c108c2317277ef49245049570003d115f3ff5c156e26e54c9647"} Feb 18 12:01:42 crc kubenswrapper[4922]: I0218 12:01:42.908874 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5rdwv" podStartSLOduration=2.5283173420000002 podStartE2EDuration="2.908852161s" podCreationTimestamp="2026-02-18 12:01:40 +0000 UTC" firstStartedPulling="2026-02-18 12:01:41.777181847 +0000 UTC m=+1503.504885927" lastFinishedPulling="2026-02-18 12:01:42.157716666 +0000 UTC m=+1503.885420746" observedRunningTime="2026-02-18 12:01:42.898437127 +0000 UTC m=+1504.626141207" watchObservedRunningTime="2026-02-18 12:01:42.908852161 +0000 UTC m=+1504.636556241" Feb 18 12:01:45 crc kubenswrapper[4922]: I0218 12:01:45.490049 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-s6f4m" Feb 18 12:01:45 crc kubenswrapper[4922]: I0218 12:01:45.490462 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-s6f4m" Feb 18 12:01:45 crc kubenswrapper[4922]: I0218 12:01:45.543144 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-s6f4m" Feb 18 12:01:45 crc kubenswrapper[4922]: I0218 12:01:45.953582 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-s6f4m" Feb 18 12:01:45 crc kubenswrapper[4922]: I0218 12:01:45.997547 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s6f4m"] Feb 18 12:01:47 crc kubenswrapper[4922]: I0218 12:01:47.932889 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-s6f4m" podUID="05c02623-ca55-4852-ac9f-1415e7d3abad" containerName="registry-server" containerID="cri-o://a289f1679934ad5340d76b537a689e2bc8e2dc588042505fe99286b54adb293a" gracePeriod=2 Feb 18 12:01:48 crc kubenswrapper[4922]: I0218 12:01:48.403012 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s6f4m" Feb 18 12:01:48 crc kubenswrapper[4922]: I0218 12:01:48.533838 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05c02623-ca55-4852-ac9f-1415e7d3abad-catalog-content\") pod \"05c02623-ca55-4852-ac9f-1415e7d3abad\" (UID: \"05c02623-ca55-4852-ac9f-1415e7d3abad\") " Feb 18 12:01:48 crc kubenswrapper[4922]: I0218 12:01:48.533911 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lhpc\" (UniqueName: \"kubernetes.io/projected/05c02623-ca55-4852-ac9f-1415e7d3abad-kube-api-access-5lhpc\") pod \"05c02623-ca55-4852-ac9f-1415e7d3abad\" (UID: \"05c02623-ca55-4852-ac9f-1415e7d3abad\") " Feb 18 12:01:48 crc kubenswrapper[4922]: I0218 12:01:48.534015 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05c02623-ca55-4852-ac9f-1415e7d3abad-utilities\") pod \"05c02623-ca55-4852-ac9f-1415e7d3abad\" (UID: \"05c02623-ca55-4852-ac9f-1415e7d3abad\") " Feb 18 12:01:48 crc kubenswrapper[4922]: I0218 12:01:48.535161 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05c02623-ca55-4852-ac9f-1415e7d3abad-utilities" (OuterVolumeSpecName: "utilities") pod "05c02623-ca55-4852-ac9f-1415e7d3abad" (UID: "05c02623-ca55-4852-ac9f-1415e7d3abad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:01:48 crc kubenswrapper[4922]: I0218 12:01:48.539832 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05c02623-ca55-4852-ac9f-1415e7d3abad-kube-api-access-5lhpc" (OuterVolumeSpecName: "kube-api-access-5lhpc") pod "05c02623-ca55-4852-ac9f-1415e7d3abad" (UID: "05c02623-ca55-4852-ac9f-1415e7d3abad"). InnerVolumeSpecName "kube-api-access-5lhpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:01:48 crc kubenswrapper[4922]: I0218 12:01:48.586277 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05c02623-ca55-4852-ac9f-1415e7d3abad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "05c02623-ca55-4852-ac9f-1415e7d3abad" (UID: "05c02623-ca55-4852-ac9f-1415e7d3abad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:01:48 crc kubenswrapper[4922]: I0218 12:01:48.636713 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05c02623-ca55-4852-ac9f-1415e7d3abad-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 12:01:48 crc kubenswrapper[4922]: I0218 12:01:48.636766 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05c02623-ca55-4852-ac9f-1415e7d3abad-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 12:01:48 crc kubenswrapper[4922]: I0218 12:01:48.636787 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lhpc\" (UniqueName: \"kubernetes.io/projected/05c02623-ca55-4852-ac9f-1415e7d3abad-kube-api-access-5lhpc\") on node \"crc\" DevicePath \"\"" Feb 18 12:01:48 crc kubenswrapper[4922]: I0218 12:01:48.944609 4922 generic.go:334] "Generic (PLEG): container finished" podID="05c02623-ca55-4852-ac9f-1415e7d3abad" containerID="a289f1679934ad5340d76b537a689e2bc8e2dc588042505fe99286b54adb293a" exitCode=0 Feb 18 12:01:48 crc kubenswrapper[4922]: I0218 12:01:48.944679 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s6f4m" event={"ID":"05c02623-ca55-4852-ac9f-1415e7d3abad","Type":"ContainerDied","Data":"a289f1679934ad5340d76b537a689e2bc8e2dc588042505fe99286b54adb293a"} Feb 18 12:01:48 crc kubenswrapper[4922]: I0218 12:01:48.944681 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s6f4m" Feb 18 12:01:48 crc kubenswrapper[4922]: I0218 12:01:48.944737 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s6f4m" event={"ID":"05c02623-ca55-4852-ac9f-1415e7d3abad","Type":"ContainerDied","Data":"db4c510ec11e469c18d71a0c6c6742633749b823ee5c8d01a1d647d346beade0"} Feb 18 12:01:48 crc kubenswrapper[4922]: I0218 12:01:48.944776 4922 scope.go:117] "RemoveContainer" containerID="a289f1679934ad5340d76b537a689e2bc8e2dc588042505fe99286b54adb293a" Feb 18 12:01:48 crc kubenswrapper[4922]: I0218 12:01:48.969006 4922 scope.go:117] "RemoveContainer" containerID="b2420c2b091c5665f89700b0399e1d80eed8c610e604c4b313e169f55232a9ec" Feb 18 12:01:48 crc kubenswrapper[4922]: I0218 12:01:48.986712 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s6f4m"] Feb 18 12:01:48 crc kubenswrapper[4922]: I0218 12:01:48.996719 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-s6f4m"] Feb 18 12:01:49 crc kubenswrapper[4922]: I0218 12:01:49.001705 4922 scope.go:117] "RemoveContainer" containerID="5a30dcc68908a23a96146c41c7a80874cf0e57089acae944a8f86d7b46b07b89" Feb 18 12:01:49 crc kubenswrapper[4922]: I0218 12:01:49.053175 4922 scope.go:117] "RemoveContainer" containerID="a289f1679934ad5340d76b537a689e2bc8e2dc588042505fe99286b54adb293a" Feb 18 12:01:49 crc kubenswrapper[4922]: E0218 12:01:49.053783 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a289f1679934ad5340d76b537a689e2bc8e2dc588042505fe99286b54adb293a\": container with ID starting with a289f1679934ad5340d76b537a689e2bc8e2dc588042505fe99286b54adb293a not found: ID does not exist" containerID="a289f1679934ad5340d76b537a689e2bc8e2dc588042505fe99286b54adb293a" Feb 18 12:01:49 crc kubenswrapper[4922]: I0218 12:01:49.053828 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a289f1679934ad5340d76b537a689e2bc8e2dc588042505fe99286b54adb293a"} err="failed to get container status \"a289f1679934ad5340d76b537a689e2bc8e2dc588042505fe99286b54adb293a\": rpc error: code = NotFound desc = could not find container \"a289f1679934ad5340d76b537a689e2bc8e2dc588042505fe99286b54adb293a\": container with ID starting with a289f1679934ad5340d76b537a689e2bc8e2dc588042505fe99286b54adb293a not found: ID does not exist" Feb 18 12:01:49 crc kubenswrapper[4922]: I0218 12:01:49.053856 4922 scope.go:117] "RemoveContainer" containerID="b2420c2b091c5665f89700b0399e1d80eed8c610e604c4b313e169f55232a9ec" Feb 18 12:01:49 crc kubenswrapper[4922]: E0218 12:01:49.054110 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2420c2b091c5665f89700b0399e1d80eed8c610e604c4b313e169f55232a9ec\": container with ID starting with b2420c2b091c5665f89700b0399e1d80eed8c610e604c4b313e169f55232a9ec not found: ID does not exist" containerID="b2420c2b091c5665f89700b0399e1d80eed8c610e604c4b313e169f55232a9ec" Feb 18 12:01:49 crc kubenswrapper[4922]: I0218 12:01:49.054132 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2420c2b091c5665f89700b0399e1d80eed8c610e604c4b313e169f55232a9ec"} err="failed to get container status \"b2420c2b091c5665f89700b0399e1d80eed8c610e604c4b313e169f55232a9ec\": rpc error: code = NotFound desc = could not find container \"b2420c2b091c5665f89700b0399e1d80eed8c610e604c4b313e169f55232a9ec\": container with ID starting with b2420c2b091c5665f89700b0399e1d80eed8c610e604c4b313e169f55232a9ec not found: ID does not exist" Feb 18 12:01:49 crc kubenswrapper[4922]: I0218 12:01:49.054148 4922 scope.go:117] "RemoveContainer" containerID="5a30dcc68908a23a96146c41c7a80874cf0e57089acae944a8f86d7b46b07b89" Feb 18 12:01:49 crc kubenswrapper[4922]: E0218 12:01:49.054400 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a30dcc68908a23a96146c41c7a80874cf0e57089acae944a8f86d7b46b07b89\": container with ID starting with 5a30dcc68908a23a96146c41c7a80874cf0e57089acae944a8f86d7b46b07b89 not found: ID does not exist" containerID="5a30dcc68908a23a96146c41c7a80874cf0e57089acae944a8f86d7b46b07b89" Feb 18 12:01:49 crc kubenswrapper[4922]: I0218 12:01:49.054441 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a30dcc68908a23a96146c41c7a80874cf0e57089acae944a8f86d7b46b07b89"} err="failed to get container status \"5a30dcc68908a23a96146c41c7a80874cf0e57089acae944a8f86d7b46b07b89\": rpc error: code = NotFound desc = could not find container \"5a30dcc68908a23a96146c41c7a80874cf0e57089acae944a8f86d7b46b07b89\": container with ID starting with 5a30dcc68908a23a96146c41c7a80874cf0e57089acae944a8f86d7b46b07b89 not found: ID does not exist" Feb 18 12:01:50 crc kubenswrapper[4922]: I0218 12:01:50.996555 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05c02623-ca55-4852-ac9f-1415e7d3abad" path="/var/lib/kubelet/pods/05c02623-ca55-4852-ac9f-1415e7d3abad/volumes" Feb 18 12:01:57 crc kubenswrapper[4922]: I0218 12:01:57.429686 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bcvc8"] Feb 18 12:01:57 crc kubenswrapper[4922]: E0218 12:01:57.431895 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05c02623-ca55-4852-ac9f-1415e7d3abad" containerName="extract-content" Feb 18 12:01:57 crc kubenswrapper[4922]: I0218 12:01:57.432005 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="05c02623-ca55-4852-ac9f-1415e7d3abad" containerName="extract-content" Feb 18 12:01:57 crc kubenswrapper[4922]: E0218 12:01:57.432100 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05c02623-ca55-4852-ac9f-1415e7d3abad" containerName="extract-utilities" Feb 18 12:01:57 crc kubenswrapper[4922]: I0218 12:01:57.432174 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="05c02623-ca55-4852-ac9f-1415e7d3abad" containerName="extract-utilities" Feb 18 12:01:57 crc kubenswrapper[4922]: E0218 12:01:57.432242 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05c02623-ca55-4852-ac9f-1415e7d3abad" containerName="registry-server" Feb 18 12:01:57 crc kubenswrapper[4922]: I0218 12:01:57.432295 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="05c02623-ca55-4852-ac9f-1415e7d3abad" containerName="registry-server" Feb 18 12:01:57 crc kubenswrapper[4922]: I0218 12:01:57.432625 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="05c02623-ca55-4852-ac9f-1415e7d3abad" containerName="registry-server" Feb 18 12:01:57 crc kubenswrapper[4922]: I0218 12:01:57.434292 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bcvc8" Feb 18 12:01:57 crc kubenswrapper[4922]: I0218 12:01:57.446564 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bcvc8"] Feb 18 12:01:57 crc kubenswrapper[4922]: I0218 12:01:57.609256 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/208a6c21-2dd8-4f3d-8ca6-b767d89bb091-catalog-content\") pod \"redhat-marketplace-bcvc8\" (UID: \"208a6c21-2dd8-4f3d-8ca6-b767d89bb091\") " pod="openshift-marketplace/redhat-marketplace-bcvc8" Feb 18 12:01:57 crc kubenswrapper[4922]: I0218 12:01:57.609523 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/208a6c21-2dd8-4f3d-8ca6-b767d89bb091-utilities\") pod \"redhat-marketplace-bcvc8\" (UID: \"208a6c21-2dd8-4f3d-8ca6-b767d89bb091\") " pod="openshift-marketplace/redhat-marketplace-bcvc8" Feb 18 12:01:57 crc kubenswrapper[4922]: I0218 12:01:57.609625 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5bs7\" (UniqueName: \"kubernetes.io/projected/208a6c21-2dd8-4f3d-8ca6-b767d89bb091-kube-api-access-v5bs7\") pod \"redhat-marketplace-bcvc8\" (UID: \"208a6c21-2dd8-4f3d-8ca6-b767d89bb091\") " pod="openshift-marketplace/redhat-marketplace-bcvc8" Feb 18 12:01:57 crc kubenswrapper[4922]: I0218 12:01:57.711211 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/208a6c21-2dd8-4f3d-8ca6-b767d89bb091-catalog-content\") pod \"redhat-marketplace-bcvc8\" (UID: \"208a6c21-2dd8-4f3d-8ca6-b767d89bb091\") " pod="openshift-marketplace/redhat-marketplace-bcvc8" Feb 18 12:01:57 crc kubenswrapper[4922]: I0218 12:01:57.711260 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/208a6c21-2dd8-4f3d-8ca6-b767d89bb091-utilities\") pod \"redhat-marketplace-bcvc8\" (UID: \"208a6c21-2dd8-4f3d-8ca6-b767d89bb091\") " pod="openshift-marketplace/redhat-marketplace-bcvc8" Feb 18 12:01:57 crc kubenswrapper[4922]: I0218 12:01:57.711286 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5bs7\" (UniqueName: \"kubernetes.io/projected/208a6c21-2dd8-4f3d-8ca6-b767d89bb091-kube-api-access-v5bs7\") pod \"redhat-marketplace-bcvc8\" (UID: \"208a6c21-2dd8-4f3d-8ca6-b767d89bb091\") " pod="openshift-marketplace/redhat-marketplace-bcvc8" Feb 18 12:01:57 crc kubenswrapper[4922]: I0218 12:01:57.711670 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/208a6c21-2dd8-4f3d-8ca6-b767d89bb091-catalog-content\") pod \"redhat-marketplace-bcvc8\" (UID: \"208a6c21-2dd8-4f3d-8ca6-b767d89bb091\") " pod="openshift-marketplace/redhat-marketplace-bcvc8" Feb 18 12:01:57 crc kubenswrapper[4922]: I0218 12:01:57.711733 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/208a6c21-2dd8-4f3d-8ca6-b767d89bb091-utilities\") pod \"redhat-marketplace-bcvc8\" (UID: \"208a6c21-2dd8-4f3d-8ca6-b767d89bb091\") " pod="openshift-marketplace/redhat-marketplace-bcvc8" Feb 18 12:01:57 crc kubenswrapper[4922]: I0218 12:01:57.729904 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5bs7\" (UniqueName: \"kubernetes.io/projected/208a6c21-2dd8-4f3d-8ca6-b767d89bb091-kube-api-access-v5bs7\") pod \"redhat-marketplace-bcvc8\" (UID: \"208a6c21-2dd8-4f3d-8ca6-b767d89bb091\") " pod="openshift-marketplace/redhat-marketplace-bcvc8" Feb 18 12:01:57 crc kubenswrapper[4922]: I0218 12:01:57.761743 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bcvc8" Feb 18 12:01:58 crc kubenswrapper[4922]: I0218 12:01:58.268727 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bcvc8"] Feb 18 12:01:59 crc kubenswrapper[4922]: I0218 12:01:59.037963 4922 generic.go:334] "Generic (PLEG): container finished" podID="208a6c21-2dd8-4f3d-8ca6-b767d89bb091" containerID="5b0a2a7731dab0112315f444a33e408175b5c28abb2f69079912cabb44e2557e" exitCode=0 Feb 18 12:01:59 crc kubenswrapper[4922]: I0218 12:01:59.038057 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bcvc8" event={"ID":"208a6c21-2dd8-4f3d-8ca6-b767d89bb091","Type":"ContainerDied","Data":"5b0a2a7731dab0112315f444a33e408175b5c28abb2f69079912cabb44e2557e"} Feb 18 12:01:59 crc kubenswrapper[4922]: I0218 12:01:59.038281 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bcvc8" event={"ID":"208a6c21-2dd8-4f3d-8ca6-b767d89bb091","Type":"ContainerStarted","Data":"f9aa604d477338624553d6467acab8f953e16b7d54b6e61af455d59803710c81"} Feb 18 12:02:01 crc kubenswrapper[4922]: I0218 12:02:01.079426 4922 generic.go:334] "Generic (PLEG): container finished" podID="208a6c21-2dd8-4f3d-8ca6-b767d89bb091" containerID="20488b03c617a43634c46564fbbe667e0a9ed54d02bba1ba2ca81cbbea0dad2c" exitCode=0 Feb 18 12:02:01 crc kubenswrapper[4922]: I0218 12:02:01.079508 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bcvc8" event={"ID":"208a6c21-2dd8-4f3d-8ca6-b767d89bb091","Type":"ContainerDied","Data":"20488b03c617a43634c46564fbbe667e0a9ed54d02bba1ba2ca81cbbea0dad2c"} Feb 18 12:02:02 crc kubenswrapper[4922]: I0218 12:02:02.091303 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bcvc8" event={"ID":"208a6c21-2dd8-4f3d-8ca6-b767d89bb091","Type":"ContainerStarted","Data":"f4bd75adca76d727b7937c3ad606434e562eddda6632e0e9b7e47844174dbe7d"} Feb 18 12:02:07 crc kubenswrapper[4922]: I0218 12:02:07.762257 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bcvc8" Feb 18 12:02:07 crc kubenswrapper[4922]: I0218 12:02:07.762890 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bcvc8" Feb 18 12:02:07 crc kubenswrapper[4922]: I0218 12:02:07.821528 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bcvc8" Feb 18 12:02:07 crc kubenswrapper[4922]: I0218 12:02:07.842187 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bcvc8" podStartSLOduration=8.368031399 podStartE2EDuration="10.842168601s" podCreationTimestamp="2026-02-18 12:01:57 +0000 UTC" firstStartedPulling="2026-02-18 12:01:59.040784853 +0000 UTC m=+1520.768488933" lastFinishedPulling="2026-02-18 12:02:01.514922055 +0000 UTC m=+1523.242626135" observedRunningTime="2026-02-18 12:02:02.116275641 +0000 UTC m=+1523.843979711" watchObservedRunningTime="2026-02-18 12:02:07.842168601 +0000 UTC m=+1529.569872681" Feb 18 12:02:08 crc kubenswrapper[4922]: I0218 12:02:08.203706 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bcvc8" Feb 18 12:02:08 crc kubenswrapper[4922]: I0218 12:02:08.250126 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bcvc8"] Feb 18 12:02:10 crc kubenswrapper[4922]: I0218 12:02:10.022964 4922 scope.go:117] "RemoveContainer" containerID="ffcb369954fde5f2470b893096827224a7452c99e4a60328dd0f303414ba87d8" Feb 18 12:02:10 crc kubenswrapper[4922]: I0218 12:02:10.181653 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bcvc8" podUID="208a6c21-2dd8-4f3d-8ca6-b767d89bb091" containerName="registry-server" containerID="cri-o://f4bd75adca76d727b7937c3ad606434e562eddda6632e0e9b7e47844174dbe7d" gracePeriod=2 Feb 18 12:02:11 crc kubenswrapper[4922]: I0218 12:02:11.197200 4922 generic.go:334] "Generic (PLEG): container finished" podID="208a6c21-2dd8-4f3d-8ca6-b767d89bb091" containerID="f4bd75adca76d727b7937c3ad606434e562eddda6632e0e9b7e47844174dbe7d" exitCode=0 Feb 18 12:02:11 crc kubenswrapper[4922]: I0218 12:02:11.197309 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bcvc8" event={"ID":"208a6c21-2dd8-4f3d-8ca6-b767d89bb091","Type":"ContainerDied","Data":"f4bd75adca76d727b7937c3ad606434e562eddda6632e0e9b7e47844174dbe7d"} Feb 18 12:02:11 crc kubenswrapper[4922]: I0218 12:02:11.197483 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bcvc8" event={"ID":"208a6c21-2dd8-4f3d-8ca6-b767d89bb091","Type":"ContainerDied","Data":"f9aa604d477338624553d6467acab8f953e16b7d54b6e61af455d59803710c81"} Feb 18 12:02:11 crc kubenswrapper[4922]: I0218 12:02:11.197503 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9aa604d477338624553d6467acab8f953e16b7d54b6e61af455d59803710c81" Feb 18 12:02:11 crc kubenswrapper[4922]: I0218 12:02:11.202818 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bcvc8" Feb 18 12:02:11 crc kubenswrapper[4922]: I0218 12:02:11.295200 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/208a6c21-2dd8-4f3d-8ca6-b767d89bb091-utilities\") pod \"208a6c21-2dd8-4f3d-8ca6-b767d89bb091\" (UID: \"208a6c21-2dd8-4f3d-8ca6-b767d89bb091\") " Feb 18 12:02:11 crc kubenswrapper[4922]: I0218 12:02:11.295336 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5bs7\" (UniqueName: \"kubernetes.io/projected/208a6c21-2dd8-4f3d-8ca6-b767d89bb091-kube-api-access-v5bs7\") pod \"208a6c21-2dd8-4f3d-8ca6-b767d89bb091\" (UID: \"208a6c21-2dd8-4f3d-8ca6-b767d89bb091\") " Feb 18 12:02:11 crc kubenswrapper[4922]: I0218 12:02:11.295635 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/208a6c21-2dd8-4f3d-8ca6-b767d89bb091-catalog-content\") pod \"208a6c21-2dd8-4f3d-8ca6-b767d89bb091\" (UID: \"208a6c21-2dd8-4f3d-8ca6-b767d89bb091\") " Feb 18 12:02:11 crc kubenswrapper[4922]: I0218 12:02:11.300294 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/208a6c21-2dd8-4f3d-8ca6-b767d89bb091-utilities" (OuterVolumeSpecName: "utilities") pod "208a6c21-2dd8-4f3d-8ca6-b767d89bb091" (UID: "208a6c21-2dd8-4f3d-8ca6-b767d89bb091"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:02:11 crc kubenswrapper[4922]: I0218 12:02:11.307304 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/208a6c21-2dd8-4f3d-8ca6-b767d89bb091-kube-api-access-v5bs7" (OuterVolumeSpecName: "kube-api-access-v5bs7") pod "208a6c21-2dd8-4f3d-8ca6-b767d89bb091" (UID: "208a6c21-2dd8-4f3d-8ca6-b767d89bb091"). InnerVolumeSpecName "kube-api-access-v5bs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:02:11 crc kubenswrapper[4922]: I0218 12:02:11.323416 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/208a6c21-2dd8-4f3d-8ca6-b767d89bb091-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "208a6c21-2dd8-4f3d-8ca6-b767d89bb091" (UID: "208a6c21-2dd8-4f3d-8ca6-b767d89bb091"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:02:11 crc kubenswrapper[4922]: I0218 12:02:11.398732 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/208a6c21-2dd8-4f3d-8ca6-b767d89bb091-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 12:02:11 crc kubenswrapper[4922]: I0218 12:02:11.398760 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/208a6c21-2dd8-4f3d-8ca6-b767d89bb091-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 12:02:11 crc kubenswrapper[4922]: I0218 12:02:11.398770 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5bs7\" (UniqueName: \"kubernetes.io/projected/208a6c21-2dd8-4f3d-8ca6-b767d89bb091-kube-api-access-v5bs7\") on node \"crc\" DevicePath \"\"" Feb 18 12:02:12 crc kubenswrapper[4922]: I0218 12:02:12.207218 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bcvc8" Feb 18 12:02:12 crc kubenswrapper[4922]: I0218 12:02:12.248264 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bcvc8"] Feb 18 12:02:12 crc kubenswrapper[4922]: I0218 12:02:12.263836 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bcvc8"] Feb 18 12:02:12 crc kubenswrapper[4922]: I0218 12:02:12.986909 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="208a6c21-2dd8-4f3d-8ca6-b767d89bb091" path="/var/lib/kubelet/pods/208a6c21-2dd8-4f3d-8ca6-b767d89bb091/volumes" Feb 18 12:03:10 crc kubenswrapper[4922]: I0218 12:03:10.143315 4922 scope.go:117] "RemoveContainer" containerID="1c2064f4a7cb5c8c928c435089f458b706b8fa933a0140eb0fd478bbb13155df" Feb 18 12:03:10 crc kubenswrapper[4922]: I0218 12:03:10.168303 4922 scope.go:117] "RemoveContainer" containerID="08eea4a1ca654ec0f46724358fe53403626c28895fe7bd0802a7388b3e60a117" Feb 18 12:03:39 crc kubenswrapper[4922]: I0218 12:03:39.807452 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 12:03:39 crc kubenswrapper[4922]: I0218 12:03:39.808034 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 12:04:03 crc kubenswrapper[4922]: I0218 12:04:03.052071 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-a3b1-account-create-update-5qfd8"] Feb 18 12:04:03 crc kubenswrapper[4922]: I0218 12:04:03.067384 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-mqx2n"] Feb 18 12:04:03 crc kubenswrapper[4922]: I0218 12:04:03.079224 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-95b8-account-create-update-58r6z"] Feb 18 12:04:03 crc kubenswrapper[4922]: I0218 12:04:03.090790 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-2714-account-create-update-j5l9f"] Feb 18 12:04:03 crc kubenswrapper[4922]: I0218 12:04:03.116076 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-dd51-account-create-update-fr8ml"] Feb 18 12:04:03 crc kubenswrapper[4922]: I0218 12:04:03.133197 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-2zkj4"] Feb 18 12:04:03 crc kubenswrapper[4922]: I0218 12:04:03.143790 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-a3b1-account-create-update-5qfd8"] Feb 18 12:04:03 crc kubenswrapper[4922]: I0218 12:04:03.155072 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-mqx2n"] Feb 18 12:04:03 crc kubenswrapper[4922]: I0218 12:04:03.164724 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-499z7"] Feb 18 12:04:03 crc kubenswrapper[4922]: I0218 12:04:03.173863 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-create-xj7zt"] Feb 18 12:04:03 crc kubenswrapper[4922]: I0218 12:04:03.183968 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-dd51-account-create-update-fr8ml"] Feb 18 12:04:03 crc kubenswrapper[4922]: I0218 12:04:03.193877 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-2714-account-create-update-j5l9f"] Feb 18 12:04:03 crc kubenswrapper[4922]: I0218 12:04:03.203547 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-create-xj7zt"] Feb 18 12:04:03 crc kubenswrapper[4922]: I0218 12:04:03.213115 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-2zkj4"] Feb 18 12:04:03 crc kubenswrapper[4922]: I0218 12:04:03.222469 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-499z7"] Feb 18 12:04:03 crc kubenswrapper[4922]: I0218 12:04:03.232589 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-95b8-account-create-update-58r6z"] Feb 18 12:04:04 crc kubenswrapper[4922]: I0218 12:04:04.985774 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05f03ea4-2462-4f2c-b9b8-395fc9802993" path="/var/lib/kubelet/pods/05f03ea4-2462-4f2c-b9b8-395fc9802993/volumes" Feb 18 12:04:04 crc kubenswrapper[4922]: I0218 12:04:04.986973 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d3c9160-dd6d-4591-9554-d3c74df3a64e" path="/var/lib/kubelet/pods/0d3c9160-dd6d-4591-9554-d3c74df3a64e/volumes" Feb 18 12:04:04 crc kubenswrapper[4922]: I0218 12:04:04.987695 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b15fbe3-8f30-41e8-8897-037694ccb56b" path="/var/lib/kubelet/pods/3b15fbe3-8f30-41e8-8897-037694ccb56b/volumes" Feb 18 12:04:04 crc kubenswrapper[4922]: I0218 12:04:04.988260 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e854dba-d50f-4228-9b7a-c8a0ae16347a" path="/var/lib/kubelet/pods/3e854dba-d50f-4228-9b7a-c8a0ae16347a/volumes" Feb 18 12:04:04 crc kubenswrapper[4922]: I0218 12:04:04.989352 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="811ffd65-f5dc-44a3-a1cb-778937ca9771" path="/var/lib/kubelet/pods/811ffd65-f5dc-44a3-a1cb-778937ca9771/volumes" Feb 18 12:04:04 crc kubenswrapper[4922]: I0218 12:04:04.989876 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85eec6a5-292b-4061-bb90-18904535d9cc" path="/var/lib/kubelet/pods/85eec6a5-292b-4061-bb90-18904535d9cc/volumes" Feb 18 12:04:04 crc kubenswrapper[4922]: I0218 12:04:04.990425 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cac3a541-a2f7-4d95-97ff-1361fbd3e81e" path="/var/lib/kubelet/pods/cac3a541-a2f7-4d95-97ff-1361fbd3e81e/volumes" Feb 18 12:04:04 crc kubenswrapper[4922]: I0218 12:04:04.991560 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc452273-8a5f-47d8-8aa5-1ddfe2240e28" path="/var/lib/kubelet/pods/cc452273-8a5f-47d8-8aa5-1ddfe2240e28/volumes" Feb 18 12:04:09 crc kubenswrapper[4922]: I0218 12:04:09.807622 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 12:04:09 crc kubenswrapper[4922]: I0218 12:04:09.808862 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 12:04:10 crc kubenswrapper[4922]: I0218 12:04:10.221244 4922 scope.go:117] "RemoveContainer" containerID="c34444b452577c8ce28445ed13810cd08c00c27fb0fdc17083d4a2fd0f78af8b" Feb 18 12:04:10 crc kubenswrapper[4922]: I0218 12:04:10.259159 4922 scope.go:117] "RemoveContainer" containerID="cca6d9a35f7a3b90b18a7c03d2317544897c0eada4e7ac4555360fd3747c2a53" Feb 18 12:04:10 crc kubenswrapper[4922]: I0218 12:04:10.322535 4922 scope.go:117] "RemoveContainer" containerID="d7cb0d481469976e2f0d27c45a95752ec61c6325990c025f51f551cbf42c3d51" Feb 18 12:04:10 crc kubenswrapper[4922]: I0218 12:04:10.392578 4922 scope.go:117] "RemoveContainer" containerID="e6310d509175d11956cf35f2881cac092138c8da93939b01d64f0006c350cdcc" Feb 18 12:04:10 crc kubenswrapper[4922]: I0218 12:04:10.415878 4922 scope.go:117] "RemoveContainer" containerID="a445d9cdaa7af5fce54eb1734c1edd288e3e96655dfca8e174ddcae4e353e89d" Feb 18 12:04:10 crc kubenswrapper[4922]: I0218 12:04:10.489352 4922 scope.go:117] "RemoveContainer" containerID="ede12ae546b38989e02144d9c29b45bfe17c0490fbfcd99cc1a79c54dc349009" Feb 18 12:04:10 crc kubenswrapper[4922]: I0218 12:04:10.536534 4922 scope.go:117] "RemoveContainer" containerID="fedb74d1b6b4c2f0bd2aef2df515af61a8cc0caff9248cf99b996d4ac610fc62" Feb 18 12:04:10 crc kubenswrapper[4922]: I0218 12:04:10.584093 4922 scope.go:117] "RemoveContainer" containerID="e7a3b382cba61f7101e30bbe239f4c456f983c4335448f6fab20d5e184472620" Feb 18 12:04:10 crc kubenswrapper[4922]: I0218 12:04:10.639288 4922 scope.go:117] "RemoveContainer" containerID="6c511819319a5fc193d0f75ed63d9083c79d2c12407691912b456da48212acab" Feb 18 12:04:10 crc kubenswrapper[4922]: I0218 12:04:10.663697 4922 scope.go:117] "RemoveContainer" containerID="019a6342d78a17e9007569b8df388ae4cd83074ceff02bfe55eb5c3a71054609" Feb 18 12:04:10 crc kubenswrapper[4922]: I0218 12:04:10.694424 4922 scope.go:117] "RemoveContainer" containerID="d21236550250b2958c4054030789e8473894e0d1d7c3c12b25573ed942732eed" Feb 18 12:04:10 crc kubenswrapper[4922]: I0218 12:04:10.719924 4922 scope.go:117] "RemoveContainer" containerID="38f62ebe43eed17090600fd985ab87c725adb4a3b86d21051e6be95923794e24" Feb 18 12:04:35 crc kubenswrapper[4922]: I0218 12:04:35.054125 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-czsfv"] Feb 18 12:04:35 crc kubenswrapper[4922]: I0218 12:04:35.067996 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5b89-account-create-update-x8q45"] Feb 18 12:04:35 crc kubenswrapper[4922]: I0218 12:04:35.081919 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-mvrlh"] Feb 18 12:04:35 crc kubenswrapper[4922]: I0218 12:04:35.090311 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-mvrlh"] Feb 18 12:04:35 crc kubenswrapper[4922]: I0218 12:04:35.099148 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-czsfv"] Feb 18 12:04:35 crc kubenswrapper[4922]: I0218 12:04:35.107515 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5b89-account-create-update-x8q45"] Feb 18 12:04:35 crc kubenswrapper[4922]: I0218 12:04:35.116433 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-f0db-account-create-update-vwv99"] Feb 18 12:04:35 crc kubenswrapper[4922]: I0218 12:04:35.126611 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-f0db-account-create-update-vwv99"] Feb 18 12:04:36 crc kubenswrapper[4922]: I0218 12:04:36.031006 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-g95qz"] Feb 18 12:04:36 crc kubenswrapper[4922]: I0218 12:04:36.040797 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-59fc-account-create-update-29wvh"] Feb 18 12:04:36 crc kubenswrapper[4922]: I0218 12:04:36.050827 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-59fc-account-create-update-29wvh"] Feb 18 12:04:36 crc kubenswrapper[4922]: I0218 12:04:36.061949 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-g95qz"] Feb 18 12:04:36 crc kubenswrapper[4922]: I0218 12:04:36.990240 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f413eca-d25a-4b47-82f6-e25088b65f2d" path="/var/lib/kubelet/pods/3f413eca-d25a-4b47-82f6-e25088b65f2d/volumes" Feb 18 12:04:36 crc kubenswrapper[4922]: I0218 12:04:36.991844 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46a88eb1-e58a-437e-b1eb-2dcb7e80b37f" path="/var/lib/kubelet/pods/46a88eb1-e58a-437e-b1eb-2dcb7e80b37f/volumes" Feb 18 12:04:36 crc kubenswrapper[4922]: I0218 12:04:36.992699 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76ba1bc1-3352-42a4-a80e-2fc2ac0e66eb" path="/var/lib/kubelet/pods/76ba1bc1-3352-42a4-a80e-2fc2ac0e66eb/volumes" Feb 18 12:04:36 crc kubenswrapper[4922]: I0218 12:04:36.993290 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87604619-ec13-480d-9456-c5062685287d" path="/var/lib/kubelet/pods/87604619-ec13-480d-9456-c5062685287d/volumes" Feb 18 12:04:36 crc kubenswrapper[4922]: I0218 12:04:36.994892 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b521417c-1968-49ee-8435-9e44af7e8a52" path="/var/lib/kubelet/pods/b521417c-1968-49ee-8435-9e44af7e8a52/volumes" Feb 18 12:04:36 crc kubenswrapper[4922]: I0218 12:04:36.995802 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9e55f2d-153c-47a0-95c4-84f8795ca57e" path="/var/lib/kubelet/pods/b9e55f2d-153c-47a0-95c4-84f8795ca57e/volumes" Feb 18 12:04:39 crc kubenswrapper[4922]: I0218 12:04:39.807948 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 12:04:39 crc kubenswrapper[4922]: I0218 12:04:39.808275 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 12:04:39 crc kubenswrapper[4922]: I0218 12:04:39.808341 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-znglx" Feb 18 12:04:39 crc kubenswrapper[4922]: I0218 12:04:39.809254 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8b1f76f76a4d90d637663bb4e07ac7219793473bf03151c5be72aa190a2db8df"} pod="openshift-machine-config-operator/machine-config-daemon-znglx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 12:04:39 crc kubenswrapper[4922]: I0218 12:04:39.809330 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" containerID="cri-o://8b1f76f76a4d90d637663bb4e07ac7219793473bf03151c5be72aa190a2db8df" gracePeriod=600 Feb 18 12:04:39 crc kubenswrapper[4922]: E0218 12:04:39.945752 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:04:40 crc kubenswrapper[4922]: I0218 12:04:40.813375 4922 generic.go:334] "Generic (PLEG): container finished" podID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerID="8b1f76f76a4d90d637663bb4e07ac7219793473bf03151c5be72aa190a2db8df" exitCode=0 Feb 18 12:04:40 crc kubenswrapper[4922]: I0218 12:04:40.813423 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" event={"ID":"fdb7cedc-b2e3-48f0-80e0-e17073b43228","Type":"ContainerDied","Data":"8b1f76f76a4d90d637663bb4e07ac7219793473bf03151c5be72aa190a2db8df"} Feb 18 12:04:40 crc kubenswrapper[4922]: I0218 12:04:40.813802 4922 scope.go:117] "RemoveContainer" containerID="ef7998df4ff2ac956dafb01bf87962d308e4ed2ee7ceb57165a1d59bde7c799c" Feb 18 12:04:40 crc kubenswrapper[4922]: I0218 12:04:40.814579 4922 scope.go:117] "RemoveContainer" containerID="8b1f76f76a4d90d637663bb4e07ac7219793473bf03151c5be72aa190a2db8df" Feb 18 12:04:40 crc kubenswrapper[4922]: E0218 12:04:40.814910 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:04:51 crc kubenswrapper[4922]: I0218 12:04:51.973703 4922 scope.go:117] "RemoveContainer" containerID="8b1f76f76a4d90d637663bb4e07ac7219793473bf03151c5be72aa190a2db8df" Feb 18 12:04:51 crc kubenswrapper[4922]: E0218 12:04:51.974650 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:04:52 crc kubenswrapper[4922]: I0218 12:04:52.960757 4922 generic.go:334] "Generic (PLEG): container finished" podID="2685dd3b-59b6-4879-b59a-215b187b1344" containerID="753438b91bc2c108c2317277ef49245049570003d115f3ff5c156e26e54c9647" exitCode=0 Feb 18 12:04:52 crc kubenswrapper[4922]: I0218 12:04:52.960830 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5rdwv" event={"ID":"2685dd3b-59b6-4879-b59a-215b187b1344","Type":"ContainerDied","Data":"753438b91bc2c108c2317277ef49245049570003d115f3ff5c156e26e54c9647"} Feb 18 12:04:54 crc kubenswrapper[4922]: I0218 12:04:54.409568 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5rdwv" Feb 18 12:04:54 crc kubenswrapper[4922]: I0218 12:04:54.552084 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qp5bp\" (UniqueName: \"kubernetes.io/projected/2685dd3b-59b6-4879-b59a-215b187b1344-kube-api-access-qp5bp\") pod \"2685dd3b-59b6-4879-b59a-215b187b1344\" (UID: \"2685dd3b-59b6-4879-b59a-215b187b1344\") " Feb 18 12:04:54 crc kubenswrapper[4922]: I0218 12:04:54.552560 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2685dd3b-59b6-4879-b59a-215b187b1344-inventory\") pod \"2685dd3b-59b6-4879-b59a-215b187b1344\" (UID: \"2685dd3b-59b6-4879-b59a-215b187b1344\") " Feb 18 12:04:54 crc kubenswrapper[4922]: I0218 12:04:54.552734 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2685dd3b-59b6-4879-b59a-215b187b1344-ssh-key-openstack-edpm-ipam\") pod \"2685dd3b-59b6-4879-b59a-215b187b1344\" (UID: \"2685dd3b-59b6-4879-b59a-215b187b1344\") " Feb 18 12:04:54 crc kubenswrapper[4922]: I0218 12:04:54.552797 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2685dd3b-59b6-4879-b59a-215b187b1344-bootstrap-combined-ca-bundle\") pod \"2685dd3b-59b6-4879-b59a-215b187b1344\" (UID: \"2685dd3b-59b6-4879-b59a-215b187b1344\") " Feb 18 12:04:54 crc kubenswrapper[4922]: I0218 12:04:54.563796 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2685dd3b-59b6-4879-b59a-215b187b1344-kube-api-access-qp5bp" (OuterVolumeSpecName: "kube-api-access-qp5bp") pod "2685dd3b-59b6-4879-b59a-215b187b1344" (UID: "2685dd3b-59b6-4879-b59a-215b187b1344"). InnerVolumeSpecName "kube-api-access-qp5bp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:04:54 crc kubenswrapper[4922]: I0218 12:04:54.572609 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2685dd3b-59b6-4879-b59a-215b187b1344-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "2685dd3b-59b6-4879-b59a-215b187b1344" (UID: "2685dd3b-59b6-4879-b59a-215b187b1344"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:04:54 crc kubenswrapper[4922]: I0218 12:04:54.589757 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2685dd3b-59b6-4879-b59a-215b187b1344-inventory" (OuterVolumeSpecName: "inventory") pod "2685dd3b-59b6-4879-b59a-215b187b1344" (UID: "2685dd3b-59b6-4879-b59a-215b187b1344"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:04:54 crc kubenswrapper[4922]: I0218 12:04:54.592049 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2685dd3b-59b6-4879-b59a-215b187b1344-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2685dd3b-59b6-4879-b59a-215b187b1344" (UID: "2685dd3b-59b6-4879-b59a-215b187b1344"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:04:54 crc kubenswrapper[4922]: I0218 12:04:54.657866 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2685dd3b-59b6-4879-b59a-215b187b1344-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 12:04:54 crc kubenswrapper[4922]: I0218 12:04:54.657943 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2685dd3b-59b6-4879-b59a-215b187b1344-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 12:04:54 crc kubenswrapper[4922]: I0218 12:04:54.657988 4922 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2685dd3b-59b6-4879-b59a-215b187b1344-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 12:04:54 crc kubenswrapper[4922]: I0218 12:04:54.658006 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qp5bp\" (UniqueName: \"kubernetes.io/projected/2685dd3b-59b6-4879-b59a-215b187b1344-kube-api-access-qp5bp\") on node \"crc\" DevicePath \"\"" Feb 18 12:04:54 crc kubenswrapper[4922]: I0218 12:04:54.986609 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5rdwv" Feb 18 12:04:54 crc kubenswrapper[4922]: I0218 12:04:54.998240 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5rdwv" event={"ID":"2685dd3b-59b6-4879-b59a-215b187b1344","Type":"ContainerDied","Data":"d8930c40b05ae338628746413aafafd1402c2f3d896ff0e8171acc0037b589fd"} Feb 18 12:04:54 crc kubenswrapper[4922]: I0218 12:04:54.998338 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8930c40b05ae338628746413aafafd1402c2f3d896ff0e8171acc0037b589fd" Feb 18 12:04:55 crc kubenswrapper[4922]: I0218 12:04:55.105289 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7b67m"] Feb 18 12:04:55 crc kubenswrapper[4922]: E0218 12:04:55.106072 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2685dd3b-59b6-4879-b59a-215b187b1344" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 18 12:04:55 crc kubenswrapper[4922]: I0218 12:04:55.106103 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="2685dd3b-59b6-4879-b59a-215b187b1344" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 18 12:04:55 crc kubenswrapper[4922]: E0218 12:04:55.106118 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="208a6c21-2dd8-4f3d-8ca6-b767d89bb091" containerName="registry-server" Feb 18 12:04:55 crc kubenswrapper[4922]: I0218 12:04:55.106125 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="208a6c21-2dd8-4f3d-8ca6-b767d89bb091" containerName="registry-server" Feb 18 12:04:55 crc kubenswrapper[4922]: E0218 12:04:55.106179 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="208a6c21-2dd8-4f3d-8ca6-b767d89bb091" containerName="extract-content" Feb 18 12:04:55 crc kubenswrapper[4922]: I0218 12:04:55.106191 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="208a6c21-2dd8-4f3d-8ca6-b767d89bb091" containerName="extract-content" Feb 18 12:04:55 crc kubenswrapper[4922]: E0218 12:04:55.106209 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="208a6c21-2dd8-4f3d-8ca6-b767d89bb091" containerName="extract-utilities" Feb 18 12:04:55 crc kubenswrapper[4922]: I0218 12:04:55.106218 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="208a6c21-2dd8-4f3d-8ca6-b767d89bb091" containerName="extract-utilities" Feb 18 12:04:55 crc kubenswrapper[4922]: I0218 12:04:55.106656 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="208a6c21-2dd8-4f3d-8ca6-b767d89bb091" containerName="registry-server" Feb 18 12:04:55 crc kubenswrapper[4922]: I0218 12:04:55.106683 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="2685dd3b-59b6-4879-b59a-215b187b1344" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 18 12:04:55 crc kubenswrapper[4922]: I0218 12:04:55.107701 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7b67m" Feb 18 12:04:55 crc kubenswrapper[4922]: I0218 12:04:55.118528 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7b67m"] Feb 18 12:04:55 crc kubenswrapper[4922]: I0218 12:04:55.149860 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 12:04:55 crc kubenswrapper[4922]: I0218 12:04:55.150470 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8fsfv" Feb 18 12:04:55 crc kubenswrapper[4922]: I0218 12:04:55.150490 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 12:04:55 crc kubenswrapper[4922]: I0218 12:04:55.150770 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 12:04:55 crc kubenswrapper[4922]: I0218 12:04:55.169950 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28a59f5e-155a-44b9-827a-a48bf1615d3d-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-7b67m\" (UID: \"28a59f5e-155a-44b9-827a-a48bf1615d3d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7b67m" Feb 18 12:04:55 crc kubenswrapper[4922]: I0218 12:04:55.170090 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28a59f5e-155a-44b9-827a-a48bf1615d3d-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-7b67m\" (UID: \"28a59f5e-155a-44b9-827a-a48bf1615d3d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7b67m" Feb 18 12:04:55 crc kubenswrapper[4922]: I0218 12:04:55.170165 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h8gh\" (UniqueName: \"kubernetes.io/projected/28a59f5e-155a-44b9-827a-a48bf1615d3d-kube-api-access-8h8gh\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-7b67m\" (UID: \"28a59f5e-155a-44b9-827a-a48bf1615d3d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7b67m" Feb 18 12:04:55 crc kubenswrapper[4922]: I0218 12:04:55.272056 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28a59f5e-155a-44b9-827a-a48bf1615d3d-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-7b67m\" (UID: \"28a59f5e-155a-44b9-827a-a48bf1615d3d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7b67m" Feb 18 12:04:55 crc kubenswrapper[4922]: I0218 12:04:55.272210 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28a59f5e-155a-44b9-827a-a48bf1615d3d-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-7b67m\" (UID: \"28a59f5e-155a-44b9-827a-a48bf1615d3d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7b67m" Feb 18 12:04:55 crc kubenswrapper[4922]: I0218 12:04:55.272291 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8h8gh\" (UniqueName: \"kubernetes.io/projected/28a59f5e-155a-44b9-827a-a48bf1615d3d-kube-api-access-8h8gh\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-7b67m\" (UID: \"28a59f5e-155a-44b9-827a-a48bf1615d3d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7b67m" Feb 18 12:04:55 crc kubenswrapper[4922]: I0218 12:04:55.283128 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28a59f5e-155a-44b9-827a-a48bf1615d3d-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-7b67m\" (UID: \"28a59f5e-155a-44b9-827a-a48bf1615d3d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7b67m" Feb 18 12:04:55 crc kubenswrapper[4922]: I0218 12:04:55.283323 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28a59f5e-155a-44b9-827a-a48bf1615d3d-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-7b67m\" (UID: \"28a59f5e-155a-44b9-827a-a48bf1615d3d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7b67m" Feb 18 12:04:55 crc kubenswrapper[4922]: I0218 12:04:55.292741 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8h8gh\" (UniqueName: \"kubernetes.io/projected/28a59f5e-155a-44b9-827a-a48bf1615d3d-kube-api-access-8h8gh\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-7b67m\" (UID: \"28a59f5e-155a-44b9-827a-a48bf1615d3d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7b67m" Feb 18 12:04:55 crc kubenswrapper[4922]: I0218 12:04:55.476879 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7b67m" Feb 18 12:04:56 crc kubenswrapper[4922]: I0218 12:04:56.044439 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7b67m"] Feb 18 12:04:56 crc kubenswrapper[4922]: I0218 12:04:56.054568 4922 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 12:04:57 crc kubenswrapper[4922]: I0218 12:04:57.011345 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7b67m" event={"ID":"28a59f5e-155a-44b9-827a-a48bf1615d3d","Type":"ContainerStarted","Data":"e4247a54fe570c27479d2bc1e0c4442c1e54068086222ac8518d6232db6583ea"} Feb 18 12:04:57 crc kubenswrapper[4922]: I0218 12:04:57.012102 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7b67m" event={"ID":"28a59f5e-155a-44b9-827a-a48bf1615d3d","Type":"ContainerStarted","Data":"18ae4f5c65cb3d7731a9211b2c1da7f253992b418f856b813ce8c0520557a5a6"} Feb 18 12:04:57 crc kubenswrapper[4922]: I0218 12:04:57.039435 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7b67m" podStartSLOduration=1.515953669 podStartE2EDuration="2.039402143s" podCreationTimestamp="2026-02-18 12:04:55 +0000 UTC" firstStartedPulling="2026-02-18 12:04:56.054185995 +0000 UTC m=+1697.781890075" lastFinishedPulling="2026-02-18 12:04:56.577634469 +0000 UTC m=+1698.305338549" observedRunningTime="2026-02-18 12:04:57.03138805 +0000 UTC m=+1698.759092130" watchObservedRunningTime="2026-02-18 12:04:57.039402143 +0000 UTC m=+1698.767106223" Feb 18 12:04:57 crc kubenswrapper[4922]: I0218 12:04:57.064854 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-xhpvj"] Feb 18 12:04:57 crc kubenswrapper[4922]: I0218 12:04:57.078714 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-xhpvj"] Feb 18 12:04:58 crc kubenswrapper[4922]: I0218 12:04:58.989840 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f2ad2ed-7e29-4760-aa9f-e5bf6bc96056" path="/var/lib/kubelet/pods/3f2ad2ed-7e29-4760-aa9f-e5bf6bc96056/volumes" Feb 18 12:05:03 crc kubenswrapper[4922]: I0218 12:05:03.973461 4922 scope.go:117] "RemoveContainer" containerID="8b1f76f76a4d90d637663bb4e07ac7219793473bf03151c5be72aa190a2db8df" Feb 18 12:05:03 crc kubenswrapper[4922]: E0218 12:05:03.974900 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:05:10 crc kubenswrapper[4922]: I0218 12:05:10.950396 4922 scope.go:117] "RemoveContainer" containerID="c059c2685304c1bbe7cb0f5bf34e25f7ad09441877372a8258fba3155ab8954a" Feb 18 12:05:11 crc kubenswrapper[4922]: I0218 12:05:11.000682 4922 scope.go:117] "RemoveContainer" containerID="3b378847d4d22bec8f5e44e34e16024756ec3850a6ac282a1c69fb4c7cbed59f" Feb 18 12:05:11 crc kubenswrapper[4922]: I0218 12:05:11.037699 4922 scope.go:117] "RemoveContainer" containerID="c14ea122843eb522430115cdefefb0c3b7d6b58a0e5c203b5fb2fca56b1e56ba" Feb 18 12:05:11 crc kubenswrapper[4922]: I0218 12:05:11.094085 4922 scope.go:117] "RemoveContainer" containerID="2773d6c651dc9198f969541910251805f8f5c022eea560495fc1c18971e69a25" Feb 18 12:05:11 crc kubenswrapper[4922]: I0218 12:05:11.117976 4922 scope.go:117] "RemoveContainer" containerID="0f464cbb217061ebc522def4ee4f957ba0ab7d4cce7acd97b0cad24b01e4e4cc" Feb 18 12:05:11 crc kubenswrapper[4922]: I0218 12:05:11.176756 4922 scope.go:117] "RemoveContainer" containerID="459c3dccc59cc9a7a25c5999d57a9a40f0164adf154409ca83fde7883515bc8e" Feb 18 12:05:11 crc kubenswrapper[4922]: I0218 12:05:11.229729 4922 scope.go:117] "RemoveContainer" containerID="f872729a376e1215c5740d39fd4f7a50b0af6397c21fc8ffd7ccc6655e4b95bf" Feb 18 12:05:11 crc kubenswrapper[4922]: I0218 12:05:11.290713 4922 scope.go:117] "RemoveContainer" containerID="cb46a05482d6c2364b368bb1c2e067b6de93db6a4072db86c206647939a79206" Feb 18 12:05:17 crc kubenswrapper[4922]: I0218 12:05:17.974085 4922 scope.go:117] "RemoveContainer" containerID="8b1f76f76a4d90d637663bb4e07ac7219793473bf03151c5be72aa190a2db8df" Feb 18 12:05:17 crc kubenswrapper[4922]: E0218 12:05:17.975793 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:05:18 crc kubenswrapper[4922]: I0218 12:05:18.055465 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-fnkcj"] Feb 18 12:05:18 crc kubenswrapper[4922]: I0218 12:05:18.066712 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-fnkcj"] Feb 18 12:05:18 crc kubenswrapper[4922]: I0218 12:05:18.084904 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-sync-sznrv"] Feb 18 12:05:18 crc kubenswrapper[4922]: I0218 12:05:18.100821 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-sync-sznrv"] Feb 18 12:05:18 crc kubenswrapper[4922]: I0218 12:05:18.987552 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2102ef9b-8151-4edf-8b43-7c4486203911" path="/var/lib/kubelet/pods/2102ef9b-8151-4edf-8b43-7c4486203911/volumes" Feb 18 12:05:18 crc kubenswrapper[4922]: I0218 12:05:18.988110 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f835c05-4bbb-4678-9410-8523cf308f05" path="/var/lib/kubelet/pods/5f835c05-4bbb-4678-9410-8523cf308f05/volumes" Feb 18 12:05:31 crc kubenswrapper[4922]: I0218 12:05:31.973027 4922 scope.go:117] "RemoveContainer" containerID="8b1f76f76a4d90d637663bb4e07ac7219793473bf03151c5be72aa190a2db8df" Feb 18 12:05:31 crc kubenswrapper[4922]: E0218 12:05:31.973811 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:05:43 crc kubenswrapper[4922]: I0218 12:05:43.974087 4922 scope.go:117] "RemoveContainer" containerID="8b1f76f76a4d90d637663bb4e07ac7219793473bf03151c5be72aa190a2db8df" Feb 18 12:05:43 crc kubenswrapper[4922]: E0218 12:05:43.974875 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:05:54 crc kubenswrapper[4922]: I0218 12:05:54.973477 4922 scope.go:117] "RemoveContainer" containerID="8b1f76f76a4d90d637663bb4e07ac7219793473bf03151c5be72aa190a2db8df" Feb 18 12:05:54 crc kubenswrapper[4922]: E0218 12:05:54.974223 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:06:08 crc kubenswrapper[4922]: I0218 12:06:08.980612 4922 scope.go:117] "RemoveContainer" containerID="8b1f76f76a4d90d637663bb4e07ac7219793473bf03151c5be72aa190a2db8df" Feb 18 12:06:08 crc kubenswrapper[4922]: E0218 12:06:08.982738 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:06:11 crc kubenswrapper[4922]: I0218 12:06:11.499242 4922 scope.go:117] "RemoveContainer" containerID="96a538608703e5f575eecc4d7943dd23e9a2c36f206b74a346320d63b212ab98" Feb 18 12:06:11 crc kubenswrapper[4922]: I0218 12:06:11.528251 4922 scope.go:117] "RemoveContainer" containerID="7b6ad8f3e92b6bbfd943139b3553e32910b71922866701c9884de500a3eaebeb" Feb 18 12:06:15 crc kubenswrapper[4922]: I0218 12:06:15.050737 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-8pvj2"] Feb 18 12:06:15 crc kubenswrapper[4922]: I0218 12:06:15.066594 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-8pvj2"] Feb 18 12:06:17 crc kubenswrapper[4922]: I0218 12:06:17.005116 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0" path="/var/lib/kubelet/pods/53cda7fd-c342-49d3-a5bc-8fcb3e09b3e0/volumes" Feb 18 12:06:23 crc kubenswrapper[4922]: I0218 12:06:23.973696 4922 scope.go:117] "RemoveContainer" containerID="8b1f76f76a4d90d637663bb4e07ac7219793473bf03151c5be72aa190a2db8df" Feb 18 12:06:23 crc kubenswrapper[4922]: E0218 12:06:23.974554 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:06:37 crc kubenswrapper[4922]: I0218 12:06:37.973394 4922 scope.go:117] "RemoveContainer" containerID="8b1f76f76a4d90d637663bb4e07ac7219793473bf03151c5be72aa190a2db8df" Feb 18 12:06:37 crc kubenswrapper[4922]: E0218 12:06:37.975543 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:06:45 crc kubenswrapper[4922]: I0218 12:06:45.042076 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-clz29"] Feb 18 12:06:45 crc kubenswrapper[4922]: I0218 12:06:45.054446 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-clz29"] Feb 18 12:06:46 crc kubenswrapper[4922]: I0218 12:06:46.994541 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71318c6d-61ee-4fb4-8682-7cf3fc0ae044" path="/var/lib/kubelet/pods/71318c6d-61ee-4fb4-8682-7cf3fc0ae044/volumes" Feb 18 12:06:47 crc kubenswrapper[4922]: I0218 12:06:47.030213 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-mqfhx"] Feb 18 12:06:47 crc kubenswrapper[4922]: I0218 12:06:47.041130 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-mqfhx"] Feb 18 12:06:48 crc kubenswrapper[4922]: I0218 12:06:48.986583 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31aad152-dcb7-472f-a0f8-d90ae972442b" path="/var/lib/kubelet/pods/31aad152-dcb7-472f-a0f8-d90ae972442b/volumes" Feb 18 12:06:50 crc kubenswrapper[4922]: I0218 12:06:50.973234 4922 scope.go:117] "RemoveContainer" containerID="8b1f76f76a4d90d637663bb4e07ac7219793473bf03151c5be72aa190a2db8df" Feb 18 12:06:50 crc kubenswrapper[4922]: E0218 12:06:50.973991 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:06:53 crc kubenswrapper[4922]: I0218 12:06:53.040519 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-rvpx7"] Feb 18 12:06:53 crc kubenswrapper[4922]: I0218 12:06:53.049408 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-rvpx7"] Feb 18 12:06:54 crc kubenswrapper[4922]: I0218 12:06:54.987415 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7852f85-b8c5-458e-901c-3659c5ed2713" path="/var/lib/kubelet/pods/d7852f85-b8c5-458e-901c-3659c5ed2713/volumes" Feb 18 12:06:56 crc kubenswrapper[4922]: I0218 12:06:56.028578 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-st9pz"] Feb 18 12:06:56 crc kubenswrapper[4922]: I0218 12:06:56.039157 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-st9pz"] Feb 18 12:06:56 crc kubenswrapper[4922]: I0218 12:06:56.988422 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="855fb3ec-e473-4a99-a94f-cc96dda6d9c4" path="/var/lib/kubelet/pods/855fb3ec-e473-4a99-a94f-cc96dda6d9c4/volumes" Feb 18 12:07:00 crc kubenswrapper[4922]: I0218 12:07:00.276207 4922 generic.go:334] "Generic (PLEG): container finished" podID="28a59f5e-155a-44b9-827a-a48bf1615d3d" containerID="e4247a54fe570c27479d2bc1e0c4442c1e54068086222ac8518d6232db6583ea" exitCode=0 Feb 18 12:07:00 crc kubenswrapper[4922]: I0218 12:07:00.276302 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7b67m" event={"ID":"28a59f5e-155a-44b9-827a-a48bf1615d3d","Type":"ContainerDied","Data":"e4247a54fe570c27479d2bc1e0c4442c1e54068086222ac8518d6232db6583ea"} Feb 18 12:07:01 crc kubenswrapper[4922]: I0218 12:07:01.684105 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7b67m" Feb 18 12:07:01 crc kubenswrapper[4922]: I0218 12:07:01.819704 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8h8gh\" (UniqueName: \"kubernetes.io/projected/28a59f5e-155a-44b9-827a-a48bf1615d3d-kube-api-access-8h8gh\") pod \"28a59f5e-155a-44b9-827a-a48bf1615d3d\" (UID: \"28a59f5e-155a-44b9-827a-a48bf1615d3d\") " Feb 18 12:07:01 crc kubenswrapper[4922]: I0218 12:07:01.819832 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28a59f5e-155a-44b9-827a-a48bf1615d3d-inventory\") pod \"28a59f5e-155a-44b9-827a-a48bf1615d3d\" (UID: \"28a59f5e-155a-44b9-827a-a48bf1615d3d\") " Feb 18 12:07:01 crc kubenswrapper[4922]: I0218 12:07:01.819860 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28a59f5e-155a-44b9-827a-a48bf1615d3d-ssh-key-openstack-edpm-ipam\") pod \"28a59f5e-155a-44b9-827a-a48bf1615d3d\" (UID: \"28a59f5e-155a-44b9-827a-a48bf1615d3d\") " Feb 18 12:07:01 crc kubenswrapper[4922]: I0218 12:07:01.825925 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28a59f5e-155a-44b9-827a-a48bf1615d3d-kube-api-access-8h8gh" (OuterVolumeSpecName: "kube-api-access-8h8gh") pod "28a59f5e-155a-44b9-827a-a48bf1615d3d" (UID: "28a59f5e-155a-44b9-827a-a48bf1615d3d"). InnerVolumeSpecName "kube-api-access-8h8gh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:07:01 crc kubenswrapper[4922]: I0218 12:07:01.847161 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28a59f5e-155a-44b9-827a-a48bf1615d3d-inventory" (OuterVolumeSpecName: "inventory") pod "28a59f5e-155a-44b9-827a-a48bf1615d3d" (UID: "28a59f5e-155a-44b9-827a-a48bf1615d3d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:07:01 crc kubenswrapper[4922]: I0218 12:07:01.853351 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28a59f5e-155a-44b9-827a-a48bf1615d3d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "28a59f5e-155a-44b9-827a-a48bf1615d3d" (UID: "28a59f5e-155a-44b9-827a-a48bf1615d3d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:07:01 crc kubenswrapper[4922]: I0218 12:07:01.922941 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28a59f5e-155a-44b9-827a-a48bf1615d3d-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 12:07:01 crc kubenswrapper[4922]: I0218 12:07:01.923051 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28a59f5e-155a-44b9-827a-a48bf1615d3d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 12:07:01 crc kubenswrapper[4922]: I0218 12:07:01.923119 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8h8gh\" (UniqueName: \"kubernetes.io/projected/28a59f5e-155a-44b9-827a-a48bf1615d3d-kube-api-access-8h8gh\") on node \"crc\" DevicePath \"\"" Feb 18 12:07:02 crc kubenswrapper[4922]: I0218 12:07:02.295446 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7b67m" event={"ID":"28a59f5e-155a-44b9-827a-a48bf1615d3d","Type":"ContainerDied","Data":"18ae4f5c65cb3d7731a9211b2c1da7f253992b418f856b813ce8c0520557a5a6"} Feb 18 12:07:02 crc kubenswrapper[4922]: I0218 12:07:02.295492 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18ae4f5c65cb3d7731a9211b2c1da7f253992b418f856b813ce8c0520557a5a6" Feb 18 12:07:02 crc kubenswrapper[4922]: I0218 12:07:02.295498 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7b67m" Feb 18 12:07:02 crc kubenswrapper[4922]: I0218 12:07:02.385459 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ws7ln"] Feb 18 12:07:02 crc kubenswrapper[4922]: E0218 12:07:02.385895 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28a59f5e-155a-44b9-827a-a48bf1615d3d" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 18 12:07:02 crc kubenswrapper[4922]: I0218 12:07:02.385914 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="28a59f5e-155a-44b9-827a-a48bf1615d3d" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 18 12:07:02 crc kubenswrapper[4922]: I0218 12:07:02.386115 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="28a59f5e-155a-44b9-827a-a48bf1615d3d" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 18 12:07:02 crc kubenswrapper[4922]: I0218 12:07:02.387049 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ws7ln" Feb 18 12:07:02 crc kubenswrapper[4922]: I0218 12:07:02.396493 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ws7ln"] Feb 18 12:07:02 crc kubenswrapper[4922]: I0218 12:07:02.548459 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr9w4\" (UniqueName: \"kubernetes.io/projected/b2f62f96-5ba4-4d16-89d8-11ae5e941699-kube-api-access-tr9w4\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ws7ln\" (UID: \"b2f62f96-5ba4-4d16-89d8-11ae5e941699\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ws7ln" Feb 18 12:07:02 crc kubenswrapper[4922]: I0218 12:07:02.548587 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b2f62f96-5ba4-4d16-89d8-11ae5e941699-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ws7ln\" (UID: \"b2f62f96-5ba4-4d16-89d8-11ae5e941699\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ws7ln" Feb 18 12:07:02 crc kubenswrapper[4922]: I0218 12:07:02.548901 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b2f62f96-5ba4-4d16-89d8-11ae5e941699-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ws7ln\" (UID: \"b2f62f96-5ba4-4d16-89d8-11ae5e941699\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ws7ln" Feb 18 12:07:02 crc kubenswrapper[4922]: I0218 12:07:02.549724 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8fsfv" Feb 18 12:07:02 crc kubenswrapper[4922]: I0218 12:07:02.549969 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 12:07:02 crc kubenswrapper[4922]: I0218 12:07:02.551019 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 12:07:02 crc kubenswrapper[4922]: I0218 12:07:02.551589 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 12:07:02 crc kubenswrapper[4922]: I0218 12:07:02.650855 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tr9w4\" (UniqueName: \"kubernetes.io/projected/b2f62f96-5ba4-4d16-89d8-11ae5e941699-kube-api-access-tr9w4\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ws7ln\" (UID: \"b2f62f96-5ba4-4d16-89d8-11ae5e941699\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ws7ln" Feb 18 12:07:02 crc kubenswrapper[4922]: I0218 12:07:02.651219 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b2f62f96-5ba4-4d16-89d8-11ae5e941699-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ws7ln\" (UID: \"b2f62f96-5ba4-4d16-89d8-11ae5e941699\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ws7ln" Feb 18 12:07:02 crc kubenswrapper[4922]: I0218 12:07:02.651332 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b2f62f96-5ba4-4d16-89d8-11ae5e941699-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ws7ln\" (UID: \"b2f62f96-5ba4-4d16-89d8-11ae5e941699\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ws7ln" Feb 18 12:07:02 crc kubenswrapper[4922]: I0218 12:07:02.656217 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b2f62f96-5ba4-4d16-89d8-11ae5e941699-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ws7ln\" (UID: \"b2f62f96-5ba4-4d16-89d8-11ae5e941699\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ws7ln" Feb 18 12:07:02 crc kubenswrapper[4922]: I0218 12:07:02.662854 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b2f62f96-5ba4-4d16-89d8-11ae5e941699-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ws7ln\" (UID: \"b2f62f96-5ba4-4d16-89d8-11ae5e941699\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ws7ln" Feb 18 12:07:02 crc kubenswrapper[4922]: I0218 12:07:02.673548 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr9w4\" (UniqueName: \"kubernetes.io/projected/b2f62f96-5ba4-4d16-89d8-11ae5e941699-kube-api-access-tr9w4\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ws7ln\" (UID: \"b2f62f96-5ba4-4d16-89d8-11ae5e941699\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ws7ln" Feb 18 12:07:02 crc kubenswrapper[4922]: I0218 12:07:02.867232 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ws7ln" Feb 18 12:07:03 crc kubenswrapper[4922]: I0218 12:07:03.420752 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ws7ln"] Feb 18 12:07:04 crc kubenswrapper[4922]: I0218 12:07:04.315531 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ws7ln" event={"ID":"b2f62f96-5ba4-4d16-89d8-11ae5e941699","Type":"ContainerStarted","Data":"46c56531dc7f6fe8b2ba5d33360e8f5c404c054782958003bf246dc84e31961c"} Feb 18 12:07:04 crc kubenswrapper[4922]: I0218 12:07:04.316145 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ws7ln" event={"ID":"b2f62f96-5ba4-4d16-89d8-11ae5e941699","Type":"ContainerStarted","Data":"13c41ae592beddfaa7d7f1100f4757a6653e55fee294002ac2782f785aade831"} Feb 18 12:07:05 crc kubenswrapper[4922]: I0218 12:07:05.973324 4922 scope.go:117] "RemoveContainer" containerID="8b1f76f76a4d90d637663bb4e07ac7219793473bf03151c5be72aa190a2db8df" Feb 18 12:07:05 crc kubenswrapper[4922]: E0218 12:07:05.973898 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:07:11 crc kubenswrapper[4922]: I0218 12:07:11.650180 4922 scope.go:117] "RemoveContainer" containerID="fdbd37640dccec05f284f7ac0ade4661f3a31188c1203c2107b809735927d31b" Feb 18 12:07:11 crc kubenswrapper[4922]: I0218 12:07:11.686109 4922 scope.go:117] "RemoveContainer" containerID="a94ac8d35c2954541cdb5fa078a3d4a480cc9002ceb6fe5897faee87fe9e9f1f" Feb 18 12:07:11 crc kubenswrapper[4922]: I0218 12:07:11.728763 4922 scope.go:117] "RemoveContainer" containerID="f6b3d065bfd75b2cfcd638eb81caf2052d3e13f6ce152807bf20c4d48f24622e" Feb 18 12:07:11 crc kubenswrapper[4922]: I0218 12:07:11.776534 4922 scope.go:117] "RemoveContainer" containerID="5597fcea1ce088dbc75d5c2c33874c67f09dcb474426ad6372bb91fff5a29ef3" Feb 18 12:07:11 crc kubenswrapper[4922]: I0218 12:07:11.813388 4922 scope.go:117] "RemoveContainer" containerID="483ace8c8816a57983828e0b595d4b4e906ff56406e775bb5aac307ba89d5e30" Feb 18 12:07:19 crc kubenswrapper[4922]: I0218 12:07:19.044463 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ws7ln" podStartSLOduration=16.482675762 podStartE2EDuration="17.044443223s" podCreationTimestamp="2026-02-18 12:07:02 +0000 UTC" firstStartedPulling="2026-02-18 12:07:03.432175086 +0000 UTC m=+1825.159879166" lastFinishedPulling="2026-02-18 12:07:03.993942557 +0000 UTC m=+1825.721646627" observedRunningTime="2026-02-18 12:07:04.334596427 +0000 UTC m=+1826.062300517" watchObservedRunningTime="2026-02-18 12:07:19.044443223 +0000 UTC m=+1840.772147303" Feb 18 12:07:19 crc kubenswrapper[4922]: I0218 12:07:19.050696 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-zjb6x"] Feb 18 12:07:19 crc kubenswrapper[4922]: I0218 12:07:19.059917 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-zjb6x"] Feb 18 12:07:20 crc kubenswrapper[4922]: I0218 12:07:20.973276 4922 scope.go:117] "RemoveContainer" containerID="8b1f76f76a4d90d637663bb4e07ac7219793473bf03151c5be72aa190a2db8df" Feb 18 12:07:20 crc kubenswrapper[4922]: E0218 12:07:20.973639 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:07:20 crc kubenswrapper[4922]: I0218 12:07:20.991247 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bcd3608-244b-44f0-be1f-5d953cd35964" path="/var/lib/kubelet/pods/4bcd3608-244b-44f0-be1f-5d953cd35964/volumes" Feb 18 12:07:22 crc kubenswrapper[4922]: I0218 12:07:22.103209 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vklb7"] Feb 18 12:07:22 crc kubenswrapper[4922]: I0218 12:07:22.118769 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vklb7" Feb 18 12:07:22 crc kubenswrapper[4922]: I0218 12:07:22.138507 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vklb7"] Feb 18 12:07:22 crc kubenswrapper[4922]: I0218 12:07:22.233522 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm7gl\" (UniqueName: \"kubernetes.io/projected/ac7439c4-4267-4309-aae7-259734126f27-kube-api-access-lm7gl\") pod \"community-operators-vklb7\" (UID: \"ac7439c4-4267-4309-aae7-259734126f27\") " pod="openshift-marketplace/community-operators-vklb7" Feb 18 12:07:22 crc kubenswrapper[4922]: I0218 12:07:22.233760 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac7439c4-4267-4309-aae7-259734126f27-utilities\") pod \"community-operators-vklb7\" (UID: \"ac7439c4-4267-4309-aae7-259734126f27\") " pod="openshift-marketplace/community-operators-vklb7" Feb 18 12:07:22 crc kubenswrapper[4922]: I0218 12:07:22.233955 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac7439c4-4267-4309-aae7-259734126f27-catalog-content\") pod \"community-operators-vklb7\" (UID: \"ac7439c4-4267-4309-aae7-259734126f27\") " pod="openshift-marketplace/community-operators-vklb7" Feb 18 12:07:22 crc kubenswrapper[4922]: I0218 12:07:22.335746 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lm7gl\" (UniqueName: \"kubernetes.io/projected/ac7439c4-4267-4309-aae7-259734126f27-kube-api-access-lm7gl\") pod \"community-operators-vklb7\" (UID: \"ac7439c4-4267-4309-aae7-259734126f27\") " pod="openshift-marketplace/community-operators-vklb7" Feb 18 12:07:22 crc kubenswrapper[4922]: I0218 12:07:22.335845 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac7439c4-4267-4309-aae7-259734126f27-utilities\") pod \"community-operators-vklb7\" (UID: \"ac7439c4-4267-4309-aae7-259734126f27\") " pod="openshift-marketplace/community-operators-vklb7" Feb 18 12:07:22 crc kubenswrapper[4922]: I0218 12:07:22.335906 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac7439c4-4267-4309-aae7-259734126f27-catalog-content\") pod \"community-operators-vklb7\" (UID: \"ac7439c4-4267-4309-aae7-259734126f27\") " pod="openshift-marketplace/community-operators-vklb7" Feb 18 12:07:22 crc kubenswrapper[4922]: I0218 12:07:22.336466 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac7439c4-4267-4309-aae7-259734126f27-catalog-content\") pod \"community-operators-vklb7\" (UID: \"ac7439c4-4267-4309-aae7-259734126f27\") " pod="openshift-marketplace/community-operators-vklb7" Feb 18 12:07:22 crc kubenswrapper[4922]: I0218 12:07:22.336946 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac7439c4-4267-4309-aae7-259734126f27-utilities\") pod \"community-operators-vklb7\" (UID: \"ac7439c4-4267-4309-aae7-259734126f27\") " pod="openshift-marketplace/community-operators-vklb7" Feb 18 12:07:22 crc kubenswrapper[4922]: I0218 12:07:22.356151 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm7gl\" (UniqueName: \"kubernetes.io/projected/ac7439c4-4267-4309-aae7-259734126f27-kube-api-access-lm7gl\") pod \"community-operators-vklb7\" (UID: \"ac7439c4-4267-4309-aae7-259734126f27\") " pod="openshift-marketplace/community-operators-vklb7" Feb 18 12:07:22 crc kubenswrapper[4922]: I0218 12:07:22.441067 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vklb7" Feb 18 12:07:22 crc kubenswrapper[4922]: I0218 12:07:22.903857 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vklb7"] Feb 18 12:07:23 crc kubenswrapper[4922]: I0218 12:07:23.482399 4922 generic.go:334] "Generic (PLEG): container finished" podID="ac7439c4-4267-4309-aae7-259734126f27" containerID="95f824af57fb1a326e2795f6ef387d9d820419c7f79841467ba73d6d09a47ab0" exitCode=0 Feb 18 12:07:23 crc kubenswrapper[4922]: I0218 12:07:23.482453 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vklb7" event={"ID":"ac7439c4-4267-4309-aae7-259734126f27","Type":"ContainerDied","Data":"95f824af57fb1a326e2795f6ef387d9d820419c7f79841467ba73d6d09a47ab0"} Feb 18 12:07:23 crc kubenswrapper[4922]: I0218 12:07:23.482488 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vklb7" event={"ID":"ac7439c4-4267-4309-aae7-259734126f27","Type":"ContainerStarted","Data":"e53febc7959bdd6c792cd27a5466ba3ccfc2726e0fe755d18fad2114a8a250b6"} Feb 18 12:07:24 crc kubenswrapper[4922]: I0218 12:07:24.494328 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vklb7" event={"ID":"ac7439c4-4267-4309-aae7-259734126f27","Type":"ContainerStarted","Data":"1fcb5b811fa263973c9bb501d12265509e1f4dcccaa9ed6fa964c6181ac3e2ec"} Feb 18 12:07:25 crc kubenswrapper[4922]: I0218 12:07:25.506122 4922 generic.go:334] "Generic (PLEG): container finished" podID="ac7439c4-4267-4309-aae7-259734126f27" containerID="1fcb5b811fa263973c9bb501d12265509e1f4dcccaa9ed6fa964c6181ac3e2ec" exitCode=0 Feb 18 12:07:25 crc kubenswrapper[4922]: I0218 12:07:25.506487 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vklb7" event={"ID":"ac7439c4-4267-4309-aae7-259734126f27","Type":"ContainerDied","Data":"1fcb5b811fa263973c9bb501d12265509e1f4dcccaa9ed6fa964c6181ac3e2ec"} Feb 18 12:07:26 crc kubenswrapper[4922]: I0218 12:07:26.517699 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vklb7" event={"ID":"ac7439c4-4267-4309-aae7-259734126f27","Type":"ContainerStarted","Data":"f9838d0ef428602969b121ca90b6ec2759edb7614230d6ca342fd808e9027bdf"} Feb 18 12:07:26 crc kubenswrapper[4922]: I0218 12:07:26.541948 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vklb7" podStartSLOduration=2.093982673 podStartE2EDuration="4.54192547s" podCreationTimestamp="2026-02-18 12:07:22 +0000 UTC" firstStartedPulling="2026-02-18 12:07:23.484081927 +0000 UTC m=+1845.211786007" lastFinishedPulling="2026-02-18 12:07:25.932024724 +0000 UTC m=+1847.659728804" observedRunningTime="2026-02-18 12:07:26.534160454 +0000 UTC m=+1848.261864564" watchObservedRunningTime="2026-02-18 12:07:26.54192547 +0000 UTC m=+1848.269629550" Feb 18 12:07:32 crc kubenswrapper[4922]: I0218 12:07:32.442240 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vklb7" Feb 18 12:07:32 crc kubenswrapper[4922]: I0218 12:07:32.442960 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vklb7" Feb 18 12:07:32 crc kubenswrapper[4922]: I0218 12:07:32.490177 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vklb7" Feb 18 12:07:32 crc kubenswrapper[4922]: I0218 12:07:32.614500 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vklb7" Feb 18 12:07:32 crc kubenswrapper[4922]: I0218 12:07:32.724038 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vklb7"] Feb 18 12:07:33 crc kubenswrapper[4922]: I0218 12:07:33.973191 4922 scope.go:117] "RemoveContainer" containerID="8b1f76f76a4d90d637663bb4e07ac7219793473bf03151c5be72aa190a2db8df" Feb 18 12:07:33 crc kubenswrapper[4922]: E0218 12:07:33.973552 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:07:34 crc kubenswrapper[4922]: I0218 12:07:34.584836 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vklb7" podUID="ac7439c4-4267-4309-aae7-259734126f27" containerName="registry-server" containerID="cri-o://f9838d0ef428602969b121ca90b6ec2759edb7614230d6ca342fd808e9027bdf" gracePeriod=2 Feb 18 12:07:35 crc kubenswrapper[4922]: I0218 12:07:35.077191 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vklb7" Feb 18 12:07:35 crc kubenswrapper[4922]: I0218 12:07:35.103128 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac7439c4-4267-4309-aae7-259734126f27-catalog-content\") pod \"ac7439c4-4267-4309-aae7-259734126f27\" (UID: \"ac7439c4-4267-4309-aae7-259734126f27\") " Feb 18 12:07:35 crc kubenswrapper[4922]: I0218 12:07:35.103217 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac7439c4-4267-4309-aae7-259734126f27-utilities\") pod \"ac7439c4-4267-4309-aae7-259734126f27\" (UID: \"ac7439c4-4267-4309-aae7-259734126f27\") " Feb 18 12:07:35 crc kubenswrapper[4922]: I0218 12:07:35.103250 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lm7gl\" (UniqueName: \"kubernetes.io/projected/ac7439c4-4267-4309-aae7-259734126f27-kube-api-access-lm7gl\") pod \"ac7439c4-4267-4309-aae7-259734126f27\" (UID: \"ac7439c4-4267-4309-aae7-259734126f27\") " Feb 18 12:07:35 crc kubenswrapper[4922]: I0218 12:07:35.108447 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac7439c4-4267-4309-aae7-259734126f27-utilities" (OuterVolumeSpecName: "utilities") pod "ac7439c4-4267-4309-aae7-259734126f27" (UID: "ac7439c4-4267-4309-aae7-259734126f27"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:07:35 crc kubenswrapper[4922]: I0218 12:07:35.121231 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac7439c4-4267-4309-aae7-259734126f27-kube-api-access-lm7gl" (OuterVolumeSpecName: "kube-api-access-lm7gl") pod "ac7439c4-4267-4309-aae7-259734126f27" (UID: "ac7439c4-4267-4309-aae7-259734126f27"). InnerVolumeSpecName "kube-api-access-lm7gl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:07:35 crc kubenswrapper[4922]: I0218 12:07:35.161690 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac7439c4-4267-4309-aae7-259734126f27-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac7439c4-4267-4309-aae7-259734126f27" (UID: "ac7439c4-4267-4309-aae7-259734126f27"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:07:35 crc kubenswrapper[4922]: I0218 12:07:35.205888 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac7439c4-4267-4309-aae7-259734126f27-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 12:07:35 crc kubenswrapper[4922]: I0218 12:07:35.205943 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac7439c4-4267-4309-aae7-259734126f27-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 12:07:35 crc kubenswrapper[4922]: I0218 12:07:35.205953 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lm7gl\" (UniqueName: \"kubernetes.io/projected/ac7439c4-4267-4309-aae7-259734126f27-kube-api-access-lm7gl\") on node \"crc\" DevicePath \"\"" Feb 18 12:07:35 crc kubenswrapper[4922]: I0218 12:07:35.597000 4922 generic.go:334] "Generic (PLEG): container finished" podID="ac7439c4-4267-4309-aae7-259734126f27" containerID="f9838d0ef428602969b121ca90b6ec2759edb7614230d6ca342fd808e9027bdf" exitCode=0 Feb 18 12:07:35 crc kubenswrapper[4922]: I0218 12:07:35.597060 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vklb7" event={"ID":"ac7439c4-4267-4309-aae7-259734126f27","Type":"ContainerDied","Data":"f9838d0ef428602969b121ca90b6ec2759edb7614230d6ca342fd808e9027bdf"} Feb 18 12:07:35 crc kubenswrapper[4922]: I0218 12:07:35.597091 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vklb7" event={"ID":"ac7439c4-4267-4309-aae7-259734126f27","Type":"ContainerDied","Data":"e53febc7959bdd6c792cd27a5466ba3ccfc2726e0fe755d18fad2114a8a250b6"} Feb 18 12:07:35 crc kubenswrapper[4922]: I0218 12:07:35.597107 4922 scope.go:117] "RemoveContainer" containerID="f9838d0ef428602969b121ca90b6ec2759edb7614230d6ca342fd808e9027bdf" Feb 18 12:07:35 crc kubenswrapper[4922]: I0218 12:07:35.597112 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vklb7" Feb 18 12:07:35 crc kubenswrapper[4922]: I0218 12:07:35.621613 4922 scope.go:117] "RemoveContainer" containerID="1fcb5b811fa263973c9bb501d12265509e1f4dcccaa9ed6fa964c6181ac3e2ec" Feb 18 12:07:35 crc kubenswrapper[4922]: I0218 12:07:35.655396 4922 scope.go:117] "RemoveContainer" containerID="95f824af57fb1a326e2795f6ef387d9d820419c7f79841467ba73d6d09a47ab0" Feb 18 12:07:35 crc kubenswrapper[4922]: I0218 12:07:35.662886 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vklb7"] Feb 18 12:07:35 crc kubenswrapper[4922]: I0218 12:07:35.701823 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vklb7"] Feb 18 12:07:35 crc kubenswrapper[4922]: I0218 12:07:35.733536 4922 scope.go:117] "RemoveContainer" containerID="f9838d0ef428602969b121ca90b6ec2759edb7614230d6ca342fd808e9027bdf" Feb 18 12:07:35 crc kubenswrapper[4922]: E0218 12:07:35.757920 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9838d0ef428602969b121ca90b6ec2759edb7614230d6ca342fd808e9027bdf\": container with ID starting with f9838d0ef428602969b121ca90b6ec2759edb7614230d6ca342fd808e9027bdf not found: ID does not exist" containerID="f9838d0ef428602969b121ca90b6ec2759edb7614230d6ca342fd808e9027bdf" Feb 18 12:07:35 crc kubenswrapper[4922]: I0218 12:07:35.757983 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9838d0ef428602969b121ca90b6ec2759edb7614230d6ca342fd808e9027bdf"} err="failed to get container status \"f9838d0ef428602969b121ca90b6ec2759edb7614230d6ca342fd808e9027bdf\": rpc error: code = NotFound desc = could not find container \"f9838d0ef428602969b121ca90b6ec2759edb7614230d6ca342fd808e9027bdf\": container with ID starting with f9838d0ef428602969b121ca90b6ec2759edb7614230d6ca342fd808e9027bdf not found: ID does not exist" Feb 18 12:07:35 crc kubenswrapper[4922]: I0218 12:07:35.758017 4922 scope.go:117] "RemoveContainer" containerID="1fcb5b811fa263973c9bb501d12265509e1f4dcccaa9ed6fa964c6181ac3e2ec" Feb 18 12:07:35 crc kubenswrapper[4922]: E0218 12:07:35.785152 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fcb5b811fa263973c9bb501d12265509e1f4dcccaa9ed6fa964c6181ac3e2ec\": container with ID starting with 1fcb5b811fa263973c9bb501d12265509e1f4dcccaa9ed6fa964c6181ac3e2ec not found: ID does not exist" containerID="1fcb5b811fa263973c9bb501d12265509e1f4dcccaa9ed6fa964c6181ac3e2ec" Feb 18 12:07:35 crc kubenswrapper[4922]: I0218 12:07:35.785241 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fcb5b811fa263973c9bb501d12265509e1f4dcccaa9ed6fa964c6181ac3e2ec"} err="failed to get container status \"1fcb5b811fa263973c9bb501d12265509e1f4dcccaa9ed6fa964c6181ac3e2ec\": rpc error: code = NotFound desc = could not find container \"1fcb5b811fa263973c9bb501d12265509e1f4dcccaa9ed6fa964c6181ac3e2ec\": container with ID starting with 1fcb5b811fa263973c9bb501d12265509e1f4dcccaa9ed6fa964c6181ac3e2ec not found: ID does not exist" Feb 18 12:07:35 crc kubenswrapper[4922]: I0218 12:07:35.785303 4922 scope.go:117] "RemoveContainer" containerID="95f824af57fb1a326e2795f6ef387d9d820419c7f79841467ba73d6d09a47ab0" Feb 18 12:07:35 crc kubenswrapper[4922]: E0218 12:07:35.788776 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95f824af57fb1a326e2795f6ef387d9d820419c7f79841467ba73d6d09a47ab0\": container with ID starting with 95f824af57fb1a326e2795f6ef387d9d820419c7f79841467ba73d6d09a47ab0 not found: ID does not exist" containerID="95f824af57fb1a326e2795f6ef387d9d820419c7f79841467ba73d6d09a47ab0" Feb 18 12:07:35 crc kubenswrapper[4922]: I0218 12:07:35.788857 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95f824af57fb1a326e2795f6ef387d9d820419c7f79841467ba73d6d09a47ab0"} err="failed to get container status \"95f824af57fb1a326e2795f6ef387d9d820419c7f79841467ba73d6d09a47ab0\": rpc error: code = NotFound desc = could not find container \"95f824af57fb1a326e2795f6ef387d9d820419c7f79841467ba73d6d09a47ab0\": container with ID starting with 95f824af57fb1a326e2795f6ef387d9d820419c7f79841467ba73d6d09a47ab0 not found: ID does not exist" Feb 18 12:07:36 crc kubenswrapper[4922]: I0218 12:07:36.037041 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-p2pzf"] Feb 18 12:07:36 crc kubenswrapper[4922]: I0218 12:07:36.044954 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-p2pzf"] Feb 18 12:07:36 crc kubenswrapper[4922]: I0218 12:07:36.053017 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-c19a-account-create-update-24shd"] Feb 18 12:07:36 crc kubenswrapper[4922]: I0218 12:07:36.061860 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-c19a-account-create-update-24shd"] Feb 18 12:07:36 crc kubenswrapper[4922]: I0218 12:07:36.983763 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="157bc07b-77b8-4a29-b8e0-9a205215187b" path="/var/lib/kubelet/pods/157bc07b-77b8-4a29-b8e0-9a205215187b/volumes" Feb 18 12:07:36 crc kubenswrapper[4922]: I0218 12:07:36.984457 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac7439c4-4267-4309-aae7-259734126f27" path="/var/lib/kubelet/pods/ac7439c4-4267-4309-aae7-259734126f27/volumes" Feb 18 12:07:36 crc kubenswrapper[4922]: I0218 12:07:36.985208 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9360e33-9ae9-4b84-a898-c2c22626a565" path="/var/lib/kubelet/pods/f9360e33-9ae9-4b84-a898-c2c22626a565/volumes" Feb 18 12:07:37 crc kubenswrapper[4922]: I0218 12:07:37.032692 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-96dc-account-create-update-4px58"] Feb 18 12:07:37 crc kubenswrapper[4922]: I0218 12:07:37.046449 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-b9cbq"] Feb 18 12:07:37 crc kubenswrapper[4922]: I0218 12:07:37.055515 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-ea20-account-create-update-k5t5k"] Feb 18 12:07:37 crc kubenswrapper[4922]: I0218 12:07:37.062960 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-96dc-account-create-update-4px58"] Feb 18 12:07:37 crc kubenswrapper[4922]: I0218 12:07:37.071989 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-b9cbq"] Feb 18 12:07:37 crc kubenswrapper[4922]: I0218 12:07:37.080136 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-ea20-account-create-update-k5t5k"] Feb 18 12:07:37 crc kubenswrapper[4922]: I0218 12:07:37.090629 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-ddrmz"] Feb 18 12:07:37 crc kubenswrapper[4922]: I0218 12:07:37.114179 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-ddrmz"] Feb 18 12:07:38 crc kubenswrapper[4922]: I0218 12:07:38.987642 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41c3abe9-3a81-44ef-babf-818b176f6437" path="/var/lib/kubelet/pods/41c3abe9-3a81-44ef-babf-818b176f6437/volumes" Feb 18 12:07:38 crc kubenswrapper[4922]: I0218 12:07:38.988618 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7513cf0a-f653-48b9-a365-9732179aaffc" path="/var/lib/kubelet/pods/7513cf0a-f653-48b9-a365-9732179aaffc/volumes" Feb 18 12:07:38 crc kubenswrapper[4922]: I0218 12:07:38.989128 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b28b3ba-c697-4cef-8e3f-e41317e3abe6" path="/var/lib/kubelet/pods/9b28b3ba-c697-4cef-8e3f-e41317e3abe6/volumes" Feb 18 12:07:38 crc kubenswrapper[4922]: I0218 12:07:38.989668 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cea3a613-3571-4de4-be73-07a4db1c146e" path="/var/lib/kubelet/pods/cea3a613-3571-4de4-be73-07a4db1c146e/volumes" Feb 18 12:07:45 crc kubenswrapper[4922]: I0218 12:07:45.974344 4922 scope.go:117] "RemoveContainer" containerID="8b1f76f76a4d90d637663bb4e07ac7219793473bf03151c5be72aa190a2db8df" Feb 18 12:07:45 crc kubenswrapper[4922]: E0218 12:07:45.975032 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:07:58 crc kubenswrapper[4922]: I0218 12:07:58.979518 4922 scope.go:117] "RemoveContainer" containerID="8b1f76f76a4d90d637663bb4e07ac7219793473bf03151c5be72aa190a2db8df" Feb 18 12:07:58 crc kubenswrapper[4922]: E0218 12:07:58.980200 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:08:07 crc kubenswrapper[4922]: I0218 12:08:07.049722 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ws7m8"] Feb 18 12:08:07 crc kubenswrapper[4922]: I0218 12:08:07.060249 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ws7m8"] Feb 18 12:08:08 crc kubenswrapper[4922]: I0218 12:08:08.985085 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5" path="/var/lib/kubelet/pods/ca4c58a3-4dbc-4aae-90cb-0b77d123e5d5/volumes" Feb 18 12:08:10 crc kubenswrapper[4922]: I0218 12:08:10.973145 4922 scope.go:117] "RemoveContainer" containerID="8b1f76f76a4d90d637663bb4e07ac7219793473bf03151c5be72aa190a2db8df" Feb 18 12:08:10 crc kubenswrapper[4922]: E0218 12:08:10.973917 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:08:11 crc kubenswrapper[4922]: I0218 12:08:11.973492 4922 scope.go:117] "RemoveContainer" containerID="9eae3101b2310737957f7e6d08c731592c72422d2cd0b2731a1d4e5979cf4d34" Feb 18 12:08:12 crc kubenswrapper[4922]: I0218 12:08:12.016067 4922 scope.go:117] "RemoveContainer" containerID="ac0ab0e9aaca817513e97dbed88ce8e6eac29d917cc8fc47fb5c8da1460429d9" Feb 18 12:08:12 crc kubenswrapper[4922]: I0218 12:08:12.070653 4922 scope.go:117] "RemoveContainer" containerID="64eecc2f7b37277d3eaa046c9a0ed3c760eb621a97a64a32162b927b6a47e4d1" Feb 18 12:08:12 crc kubenswrapper[4922]: I0218 12:08:12.105262 4922 scope.go:117] "RemoveContainer" containerID="38158e702c80c7a31868549296123efd8bf7bfe465aae6ab472768cf855c3a38" Feb 18 12:08:12 crc kubenswrapper[4922]: I0218 12:08:12.172340 4922 scope.go:117] "RemoveContainer" containerID="bf9c600a2ed87d6610ddd93585d70fd608560cee157862758bf35ddf2c2a4754" Feb 18 12:08:12 crc kubenswrapper[4922]: I0218 12:08:12.197494 4922 scope.go:117] "RemoveContainer" containerID="b2705f7646a829371102cfea131123135c7490beec56768c960d999d9c5fa2de" Feb 18 12:08:12 crc kubenswrapper[4922]: I0218 12:08:12.255054 4922 scope.go:117] "RemoveContainer" containerID="f4bd75adca76d727b7937c3ad606434e562eddda6632e0e9b7e47844174dbe7d" Feb 18 12:08:12 crc kubenswrapper[4922]: I0218 12:08:12.273383 4922 scope.go:117] "RemoveContainer" containerID="2ca4584a9bdc48778761e031dee58443f19954bb292722b1e84049c5d0d3891e" Feb 18 12:08:12 crc kubenswrapper[4922]: I0218 12:08:12.292213 4922 scope.go:117] "RemoveContainer" containerID="5b0a2a7731dab0112315f444a33e408175b5c28abb2f69079912cabb44e2557e" Feb 18 12:08:12 crc kubenswrapper[4922]: I0218 12:08:12.313415 4922 scope.go:117] "RemoveContainer" containerID="6447a128ed7c2ef808d7f125af51fae05848cf0c3a78ee0cef1bb21c2071c85c" Feb 18 12:08:12 crc kubenswrapper[4922]: I0218 12:08:12.331197 4922 scope.go:117] "RemoveContainer" containerID="20488b03c617a43634c46564fbbe667e0a9ed54d02bba1ba2ca81cbbea0dad2c" Feb 18 12:08:15 crc kubenswrapper[4922]: I0218 12:08:15.930194 4922 generic.go:334] "Generic (PLEG): container finished" podID="b2f62f96-5ba4-4d16-89d8-11ae5e941699" containerID="46c56531dc7f6fe8b2ba5d33360e8f5c404c054782958003bf246dc84e31961c" exitCode=0 Feb 18 12:08:15 crc kubenswrapper[4922]: I0218 12:08:15.930475 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ws7ln" event={"ID":"b2f62f96-5ba4-4d16-89d8-11ae5e941699","Type":"ContainerDied","Data":"46c56531dc7f6fe8b2ba5d33360e8f5c404c054782958003bf246dc84e31961c"} Feb 18 12:08:17 crc kubenswrapper[4922]: I0218 12:08:17.334310 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ws7ln" Feb 18 12:08:17 crc kubenswrapper[4922]: I0218 12:08:17.465787 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b2f62f96-5ba4-4d16-89d8-11ae5e941699-inventory\") pod \"b2f62f96-5ba4-4d16-89d8-11ae5e941699\" (UID: \"b2f62f96-5ba4-4d16-89d8-11ae5e941699\") " Feb 18 12:08:17 crc kubenswrapper[4922]: I0218 12:08:17.467047 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b2f62f96-5ba4-4d16-89d8-11ae5e941699-ssh-key-openstack-edpm-ipam\") pod \"b2f62f96-5ba4-4d16-89d8-11ae5e941699\" (UID: \"b2f62f96-5ba4-4d16-89d8-11ae5e941699\") " Feb 18 12:08:17 crc kubenswrapper[4922]: I0218 12:08:17.467090 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tr9w4\" (UniqueName: \"kubernetes.io/projected/b2f62f96-5ba4-4d16-89d8-11ae5e941699-kube-api-access-tr9w4\") pod \"b2f62f96-5ba4-4d16-89d8-11ae5e941699\" (UID: \"b2f62f96-5ba4-4d16-89d8-11ae5e941699\") " Feb 18 12:08:17 crc kubenswrapper[4922]: I0218 12:08:17.474494 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2f62f96-5ba4-4d16-89d8-11ae5e941699-kube-api-access-tr9w4" (OuterVolumeSpecName: "kube-api-access-tr9w4") pod "b2f62f96-5ba4-4d16-89d8-11ae5e941699" (UID: "b2f62f96-5ba4-4d16-89d8-11ae5e941699"). InnerVolumeSpecName "kube-api-access-tr9w4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:08:17 crc kubenswrapper[4922]: I0218 12:08:17.501176 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2f62f96-5ba4-4d16-89d8-11ae5e941699-inventory" (OuterVolumeSpecName: "inventory") pod "b2f62f96-5ba4-4d16-89d8-11ae5e941699" (UID: "b2f62f96-5ba4-4d16-89d8-11ae5e941699"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:08:17 crc kubenswrapper[4922]: I0218 12:08:17.502914 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2f62f96-5ba4-4d16-89d8-11ae5e941699-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b2f62f96-5ba4-4d16-89d8-11ae5e941699" (UID: "b2f62f96-5ba4-4d16-89d8-11ae5e941699"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:08:17 crc kubenswrapper[4922]: I0218 12:08:17.570383 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b2f62f96-5ba4-4d16-89d8-11ae5e941699-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 12:08:17 crc kubenswrapper[4922]: I0218 12:08:17.570431 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b2f62f96-5ba4-4d16-89d8-11ae5e941699-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 12:08:17 crc kubenswrapper[4922]: I0218 12:08:17.570447 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tr9w4\" (UniqueName: \"kubernetes.io/projected/b2f62f96-5ba4-4d16-89d8-11ae5e941699-kube-api-access-tr9w4\") on node \"crc\" DevicePath \"\"" Feb 18 12:08:17 crc kubenswrapper[4922]: I0218 12:08:17.949163 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ws7ln" event={"ID":"b2f62f96-5ba4-4d16-89d8-11ae5e941699","Type":"ContainerDied","Data":"13c41ae592beddfaa7d7f1100f4757a6653e55fee294002ac2782f785aade831"} Feb 18 12:08:17 crc kubenswrapper[4922]: I0218 12:08:17.949527 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13c41ae592beddfaa7d7f1100f4757a6653e55fee294002ac2782f785aade831" Feb 18 12:08:17 crc kubenswrapper[4922]: I0218 12:08:17.949228 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ws7ln" Feb 18 12:08:18 crc kubenswrapper[4922]: I0218 12:08:18.047352 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hprwc"] Feb 18 12:08:18 crc kubenswrapper[4922]: E0218 12:08:18.047735 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac7439c4-4267-4309-aae7-259734126f27" containerName="extract-content" Feb 18 12:08:18 crc kubenswrapper[4922]: I0218 12:08:18.047751 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac7439c4-4267-4309-aae7-259734126f27" containerName="extract-content" Feb 18 12:08:18 crc kubenswrapper[4922]: E0218 12:08:18.047768 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac7439c4-4267-4309-aae7-259734126f27" containerName="extract-utilities" Feb 18 12:08:18 crc kubenswrapper[4922]: I0218 12:08:18.047777 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac7439c4-4267-4309-aae7-259734126f27" containerName="extract-utilities" Feb 18 12:08:18 crc kubenswrapper[4922]: E0218 12:08:18.047790 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac7439c4-4267-4309-aae7-259734126f27" containerName="registry-server" Feb 18 12:08:18 crc kubenswrapper[4922]: I0218 12:08:18.047800 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac7439c4-4267-4309-aae7-259734126f27" containerName="registry-server" Feb 18 12:08:18 crc kubenswrapper[4922]: E0218 12:08:18.047828 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2f62f96-5ba4-4d16-89d8-11ae5e941699" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 18 12:08:18 crc kubenswrapper[4922]: I0218 12:08:18.047834 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2f62f96-5ba4-4d16-89d8-11ae5e941699" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 18 12:08:18 crc kubenswrapper[4922]: I0218 12:08:18.048016 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2f62f96-5ba4-4d16-89d8-11ae5e941699" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 18 12:08:18 crc kubenswrapper[4922]: I0218 12:08:18.048030 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac7439c4-4267-4309-aae7-259734126f27" containerName="registry-server" Feb 18 12:08:18 crc kubenswrapper[4922]: I0218 12:08:18.062781 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hprwc" Feb 18 12:08:18 crc kubenswrapper[4922]: I0218 12:08:18.115660 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 12:08:18 crc kubenswrapper[4922]: I0218 12:08:18.116060 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8fsfv" Feb 18 12:08:18 crc kubenswrapper[4922]: I0218 12:08:18.116333 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 12:08:18 crc kubenswrapper[4922]: I0218 12:08:18.116605 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 12:08:18 crc kubenswrapper[4922]: I0218 12:08:18.133558 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hprwc"] Feb 18 12:08:18 crc kubenswrapper[4922]: I0218 12:08:18.222968 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/353e7c86-6842-40e4-ac3d-e2032eef15c5-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-hprwc\" (UID: \"353e7c86-6842-40e4-ac3d-e2032eef15c5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hprwc" Feb 18 12:08:18 crc kubenswrapper[4922]: I0218 12:08:18.223201 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cc6l\" (UniqueName: \"kubernetes.io/projected/353e7c86-6842-40e4-ac3d-e2032eef15c5-kube-api-access-2cc6l\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-hprwc\" (UID: \"353e7c86-6842-40e4-ac3d-e2032eef15c5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hprwc" Feb 18 12:08:18 crc kubenswrapper[4922]: I0218 12:08:18.223445 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/353e7c86-6842-40e4-ac3d-e2032eef15c5-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-hprwc\" (UID: \"353e7c86-6842-40e4-ac3d-e2032eef15c5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hprwc" Feb 18 12:08:18 crc kubenswrapper[4922]: I0218 12:08:18.325189 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cc6l\" (UniqueName: \"kubernetes.io/projected/353e7c86-6842-40e4-ac3d-e2032eef15c5-kube-api-access-2cc6l\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-hprwc\" (UID: \"353e7c86-6842-40e4-ac3d-e2032eef15c5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hprwc" Feb 18 12:08:18 crc kubenswrapper[4922]: I0218 12:08:18.325316 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/353e7c86-6842-40e4-ac3d-e2032eef15c5-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-hprwc\" (UID: \"353e7c86-6842-40e4-ac3d-e2032eef15c5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hprwc" Feb 18 12:08:18 crc kubenswrapper[4922]: I0218 12:08:18.325446 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/353e7c86-6842-40e4-ac3d-e2032eef15c5-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-hprwc\" (UID: \"353e7c86-6842-40e4-ac3d-e2032eef15c5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hprwc" Feb 18 12:08:18 crc kubenswrapper[4922]: I0218 12:08:18.335698 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/353e7c86-6842-40e4-ac3d-e2032eef15c5-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-hprwc\" (UID: \"353e7c86-6842-40e4-ac3d-e2032eef15c5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hprwc" Feb 18 12:08:18 crc kubenswrapper[4922]: I0218 12:08:18.343542 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/353e7c86-6842-40e4-ac3d-e2032eef15c5-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-hprwc\" (UID: \"353e7c86-6842-40e4-ac3d-e2032eef15c5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hprwc" Feb 18 12:08:18 crc kubenswrapper[4922]: I0218 12:08:18.344350 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cc6l\" (UniqueName: \"kubernetes.io/projected/353e7c86-6842-40e4-ac3d-e2032eef15c5-kube-api-access-2cc6l\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-hprwc\" (UID: \"353e7c86-6842-40e4-ac3d-e2032eef15c5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hprwc" Feb 18 12:08:18 crc kubenswrapper[4922]: I0218 12:08:18.434612 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hprwc" Feb 18 12:08:18 crc kubenswrapper[4922]: I0218 12:08:18.772202 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hprwc"] Feb 18 12:08:18 crc kubenswrapper[4922]: I0218 12:08:18.962739 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hprwc" event={"ID":"353e7c86-6842-40e4-ac3d-e2032eef15c5","Type":"ContainerStarted","Data":"f7144af24da3674fe3bfa5c89017af24270cc0d2432907f93d8a751e963ddc7d"} Feb 18 12:08:19 crc kubenswrapper[4922]: I0218 12:08:19.976431 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hprwc" event={"ID":"353e7c86-6842-40e4-ac3d-e2032eef15c5","Type":"ContainerStarted","Data":"024c9200cc944fa7d0d57d8c7ae8a256168a2253689f7368fddc0bc9307314f5"} Feb 18 12:08:20 crc kubenswrapper[4922]: I0218 12:08:20.008567 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hprwc" podStartSLOduration=1.268592464 podStartE2EDuration="2.008546983s" podCreationTimestamp="2026-02-18 12:08:18 +0000 UTC" firstStartedPulling="2026-02-18 12:08:18.772615203 +0000 UTC m=+1900.500319283" lastFinishedPulling="2026-02-18 12:08:19.512569722 +0000 UTC m=+1901.240273802" observedRunningTime="2026-02-18 12:08:19.994680803 +0000 UTC m=+1901.722384893" watchObservedRunningTime="2026-02-18 12:08:20.008546983 +0000 UTC m=+1901.736251093" Feb 18 12:08:23 crc kubenswrapper[4922]: I0218 12:08:23.973895 4922 scope.go:117] "RemoveContainer" containerID="8b1f76f76a4d90d637663bb4e07ac7219793473bf03151c5be72aa190a2db8df" Feb 18 12:08:23 crc kubenswrapper[4922]: E0218 12:08:23.974626 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:08:25 crc kubenswrapper[4922]: I0218 12:08:25.021527 4922 generic.go:334] "Generic (PLEG): container finished" podID="353e7c86-6842-40e4-ac3d-e2032eef15c5" containerID="024c9200cc944fa7d0d57d8c7ae8a256168a2253689f7368fddc0bc9307314f5" exitCode=0 Feb 18 12:08:25 crc kubenswrapper[4922]: I0218 12:08:25.021583 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hprwc" event={"ID":"353e7c86-6842-40e4-ac3d-e2032eef15c5","Type":"ContainerDied","Data":"024c9200cc944fa7d0d57d8c7ae8a256168a2253689f7368fddc0bc9307314f5"} Feb 18 12:08:26 crc kubenswrapper[4922]: I0218 12:08:26.049787 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zqpbh"] Feb 18 12:08:26 crc kubenswrapper[4922]: I0218 12:08:26.063893 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-gts9l"] Feb 18 12:08:26 crc kubenswrapper[4922]: I0218 12:08:26.074520 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-gts9l"] Feb 18 12:08:26 crc kubenswrapper[4922]: I0218 12:08:26.082788 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zqpbh"] Feb 18 12:08:26 crc kubenswrapper[4922]: I0218 12:08:26.470131 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hprwc" Feb 18 12:08:26 crc kubenswrapper[4922]: I0218 12:08:26.527498 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cc6l\" (UniqueName: \"kubernetes.io/projected/353e7c86-6842-40e4-ac3d-e2032eef15c5-kube-api-access-2cc6l\") pod \"353e7c86-6842-40e4-ac3d-e2032eef15c5\" (UID: \"353e7c86-6842-40e4-ac3d-e2032eef15c5\") " Feb 18 12:08:26 crc kubenswrapper[4922]: I0218 12:08:26.534589 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/353e7c86-6842-40e4-ac3d-e2032eef15c5-kube-api-access-2cc6l" (OuterVolumeSpecName: "kube-api-access-2cc6l") pod "353e7c86-6842-40e4-ac3d-e2032eef15c5" (UID: "353e7c86-6842-40e4-ac3d-e2032eef15c5"). InnerVolumeSpecName "kube-api-access-2cc6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:08:26 crc kubenswrapper[4922]: I0218 12:08:26.629180 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/353e7c86-6842-40e4-ac3d-e2032eef15c5-inventory\") pod \"353e7c86-6842-40e4-ac3d-e2032eef15c5\" (UID: \"353e7c86-6842-40e4-ac3d-e2032eef15c5\") " Feb 18 12:08:26 crc kubenswrapper[4922]: I0218 12:08:26.629320 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/353e7c86-6842-40e4-ac3d-e2032eef15c5-ssh-key-openstack-edpm-ipam\") pod \"353e7c86-6842-40e4-ac3d-e2032eef15c5\" (UID: \"353e7c86-6842-40e4-ac3d-e2032eef15c5\") " Feb 18 12:08:26 crc kubenswrapper[4922]: I0218 12:08:26.630126 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cc6l\" (UniqueName: \"kubernetes.io/projected/353e7c86-6842-40e4-ac3d-e2032eef15c5-kube-api-access-2cc6l\") on node \"crc\" DevicePath \"\"" Feb 18 12:08:26 crc kubenswrapper[4922]: I0218 12:08:26.655201 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/353e7c86-6842-40e4-ac3d-e2032eef15c5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "353e7c86-6842-40e4-ac3d-e2032eef15c5" (UID: "353e7c86-6842-40e4-ac3d-e2032eef15c5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:08:26 crc kubenswrapper[4922]: I0218 12:08:26.655233 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/353e7c86-6842-40e4-ac3d-e2032eef15c5-inventory" (OuterVolumeSpecName: "inventory") pod "353e7c86-6842-40e4-ac3d-e2032eef15c5" (UID: "353e7c86-6842-40e4-ac3d-e2032eef15c5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:08:26 crc kubenswrapper[4922]: I0218 12:08:26.732329 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/353e7c86-6842-40e4-ac3d-e2032eef15c5-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 12:08:26 crc kubenswrapper[4922]: I0218 12:08:26.732378 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/353e7c86-6842-40e4-ac3d-e2032eef15c5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 12:08:27 crc kubenswrapper[4922]: I0218 12:08:27.003082 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95fc0adb-b8ae-4fd6-88eb-3b6357173103" path="/var/lib/kubelet/pods/95fc0adb-b8ae-4fd6-88eb-3b6357173103/volumes" Feb 18 12:08:27 crc kubenswrapper[4922]: I0218 12:08:27.004530 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aacb8ffe-ff30-4292-b253-1e12d07f499b" path="/var/lib/kubelet/pods/aacb8ffe-ff30-4292-b253-1e12d07f499b/volumes" Feb 18 12:08:27 crc kubenswrapper[4922]: I0218 12:08:27.044025 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hprwc" event={"ID":"353e7c86-6842-40e4-ac3d-e2032eef15c5","Type":"ContainerDied","Data":"f7144af24da3674fe3bfa5c89017af24270cc0d2432907f93d8a751e963ddc7d"} Feb 18 12:08:27 crc kubenswrapper[4922]: I0218 12:08:27.044076 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7144af24da3674fe3bfa5c89017af24270cc0d2432907f93d8a751e963ddc7d" Feb 18 12:08:27 crc kubenswrapper[4922]: I0218 12:08:27.044095 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hprwc" Feb 18 12:08:27 crc kubenswrapper[4922]: I0218 12:08:27.205160 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-sz5wh"] Feb 18 12:08:27 crc kubenswrapper[4922]: E0218 12:08:27.205743 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="353e7c86-6842-40e4-ac3d-e2032eef15c5" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 18 12:08:27 crc kubenswrapper[4922]: I0218 12:08:27.205762 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="353e7c86-6842-40e4-ac3d-e2032eef15c5" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 18 12:08:27 crc kubenswrapper[4922]: I0218 12:08:27.206094 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="353e7c86-6842-40e4-ac3d-e2032eef15c5" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 18 12:08:27 crc kubenswrapper[4922]: I0218 12:08:27.207031 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sz5wh" Feb 18 12:08:27 crc kubenswrapper[4922]: I0218 12:08:27.209467 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 12:08:27 crc kubenswrapper[4922]: I0218 12:08:27.209713 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 12:08:27 crc kubenswrapper[4922]: I0218 12:08:27.209852 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 12:08:27 crc kubenswrapper[4922]: I0218 12:08:27.210967 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8fsfv" Feb 18 12:08:27 crc kubenswrapper[4922]: I0218 12:08:27.215277 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-sz5wh"] Feb 18 12:08:27 crc kubenswrapper[4922]: I0218 12:08:27.242577 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c107695a-fdf7-48c6-b165-5e4dd2427148-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-sz5wh\" (UID: \"c107695a-fdf7-48c6-b165-5e4dd2427148\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sz5wh" Feb 18 12:08:27 crc kubenswrapper[4922]: I0218 12:08:27.242622 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64wq7\" (UniqueName: \"kubernetes.io/projected/c107695a-fdf7-48c6-b165-5e4dd2427148-kube-api-access-64wq7\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-sz5wh\" (UID: \"c107695a-fdf7-48c6-b165-5e4dd2427148\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sz5wh" Feb 18 12:08:27 crc kubenswrapper[4922]: I0218 12:08:27.242706 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c107695a-fdf7-48c6-b165-5e4dd2427148-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-sz5wh\" (UID: \"c107695a-fdf7-48c6-b165-5e4dd2427148\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sz5wh" Feb 18 12:08:27 crc kubenswrapper[4922]: I0218 12:08:27.344173 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c107695a-fdf7-48c6-b165-5e4dd2427148-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-sz5wh\" (UID: \"c107695a-fdf7-48c6-b165-5e4dd2427148\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sz5wh" Feb 18 12:08:27 crc kubenswrapper[4922]: I0218 12:08:27.344246 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64wq7\" (UniqueName: \"kubernetes.io/projected/c107695a-fdf7-48c6-b165-5e4dd2427148-kube-api-access-64wq7\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-sz5wh\" (UID: \"c107695a-fdf7-48c6-b165-5e4dd2427148\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sz5wh" Feb 18 12:08:27 crc kubenswrapper[4922]: I0218 12:08:27.344324 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c107695a-fdf7-48c6-b165-5e4dd2427148-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-sz5wh\" (UID: \"c107695a-fdf7-48c6-b165-5e4dd2427148\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sz5wh" Feb 18 12:08:27 crc kubenswrapper[4922]: I0218 12:08:27.347837 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c107695a-fdf7-48c6-b165-5e4dd2427148-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-sz5wh\" (UID: \"c107695a-fdf7-48c6-b165-5e4dd2427148\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sz5wh" Feb 18 12:08:27 crc kubenswrapper[4922]: I0218 12:08:27.348348 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c107695a-fdf7-48c6-b165-5e4dd2427148-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-sz5wh\" (UID: \"c107695a-fdf7-48c6-b165-5e4dd2427148\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sz5wh" Feb 18 12:08:27 crc kubenswrapper[4922]: I0218 12:08:27.365344 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64wq7\" (UniqueName: \"kubernetes.io/projected/c107695a-fdf7-48c6-b165-5e4dd2427148-kube-api-access-64wq7\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-sz5wh\" (UID: \"c107695a-fdf7-48c6-b165-5e4dd2427148\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sz5wh" Feb 18 12:08:27 crc kubenswrapper[4922]: I0218 12:08:27.522057 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sz5wh" Feb 18 12:08:28 crc kubenswrapper[4922]: I0218 12:08:28.032283 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-sz5wh"] Feb 18 12:08:28 crc kubenswrapper[4922]: I0218 12:08:28.053949 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sz5wh" event={"ID":"c107695a-fdf7-48c6-b165-5e4dd2427148","Type":"ContainerStarted","Data":"6aca127bc3993d0dae302a205ad3f1b63aa98027f67d0e8fc71b42da03f6581b"} Feb 18 12:08:29 crc kubenswrapper[4922]: I0218 12:08:29.065047 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sz5wh" event={"ID":"c107695a-fdf7-48c6-b165-5e4dd2427148","Type":"ContainerStarted","Data":"8ea96b3623a7dca277e92ddcee5cb2ddcc8b300dc1e184235406bf4e928c6ca2"} Feb 18 12:08:29 crc kubenswrapper[4922]: I0218 12:08:29.088990 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sz5wh" podStartSLOduration=1.441219157 podStartE2EDuration="2.088965519s" podCreationTimestamp="2026-02-18 12:08:27 +0000 UTC" firstStartedPulling="2026-02-18 12:08:28.03729307 +0000 UTC m=+1909.764997150" lastFinishedPulling="2026-02-18 12:08:28.685039432 +0000 UTC m=+1910.412743512" observedRunningTime="2026-02-18 12:08:29.085333207 +0000 UTC m=+1910.813037287" watchObservedRunningTime="2026-02-18 12:08:29.088965519 +0000 UTC m=+1910.816669599" Feb 18 12:08:35 crc kubenswrapper[4922]: I0218 12:08:35.974691 4922 scope.go:117] "RemoveContainer" containerID="8b1f76f76a4d90d637663bb4e07ac7219793473bf03151c5be72aa190a2db8df" Feb 18 12:08:35 crc kubenswrapper[4922]: E0218 12:08:35.975512 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:08:49 crc kubenswrapper[4922]: I0218 12:08:49.973243 4922 scope.go:117] "RemoveContainer" containerID="8b1f76f76a4d90d637663bb4e07ac7219793473bf03151c5be72aa190a2db8df" Feb 18 12:08:49 crc kubenswrapper[4922]: E0218 12:08:49.974102 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:09:03 crc kubenswrapper[4922]: I0218 12:09:03.973484 4922 scope.go:117] "RemoveContainer" containerID="8b1f76f76a4d90d637663bb4e07ac7219793473bf03151c5be72aa190a2db8df" Feb 18 12:09:03 crc kubenswrapper[4922]: E0218 12:09:03.974231 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:09:04 crc kubenswrapper[4922]: I0218 12:09:04.368081 4922 generic.go:334] "Generic (PLEG): container finished" podID="c107695a-fdf7-48c6-b165-5e4dd2427148" containerID="8ea96b3623a7dca277e92ddcee5cb2ddcc8b300dc1e184235406bf4e928c6ca2" exitCode=0 Feb 18 12:09:04 crc kubenswrapper[4922]: I0218 12:09:04.368138 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sz5wh" event={"ID":"c107695a-fdf7-48c6-b165-5e4dd2427148","Type":"ContainerDied","Data":"8ea96b3623a7dca277e92ddcee5cb2ddcc8b300dc1e184235406bf4e928c6ca2"} Feb 18 12:09:05 crc kubenswrapper[4922]: I0218 12:09:05.859745 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sz5wh" Feb 18 12:09:06 crc kubenswrapper[4922]: I0218 12:09:06.007462 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64wq7\" (UniqueName: \"kubernetes.io/projected/c107695a-fdf7-48c6-b165-5e4dd2427148-kube-api-access-64wq7\") pod \"c107695a-fdf7-48c6-b165-5e4dd2427148\" (UID: \"c107695a-fdf7-48c6-b165-5e4dd2427148\") " Feb 18 12:09:06 crc kubenswrapper[4922]: I0218 12:09:06.007641 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c107695a-fdf7-48c6-b165-5e4dd2427148-ssh-key-openstack-edpm-ipam\") pod \"c107695a-fdf7-48c6-b165-5e4dd2427148\" (UID: \"c107695a-fdf7-48c6-b165-5e4dd2427148\") " Feb 18 12:09:06 crc kubenswrapper[4922]: I0218 12:09:06.007716 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c107695a-fdf7-48c6-b165-5e4dd2427148-inventory\") pod \"c107695a-fdf7-48c6-b165-5e4dd2427148\" (UID: \"c107695a-fdf7-48c6-b165-5e4dd2427148\") " Feb 18 12:09:06 crc kubenswrapper[4922]: I0218 12:09:06.013243 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c107695a-fdf7-48c6-b165-5e4dd2427148-kube-api-access-64wq7" (OuterVolumeSpecName: "kube-api-access-64wq7") pod "c107695a-fdf7-48c6-b165-5e4dd2427148" (UID: "c107695a-fdf7-48c6-b165-5e4dd2427148"). InnerVolumeSpecName "kube-api-access-64wq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:09:06 crc kubenswrapper[4922]: I0218 12:09:06.035313 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c107695a-fdf7-48c6-b165-5e4dd2427148-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c107695a-fdf7-48c6-b165-5e4dd2427148" (UID: "c107695a-fdf7-48c6-b165-5e4dd2427148"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:09:06 crc kubenswrapper[4922]: I0218 12:09:06.040061 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c107695a-fdf7-48c6-b165-5e4dd2427148-inventory" (OuterVolumeSpecName: "inventory") pod "c107695a-fdf7-48c6-b165-5e4dd2427148" (UID: "c107695a-fdf7-48c6-b165-5e4dd2427148"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:09:06 crc kubenswrapper[4922]: I0218 12:09:06.111998 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64wq7\" (UniqueName: \"kubernetes.io/projected/c107695a-fdf7-48c6-b165-5e4dd2427148-kube-api-access-64wq7\") on node \"crc\" DevicePath \"\"" Feb 18 12:09:06 crc kubenswrapper[4922]: I0218 12:09:06.112499 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c107695a-fdf7-48c6-b165-5e4dd2427148-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 12:09:06 crc kubenswrapper[4922]: I0218 12:09:06.112628 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c107695a-fdf7-48c6-b165-5e4dd2427148-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 12:09:06 crc kubenswrapper[4922]: I0218 12:09:06.385866 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sz5wh" event={"ID":"c107695a-fdf7-48c6-b165-5e4dd2427148","Type":"ContainerDied","Data":"6aca127bc3993d0dae302a205ad3f1b63aa98027f67d0e8fc71b42da03f6581b"} Feb 18 12:09:06 crc kubenswrapper[4922]: I0218 12:09:06.386169 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6aca127bc3993d0dae302a205ad3f1b63aa98027f67d0e8fc71b42da03f6581b" Feb 18 12:09:06 crc kubenswrapper[4922]: I0218 12:09:06.385925 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sz5wh" Feb 18 12:09:06 crc kubenswrapper[4922]: I0218 12:09:06.508147 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fcnzl"] Feb 18 12:09:06 crc kubenswrapper[4922]: E0218 12:09:06.508554 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c107695a-fdf7-48c6-b165-5e4dd2427148" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 18 12:09:06 crc kubenswrapper[4922]: I0218 12:09:06.508572 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="c107695a-fdf7-48c6-b165-5e4dd2427148" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 18 12:09:06 crc kubenswrapper[4922]: I0218 12:09:06.508775 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="c107695a-fdf7-48c6-b165-5e4dd2427148" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 18 12:09:06 crc kubenswrapper[4922]: I0218 12:09:06.509412 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fcnzl" Feb 18 12:09:06 crc kubenswrapper[4922]: I0218 12:09:06.512529 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 12:09:06 crc kubenswrapper[4922]: I0218 12:09:06.512752 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 12:09:06 crc kubenswrapper[4922]: I0218 12:09:06.512929 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 12:09:06 crc kubenswrapper[4922]: I0218 12:09:06.513087 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8fsfv" Feb 18 12:09:06 crc kubenswrapper[4922]: I0218 12:09:06.523861 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fcnzl"] Feb 18 12:09:06 crc kubenswrapper[4922]: I0218 12:09:06.526326 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ff0774c0-eec1-4e8d-9368-8c3f26ec3d8d-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fcnzl\" (UID: \"ff0774c0-eec1-4e8d-9368-8c3f26ec3d8d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fcnzl" Feb 18 12:09:06 crc kubenswrapper[4922]: I0218 12:09:06.526384 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff0774c0-eec1-4e8d-9368-8c3f26ec3d8d-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fcnzl\" (UID: \"ff0774c0-eec1-4e8d-9368-8c3f26ec3d8d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fcnzl" Feb 18 12:09:06 crc kubenswrapper[4922]: I0218 12:09:06.526441 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xpnx\" (UniqueName: \"kubernetes.io/projected/ff0774c0-eec1-4e8d-9368-8c3f26ec3d8d-kube-api-access-8xpnx\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fcnzl\" (UID: \"ff0774c0-eec1-4e8d-9368-8c3f26ec3d8d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fcnzl" Feb 18 12:09:06 crc kubenswrapper[4922]: I0218 12:09:06.628107 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ff0774c0-eec1-4e8d-9368-8c3f26ec3d8d-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fcnzl\" (UID: \"ff0774c0-eec1-4e8d-9368-8c3f26ec3d8d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fcnzl" Feb 18 12:09:06 crc kubenswrapper[4922]: I0218 12:09:06.628319 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff0774c0-eec1-4e8d-9368-8c3f26ec3d8d-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fcnzl\" (UID: \"ff0774c0-eec1-4e8d-9368-8c3f26ec3d8d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fcnzl" Feb 18 12:09:06 crc kubenswrapper[4922]: I0218 12:09:06.628384 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xpnx\" (UniqueName: \"kubernetes.io/projected/ff0774c0-eec1-4e8d-9368-8c3f26ec3d8d-kube-api-access-8xpnx\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fcnzl\" (UID: \"ff0774c0-eec1-4e8d-9368-8c3f26ec3d8d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fcnzl" Feb 18 12:09:06 crc kubenswrapper[4922]: I0218 12:09:06.633155 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff0774c0-eec1-4e8d-9368-8c3f26ec3d8d-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fcnzl\" (UID: \"ff0774c0-eec1-4e8d-9368-8c3f26ec3d8d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fcnzl" Feb 18 12:09:06 crc kubenswrapper[4922]: I0218 12:09:06.637905 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ff0774c0-eec1-4e8d-9368-8c3f26ec3d8d-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fcnzl\" (UID: \"ff0774c0-eec1-4e8d-9368-8c3f26ec3d8d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fcnzl" Feb 18 12:09:06 crc kubenswrapper[4922]: I0218 12:09:06.644232 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xpnx\" (UniqueName: \"kubernetes.io/projected/ff0774c0-eec1-4e8d-9368-8c3f26ec3d8d-kube-api-access-8xpnx\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fcnzl\" (UID: \"ff0774c0-eec1-4e8d-9368-8c3f26ec3d8d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fcnzl" Feb 18 12:09:06 crc kubenswrapper[4922]: I0218 12:09:06.850464 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fcnzl" Feb 18 12:09:07 crc kubenswrapper[4922]: I0218 12:09:07.391259 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fcnzl"] Feb 18 12:09:07 crc kubenswrapper[4922]: I0218 12:09:07.395514 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fcnzl" event={"ID":"ff0774c0-eec1-4e8d-9368-8c3f26ec3d8d","Type":"ContainerStarted","Data":"b7b1ebedfebb46a47d356ce9223386e680a037fd14252fe17313a77c3838483a"} Feb 18 12:09:08 crc kubenswrapper[4922]: I0218 12:09:08.405243 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fcnzl" event={"ID":"ff0774c0-eec1-4e8d-9368-8c3f26ec3d8d","Type":"ContainerStarted","Data":"a8d14529d596b2d3019a7c74078324669b0e3b7ca1a5541f03a98ba8120df860"} Feb 18 12:09:08 crc kubenswrapper[4922]: I0218 12:09:08.432451 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fcnzl" podStartSLOduration=2.023280997 podStartE2EDuration="2.432427736s" podCreationTimestamp="2026-02-18 12:09:06 +0000 UTC" firstStartedPulling="2026-02-18 12:09:07.374303474 +0000 UTC m=+1949.102007554" lastFinishedPulling="2026-02-18 12:09:07.783450213 +0000 UTC m=+1949.511154293" observedRunningTime="2026-02-18 12:09:08.423301585 +0000 UTC m=+1950.151005665" watchObservedRunningTime="2026-02-18 12:09:08.432427736 +0000 UTC m=+1950.160131826" Feb 18 12:09:12 crc kubenswrapper[4922]: I0218 12:09:12.503323 4922 scope.go:117] "RemoveContainer" containerID="3dd5bc6bbbea448a7d9229f94c3182f66fafad86532a2426f6a51eaa5b649205" Feb 18 12:09:12 crc kubenswrapper[4922]: I0218 12:09:12.560629 4922 scope.go:117] "RemoveContainer" containerID="ee60fdb1d51bfbe6bd8d0e50a4e4f1a8691171bcc2ba0c807ba59a9d88e2dc0d" Feb 18 12:09:14 crc kubenswrapper[4922]: I0218 12:09:14.038154 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-dqtvp"] Feb 18 12:09:14 crc kubenswrapper[4922]: I0218 12:09:14.046171 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-dqtvp"] Feb 18 12:09:14 crc kubenswrapper[4922]: I0218 12:09:14.983697 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3440f4a-2f1d-4d69-aafe-ec2eb86183cc" path="/var/lib/kubelet/pods/b3440f4a-2f1d-4d69-aafe-ec2eb86183cc/volumes" Feb 18 12:09:15 crc kubenswrapper[4922]: I0218 12:09:15.974903 4922 scope.go:117] "RemoveContainer" containerID="8b1f76f76a4d90d637663bb4e07ac7219793473bf03151c5be72aa190a2db8df" Feb 18 12:09:15 crc kubenswrapper[4922]: E0218 12:09:15.975794 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:09:27 crc kubenswrapper[4922]: I0218 12:09:27.973494 4922 scope.go:117] "RemoveContainer" containerID="8b1f76f76a4d90d637663bb4e07ac7219793473bf03151c5be72aa190a2db8df" Feb 18 12:09:27 crc kubenswrapper[4922]: E0218 12:09:27.974666 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:09:41 crc kubenswrapper[4922]: I0218 12:09:41.974613 4922 scope.go:117] "RemoveContainer" containerID="8b1f76f76a4d90d637663bb4e07ac7219793473bf03151c5be72aa190a2db8df" Feb 18 12:09:42 crc kubenswrapper[4922]: I0218 12:09:42.726010 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" event={"ID":"fdb7cedc-b2e3-48f0-80e0-e17073b43228","Type":"ContainerStarted","Data":"f96eabfecb9bc217d8636beb3db027fe21b0aa914ece8706e84fc03338c589ff"} Feb 18 12:09:51 crc kubenswrapper[4922]: I0218 12:09:51.823122 4922 generic.go:334] "Generic (PLEG): container finished" podID="ff0774c0-eec1-4e8d-9368-8c3f26ec3d8d" containerID="a8d14529d596b2d3019a7c74078324669b0e3b7ca1a5541f03a98ba8120df860" exitCode=0 Feb 18 12:09:51 crc kubenswrapper[4922]: I0218 12:09:51.823224 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fcnzl" event={"ID":"ff0774c0-eec1-4e8d-9368-8c3f26ec3d8d","Type":"ContainerDied","Data":"a8d14529d596b2d3019a7c74078324669b0e3b7ca1a5541f03a98ba8120df860"} Feb 18 12:09:53 crc kubenswrapper[4922]: I0218 12:09:53.288610 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fcnzl" Feb 18 12:09:53 crc kubenswrapper[4922]: I0218 12:09:53.344984 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ff0774c0-eec1-4e8d-9368-8c3f26ec3d8d-ssh-key-openstack-edpm-ipam\") pod \"ff0774c0-eec1-4e8d-9368-8c3f26ec3d8d\" (UID: \"ff0774c0-eec1-4e8d-9368-8c3f26ec3d8d\") " Feb 18 12:09:53 crc kubenswrapper[4922]: I0218 12:09:53.345038 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xpnx\" (UniqueName: \"kubernetes.io/projected/ff0774c0-eec1-4e8d-9368-8c3f26ec3d8d-kube-api-access-8xpnx\") pod \"ff0774c0-eec1-4e8d-9368-8c3f26ec3d8d\" (UID: \"ff0774c0-eec1-4e8d-9368-8c3f26ec3d8d\") " Feb 18 12:09:53 crc kubenswrapper[4922]: I0218 12:09:53.345337 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff0774c0-eec1-4e8d-9368-8c3f26ec3d8d-inventory\") pod \"ff0774c0-eec1-4e8d-9368-8c3f26ec3d8d\" (UID: \"ff0774c0-eec1-4e8d-9368-8c3f26ec3d8d\") " Feb 18 12:09:53 crc kubenswrapper[4922]: I0218 12:09:53.352825 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff0774c0-eec1-4e8d-9368-8c3f26ec3d8d-kube-api-access-8xpnx" (OuterVolumeSpecName: "kube-api-access-8xpnx") pod "ff0774c0-eec1-4e8d-9368-8c3f26ec3d8d" (UID: "ff0774c0-eec1-4e8d-9368-8c3f26ec3d8d"). InnerVolumeSpecName "kube-api-access-8xpnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:09:53 crc kubenswrapper[4922]: I0218 12:09:53.381474 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff0774c0-eec1-4e8d-9368-8c3f26ec3d8d-inventory" (OuterVolumeSpecName: "inventory") pod "ff0774c0-eec1-4e8d-9368-8c3f26ec3d8d" (UID: "ff0774c0-eec1-4e8d-9368-8c3f26ec3d8d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:09:53 crc kubenswrapper[4922]: I0218 12:09:53.384483 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff0774c0-eec1-4e8d-9368-8c3f26ec3d8d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ff0774c0-eec1-4e8d-9368-8c3f26ec3d8d" (UID: "ff0774c0-eec1-4e8d-9368-8c3f26ec3d8d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:09:53 crc kubenswrapper[4922]: I0218 12:09:53.447750 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff0774c0-eec1-4e8d-9368-8c3f26ec3d8d-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 12:09:53 crc kubenswrapper[4922]: I0218 12:09:53.447800 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ff0774c0-eec1-4e8d-9368-8c3f26ec3d8d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 12:09:53 crc kubenswrapper[4922]: I0218 12:09:53.447815 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xpnx\" (UniqueName: \"kubernetes.io/projected/ff0774c0-eec1-4e8d-9368-8c3f26ec3d8d-kube-api-access-8xpnx\") on node \"crc\" DevicePath \"\"" Feb 18 12:09:53 crc kubenswrapper[4922]: I0218 12:09:53.843777 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fcnzl" event={"ID":"ff0774c0-eec1-4e8d-9368-8c3f26ec3d8d","Type":"ContainerDied","Data":"b7b1ebedfebb46a47d356ce9223386e680a037fd14252fe17313a77c3838483a"} Feb 18 12:09:53 crc kubenswrapper[4922]: I0218 12:09:53.844121 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7b1ebedfebb46a47d356ce9223386e680a037fd14252fe17313a77c3838483a" Feb 18 12:09:53 crc kubenswrapper[4922]: I0218 12:09:53.843934 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fcnzl" Feb 18 12:09:54 crc kubenswrapper[4922]: I0218 12:09:54.024842 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-cz4px"] Feb 18 12:09:54 crc kubenswrapper[4922]: E0218 12:09:54.025307 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff0774c0-eec1-4e8d-9368-8c3f26ec3d8d" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 18 12:09:54 crc kubenswrapper[4922]: I0218 12:09:54.025328 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff0774c0-eec1-4e8d-9368-8c3f26ec3d8d" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 18 12:09:54 crc kubenswrapper[4922]: I0218 12:09:54.025546 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff0774c0-eec1-4e8d-9368-8c3f26ec3d8d" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 18 12:09:54 crc kubenswrapper[4922]: I0218 12:09:54.026220 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-cz4px" Feb 18 12:09:54 crc kubenswrapper[4922]: I0218 12:09:54.029947 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8fsfv" Feb 18 12:09:54 crc kubenswrapper[4922]: I0218 12:09:54.030640 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 12:09:54 crc kubenswrapper[4922]: I0218 12:09:54.031831 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 12:09:54 crc kubenswrapper[4922]: I0218 12:09:54.033508 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 12:09:54 crc kubenswrapper[4922]: I0218 12:09:54.037116 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-cz4px"] Feb 18 12:09:54 crc kubenswrapper[4922]: I0218 12:09:54.059972 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c2fa843a-470e-441c-93c9-8c412459933b-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-cz4px\" (UID: \"c2fa843a-470e-441c-93c9-8c412459933b\") " pod="openstack/ssh-known-hosts-edpm-deployment-cz4px" Feb 18 12:09:54 crc kubenswrapper[4922]: I0218 12:09:54.061345 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9gz6\" (UniqueName: \"kubernetes.io/projected/c2fa843a-470e-441c-93c9-8c412459933b-kube-api-access-m9gz6\") pod \"ssh-known-hosts-edpm-deployment-cz4px\" (UID: \"c2fa843a-470e-441c-93c9-8c412459933b\") " pod="openstack/ssh-known-hosts-edpm-deployment-cz4px" Feb 18 12:09:54 crc kubenswrapper[4922]: I0218 12:09:54.061506 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c2fa843a-470e-441c-93c9-8c412459933b-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-cz4px\" (UID: \"c2fa843a-470e-441c-93c9-8c412459933b\") " pod="openstack/ssh-known-hosts-edpm-deployment-cz4px" Feb 18 12:09:54 crc kubenswrapper[4922]: I0218 12:09:54.163796 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9gz6\" (UniqueName: \"kubernetes.io/projected/c2fa843a-470e-441c-93c9-8c412459933b-kube-api-access-m9gz6\") pod \"ssh-known-hosts-edpm-deployment-cz4px\" (UID: \"c2fa843a-470e-441c-93c9-8c412459933b\") " pod="openstack/ssh-known-hosts-edpm-deployment-cz4px" Feb 18 12:09:54 crc kubenswrapper[4922]: I0218 12:09:54.163913 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c2fa843a-470e-441c-93c9-8c412459933b-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-cz4px\" (UID: \"c2fa843a-470e-441c-93c9-8c412459933b\") " pod="openstack/ssh-known-hosts-edpm-deployment-cz4px" Feb 18 12:09:54 crc kubenswrapper[4922]: I0218 12:09:54.164727 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c2fa843a-470e-441c-93c9-8c412459933b-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-cz4px\" (UID: \"c2fa843a-470e-441c-93c9-8c412459933b\") " pod="openstack/ssh-known-hosts-edpm-deployment-cz4px" Feb 18 12:09:54 crc kubenswrapper[4922]: I0218 12:09:54.167932 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c2fa843a-470e-441c-93c9-8c412459933b-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-cz4px\" (UID: \"c2fa843a-470e-441c-93c9-8c412459933b\") " pod="openstack/ssh-known-hosts-edpm-deployment-cz4px" Feb 18 12:09:54 crc kubenswrapper[4922]: I0218 12:09:54.173960 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c2fa843a-470e-441c-93c9-8c412459933b-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-cz4px\" (UID: \"c2fa843a-470e-441c-93c9-8c412459933b\") " pod="openstack/ssh-known-hosts-edpm-deployment-cz4px" Feb 18 12:09:54 crc kubenswrapper[4922]: I0218 12:09:54.184562 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9gz6\" (UniqueName: \"kubernetes.io/projected/c2fa843a-470e-441c-93c9-8c412459933b-kube-api-access-m9gz6\") pod \"ssh-known-hosts-edpm-deployment-cz4px\" (UID: \"c2fa843a-470e-441c-93c9-8c412459933b\") " pod="openstack/ssh-known-hosts-edpm-deployment-cz4px" Feb 18 12:09:54 crc kubenswrapper[4922]: I0218 12:09:54.347307 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-cz4px" Feb 18 12:09:54 crc kubenswrapper[4922]: I0218 12:09:54.854254 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-cz4px"] Feb 18 12:09:55 crc kubenswrapper[4922]: I0218 12:09:55.873007 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-cz4px" event={"ID":"c2fa843a-470e-441c-93c9-8c412459933b","Type":"ContainerStarted","Data":"2155458a0d41d2a078dfa477e75d3f295a29102fe0d4b31385906a721ac9fc69"} Feb 18 12:09:55 crc kubenswrapper[4922]: I0218 12:09:55.873250 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-cz4px" event={"ID":"c2fa843a-470e-441c-93c9-8c412459933b","Type":"ContainerStarted","Data":"cb56bd5543e36cd0c5573b96d86a626b879a0205ac0dce5406cef7788990a831"} Feb 18 12:09:55 crc kubenswrapper[4922]: I0218 12:09:55.898040 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-cz4px" podStartSLOduration=1.525958537 podStartE2EDuration="1.89802061s" podCreationTimestamp="2026-02-18 12:09:54 +0000 UTC" firstStartedPulling="2026-02-18 12:09:54.867443013 +0000 UTC m=+1996.595147093" lastFinishedPulling="2026-02-18 12:09:55.239505086 +0000 UTC m=+1996.967209166" observedRunningTime="2026-02-18 12:09:55.888837708 +0000 UTC m=+1997.616541798" watchObservedRunningTime="2026-02-18 12:09:55.89802061 +0000 UTC m=+1997.625724690" Feb 18 12:10:01 crc kubenswrapper[4922]: I0218 12:10:01.927701 4922 generic.go:334] "Generic (PLEG): container finished" podID="c2fa843a-470e-441c-93c9-8c412459933b" containerID="2155458a0d41d2a078dfa477e75d3f295a29102fe0d4b31385906a721ac9fc69" exitCode=0 Feb 18 12:10:01 crc kubenswrapper[4922]: I0218 12:10:01.927746 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-cz4px" event={"ID":"c2fa843a-470e-441c-93c9-8c412459933b","Type":"ContainerDied","Data":"2155458a0d41d2a078dfa477e75d3f295a29102fe0d4b31385906a721ac9fc69"} Feb 18 12:10:03 crc kubenswrapper[4922]: I0218 12:10:03.347749 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-cz4px" Feb 18 12:10:03 crc kubenswrapper[4922]: I0218 12:10:03.548101 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c2fa843a-470e-441c-93c9-8c412459933b-ssh-key-openstack-edpm-ipam\") pod \"c2fa843a-470e-441c-93c9-8c412459933b\" (UID: \"c2fa843a-470e-441c-93c9-8c412459933b\") " Feb 18 12:10:03 crc kubenswrapper[4922]: I0218 12:10:03.548300 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9gz6\" (UniqueName: \"kubernetes.io/projected/c2fa843a-470e-441c-93c9-8c412459933b-kube-api-access-m9gz6\") pod \"c2fa843a-470e-441c-93c9-8c412459933b\" (UID: \"c2fa843a-470e-441c-93c9-8c412459933b\") " Feb 18 12:10:03 crc kubenswrapper[4922]: I0218 12:10:03.548410 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c2fa843a-470e-441c-93c9-8c412459933b-inventory-0\") pod \"c2fa843a-470e-441c-93c9-8c412459933b\" (UID: \"c2fa843a-470e-441c-93c9-8c412459933b\") " Feb 18 12:10:03 crc kubenswrapper[4922]: I0218 12:10:03.557734 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2fa843a-470e-441c-93c9-8c412459933b-kube-api-access-m9gz6" (OuterVolumeSpecName: "kube-api-access-m9gz6") pod "c2fa843a-470e-441c-93c9-8c412459933b" (UID: "c2fa843a-470e-441c-93c9-8c412459933b"). InnerVolumeSpecName "kube-api-access-m9gz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:10:03 crc kubenswrapper[4922]: I0218 12:10:03.931556 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9gz6\" (UniqueName: \"kubernetes.io/projected/c2fa843a-470e-441c-93c9-8c412459933b-kube-api-access-m9gz6\") on node \"crc\" DevicePath \"\"" Feb 18 12:10:03 crc kubenswrapper[4922]: I0218 12:10:03.931974 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2fa843a-470e-441c-93c9-8c412459933b-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "c2fa843a-470e-441c-93c9-8c412459933b" (UID: "c2fa843a-470e-441c-93c9-8c412459933b"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:10:03 crc kubenswrapper[4922]: I0218 12:10:03.939091 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2fa843a-470e-441c-93c9-8c412459933b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c2fa843a-470e-441c-93c9-8c412459933b" (UID: "c2fa843a-470e-441c-93c9-8c412459933b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:10:03 crc kubenswrapper[4922]: I0218 12:10:03.951099 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-cz4px" event={"ID":"c2fa843a-470e-441c-93c9-8c412459933b","Type":"ContainerDied","Data":"cb56bd5543e36cd0c5573b96d86a626b879a0205ac0dce5406cef7788990a831"} Feb 18 12:10:03 crc kubenswrapper[4922]: I0218 12:10:03.951155 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb56bd5543e36cd0c5573b96d86a626b879a0205ac0dce5406cef7788990a831" Feb 18 12:10:03 crc kubenswrapper[4922]: I0218 12:10:03.951202 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-cz4px" Feb 18 12:10:04 crc kubenswrapper[4922]: I0218 12:10:04.034412 4922 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c2fa843a-470e-441c-93c9-8c412459933b-inventory-0\") on node \"crc\" DevicePath \"\"" Feb 18 12:10:04 crc kubenswrapper[4922]: I0218 12:10:04.034459 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c2fa843a-470e-441c-93c9-8c412459933b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 12:10:04 crc kubenswrapper[4922]: I0218 12:10:04.110444 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-b5h25"] Feb 18 12:10:04 crc kubenswrapper[4922]: E0218 12:10:04.110980 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2fa843a-470e-441c-93c9-8c412459933b" containerName="ssh-known-hosts-edpm-deployment" Feb 18 12:10:04 crc kubenswrapper[4922]: I0218 12:10:04.111006 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2fa843a-470e-441c-93c9-8c412459933b" containerName="ssh-known-hosts-edpm-deployment" Feb 18 12:10:04 crc kubenswrapper[4922]: I0218 12:10:04.111253 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2fa843a-470e-441c-93c9-8c412459933b" containerName="ssh-known-hosts-edpm-deployment" Feb 18 12:10:04 crc kubenswrapper[4922]: I0218 12:10:04.112090 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-b5h25" Feb 18 12:10:04 crc kubenswrapper[4922]: I0218 12:10:04.114678 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 12:10:04 crc kubenswrapper[4922]: I0218 12:10:04.114737 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 12:10:04 crc kubenswrapper[4922]: I0218 12:10:04.115136 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8fsfv" Feb 18 12:10:04 crc kubenswrapper[4922]: I0218 12:10:04.115254 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 12:10:04 crc kubenswrapper[4922]: I0218 12:10:04.126719 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-b5h25"] Feb 18 12:10:04 crc kubenswrapper[4922]: I0218 12:10:04.245497 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/227ab888-976c-4ce1-beb8-abbe305c6d79-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-b5h25\" (UID: \"227ab888-976c-4ce1-beb8-abbe305c6d79\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-b5h25" Feb 18 12:10:04 crc kubenswrapper[4922]: I0218 12:10:04.245835 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/227ab888-976c-4ce1-beb8-abbe305c6d79-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-b5h25\" (UID: \"227ab888-976c-4ce1-beb8-abbe305c6d79\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-b5h25" Feb 18 12:10:04 crc kubenswrapper[4922]: I0218 12:10:04.245859 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92dhd\" (UniqueName: \"kubernetes.io/projected/227ab888-976c-4ce1-beb8-abbe305c6d79-kube-api-access-92dhd\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-b5h25\" (UID: \"227ab888-976c-4ce1-beb8-abbe305c6d79\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-b5h25" Feb 18 12:10:04 crc kubenswrapper[4922]: I0218 12:10:04.347564 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/227ab888-976c-4ce1-beb8-abbe305c6d79-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-b5h25\" (UID: \"227ab888-976c-4ce1-beb8-abbe305c6d79\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-b5h25" Feb 18 12:10:04 crc kubenswrapper[4922]: I0218 12:10:04.347680 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/227ab888-976c-4ce1-beb8-abbe305c6d79-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-b5h25\" (UID: \"227ab888-976c-4ce1-beb8-abbe305c6d79\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-b5h25" Feb 18 12:10:04 crc kubenswrapper[4922]: I0218 12:10:04.347704 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92dhd\" (UniqueName: \"kubernetes.io/projected/227ab888-976c-4ce1-beb8-abbe305c6d79-kube-api-access-92dhd\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-b5h25\" (UID: \"227ab888-976c-4ce1-beb8-abbe305c6d79\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-b5h25" Feb 18 12:10:04 crc kubenswrapper[4922]: I0218 12:10:04.353201 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/227ab888-976c-4ce1-beb8-abbe305c6d79-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-b5h25\" (UID: \"227ab888-976c-4ce1-beb8-abbe305c6d79\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-b5h25" Feb 18 12:10:04 crc kubenswrapper[4922]: I0218 12:10:04.353215 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/227ab888-976c-4ce1-beb8-abbe305c6d79-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-b5h25\" (UID: \"227ab888-976c-4ce1-beb8-abbe305c6d79\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-b5h25" Feb 18 12:10:04 crc kubenswrapper[4922]: I0218 12:10:04.367424 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92dhd\" (UniqueName: \"kubernetes.io/projected/227ab888-976c-4ce1-beb8-abbe305c6d79-kube-api-access-92dhd\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-b5h25\" (UID: \"227ab888-976c-4ce1-beb8-abbe305c6d79\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-b5h25" Feb 18 12:10:04 crc kubenswrapper[4922]: I0218 12:10:04.451603 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-b5h25" Feb 18 12:10:04 crc kubenswrapper[4922]: I0218 12:10:04.989080 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-b5h25"] Feb 18 12:10:04 crc kubenswrapper[4922]: I0218 12:10:04.997945 4922 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 12:10:05 crc kubenswrapper[4922]: I0218 12:10:05.971475 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-b5h25" event={"ID":"227ab888-976c-4ce1-beb8-abbe305c6d79","Type":"ContainerStarted","Data":"6a338d676d67c02553a3ed22517161f630c3e29222fabb352e3c16eeab57926b"} Feb 18 12:10:05 crc kubenswrapper[4922]: I0218 12:10:05.972181 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-b5h25" event={"ID":"227ab888-976c-4ce1-beb8-abbe305c6d79","Type":"ContainerStarted","Data":"e45b93f4844ffc72c1bc4c029021b6cc531116c46cb3c0bd6ccfb945234158a8"} Feb 18 12:10:06 crc kubenswrapper[4922]: I0218 12:10:06.003890 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-b5h25" podStartSLOduration=1.463399148 podStartE2EDuration="2.003866152s" podCreationTimestamp="2026-02-18 12:10:04 +0000 UTC" firstStartedPulling="2026-02-18 12:10:04.997651881 +0000 UTC m=+2006.725355971" lastFinishedPulling="2026-02-18 12:10:05.538118885 +0000 UTC m=+2007.265822975" observedRunningTime="2026-02-18 12:10:05.989419878 +0000 UTC m=+2007.717123958" watchObservedRunningTime="2026-02-18 12:10:06.003866152 +0000 UTC m=+2007.731570232" Feb 18 12:10:12 crc kubenswrapper[4922]: I0218 12:10:12.651457 4922 scope.go:117] "RemoveContainer" containerID="fe9e0b8c506169c8ff2962eae62f59ba2039b683aba9a221a651171f7491288f" Feb 18 12:10:14 crc kubenswrapper[4922]: I0218 12:10:14.061304 4922 generic.go:334] "Generic (PLEG): container finished" podID="227ab888-976c-4ce1-beb8-abbe305c6d79" containerID="6a338d676d67c02553a3ed22517161f630c3e29222fabb352e3c16eeab57926b" exitCode=0 Feb 18 12:10:14 crc kubenswrapper[4922]: I0218 12:10:14.061415 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-b5h25" event={"ID":"227ab888-976c-4ce1-beb8-abbe305c6d79","Type":"ContainerDied","Data":"6a338d676d67c02553a3ed22517161f630c3e29222fabb352e3c16eeab57926b"} Feb 18 12:10:15 crc kubenswrapper[4922]: I0218 12:10:15.510592 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-b5h25" Feb 18 12:10:15 crc kubenswrapper[4922]: I0218 12:10:15.592080 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/227ab888-976c-4ce1-beb8-abbe305c6d79-inventory\") pod \"227ab888-976c-4ce1-beb8-abbe305c6d79\" (UID: \"227ab888-976c-4ce1-beb8-abbe305c6d79\") " Feb 18 12:10:15 crc kubenswrapper[4922]: I0218 12:10:15.592156 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92dhd\" (UniqueName: \"kubernetes.io/projected/227ab888-976c-4ce1-beb8-abbe305c6d79-kube-api-access-92dhd\") pod \"227ab888-976c-4ce1-beb8-abbe305c6d79\" (UID: \"227ab888-976c-4ce1-beb8-abbe305c6d79\") " Feb 18 12:10:15 crc kubenswrapper[4922]: I0218 12:10:15.592308 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/227ab888-976c-4ce1-beb8-abbe305c6d79-ssh-key-openstack-edpm-ipam\") pod \"227ab888-976c-4ce1-beb8-abbe305c6d79\" (UID: \"227ab888-976c-4ce1-beb8-abbe305c6d79\") " Feb 18 12:10:15 crc kubenswrapper[4922]: I0218 12:10:15.597614 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/227ab888-976c-4ce1-beb8-abbe305c6d79-kube-api-access-92dhd" (OuterVolumeSpecName: "kube-api-access-92dhd") pod "227ab888-976c-4ce1-beb8-abbe305c6d79" (UID: "227ab888-976c-4ce1-beb8-abbe305c6d79"). InnerVolumeSpecName "kube-api-access-92dhd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:10:15 crc kubenswrapper[4922]: I0218 12:10:15.618410 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/227ab888-976c-4ce1-beb8-abbe305c6d79-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "227ab888-976c-4ce1-beb8-abbe305c6d79" (UID: "227ab888-976c-4ce1-beb8-abbe305c6d79"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:10:15 crc kubenswrapper[4922]: I0218 12:10:15.626645 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/227ab888-976c-4ce1-beb8-abbe305c6d79-inventory" (OuterVolumeSpecName: "inventory") pod "227ab888-976c-4ce1-beb8-abbe305c6d79" (UID: "227ab888-976c-4ce1-beb8-abbe305c6d79"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:10:15 crc kubenswrapper[4922]: I0218 12:10:15.694437 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/227ab888-976c-4ce1-beb8-abbe305c6d79-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 12:10:15 crc kubenswrapper[4922]: I0218 12:10:15.694464 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/227ab888-976c-4ce1-beb8-abbe305c6d79-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 12:10:15 crc kubenswrapper[4922]: I0218 12:10:15.694474 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92dhd\" (UniqueName: \"kubernetes.io/projected/227ab888-976c-4ce1-beb8-abbe305c6d79-kube-api-access-92dhd\") on node \"crc\" DevicePath \"\"" Feb 18 12:10:16 crc kubenswrapper[4922]: I0218 12:10:16.081150 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-b5h25" event={"ID":"227ab888-976c-4ce1-beb8-abbe305c6d79","Type":"ContainerDied","Data":"e45b93f4844ffc72c1bc4c029021b6cc531116c46cb3c0bd6ccfb945234158a8"} Feb 18 12:10:16 crc kubenswrapper[4922]: I0218 12:10:16.081199 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e45b93f4844ffc72c1bc4c029021b6cc531116c46cb3c0bd6ccfb945234158a8" Feb 18 12:10:16 crc kubenswrapper[4922]: I0218 12:10:16.081201 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-b5h25" Feb 18 12:10:16 crc kubenswrapper[4922]: I0218 12:10:16.180982 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lwp7z"] Feb 18 12:10:16 crc kubenswrapper[4922]: E0218 12:10:16.181510 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="227ab888-976c-4ce1-beb8-abbe305c6d79" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 18 12:10:16 crc kubenswrapper[4922]: I0218 12:10:16.181533 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="227ab888-976c-4ce1-beb8-abbe305c6d79" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 18 12:10:16 crc kubenswrapper[4922]: I0218 12:10:16.181759 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="227ab888-976c-4ce1-beb8-abbe305c6d79" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 18 12:10:16 crc kubenswrapper[4922]: I0218 12:10:16.182472 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lwp7z" Feb 18 12:10:16 crc kubenswrapper[4922]: I0218 12:10:16.185365 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 12:10:16 crc kubenswrapper[4922]: I0218 12:10:16.185611 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 12:10:16 crc kubenswrapper[4922]: I0218 12:10:16.185795 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8fsfv" Feb 18 12:10:16 crc kubenswrapper[4922]: I0218 12:10:16.185987 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 12:10:16 crc kubenswrapper[4922]: I0218 12:10:16.204470 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lwp7z"] Feb 18 12:10:16 crc kubenswrapper[4922]: I0218 12:10:16.304228 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fac1ed4a-2fa4-4220-80fb-f54e3a357fb9-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lwp7z\" (UID: \"fac1ed4a-2fa4-4220-80fb-f54e3a357fb9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lwp7z" Feb 18 12:10:16 crc kubenswrapper[4922]: I0218 12:10:16.305380 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fac1ed4a-2fa4-4220-80fb-f54e3a357fb9-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lwp7z\" (UID: \"fac1ed4a-2fa4-4220-80fb-f54e3a357fb9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lwp7z" Feb 18 12:10:16 crc kubenswrapper[4922]: I0218 12:10:16.305774 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g5h4\" (UniqueName: \"kubernetes.io/projected/fac1ed4a-2fa4-4220-80fb-f54e3a357fb9-kube-api-access-4g5h4\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lwp7z\" (UID: \"fac1ed4a-2fa4-4220-80fb-f54e3a357fb9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lwp7z" Feb 18 12:10:16 crc kubenswrapper[4922]: I0218 12:10:16.407671 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g5h4\" (UniqueName: \"kubernetes.io/projected/fac1ed4a-2fa4-4220-80fb-f54e3a357fb9-kube-api-access-4g5h4\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lwp7z\" (UID: \"fac1ed4a-2fa4-4220-80fb-f54e3a357fb9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lwp7z" Feb 18 12:10:16 crc kubenswrapper[4922]: I0218 12:10:16.408077 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fac1ed4a-2fa4-4220-80fb-f54e3a357fb9-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lwp7z\" (UID: \"fac1ed4a-2fa4-4220-80fb-f54e3a357fb9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lwp7z" Feb 18 12:10:16 crc kubenswrapper[4922]: I0218 12:10:16.408120 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fac1ed4a-2fa4-4220-80fb-f54e3a357fb9-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lwp7z\" (UID: \"fac1ed4a-2fa4-4220-80fb-f54e3a357fb9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lwp7z" Feb 18 12:10:16 crc kubenswrapper[4922]: I0218 12:10:16.415451 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fac1ed4a-2fa4-4220-80fb-f54e3a357fb9-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lwp7z\" (UID: \"fac1ed4a-2fa4-4220-80fb-f54e3a357fb9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lwp7z" Feb 18 12:10:16 crc kubenswrapper[4922]: I0218 12:10:16.423014 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fac1ed4a-2fa4-4220-80fb-f54e3a357fb9-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lwp7z\" (UID: \"fac1ed4a-2fa4-4220-80fb-f54e3a357fb9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lwp7z" Feb 18 12:10:16 crc kubenswrapper[4922]: I0218 12:10:16.429933 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g5h4\" (UniqueName: \"kubernetes.io/projected/fac1ed4a-2fa4-4220-80fb-f54e3a357fb9-kube-api-access-4g5h4\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lwp7z\" (UID: \"fac1ed4a-2fa4-4220-80fb-f54e3a357fb9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lwp7z" Feb 18 12:10:16 crc kubenswrapper[4922]: I0218 12:10:16.502917 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lwp7z" Feb 18 12:10:17 crc kubenswrapper[4922]: I0218 12:10:17.058653 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lwp7z"] Feb 18 12:10:17 crc kubenswrapper[4922]: I0218 12:10:17.097385 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lwp7z" event={"ID":"fac1ed4a-2fa4-4220-80fb-f54e3a357fb9","Type":"ContainerStarted","Data":"38c17c5d036cba7c5c3872091f42de94c68d4f5d546f513297a4d61e02240874"} Feb 18 12:10:18 crc kubenswrapper[4922]: I0218 12:10:18.106135 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lwp7z" event={"ID":"fac1ed4a-2fa4-4220-80fb-f54e3a357fb9","Type":"ContainerStarted","Data":"daaa6c1a89adf85e202516fca8131d072d601b2824c5cbed83b90661a7b68d6f"} Feb 18 12:10:18 crc kubenswrapper[4922]: I0218 12:10:18.127760 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lwp7z" podStartSLOduration=1.741161312 podStartE2EDuration="2.127740658s" podCreationTimestamp="2026-02-18 12:10:16 +0000 UTC" firstStartedPulling="2026-02-18 12:10:17.072273038 +0000 UTC m=+2018.799977128" lastFinishedPulling="2026-02-18 12:10:17.458852394 +0000 UTC m=+2019.186556474" observedRunningTime="2026-02-18 12:10:18.126137788 +0000 UTC m=+2019.853841868" watchObservedRunningTime="2026-02-18 12:10:18.127740658 +0000 UTC m=+2019.855444738" Feb 18 12:10:27 crc kubenswrapper[4922]: I0218 12:10:27.195505 4922 generic.go:334] "Generic (PLEG): container finished" podID="fac1ed4a-2fa4-4220-80fb-f54e3a357fb9" containerID="daaa6c1a89adf85e202516fca8131d072d601b2824c5cbed83b90661a7b68d6f" exitCode=0 Feb 18 12:10:27 crc kubenswrapper[4922]: I0218 12:10:27.195605 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lwp7z" event={"ID":"fac1ed4a-2fa4-4220-80fb-f54e3a357fb9","Type":"ContainerDied","Data":"daaa6c1a89adf85e202516fca8131d072d601b2824c5cbed83b90661a7b68d6f"} Feb 18 12:10:28 crc kubenswrapper[4922]: I0218 12:10:28.622154 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lwp7z" Feb 18 12:10:28 crc kubenswrapper[4922]: I0218 12:10:28.747004 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4g5h4\" (UniqueName: \"kubernetes.io/projected/fac1ed4a-2fa4-4220-80fb-f54e3a357fb9-kube-api-access-4g5h4\") pod \"fac1ed4a-2fa4-4220-80fb-f54e3a357fb9\" (UID: \"fac1ed4a-2fa4-4220-80fb-f54e3a357fb9\") " Feb 18 12:10:28 crc kubenswrapper[4922]: I0218 12:10:28.747057 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fac1ed4a-2fa4-4220-80fb-f54e3a357fb9-ssh-key-openstack-edpm-ipam\") pod \"fac1ed4a-2fa4-4220-80fb-f54e3a357fb9\" (UID: \"fac1ed4a-2fa4-4220-80fb-f54e3a357fb9\") " Feb 18 12:10:28 crc kubenswrapper[4922]: I0218 12:10:28.747288 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fac1ed4a-2fa4-4220-80fb-f54e3a357fb9-inventory\") pod \"fac1ed4a-2fa4-4220-80fb-f54e3a357fb9\" (UID: \"fac1ed4a-2fa4-4220-80fb-f54e3a357fb9\") " Feb 18 12:10:28 crc kubenswrapper[4922]: I0218 12:10:28.753921 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fac1ed4a-2fa4-4220-80fb-f54e3a357fb9-kube-api-access-4g5h4" (OuterVolumeSpecName: "kube-api-access-4g5h4") pod "fac1ed4a-2fa4-4220-80fb-f54e3a357fb9" (UID: "fac1ed4a-2fa4-4220-80fb-f54e3a357fb9"). InnerVolumeSpecName "kube-api-access-4g5h4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:10:28 crc kubenswrapper[4922]: I0218 12:10:28.774793 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fac1ed4a-2fa4-4220-80fb-f54e3a357fb9-inventory" (OuterVolumeSpecName: "inventory") pod "fac1ed4a-2fa4-4220-80fb-f54e3a357fb9" (UID: "fac1ed4a-2fa4-4220-80fb-f54e3a357fb9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:10:28 crc kubenswrapper[4922]: I0218 12:10:28.777788 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fac1ed4a-2fa4-4220-80fb-f54e3a357fb9-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "fac1ed4a-2fa4-4220-80fb-f54e3a357fb9" (UID: "fac1ed4a-2fa4-4220-80fb-f54e3a357fb9"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:10:28 crc kubenswrapper[4922]: I0218 12:10:28.849749 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fac1ed4a-2fa4-4220-80fb-f54e3a357fb9-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 12:10:28 crc kubenswrapper[4922]: I0218 12:10:28.849800 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4g5h4\" (UniqueName: \"kubernetes.io/projected/fac1ed4a-2fa4-4220-80fb-f54e3a357fb9-kube-api-access-4g5h4\") on node \"crc\" DevicePath \"\"" Feb 18 12:10:28 crc kubenswrapper[4922]: I0218 12:10:28.849818 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fac1ed4a-2fa4-4220-80fb-f54e3a357fb9-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.212930 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lwp7z" event={"ID":"fac1ed4a-2fa4-4220-80fb-f54e3a357fb9","Type":"ContainerDied","Data":"38c17c5d036cba7c5c3872091f42de94c68d4f5d546f513297a4d61e02240874"} Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.212967 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38c17c5d036cba7c5c3872091f42de94c68d4f5d546f513297a4d61e02240874" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.212971 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lwp7z" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.311959 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq"] Feb 18 12:10:29 crc kubenswrapper[4922]: E0218 12:10:29.314123 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fac1ed4a-2fa4-4220-80fb-f54e3a357fb9" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.314154 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="fac1ed4a-2fa4-4220-80fb-f54e3a357fb9" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.314505 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="fac1ed4a-2fa4-4220-80fb-f54e3a357fb9" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.315412 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.321214 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.321406 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.321498 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8fsfv" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.321941 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.322035 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.322129 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.322182 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.322231 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.324822 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq"] Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.460337 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bxngq\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.460502 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bxngq\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.460543 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bxngq\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.460691 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6mcp\" (UniqueName: \"kubernetes.io/projected/98bc83e7-66dd-4133-82cd-d4301c233f9d-kube-api-access-k6mcp\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bxngq\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.460765 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bxngq\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.461007 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98bc83e7-66dd-4133-82cd-d4301c233f9d-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bxngq\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.461057 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bxngq\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.461086 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98bc83e7-66dd-4133-82cd-d4301c233f9d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bxngq\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.461120 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98bc83e7-66dd-4133-82cd-d4301c233f9d-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bxngq\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.461163 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98bc83e7-66dd-4133-82cd-d4301c233f9d-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bxngq\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.461207 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bxngq\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.461307 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bxngq\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.461338 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bxngq\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.461395 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bxngq\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.562833 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bxngq\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.562878 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bxngq\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.562918 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6mcp\" (UniqueName: \"kubernetes.io/projected/98bc83e7-66dd-4133-82cd-d4301c233f9d-kube-api-access-k6mcp\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bxngq\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.562953 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bxngq\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.562981 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98bc83e7-66dd-4133-82cd-d4301c233f9d-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bxngq\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.563018 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bxngq\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.563042 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98bc83e7-66dd-4133-82cd-d4301c233f9d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bxngq\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.563066 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98bc83e7-66dd-4133-82cd-d4301c233f9d-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bxngq\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.563092 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98bc83e7-66dd-4133-82cd-d4301c233f9d-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bxngq\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.563108 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bxngq\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.563163 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bxngq\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.563183 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bxngq\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.563204 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bxngq\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.563227 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bxngq\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.567260 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bxngq\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.567551 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98bc83e7-66dd-4133-82cd-d4301c233f9d-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bxngq\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.568011 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bxngq\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.571681 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bxngq\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.571847 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98bc83e7-66dd-4133-82cd-d4301c233f9d-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bxngq\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.572162 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bxngq\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.573552 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bxngq\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.573964 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bxngq\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.574687 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98bc83e7-66dd-4133-82cd-d4301c233f9d-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bxngq\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.575142 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bxngq\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.575810 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bxngq\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.577438 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98bc83e7-66dd-4133-82cd-d4301c233f9d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bxngq\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.583756 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bxngq\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.583787 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6mcp\" (UniqueName: \"kubernetes.io/projected/98bc83e7-66dd-4133-82cd-d4301c233f9d-kube-api-access-k6mcp\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bxngq\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq" Feb 18 12:10:29 crc kubenswrapper[4922]: I0218 12:10:29.635874 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq" Feb 18 12:10:30 crc kubenswrapper[4922]: I0218 12:10:30.237605 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq"] Feb 18 12:10:31 crc kubenswrapper[4922]: I0218 12:10:31.231485 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq" event={"ID":"98bc83e7-66dd-4133-82cd-d4301c233f9d","Type":"ContainerStarted","Data":"4938616f4f8b9f59ee92ec69c7db137c366133a3da099ee49fe218fc024cbfe1"} Feb 18 12:10:31 crc kubenswrapper[4922]: I0218 12:10:31.232997 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq" event={"ID":"98bc83e7-66dd-4133-82cd-d4301c233f9d","Type":"ContainerStarted","Data":"daacfe7efb7e08b3efeba8d32115cca5950451e2edadcd24196c068b03499b35"} Feb 18 12:10:31 crc kubenswrapper[4922]: I0218 12:10:31.256098 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq" podStartSLOduration=1.6755501179999999 podStartE2EDuration="2.256076507s" podCreationTimestamp="2026-02-18 12:10:29 +0000 UTC" firstStartedPulling="2026-02-18 12:10:30.247074447 +0000 UTC m=+2031.974778527" lastFinishedPulling="2026-02-18 12:10:30.827600826 +0000 UTC m=+2032.555304916" observedRunningTime="2026-02-18 12:10:31.254153388 +0000 UTC m=+2032.981857468" watchObservedRunningTime="2026-02-18 12:10:31.256076507 +0000 UTC m=+2032.983780587" Feb 18 12:11:06 crc kubenswrapper[4922]: I0218 12:11:06.545890 4922 generic.go:334] "Generic (PLEG): container finished" podID="98bc83e7-66dd-4133-82cd-d4301c233f9d" containerID="4938616f4f8b9f59ee92ec69c7db137c366133a3da099ee49fe218fc024cbfe1" exitCode=0 Feb 18 12:11:06 crc kubenswrapper[4922]: I0218 12:11:06.545951 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq" event={"ID":"98bc83e7-66dd-4133-82cd-d4301c233f9d","Type":"ContainerDied","Data":"4938616f4f8b9f59ee92ec69c7db137c366133a3da099ee49fe218fc024cbfe1"} Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.043094 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.186956 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-inventory\") pod \"98bc83e7-66dd-4133-82cd-d4301c233f9d\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.187030 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6mcp\" (UniqueName: \"kubernetes.io/projected/98bc83e7-66dd-4133-82cd-d4301c233f9d-kube-api-access-k6mcp\") pod \"98bc83e7-66dd-4133-82cd-d4301c233f9d\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.187138 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-libvirt-combined-ca-bundle\") pod \"98bc83e7-66dd-4133-82cd-d4301c233f9d\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.187183 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98bc83e7-66dd-4133-82cd-d4301c233f9d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"98bc83e7-66dd-4133-82cd-d4301c233f9d\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.187252 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98bc83e7-66dd-4133-82cd-d4301c233f9d-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"98bc83e7-66dd-4133-82cd-d4301c233f9d\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.187329 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-telemetry-combined-ca-bundle\") pod \"98bc83e7-66dd-4133-82cd-d4301c233f9d\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.187352 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-nova-combined-ca-bundle\") pod \"98bc83e7-66dd-4133-82cd-d4301c233f9d\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.187396 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-bootstrap-combined-ca-bundle\") pod \"98bc83e7-66dd-4133-82cd-d4301c233f9d\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.187434 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98bc83e7-66dd-4133-82cd-d4301c233f9d-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"98bc83e7-66dd-4133-82cd-d4301c233f9d\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.187457 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-repo-setup-combined-ca-bundle\") pod \"98bc83e7-66dd-4133-82cd-d4301c233f9d\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.187477 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-ssh-key-openstack-edpm-ipam\") pod \"98bc83e7-66dd-4133-82cd-d4301c233f9d\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.187500 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98bc83e7-66dd-4133-82cd-d4301c233f9d-openstack-edpm-ipam-ovn-default-certs-0\") pod \"98bc83e7-66dd-4133-82cd-d4301c233f9d\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.187522 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-ovn-combined-ca-bundle\") pod \"98bc83e7-66dd-4133-82cd-d4301c233f9d\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.187551 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-neutron-metadata-combined-ca-bundle\") pod \"98bc83e7-66dd-4133-82cd-d4301c233f9d\" (UID: \"98bc83e7-66dd-4133-82cd-d4301c233f9d\") " Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.194859 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "98bc83e7-66dd-4133-82cd-d4301c233f9d" (UID: "98bc83e7-66dd-4133-82cd-d4301c233f9d"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.195436 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "98bc83e7-66dd-4133-82cd-d4301c233f9d" (UID: "98bc83e7-66dd-4133-82cd-d4301c233f9d"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.195954 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "98bc83e7-66dd-4133-82cd-d4301c233f9d" (UID: "98bc83e7-66dd-4133-82cd-d4301c233f9d"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.196022 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "98bc83e7-66dd-4133-82cd-d4301c233f9d" (UID: "98bc83e7-66dd-4133-82cd-d4301c233f9d"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.196229 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98bc83e7-66dd-4133-82cd-d4301c233f9d-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "98bc83e7-66dd-4133-82cd-d4301c233f9d" (UID: "98bc83e7-66dd-4133-82cd-d4301c233f9d"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.196981 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98bc83e7-66dd-4133-82cd-d4301c233f9d-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "98bc83e7-66dd-4133-82cd-d4301c233f9d" (UID: "98bc83e7-66dd-4133-82cd-d4301c233f9d"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.197848 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98bc83e7-66dd-4133-82cd-d4301c233f9d-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "98bc83e7-66dd-4133-82cd-d4301c233f9d" (UID: "98bc83e7-66dd-4133-82cd-d4301c233f9d"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.197951 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98bc83e7-66dd-4133-82cd-d4301c233f9d-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "98bc83e7-66dd-4133-82cd-d4301c233f9d" (UID: "98bc83e7-66dd-4133-82cd-d4301c233f9d"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.198290 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "98bc83e7-66dd-4133-82cd-d4301c233f9d" (UID: "98bc83e7-66dd-4133-82cd-d4301c233f9d"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.207463 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "98bc83e7-66dd-4133-82cd-d4301c233f9d" (UID: "98bc83e7-66dd-4133-82cd-d4301c233f9d"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.207586 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98bc83e7-66dd-4133-82cd-d4301c233f9d-kube-api-access-k6mcp" (OuterVolumeSpecName: "kube-api-access-k6mcp") pod "98bc83e7-66dd-4133-82cd-d4301c233f9d" (UID: "98bc83e7-66dd-4133-82cd-d4301c233f9d"). InnerVolumeSpecName "kube-api-access-k6mcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.207595 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "98bc83e7-66dd-4133-82cd-d4301c233f9d" (UID: "98bc83e7-66dd-4133-82cd-d4301c233f9d"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.220067 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-inventory" (OuterVolumeSpecName: "inventory") pod "98bc83e7-66dd-4133-82cd-d4301c233f9d" (UID: "98bc83e7-66dd-4133-82cd-d4301c233f9d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.234988 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "98bc83e7-66dd-4133-82cd-d4301c233f9d" (UID: "98bc83e7-66dd-4133-82cd-d4301c233f9d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.292072 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6mcp\" (UniqueName: \"kubernetes.io/projected/98bc83e7-66dd-4133-82cd-d4301c233f9d-kube-api-access-k6mcp\") on node \"crc\" DevicePath \"\"" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.292118 4922 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.292133 4922 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98bc83e7-66dd-4133-82cd-d4301c233f9d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.292147 4922 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98bc83e7-66dd-4133-82cd-d4301c233f9d-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.292160 4922 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.292171 4922 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.292182 4922 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.292195 4922 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98bc83e7-66dd-4133-82cd-d4301c233f9d-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.292206 4922 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.292233 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.292248 4922 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98bc83e7-66dd-4133-82cd-d4301c233f9d-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.292260 4922 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.292271 4922 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.292284 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98bc83e7-66dd-4133-82cd-d4301c233f9d-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.567717 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq" event={"ID":"98bc83e7-66dd-4133-82cd-d4301c233f9d","Type":"ContainerDied","Data":"daacfe7efb7e08b3efeba8d32115cca5950451e2edadcd24196c068b03499b35"} Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.567765 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="daacfe7efb7e08b3efeba8d32115cca5950451e2edadcd24196c068b03499b35" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.567825 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bxngq" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.680093 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-8jmtt"] Feb 18 12:11:08 crc kubenswrapper[4922]: E0218 12:11:08.683080 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98bc83e7-66dd-4133-82cd-d4301c233f9d" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.683125 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="98bc83e7-66dd-4133-82cd-d4301c233f9d" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.683495 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="98bc83e7-66dd-4133-82cd-d4301c233f9d" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.684732 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8jmtt" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.687492 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.687664 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.687812 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8fsfv" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.687826 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.687976 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.697746 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-8jmtt"] Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.804121 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/45d322f9-bf52-4679-ab43-9d222bc09a14-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-8jmtt\" (UID: \"45d322f9-bf52-4679-ab43-9d222bc09a14\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8jmtt" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.804208 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/45d322f9-bf52-4679-ab43-9d222bc09a14-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-8jmtt\" (UID: \"45d322f9-bf52-4679-ab43-9d222bc09a14\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8jmtt" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.804442 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/45d322f9-bf52-4679-ab43-9d222bc09a14-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-8jmtt\" (UID: \"45d322f9-bf52-4679-ab43-9d222bc09a14\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8jmtt" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.804836 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l94rs\" (UniqueName: \"kubernetes.io/projected/45d322f9-bf52-4679-ab43-9d222bc09a14-kube-api-access-l94rs\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-8jmtt\" (UID: \"45d322f9-bf52-4679-ab43-9d222bc09a14\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8jmtt" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.804893 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45d322f9-bf52-4679-ab43-9d222bc09a14-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-8jmtt\" (UID: \"45d322f9-bf52-4679-ab43-9d222bc09a14\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8jmtt" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.907503 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l94rs\" (UniqueName: \"kubernetes.io/projected/45d322f9-bf52-4679-ab43-9d222bc09a14-kube-api-access-l94rs\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-8jmtt\" (UID: \"45d322f9-bf52-4679-ab43-9d222bc09a14\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8jmtt" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.907559 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45d322f9-bf52-4679-ab43-9d222bc09a14-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-8jmtt\" (UID: \"45d322f9-bf52-4679-ab43-9d222bc09a14\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8jmtt" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.907635 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/45d322f9-bf52-4679-ab43-9d222bc09a14-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-8jmtt\" (UID: \"45d322f9-bf52-4679-ab43-9d222bc09a14\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8jmtt" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.907688 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/45d322f9-bf52-4679-ab43-9d222bc09a14-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-8jmtt\" (UID: \"45d322f9-bf52-4679-ab43-9d222bc09a14\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8jmtt" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.907739 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/45d322f9-bf52-4679-ab43-9d222bc09a14-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-8jmtt\" (UID: \"45d322f9-bf52-4679-ab43-9d222bc09a14\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8jmtt" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.908758 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/45d322f9-bf52-4679-ab43-9d222bc09a14-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-8jmtt\" (UID: \"45d322f9-bf52-4679-ab43-9d222bc09a14\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8jmtt" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.912532 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/45d322f9-bf52-4679-ab43-9d222bc09a14-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-8jmtt\" (UID: \"45d322f9-bf52-4679-ab43-9d222bc09a14\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8jmtt" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.914639 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/45d322f9-bf52-4679-ab43-9d222bc09a14-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-8jmtt\" (UID: \"45d322f9-bf52-4679-ab43-9d222bc09a14\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8jmtt" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.916154 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45d322f9-bf52-4679-ab43-9d222bc09a14-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-8jmtt\" (UID: \"45d322f9-bf52-4679-ab43-9d222bc09a14\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8jmtt" Feb 18 12:11:08 crc kubenswrapper[4922]: I0218 12:11:08.925194 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l94rs\" (UniqueName: \"kubernetes.io/projected/45d322f9-bf52-4679-ab43-9d222bc09a14-kube-api-access-l94rs\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-8jmtt\" (UID: \"45d322f9-bf52-4679-ab43-9d222bc09a14\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8jmtt" Feb 18 12:11:09 crc kubenswrapper[4922]: I0218 12:11:09.001662 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8jmtt" Feb 18 12:11:09 crc kubenswrapper[4922]: I0218 12:11:09.540172 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-8jmtt"] Feb 18 12:11:09 crc kubenswrapper[4922]: I0218 12:11:09.581929 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8jmtt" event={"ID":"45d322f9-bf52-4679-ab43-9d222bc09a14","Type":"ContainerStarted","Data":"cb909c8a47f940a517a676dc5536d0bf0b70a866ff9c6682d5855478d99f6690"} Feb 18 12:11:10 crc kubenswrapper[4922]: I0218 12:11:10.592499 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8jmtt" event={"ID":"45d322f9-bf52-4679-ab43-9d222bc09a14","Type":"ContainerStarted","Data":"14e866305ca7680d979cf41bd87ba2c36720433b54f20a2b9b9d179d1f1c9a18"} Feb 18 12:11:10 crc kubenswrapper[4922]: I0218 12:11:10.619502 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8jmtt" podStartSLOduration=2.185812717 podStartE2EDuration="2.619479258s" podCreationTimestamp="2026-02-18 12:11:08 +0000 UTC" firstStartedPulling="2026-02-18 12:11:09.551952985 +0000 UTC m=+2071.279657065" lastFinishedPulling="2026-02-18 12:11:09.985619526 +0000 UTC m=+2071.713323606" observedRunningTime="2026-02-18 12:11:10.608458861 +0000 UTC m=+2072.336162961" watchObservedRunningTime="2026-02-18 12:11:10.619479258 +0000 UTC m=+2072.347183348" Feb 18 12:11:50 crc kubenswrapper[4922]: I0218 12:11:50.190305 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-f9wgh"] Feb 18 12:11:50 crc kubenswrapper[4922]: I0218 12:11:50.206198 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f9wgh" Feb 18 12:11:50 crc kubenswrapper[4922]: I0218 12:11:50.215260 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f9wgh"] Feb 18 12:11:50 crc kubenswrapper[4922]: I0218 12:11:50.394659 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpsh7\" (UniqueName: \"kubernetes.io/projected/0c0e4049-cc63-4ef1-aef5-0542ca9b9667-kube-api-access-rpsh7\") pod \"redhat-operators-f9wgh\" (UID: \"0c0e4049-cc63-4ef1-aef5-0542ca9b9667\") " pod="openshift-marketplace/redhat-operators-f9wgh" Feb 18 12:11:50 crc kubenswrapper[4922]: I0218 12:11:50.394791 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c0e4049-cc63-4ef1-aef5-0542ca9b9667-utilities\") pod \"redhat-operators-f9wgh\" (UID: \"0c0e4049-cc63-4ef1-aef5-0542ca9b9667\") " pod="openshift-marketplace/redhat-operators-f9wgh" Feb 18 12:11:50 crc kubenswrapper[4922]: I0218 12:11:50.394853 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c0e4049-cc63-4ef1-aef5-0542ca9b9667-catalog-content\") pod \"redhat-operators-f9wgh\" (UID: \"0c0e4049-cc63-4ef1-aef5-0542ca9b9667\") " pod="openshift-marketplace/redhat-operators-f9wgh" Feb 18 12:11:50 crc kubenswrapper[4922]: I0218 12:11:50.496275 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpsh7\" (UniqueName: \"kubernetes.io/projected/0c0e4049-cc63-4ef1-aef5-0542ca9b9667-kube-api-access-rpsh7\") pod \"redhat-operators-f9wgh\" (UID: \"0c0e4049-cc63-4ef1-aef5-0542ca9b9667\") " pod="openshift-marketplace/redhat-operators-f9wgh" Feb 18 12:11:50 crc kubenswrapper[4922]: I0218 12:11:50.496709 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c0e4049-cc63-4ef1-aef5-0542ca9b9667-utilities\") pod \"redhat-operators-f9wgh\" (UID: \"0c0e4049-cc63-4ef1-aef5-0542ca9b9667\") " pod="openshift-marketplace/redhat-operators-f9wgh" Feb 18 12:11:50 crc kubenswrapper[4922]: I0218 12:11:50.496851 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c0e4049-cc63-4ef1-aef5-0542ca9b9667-catalog-content\") pod \"redhat-operators-f9wgh\" (UID: \"0c0e4049-cc63-4ef1-aef5-0542ca9b9667\") " pod="openshift-marketplace/redhat-operators-f9wgh" Feb 18 12:11:50 crc kubenswrapper[4922]: I0218 12:11:50.497567 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c0e4049-cc63-4ef1-aef5-0542ca9b9667-utilities\") pod \"redhat-operators-f9wgh\" (UID: \"0c0e4049-cc63-4ef1-aef5-0542ca9b9667\") " pod="openshift-marketplace/redhat-operators-f9wgh" Feb 18 12:11:50 crc kubenswrapper[4922]: I0218 12:11:50.497585 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c0e4049-cc63-4ef1-aef5-0542ca9b9667-catalog-content\") pod \"redhat-operators-f9wgh\" (UID: \"0c0e4049-cc63-4ef1-aef5-0542ca9b9667\") " pod="openshift-marketplace/redhat-operators-f9wgh" Feb 18 12:11:50 crc kubenswrapper[4922]: I0218 12:11:50.519381 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpsh7\" (UniqueName: \"kubernetes.io/projected/0c0e4049-cc63-4ef1-aef5-0542ca9b9667-kube-api-access-rpsh7\") pod \"redhat-operators-f9wgh\" (UID: \"0c0e4049-cc63-4ef1-aef5-0542ca9b9667\") " pod="openshift-marketplace/redhat-operators-f9wgh" Feb 18 12:11:50 crc kubenswrapper[4922]: I0218 12:11:50.536101 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f9wgh" Feb 18 12:11:51 crc kubenswrapper[4922]: I0218 12:11:51.022707 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f9wgh"] Feb 18 12:11:51 crc kubenswrapper[4922]: I0218 12:11:51.978134 4922 generic.go:334] "Generic (PLEG): container finished" podID="0c0e4049-cc63-4ef1-aef5-0542ca9b9667" containerID="9890a2620c97984231eef2e007309f75dd81b3c214c8050c5d3c9d04775450e2" exitCode=0 Feb 18 12:11:51 crc kubenswrapper[4922]: I0218 12:11:51.978214 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f9wgh" event={"ID":"0c0e4049-cc63-4ef1-aef5-0542ca9b9667","Type":"ContainerDied","Data":"9890a2620c97984231eef2e007309f75dd81b3c214c8050c5d3c9d04775450e2"} Feb 18 12:11:51 crc kubenswrapper[4922]: I0218 12:11:51.978475 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f9wgh" event={"ID":"0c0e4049-cc63-4ef1-aef5-0542ca9b9667","Type":"ContainerStarted","Data":"a04d7d069df5d9960d42f6d4bd51a40213af02b5d84049f2834e25d340b39cd9"} Feb 18 12:11:54 crc kubenswrapper[4922]: I0218 12:11:54.023837 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f9wgh" event={"ID":"0c0e4049-cc63-4ef1-aef5-0542ca9b9667","Type":"ContainerStarted","Data":"6a8d4cd1b4b048bf54c76705d96da87ce8de69c632ee41b7145096277e801fbc"} Feb 18 12:11:56 crc kubenswrapper[4922]: I0218 12:11:56.044547 4922 generic.go:334] "Generic (PLEG): container finished" podID="0c0e4049-cc63-4ef1-aef5-0542ca9b9667" containerID="6a8d4cd1b4b048bf54c76705d96da87ce8de69c632ee41b7145096277e801fbc" exitCode=0 Feb 18 12:11:56 crc kubenswrapper[4922]: I0218 12:11:56.044622 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f9wgh" event={"ID":"0c0e4049-cc63-4ef1-aef5-0542ca9b9667","Type":"ContainerDied","Data":"6a8d4cd1b4b048bf54c76705d96da87ce8de69c632ee41b7145096277e801fbc"} Feb 18 12:11:58 crc kubenswrapper[4922]: I0218 12:11:58.068484 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f9wgh" event={"ID":"0c0e4049-cc63-4ef1-aef5-0542ca9b9667","Type":"ContainerStarted","Data":"ce2a92bf867d95dab323be5ac67b666234b7cc50ed9bdbfe009004e18881fc24"} Feb 18 12:12:00 crc kubenswrapper[4922]: I0218 12:12:00.536274 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-f9wgh" Feb 18 12:12:00 crc kubenswrapper[4922]: I0218 12:12:00.536906 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-f9wgh" Feb 18 12:12:01 crc kubenswrapper[4922]: I0218 12:12:01.593743 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-f9wgh" podUID="0c0e4049-cc63-4ef1-aef5-0542ca9b9667" containerName="registry-server" probeResult="failure" output=< Feb 18 12:12:01 crc kubenswrapper[4922]: timeout: failed to connect service ":50051" within 1s Feb 18 12:12:01 crc kubenswrapper[4922]: > Feb 18 12:12:09 crc kubenswrapper[4922]: I0218 12:12:09.808096 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 12:12:09 crc kubenswrapper[4922]: I0218 12:12:09.808718 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 12:12:11 crc kubenswrapper[4922]: I0218 12:12:11.592685 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-f9wgh" podUID="0c0e4049-cc63-4ef1-aef5-0542ca9b9667" containerName="registry-server" probeResult="failure" output=< Feb 18 12:12:11 crc kubenswrapper[4922]: timeout: failed to connect service ":50051" within 1s Feb 18 12:12:11 crc kubenswrapper[4922]: > Feb 18 12:12:13 crc kubenswrapper[4922]: I0218 12:12:13.329288 4922 generic.go:334] "Generic (PLEG): container finished" podID="45d322f9-bf52-4679-ab43-9d222bc09a14" containerID="14e866305ca7680d979cf41bd87ba2c36720433b54f20a2b9b9d179d1f1c9a18" exitCode=0 Feb 18 12:12:13 crc kubenswrapper[4922]: I0218 12:12:13.329460 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8jmtt" event={"ID":"45d322f9-bf52-4679-ab43-9d222bc09a14","Type":"ContainerDied","Data":"14e866305ca7680d979cf41bd87ba2c36720433b54f20a2b9b9d179d1f1c9a18"} Feb 18 12:12:13 crc kubenswrapper[4922]: I0218 12:12:13.359630 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-f9wgh" podStartSLOduration=18.123089104 podStartE2EDuration="23.359607624s" podCreationTimestamp="2026-02-18 12:11:50 +0000 UTC" firstStartedPulling="2026-02-18 12:11:51.98104474 +0000 UTC m=+2113.708748820" lastFinishedPulling="2026-02-18 12:11:57.21756326 +0000 UTC m=+2118.945267340" observedRunningTime="2026-02-18 12:11:58.092843963 +0000 UTC m=+2119.820548043" watchObservedRunningTime="2026-02-18 12:12:13.359607624 +0000 UTC m=+2135.087311704" Feb 18 12:12:14 crc kubenswrapper[4922]: I0218 12:12:14.948492 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8jmtt" Feb 18 12:12:15 crc kubenswrapper[4922]: I0218 12:12:15.086677 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/45d322f9-bf52-4679-ab43-9d222bc09a14-ovncontroller-config-0\") pod \"45d322f9-bf52-4679-ab43-9d222bc09a14\" (UID: \"45d322f9-bf52-4679-ab43-9d222bc09a14\") " Feb 18 12:12:15 crc kubenswrapper[4922]: I0218 12:12:15.086766 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l94rs\" (UniqueName: \"kubernetes.io/projected/45d322f9-bf52-4679-ab43-9d222bc09a14-kube-api-access-l94rs\") pod \"45d322f9-bf52-4679-ab43-9d222bc09a14\" (UID: \"45d322f9-bf52-4679-ab43-9d222bc09a14\") " Feb 18 12:12:15 crc kubenswrapper[4922]: I0218 12:12:15.086817 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/45d322f9-bf52-4679-ab43-9d222bc09a14-inventory\") pod \"45d322f9-bf52-4679-ab43-9d222bc09a14\" (UID: \"45d322f9-bf52-4679-ab43-9d222bc09a14\") " Feb 18 12:12:15 crc kubenswrapper[4922]: I0218 12:12:15.087116 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45d322f9-bf52-4679-ab43-9d222bc09a14-ovn-combined-ca-bundle\") pod \"45d322f9-bf52-4679-ab43-9d222bc09a14\" (UID: \"45d322f9-bf52-4679-ab43-9d222bc09a14\") " Feb 18 12:12:15 crc kubenswrapper[4922]: I0218 12:12:15.088022 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/45d322f9-bf52-4679-ab43-9d222bc09a14-ssh-key-openstack-edpm-ipam\") pod \"45d322f9-bf52-4679-ab43-9d222bc09a14\" (UID: \"45d322f9-bf52-4679-ab43-9d222bc09a14\") " Feb 18 12:12:15 crc kubenswrapper[4922]: I0218 12:12:15.097280 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45d322f9-bf52-4679-ab43-9d222bc09a14-kube-api-access-l94rs" (OuterVolumeSpecName: "kube-api-access-l94rs") pod "45d322f9-bf52-4679-ab43-9d222bc09a14" (UID: "45d322f9-bf52-4679-ab43-9d222bc09a14"). InnerVolumeSpecName "kube-api-access-l94rs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:12:15 crc kubenswrapper[4922]: I0218 12:12:15.101831 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45d322f9-bf52-4679-ab43-9d222bc09a14-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "45d322f9-bf52-4679-ab43-9d222bc09a14" (UID: "45d322f9-bf52-4679-ab43-9d222bc09a14"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:12:15 crc kubenswrapper[4922]: I0218 12:12:15.125788 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45d322f9-bf52-4679-ab43-9d222bc09a14-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "45d322f9-bf52-4679-ab43-9d222bc09a14" (UID: "45d322f9-bf52-4679-ab43-9d222bc09a14"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 12:12:15 crc kubenswrapper[4922]: I0218 12:12:15.132137 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45d322f9-bf52-4679-ab43-9d222bc09a14-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "45d322f9-bf52-4679-ab43-9d222bc09a14" (UID: "45d322f9-bf52-4679-ab43-9d222bc09a14"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:12:15 crc kubenswrapper[4922]: I0218 12:12:15.138503 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45d322f9-bf52-4679-ab43-9d222bc09a14-inventory" (OuterVolumeSpecName: "inventory") pod "45d322f9-bf52-4679-ab43-9d222bc09a14" (UID: "45d322f9-bf52-4679-ab43-9d222bc09a14"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:12:15 crc kubenswrapper[4922]: I0218 12:12:15.191753 4922 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45d322f9-bf52-4679-ab43-9d222bc09a14-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 12:12:15 crc kubenswrapper[4922]: I0218 12:12:15.191807 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/45d322f9-bf52-4679-ab43-9d222bc09a14-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 12:12:15 crc kubenswrapper[4922]: I0218 12:12:15.191819 4922 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/45d322f9-bf52-4679-ab43-9d222bc09a14-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Feb 18 12:12:15 crc kubenswrapper[4922]: I0218 12:12:15.191830 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l94rs\" (UniqueName: \"kubernetes.io/projected/45d322f9-bf52-4679-ab43-9d222bc09a14-kube-api-access-l94rs\") on node \"crc\" DevicePath \"\"" Feb 18 12:12:15 crc kubenswrapper[4922]: I0218 12:12:15.191843 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/45d322f9-bf52-4679-ab43-9d222bc09a14-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 12:12:15 crc kubenswrapper[4922]: I0218 12:12:15.352337 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8jmtt" event={"ID":"45d322f9-bf52-4679-ab43-9d222bc09a14","Type":"ContainerDied","Data":"cb909c8a47f940a517a676dc5536d0bf0b70a866ff9c6682d5855478d99f6690"} Feb 18 12:12:15 crc kubenswrapper[4922]: I0218 12:12:15.352400 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb909c8a47f940a517a676dc5536d0bf0b70a866ff9c6682d5855478d99f6690" Feb 18 12:12:15 crc kubenswrapper[4922]: I0218 12:12:15.352515 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8jmtt" Feb 18 12:12:15 crc kubenswrapper[4922]: I0218 12:12:15.526168 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25nz6"] Feb 18 12:12:15 crc kubenswrapper[4922]: E0218 12:12:15.526689 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45d322f9-bf52-4679-ab43-9d222bc09a14" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 18 12:12:15 crc kubenswrapper[4922]: I0218 12:12:15.526711 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="45d322f9-bf52-4679-ab43-9d222bc09a14" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 18 12:12:15 crc kubenswrapper[4922]: I0218 12:12:15.526959 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="45d322f9-bf52-4679-ab43-9d222bc09a14" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 18 12:12:15 crc kubenswrapper[4922]: I0218 12:12:15.527846 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25nz6" Feb 18 12:12:15 crc kubenswrapper[4922]: I0218 12:12:15.535135 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 12:12:15 crc kubenswrapper[4922]: I0218 12:12:15.535328 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 12:12:15 crc kubenswrapper[4922]: I0218 12:12:15.535457 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Feb 18 12:12:15 crc kubenswrapper[4922]: I0218 12:12:15.537240 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 12:12:15 crc kubenswrapper[4922]: I0218 12:12:15.537514 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8fsfv" Feb 18 12:12:15 crc kubenswrapper[4922]: I0218 12:12:15.538636 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Feb 18 12:12:15 crc kubenswrapper[4922]: I0218 12:12:15.539925 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25nz6"] Feb 18 12:12:15 crc kubenswrapper[4922]: I0218 12:12:15.704467 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htkvl\" (UniqueName: \"kubernetes.io/projected/9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3-kube-api-access-htkvl\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-25nz6\" (UID: \"9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25nz6" Feb 18 12:12:15 crc kubenswrapper[4922]: I0218 12:12:15.704544 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-25nz6\" (UID: \"9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25nz6" Feb 18 12:12:15 crc kubenswrapper[4922]: I0218 12:12:15.704892 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-25nz6\" (UID: \"9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25nz6" Feb 18 12:12:15 crc kubenswrapper[4922]: I0218 12:12:15.705038 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-25nz6\" (UID: \"9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25nz6" Feb 18 12:12:15 crc kubenswrapper[4922]: I0218 12:12:15.705113 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-25nz6\" (UID: \"9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25nz6" Feb 18 12:12:15 crc kubenswrapper[4922]: I0218 12:12:15.705190 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-25nz6\" (UID: \"9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25nz6" Feb 18 12:12:15 crc kubenswrapper[4922]: I0218 12:12:15.806633 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-25nz6\" (UID: \"9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25nz6" Feb 18 12:12:15 crc kubenswrapper[4922]: I0218 12:12:15.806987 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-25nz6\" (UID: \"9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25nz6" Feb 18 12:12:15 crc kubenswrapper[4922]: I0218 12:12:15.807020 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-25nz6\" (UID: \"9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25nz6" Feb 18 12:12:15 crc kubenswrapper[4922]: I0218 12:12:15.807067 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-25nz6\" (UID: \"9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25nz6" Feb 18 12:12:15 crc kubenswrapper[4922]: I0218 12:12:15.807150 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htkvl\" (UniqueName: \"kubernetes.io/projected/9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3-kube-api-access-htkvl\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-25nz6\" (UID: \"9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25nz6" Feb 18 12:12:15 crc kubenswrapper[4922]: I0218 12:12:15.807190 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-25nz6\" (UID: \"9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25nz6" Feb 18 12:12:15 crc kubenswrapper[4922]: I0218 12:12:15.811629 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-25nz6\" (UID: \"9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25nz6" Feb 18 12:12:15 crc kubenswrapper[4922]: I0218 12:12:15.811737 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-25nz6\" (UID: \"9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25nz6" Feb 18 12:12:15 crc kubenswrapper[4922]: I0218 12:12:15.812247 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-25nz6\" (UID: \"9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25nz6" Feb 18 12:12:15 crc kubenswrapper[4922]: I0218 12:12:15.815930 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-25nz6\" (UID: \"9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25nz6" Feb 18 12:12:15 crc kubenswrapper[4922]: I0218 12:12:15.817643 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-25nz6\" (UID: \"9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25nz6" Feb 18 12:12:15 crc kubenswrapper[4922]: I0218 12:12:15.826712 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htkvl\" (UniqueName: \"kubernetes.io/projected/9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3-kube-api-access-htkvl\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-25nz6\" (UID: \"9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25nz6" Feb 18 12:12:15 crc kubenswrapper[4922]: I0218 12:12:15.861539 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25nz6" Feb 18 12:12:16 crc kubenswrapper[4922]: I0218 12:12:16.443646 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25nz6"] Feb 18 12:12:17 crc kubenswrapper[4922]: I0218 12:12:17.371588 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25nz6" event={"ID":"9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3","Type":"ContainerStarted","Data":"f675a65ec5122eac9b3db3da6055626f4e90d9a34b75684d8bd611667cdcc4bc"} Feb 18 12:12:17 crc kubenswrapper[4922]: I0218 12:12:17.371902 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25nz6" event={"ID":"9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3","Type":"ContainerStarted","Data":"a81e1f243cb58e624334569d3d46568771a74e45026b207775af54bf872d2f98"} Feb 18 12:12:17 crc kubenswrapper[4922]: I0218 12:12:17.394767 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25nz6" podStartSLOduration=1.857260345 podStartE2EDuration="2.39474805s" podCreationTimestamp="2026-02-18 12:12:15 +0000 UTC" firstStartedPulling="2026-02-18 12:12:16.473786308 +0000 UTC m=+2138.201490388" lastFinishedPulling="2026-02-18 12:12:17.011274013 +0000 UTC m=+2138.738978093" observedRunningTime="2026-02-18 12:12:17.393908789 +0000 UTC m=+2139.121612879" watchObservedRunningTime="2026-02-18 12:12:17.39474805 +0000 UTC m=+2139.122452130" Feb 18 12:12:19 crc kubenswrapper[4922]: I0218 12:12:19.921339 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5vz25"] Feb 18 12:12:19 crc kubenswrapper[4922]: I0218 12:12:19.924678 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5vz25" Feb 18 12:12:19 crc kubenswrapper[4922]: I0218 12:12:19.940209 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5vz25"] Feb 18 12:12:20 crc kubenswrapper[4922]: I0218 12:12:20.100222 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c22488e9-a8cd-4400-8b66-15074c7726ac-catalog-content\") pod \"certified-operators-5vz25\" (UID: \"c22488e9-a8cd-4400-8b66-15074c7726ac\") " pod="openshift-marketplace/certified-operators-5vz25" Feb 18 12:12:20 crc kubenswrapper[4922]: I0218 12:12:20.100300 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tct6\" (UniqueName: \"kubernetes.io/projected/c22488e9-a8cd-4400-8b66-15074c7726ac-kube-api-access-6tct6\") pod \"certified-operators-5vz25\" (UID: \"c22488e9-a8cd-4400-8b66-15074c7726ac\") " pod="openshift-marketplace/certified-operators-5vz25" Feb 18 12:12:20 crc kubenswrapper[4922]: I0218 12:12:20.100460 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c22488e9-a8cd-4400-8b66-15074c7726ac-utilities\") pod \"certified-operators-5vz25\" (UID: \"c22488e9-a8cd-4400-8b66-15074c7726ac\") " pod="openshift-marketplace/certified-operators-5vz25" Feb 18 12:12:20 crc kubenswrapper[4922]: I0218 12:12:20.201712 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c22488e9-a8cd-4400-8b66-15074c7726ac-utilities\") pod \"certified-operators-5vz25\" (UID: \"c22488e9-a8cd-4400-8b66-15074c7726ac\") " pod="openshift-marketplace/certified-operators-5vz25" Feb 18 12:12:20 crc kubenswrapper[4922]: I0218 12:12:20.201902 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c22488e9-a8cd-4400-8b66-15074c7726ac-catalog-content\") pod \"certified-operators-5vz25\" (UID: \"c22488e9-a8cd-4400-8b66-15074c7726ac\") " pod="openshift-marketplace/certified-operators-5vz25" Feb 18 12:12:20 crc kubenswrapper[4922]: I0218 12:12:20.201940 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tct6\" (UniqueName: \"kubernetes.io/projected/c22488e9-a8cd-4400-8b66-15074c7726ac-kube-api-access-6tct6\") pod \"certified-operators-5vz25\" (UID: \"c22488e9-a8cd-4400-8b66-15074c7726ac\") " pod="openshift-marketplace/certified-operators-5vz25" Feb 18 12:12:20 crc kubenswrapper[4922]: I0218 12:12:20.202652 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c22488e9-a8cd-4400-8b66-15074c7726ac-utilities\") pod \"certified-operators-5vz25\" (UID: \"c22488e9-a8cd-4400-8b66-15074c7726ac\") " pod="openshift-marketplace/certified-operators-5vz25" Feb 18 12:12:20 crc kubenswrapper[4922]: I0218 12:12:20.202862 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c22488e9-a8cd-4400-8b66-15074c7726ac-catalog-content\") pod \"certified-operators-5vz25\" (UID: \"c22488e9-a8cd-4400-8b66-15074c7726ac\") " pod="openshift-marketplace/certified-operators-5vz25" Feb 18 12:12:20 crc kubenswrapper[4922]: I0218 12:12:20.241559 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tct6\" (UniqueName: \"kubernetes.io/projected/c22488e9-a8cd-4400-8b66-15074c7726ac-kube-api-access-6tct6\") pod \"certified-operators-5vz25\" (UID: \"c22488e9-a8cd-4400-8b66-15074c7726ac\") " pod="openshift-marketplace/certified-operators-5vz25" Feb 18 12:12:20 crc kubenswrapper[4922]: I0218 12:12:20.256435 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5vz25" Feb 18 12:12:20 crc kubenswrapper[4922]: I0218 12:12:20.802021 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5vz25"] Feb 18 12:12:21 crc kubenswrapper[4922]: I0218 12:12:21.415343 4922 generic.go:334] "Generic (PLEG): container finished" podID="c22488e9-a8cd-4400-8b66-15074c7726ac" containerID="97ef2eea12376eeab2e6df9623ea90d67e62db6f4755d6707c06eecdb163bde2" exitCode=0 Feb 18 12:12:21 crc kubenswrapper[4922]: I0218 12:12:21.415425 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5vz25" event={"ID":"c22488e9-a8cd-4400-8b66-15074c7726ac","Type":"ContainerDied","Data":"97ef2eea12376eeab2e6df9623ea90d67e62db6f4755d6707c06eecdb163bde2"} Feb 18 12:12:21 crc kubenswrapper[4922]: I0218 12:12:21.415453 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5vz25" event={"ID":"c22488e9-a8cd-4400-8b66-15074c7726ac","Type":"ContainerStarted","Data":"8baac64d2919d5b2b53ee7bfc0c20fa53158f07aa3235e48a21697aa69aeb1df"} Feb 18 12:12:21 crc kubenswrapper[4922]: I0218 12:12:21.593225 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-f9wgh" podUID="0c0e4049-cc63-4ef1-aef5-0542ca9b9667" containerName="registry-server" probeResult="failure" output=< Feb 18 12:12:21 crc kubenswrapper[4922]: timeout: failed to connect service ":50051" within 1s Feb 18 12:12:21 crc kubenswrapper[4922]: > Feb 18 12:12:23 crc kubenswrapper[4922]: I0218 12:12:23.115245 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-c7wgq"] Feb 18 12:12:23 crc kubenswrapper[4922]: I0218 12:12:23.121694 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c7wgq" Feb 18 12:12:23 crc kubenswrapper[4922]: I0218 12:12:23.133573 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c7wgq"] Feb 18 12:12:23 crc kubenswrapper[4922]: I0218 12:12:23.189692 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12118652-851f-47e2-ac7d-42304ca159f7-utilities\") pod \"redhat-marketplace-c7wgq\" (UID: \"12118652-851f-47e2-ac7d-42304ca159f7\") " pod="openshift-marketplace/redhat-marketplace-c7wgq" Feb 18 12:12:23 crc kubenswrapper[4922]: I0218 12:12:23.189814 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcxqg\" (UniqueName: \"kubernetes.io/projected/12118652-851f-47e2-ac7d-42304ca159f7-kube-api-access-qcxqg\") pod \"redhat-marketplace-c7wgq\" (UID: \"12118652-851f-47e2-ac7d-42304ca159f7\") " pod="openshift-marketplace/redhat-marketplace-c7wgq" Feb 18 12:12:23 crc kubenswrapper[4922]: I0218 12:12:23.189878 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12118652-851f-47e2-ac7d-42304ca159f7-catalog-content\") pod \"redhat-marketplace-c7wgq\" (UID: \"12118652-851f-47e2-ac7d-42304ca159f7\") " pod="openshift-marketplace/redhat-marketplace-c7wgq" Feb 18 12:12:23 crc kubenswrapper[4922]: I0218 12:12:23.292846 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12118652-851f-47e2-ac7d-42304ca159f7-utilities\") pod \"redhat-marketplace-c7wgq\" (UID: \"12118652-851f-47e2-ac7d-42304ca159f7\") " pod="openshift-marketplace/redhat-marketplace-c7wgq" Feb 18 12:12:23 crc kubenswrapper[4922]: I0218 12:12:23.292976 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcxqg\" (UniqueName: \"kubernetes.io/projected/12118652-851f-47e2-ac7d-42304ca159f7-kube-api-access-qcxqg\") pod \"redhat-marketplace-c7wgq\" (UID: \"12118652-851f-47e2-ac7d-42304ca159f7\") " pod="openshift-marketplace/redhat-marketplace-c7wgq" Feb 18 12:12:23 crc kubenswrapper[4922]: I0218 12:12:23.293044 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12118652-851f-47e2-ac7d-42304ca159f7-catalog-content\") pod \"redhat-marketplace-c7wgq\" (UID: \"12118652-851f-47e2-ac7d-42304ca159f7\") " pod="openshift-marketplace/redhat-marketplace-c7wgq" Feb 18 12:12:23 crc kubenswrapper[4922]: I0218 12:12:23.293398 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12118652-851f-47e2-ac7d-42304ca159f7-utilities\") pod \"redhat-marketplace-c7wgq\" (UID: \"12118652-851f-47e2-ac7d-42304ca159f7\") " pod="openshift-marketplace/redhat-marketplace-c7wgq" Feb 18 12:12:23 crc kubenswrapper[4922]: I0218 12:12:23.293800 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12118652-851f-47e2-ac7d-42304ca159f7-catalog-content\") pod \"redhat-marketplace-c7wgq\" (UID: \"12118652-851f-47e2-ac7d-42304ca159f7\") " pod="openshift-marketplace/redhat-marketplace-c7wgq" Feb 18 12:12:23 crc kubenswrapper[4922]: I0218 12:12:23.316802 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcxqg\" (UniqueName: \"kubernetes.io/projected/12118652-851f-47e2-ac7d-42304ca159f7-kube-api-access-qcxqg\") pod \"redhat-marketplace-c7wgq\" (UID: \"12118652-851f-47e2-ac7d-42304ca159f7\") " pod="openshift-marketplace/redhat-marketplace-c7wgq" Feb 18 12:12:23 crc kubenswrapper[4922]: I0218 12:12:23.454191 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c7wgq" Feb 18 12:12:23 crc kubenswrapper[4922]: I0218 12:12:23.954575 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c7wgq"] Feb 18 12:12:24 crc kubenswrapper[4922]: I0218 12:12:24.441730 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c7wgq" event={"ID":"12118652-851f-47e2-ac7d-42304ca159f7","Type":"ContainerStarted","Data":"7e19e5b7a0886f12574215201d36d33ceaa48e9faa615612e6dc043561627932"} Feb 18 12:12:25 crc kubenswrapper[4922]: I0218 12:12:25.452520 4922 generic.go:334] "Generic (PLEG): container finished" podID="12118652-851f-47e2-ac7d-42304ca159f7" containerID="dd1dba4d12b91de3b2090b7189eb56e82f21a53041e34adc93da1219ee52a1cf" exitCode=0 Feb 18 12:12:25 crc kubenswrapper[4922]: I0218 12:12:25.452633 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c7wgq" event={"ID":"12118652-851f-47e2-ac7d-42304ca159f7","Type":"ContainerDied","Data":"dd1dba4d12b91de3b2090b7189eb56e82f21a53041e34adc93da1219ee52a1cf"} Feb 18 12:12:25 crc kubenswrapper[4922]: I0218 12:12:25.454448 4922 generic.go:334] "Generic (PLEG): container finished" podID="c22488e9-a8cd-4400-8b66-15074c7726ac" containerID="3e055de6ac902e5ab3c63140a9b1f08e83d9d8d12f85796a582b63697fa0133a" exitCode=0 Feb 18 12:12:25 crc kubenswrapper[4922]: I0218 12:12:25.454478 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5vz25" event={"ID":"c22488e9-a8cd-4400-8b66-15074c7726ac","Type":"ContainerDied","Data":"3e055de6ac902e5ab3c63140a9b1f08e83d9d8d12f85796a582b63697fa0133a"} Feb 18 12:12:27 crc kubenswrapper[4922]: I0218 12:12:27.483465 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5vz25" event={"ID":"c22488e9-a8cd-4400-8b66-15074c7726ac","Type":"ContainerStarted","Data":"f452d7b83cd45bdffb82f76f87e8bc9e821a596574b505e62f7c495e49975eb8"} Feb 18 12:12:27 crc kubenswrapper[4922]: I0218 12:12:27.487726 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c7wgq" event={"ID":"12118652-851f-47e2-ac7d-42304ca159f7","Type":"ContainerStarted","Data":"eddc46b683c01644b3304efd906e75b5a9fd440e15fcfe930f86c07c2b3ba4fa"} Feb 18 12:12:27 crc kubenswrapper[4922]: I0218 12:12:27.512600 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5vz25" podStartSLOduration=2.793957605 podStartE2EDuration="8.512580495s" podCreationTimestamp="2026-02-18 12:12:19 +0000 UTC" firstStartedPulling="2026-02-18 12:12:21.417622338 +0000 UTC m=+2143.145326408" lastFinishedPulling="2026-02-18 12:12:27.136245218 +0000 UTC m=+2148.863949298" observedRunningTime="2026-02-18 12:12:27.501495006 +0000 UTC m=+2149.229199106" watchObservedRunningTime="2026-02-18 12:12:27.512580495 +0000 UTC m=+2149.240284575" Feb 18 12:12:29 crc kubenswrapper[4922]: I0218 12:12:29.513175 4922 generic.go:334] "Generic (PLEG): container finished" podID="12118652-851f-47e2-ac7d-42304ca159f7" containerID="eddc46b683c01644b3304efd906e75b5a9fd440e15fcfe930f86c07c2b3ba4fa" exitCode=0 Feb 18 12:12:29 crc kubenswrapper[4922]: I0218 12:12:29.513265 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c7wgq" event={"ID":"12118652-851f-47e2-ac7d-42304ca159f7","Type":"ContainerDied","Data":"eddc46b683c01644b3304efd906e75b5a9fd440e15fcfe930f86c07c2b3ba4fa"} Feb 18 12:12:30 crc kubenswrapper[4922]: I0218 12:12:30.257496 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5vz25" Feb 18 12:12:30 crc kubenswrapper[4922]: I0218 12:12:30.258711 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5vz25" Feb 18 12:12:30 crc kubenswrapper[4922]: I0218 12:12:30.589882 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-f9wgh" Feb 18 12:12:30 crc kubenswrapper[4922]: I0218 12:12:30.648888 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-f9wgh" Feb 18 12:12:31 crc kubenswrapper[4922]: I0218 12:12:31.308830 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-5vz25" podUID="c22488e9-a8cd-4400-8b66-15074c7726ac" containerName="registry-server" probeResult="failure" output=< Feb 18 12:12:31 crc kubenswrapper[4922]: timeout: failed to connect service ":50051" within 1s Feb 18 12:12:31 crc kubenswrapper[4922]: > Feb 18 12:12:31 crc kubenswrapper[4922]: I0218 12:12:31.531714 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c7wgq" event={"ID":"12118652-851f-47e2-ac7d-42304ca159f7","Type":"ContainerStarted","Data":"8b6f8a1b87bc5c610a532fb17ee32a73f90f8b4873b63a0ecc7e0fb900c07607"} Feb 18 12:12:31 crc kubenswrapper[4922]: I0218 12:12:31.557660 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-c7wgq" podStartSLOduration=3.6933986450000003 podStartE2EDuration="8.557638481s" podCreationTimestamp="2026-02-18 12:12:23 +0000 UTC" firstStartedPulling="2026-02-18 12:12:25.454647611 +0000 UTC m=+2147.182351681" lastFinishedPulling="2026-02-18 12:12:30.318887447 +0000 UTC m=+2152.046591517" observedRunningTime="2026-02-18 12:12:31.549311572 +0000 UTC m=+2153.277015642" watchObservedRunningTime="2026-02-18 12:12:31.557638481 +0000 UTC m=+2153.285342571" Feb 18 12:12:33 crc kubenswrapper[4922]: I0218 12:12:33.455155 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-c7wgq" Feb 18 12:12:33 crc kubenswrapper[4922]: I0218 12:12:33.455200 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-c7wgq" Feb 18 12:12:33 crc kubenswrapper[4922]: I0218 12:12:33.506130 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-c7wgq" Feb 18 12:12:37 crc kubenswrapper[4922]: I0218 12:12:37.704429 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f9wgh"] Feb 18 12:12:37 crc kubenswrapper[4922]: I0218 12:12:37.705133 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-f9wgh" podUID="0c0e4049-cc63-4ef1-aef5-0542ca9b9667" containerName="registry-server" containerID="cri-o://ce2a92bf867d95dab323be5ac67b666234b7cc50ed9bdbfe009004e18881fc24" gracePeriod=2 Feb 18 12:12:38 crc kubenswrapper[4922]: I0218 12:12:38.162642 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f9wgh" Feb 18 12:12:38 crc kubenswrapper[4922]: I0218 12:12:38.187496 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c0e4049-cc63-4ef1-aef5-0542ca9b9667-catalog-content\") pod \"0c0e4049-cc63-4ef1-aef5-0542ca9b9667\" (UID: \"0c0e4049-cc63-4ef1-aef5-0542ca9b9667\") " Feb 18 12:12:38 crc kubenswrapper[4922]: I0218 12:12:38.187566 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpsh7\" (UniqueName: \"kubernetes.io/projected/0c0e4049-cc63-4ef1-aef5-0542ca9b9667-kube-api-access-rpsh7\") pod \"0c0e4049-cc63-4ef1-aef5-0542ca9b9667\" (UID: \"0c0e4049-cc63-4ef1-aef5-0542ca9b9667\") " Feb 18 12:12:38 crc kubenswrapper[4922]: I0218 12:12:38.194624 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c0e4049-cc63-4ef1-aef5-0542ca9b9667-kube-api-access-rpsh7" (OuterVolumeSpecName: "kube-api-access-rpsh7") pod "0c0e4049-cc63-4ef1-aef5-0542ca9b9667" (UID: "0c0e4049-cc63-4ef1-aef5-0542ca9b9667"). InnerVolumeSpecName "kube-api-access-rpsh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:12:38 crc kubenswrapper[4922]: I0218 12:12:38.289353 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c0e4049-cc63-4ef1-aef5-0542ca9b9667-utilities\") pod \"0c0e4049-cc63-4ef1-aef5-0542ca9b9667\" (UID: \"0c0e4049-cc63-4ef1-aef5-0542ca9b9667\") " Feb 18 12:12:38 crc kubenswrapper[4922]: I0218 12:12:38.289863 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpsh7\" (UniqueName: \"kubernetes.io/projected/0c0e4049-cc63-4ef1-aef5-0542ca9b9667-kube-api-access-rpsh7\") on node \"crc\" DevicePath \"\"" Feb 18 12:12:38 crc kubenswrapper[4922]: I0218 12:12:38.290180 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c0e4049-cc63-4ef1-aef5-0542ca9b9667-utilities" (OuterVolumeSpecName: "utilities") pod "0c0e4049-cc63-4ef1-aef5-0542ca9b9667" (UID: "0c0e4049-cc63-4ef1-aef5-0542ca9b9667"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:12:38 crc kubenswrapper[4922]: I0218 12:12:38.306307 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c0e4049-cc63-4ef1-aef5-0542ca9b9667-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0c0e4049-cc63-4ef1-aef5-0542ca9b9667" (UID: "0c0e4049-cc63-4ef1-aef5-0542ca9b9667"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:12:38 crc kubenswrapper[4922]: I0218 12:12:38.391340 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c0e4049-cc63-4ef1-aef5-0542ca9b9667-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 12:12:38 crc kubenswrapper[4922]: I0218 12:12:38.391427 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c0e4049-cc63-4ef1-aef5-0542ca9b9667-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 12:12:38 crc kubenswrapper[4922]: I0218 12:12:38.591630 4922 generic.go:334] "Generic (PLEG): container finished" podID="0c0e4049-cc63-4ef1-aef5-0542ca9b9667" containerID="ce2a92bf867d95dab323be5ac67b666234b7cc50ed9bdbfe009004e18881fc24" exitCode=0 Feb 18 12:12:38 crc kubenswrapper[4922]: I0218 12:12:38.591673 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f9wgh" event={"ID":"0c0e4049-cc63-4ef1-aef5-0542ca9b9667","Type":"ContainerDied","Data":"ce2a92bf867d95dab323be5ac67b666234b7cc50ed9bdbfe009004e18881fc24"} Feb 18 12:12:38 crc kubenswrapper[4922]: I0218 12:12:38.591699 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f9wgh" event={"ID":"0c0e4049-cc63-4ef1-aef5-0542ca9b9667","Type":"ContainerDied","Data":"a04d7d069df5d9960d42f6d4bd51a40213af02b5d84049f2834e25d340b39cd9"} Feb 18 12:12:38 crc kubenswrapper[4922]: I0218 12:12:38.591716 4922 scope.go:117] "RemoveContainer" containerID="ce2a92bf867d95dab323be5ac67b666234b7cc50ed9bdbfe009004e18881fc24" Feb 18 12:12:38 crc kubenswrapper[4922]: I0218 12:12:38.591852 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f9wgh" Feb 18 12:12:38 crc kubenswrapper[4922]: I0218 12:12:38.616311 4922 scope.go:117] "RemoveContainer" containerID="6a8d4cd1b4b048bf54c76705d96da87ce8de69c632ee41b7145096277e801fbc" Feb 18 12:12:38 crc kubenswrapper[4922]: I0218 12:12:38.626248 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f9wgh"] Feb 18 12:12:38 crc kubenswrapper[4922]: I0218 12:12:38.635897 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-f9wgh"] Feb 18 12:12:38 crc kubenswrapper[4922]: I0218 12:12:38.662106 4922 scope.go:117] "RemoveContainer" containerID="9890a2620c97984231eef2e007309f75dd81b3c214c8050c5d3c9d04775450e2" Feb 18 12:12:38 crc kubenswrapper[4922]: I0218 12:12:38.685753 4922 scope.go:117] "RemoveContainer" containerID="ce2a92bf867d95dab323be5ac67b666234b7cc50ed9bdbfe009004e18881fc24" Feb 18 12:12:38 crc kubenswrapper[4922]: E0218 12:12:38.686251 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce2a92bf867d95dab323be5ac67b666234b7cc50ed9bdbfe009004e18881fc24\": container with ID starting with ce2a92bf867d95dab323be5ac67b666234b7cc50ed9bdbfe009004e18881fc24 not found: ID does not exist" containerID="ce2a92bf867d95dab323be5ac67b666234b7cc50ed9bdbfe009004e18881fc24" Feb 18 12:12:38 crc kubenswrapper[4922]: I0218 12:12:38.686478 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce2a92bf867d95dab323be5ac67b666234b7cc50ed9bdbfe009004e18881fc24"} err="failed to get container status \"ce2a92bf867d95dab323be5ac67b666234b7cc50ed9bdbfe009004e18881fc24\": rpc error: code = NotFound desc = could not find container \"ce2a92bf867d95dab323be5ac67b666234b7cc50ed9bdbfe009004e18881fc24\": container with ID starting with ce2a92bf867d95dab323be5ac67b666234b7cc50ed9bdbfe009004e18881fc24 not found: ID does not exist" Feb 18 12:12:38 crc kubenswrapper[4922]: I0218 12:12:38.686579 4922 scope.go:117] "RemoveContainer" containerID="6a8d4cd1b4b048bf54c76705d96da87ce8de69c632ee41b7145096277e801fbc" Feb 18 12:12:38 crc kubenswrapper[4922]: E0218 12:12:38.687152 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a8d4cd1b4b048bf54c76705d96da87ce8de69c632ee41b7145096277e801fbc\": container with ID starting with 6a8d4cd1b4b048bf54c76705d96da87ce8de69c632ee41b7145096277e801fbc not found: ID does not exist" containerID="6a8d4cd1b4b048bf54c76705d96da87ce8de69c632ee41b7145096277e801fbc" Feb 18 12:12:38 crc kubenswrapper[4922]: I0218 12:12:38.687236 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a8d4cd1b4b048bf54c76705d96da87ce8de69c632ee41b7145096277e801fbc"} err="failed to get container status \"6a8d4cd1b4b048bf54c76705d96da87ce8de69c632ee41b7145096277e801fbc\": rpc error: code = NotFound desc = could not find container \"6a8d4cd1b4b048bf54c76705d96da87ce8de69c632ee41b7145096277e801fbc\": container with ID starting with 6a8d4cd1b4b048bf54c76705d96da87ce8de69c632ee41b7145096277e801fbc not found: ID does not exist" Feb 18 12:12:38 crc kubenswrapper[4922]: I0218 12:12:38.687296 4922 scope.go:117] "RemoveContainer" containerID="9890a2620c97984231eef2e007309f75dd81b3c214c8050c5d3c9d04775450e2" Feb 18 12:12:38 crc kubenswrapper[4922]: E0218 12:12:38.687734 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9890a2620c97984231eef2e007309f75dd81b3c214c8050c5d3c9d04775450e2\": container with ID starting with 9890a2620c97984231eef2e007309f75dd81b3c214c8050c5d3c9d04775450e2 not found: ID does not exist" containerID="9890a2620c97984231eef2e007309f75dd81b3c214c8050c5d3c9d04775450e2" Feb 18 12:12:38 crc kubenswrapper[4922]: I0218 12:12:38.687763 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9890a2620c97984231eef2e007309f75dd81b3c214c8050c5d3c9d04775450e2"} err="failed to get container status \"9890a2620c97984231eef2e007309f75dd81b3c214c8050c5d3c9d04775450e2\": rpc error: code = NotFound desc = could not find container \"9890a2620c97984231eef2e007309f75dd81b3c214c8050c5d3c9d04775450e2\": container with ID starting with 9890a2620c97984231eef2e007309f75dd81b3c214c8050c5d3c9d04775450e2 not found: ID does not exist" Feb 18 12:12:38 crc kubenswrapper[4922]: I0218 12:12:38.985021 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c0e4049-cc63-4ef1-aef5-0542ca9b9667" path="/var/lib/kubelet/pods/0c0e4049-cc63-4ef1-aef5-0542ca9b9667/volumes" Feb 18 12:12:39 crc kubenswrapper[4922]: I0218 12:12:39.807855 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 12:12:39 crc kubenswrapper[4922]: I0218 12:12:39.808225 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 12:12:40 crc kubenswrapper[4922]: I0218 12:12:40.309409 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5vz25" Feb 18 12:12:40 crc kubenswrapper[4922]: I0218 12:12:40.364331 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5vz25" Feb 18 12:12:42 crc kubenswrapper[4922]: I0218 12:12:42.318602 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5vz25"] Feb 18 12:12:42 crc kubenswrapper[4922]: I0218 12:12:42.319805 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5vz25" podUID="c22488e9-a8cd-4400-8b66-15074c7726ac" containerName="registry-server" containerID="cri-o://f452d7b83cd45bdffb82f76f87e8bc9e821a596574b505e62f7c495e49975eb8" gracePeriod=2 Feb 18 12:12:42 crc kubenswrapper[4922]: I0218 12:12:42.632780 4922 generic.go:334] "Generic (PLEG): container finished" podID="c22488e9-a8cd-4400-8b66-15074c7726ac" containerID="f452d7b83cd45bdffb82f76f87e8bc9e821a596574b505e62f7c495e49975eb8" exitCode=0 Feb 18 12:12:42 crc kubenswrapper[4922]: I0218 12:12:42.632874 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5vz25" event={"ID":"c22488e9-a8cd-4400-8b66-15074c7726ac","Type":"ContainerDied","Data":"f452d7b83cd45bdffb82f76f87e8bc9e821a596574b505e62f7c495e49975eb8"} Feb 18 12:12:42 crc kubenswrapper[4922]: I0218 12:12:42.815320 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5vz25" Feb 18 12:12:43 crc kubenswrapper[4922]: I0218 12:12:43.006848 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tct6\" (UniqueName: \"kubernetes.io/projected/c22488e9-a8cd-4400-8b66-15074c7726ac-kube-api-access-6tct6\") pod \"c22488e9-a8cd-4400-8b66-15074c7726ac\" (UID: \"c22488e9-a8cd-4400-8b66-15074c7726ac\") " Feb 18 12:12:43 crc kubenswrapper[4922]: I0218 12:12:43.006913 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c22488e9-a8cd-4400-8b66-15074c7726ac-utilities\") pod \"c22488e9-a8cd-4400-8b66-15074c7726ac\" (UID: \"c22488e9-a8cd-4400-8b66-15074c7726ac\") " Feb 18 12:12:43 crc kubenswrapper[4922]: I0218 12:12:43.007067 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c22488e9-a8cd-4400-8b66-15074c7726ac-catalog-content\") pod \"c22488e9-a8cd-4400-8b66-15074c7726ac\" (UID: \"c22488e9-a8cd-4400-8b66-15074c7726ac\") " Feb 18 12:12:43 crc kubenswrapper[4922]: I0218 12:12:43.008022 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c22488e9-a8cd-4400-8b66-15074c7726ac-utilities" (OuterVolumeSpecName: "utilities") pod "c22488e9-a8cd-4400-8b66-15074c7726ac" (UID: "c22488e9-a8cd-4400-8b66-15074c7726ac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:12:43 crc kubenswrapper[4922]: I0218 12:12:43.013162 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c22488e9-a8cd-4400-8b66-15074c7726ac-kube-api-access-6tct6" (OuterVolumeSpecName: "kube-api-access-6tct6") pod "c22488e9-a8cd-4400-8b66-15074c7726ac" (UID: "c22488e9-a8cd-4400-8b66-15074c7726ac"). InnerVolumeSpecName "kube-api-access-6tct6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:12:43 crc kubenswrapper[4922]: I0218 12:12:43.062345 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c22488e9-a8cd-4400-8b66-15074c7726ac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c22488e9-a8cd-4400-8b66-15074c7726ac" (UID: "c22488e9-a8cd-4400-8b66-15074c7726ac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:12:43 crc kubenswrapper[4922]: I0218 12:12:43.113396 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c22488e9-a8cd-4400-8b66-15074c7726ac-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 12:12:43 crc kubenswrapper[4922]: I0218 12:12:43.113424 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c22488e9-a8cd-4400-8b66-15074c7726ac-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 12:12:43 crc kubenswrapper[4922]: I0218 12:12:43.113436 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tct6\" (UniqueName: \"kubernetes.io/projected/c22488e9-a8cd-4400-8b66-15074c7726ac-kube-api-access-6tct6\") on node \"crc\" DevicePath \"\"" Feb 18 12:12:43 crc kubenswrapper[4922]: I0218 12:12:43.505341 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-c7wgq" Feb 18 12:12:43 crc kubenswrapper[4922]: I0218 12:12:43.647831 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5vz25" event={"ID":"c22488e9-a8cd-4400-8b66-15074c7726ac","Type":"ContainerDied","Data":"8baac64d2919d5b2b53ee7bfc0c20fa53158f07aa3235e48a21697aa69aeb1df"} Feb 18 12:12:43 crc kubenswrapper[4922]: I0218 12:12:43.647896 4922 scope.go:117] "RemoveContainer" containerID="f452d7b83cd45bdffb82f76f87e8bc9e821a596574b505e62f7c495e49975eb8" Feb 18 12:12:43 crc kubenswrapper[4922]: I0218 12:12:43.647957 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5vz25" Feb 18 12:12:43 crc kubenswrapper[4922]: I0218 12:12:43.668839 4922 scope.go:117] "RemoveContainer" containerID="3e055de6ac902e5ab3c63140a9b1f08e83d9d8d12f85796a582b63697fa0133a" Feb 18 12:12:43 crc kubenswrapper[4922]: I0218 12:12:43.694953 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5vz25"] Feb 18 12:12:43 crc kubenswrapper[4922]: I0218 12:12:43.705593 4922 scope.go:117] "RemoveContainer" containerID="97ef2eea12376eeab2e6df9623ea90d67e62db6f4755d6707c06eecdb163bde2" Feb 18 12:12:43 crc kubenswrapper[4922]: I0218 12:12:43.706445 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5vz25"] Feb 18 12:12:44 crc kubenswrapper[4922]: I0218 12:12:44.986244 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c22488e9-a8cd-4400-8b66-15074c7726ac" path="/var/lib/kubelet/pods/c22488e9-a8cd-4400-8b66-15074c7726ac/volumes" Feb 18 12:12:45 crc kubenswrapper[4922]: I0218 12:12:45.908785 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c7wgq"] Feb 18 12:12:45 crc kubenswrapper[4922]: I0218 12:12:45.909168 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-c7wgq" podUID="12118652-851f-47e2-ac7d-42304ca159f7" containerName="registry-server" containerID="cri-o://8b6f8a1b87bc5c610a532fb17ee32a73f90f8b4873b63a0ecc7e0fb900c07607" gracePeriod=2 Feb 18 12:12:46 crc kubenswrapper[4922]: I0218 12:12:46.421454 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c7wgq" Feb 18 12:12:46 crc kubenswrapper[4922]: I0218 12:12:46.593074 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12118652-851f-47e2-ac7d-42304ca159f7-catalog-content\") pod \"12118652-851f-47e2-ac7d-42304ca159f7\" (UID: \"12118652-851f-47e2-ac7d-42304ca159f7\") " Feb 18 12:12:46 crc kubenswrapper[4922]: I0218 12:12:46.593194 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12118652-851f-47e2-ac7d-42304ca159f7-utilities\") pod \"12118652-851f-47e2-ac7d-42304ca159f7\" (UID: \"12118652-851f-47e2-ac7d-42304ca159f7\") " Feb 18 12:12:46 crc kubenswrapper[4922]: I0218 12:12:46.593441 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcxqg\" (UniqueName: \"kubernetes.io/projected/12118652-851f-47e2-ac7d-42304ca159f7-kube-api-access-qcxqg\") pod \"12118652-851f-47e2-ac7d-42304ca159f7\" (UID: \"12118652-851f-47e2-ac7d-42304ca159f7\") " Feb 18 12:12:46 crc kubenswrapper[4922]: I0218 12:12:46.594171 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12118652-851f-47e2-ac7d-42304ca159f7-utilities" (OuterVolumeSpecName: "utilities") pod "12118652-851f-47e2-ac7d-42304ca159f7" (UID: "12118652-851f-47e2-ac7d-42304ca159f7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:12:46 crc kubenswrapper[4922]: I0218 12:12:46.600477 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12118652-851f-47e2-ac7d-42304ca159f7-kube-api-access-qcxqg" (OuterVolumeSpecName: "kube-api-access-qcxqg") pod "12118652-851f-47e2-ac7d-42304ca159f7" (UID: "12118652-851f-47e2-ac7d-42304ca159f7"). InnerVolumeSpecName "kube-api-access-qcxqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:12:46 crc kubenswrapper[4922]: I0218 12:12:46.622111 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12118652-851f-47e2-ac7d-42304ca159f7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "12118652-851f-47e2-ac7d-42304ca159f7" (UID: "12118652-851f-47e2-ac7d-42304ca159f7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:12:46 crc kubenswrapper[4922]: I0218 12:12:46.684395 4922 generic.go:334] "Generic (PLEG): container finished" podID="12118652-851f-47e2-ac7d-42304ca159f7" containerID="8b6f8a1b87bc5c610a532fb17ee32a73f90f8b4873b63a0ecc7e0fb900c07607" exitCode=0 Feb 18 12:12:46 crc kubenswrapper[4922]: I0218 12:12:46.684455 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c7wgq" event={"ID":"12118652-851f-47e2-ac7d-42304ca159f7","Type":"ContainerDied","Data":"8b6f8a1b87bc5c610a532fb17ee32a73f90f8b4873b63a0ecc7e0fb900c07607"} Feb 18 12:12:46 crc kubenswrapper[4922]: I0218 12:12:46.684495 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c7wgq" event={"ID":"12118652-851f-47e2-ac7d-42304ca159f7","Type":"ContainerDied","Data":"7e19e5b7a0886f12574215201d36d33ceaa48e9faa615612e6dc043561627932"} Feb 18 12:12:46 crc kubenswrapper[4922]: I0218 12:12:46.684521 4922 scope.go:117] "RemoveContainer" containerID="8b6f8a1b87bc5c610a532fb17ee32a73f90f8b4873b63a0ecc7e0fb900c07607" Feb 18 12:12:46 crc kubenswrapper[4922]: I0218 12:12:46.684683 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c7wgq" Feb 18 12:12:46 crc kubenswrapper[4922]: I0218 12:12:46.696626 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12118652-851f-47e2-ac7d-42304ca159f7-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 12:12:46 crc kubenswrapper[4922]: I0218 12:12:46.696675 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12118652-851f-47e2-ac7d-42304ca159f7-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 12:12:46 crc kubenswrapper[4922]: I0218 12:12:46.696688 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcxqg\" (UniqueName: \"kubernetes.io/projected/12118652-851f-47e2-ac7d-42304ca159f7-kube-api-access-qcxqg\") on node \"crc\" DevicePath \"\"" Feb 18 12:12:46 crc kubenswrapper[4922]: I0218 12:12:46.731418 4922 scope.go:117] "RemoveContainer" containerID="eddc46b683c01644b3304efd906e75b5a9fd440e15fcfe930f86c07c2b3ba4fa" Feb 18 12:12:46 crc kubenswrapper[4922]: I0218 12:12:46.734812 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c7wgq"] Feb 18 12:12:46 crc kubenswrapper[4922]: I0218 12:12:46.745667 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-c7wgq"] Feb 18 12:12:46 crc kubenswrapper[4922]: I0218 12:12:46.757096 4922 scope.go:117] "RemoveContainer" containerID="dd1dba4d12b91de3b2090b7189eb56e82f21a53041e34adc93da1219ee52a1cf" Feb 18 12:12:46 crc kubenswrapper[4922]: I0218 12:12:46.805864 4922 scope.go:117] "RemoveContainer" containerID="8b6f8a1b87bc5c610a532fb17ee32a73f90f8b4873b63a0ecc7e0fb900c07607" Feb 18 12:12:46 crc kubenswrapper[4922]: E0218 12:12:46.806392 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b6f8a1b87bc5c610a532fb17ee32a73f90f8b4873b63a0ecc7e0fb900c07607\": container with ID starting with 8b6f8a1b87bc5c610a532fb17ee32a73f90f8b4873b63a0ecc7e0fb900c07607 not found: ID does not exist" containerID="8b6f8a1b87bc5c610a532fb17ee32a73f90f8b4873b63a0ecc7e0fb900c07607" Feb 18 12:12:46 crc kubenswrapper[4922]: I0218 12:12:46.806438 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b6f8a1b87bc5c610a532fb17ee32a73f90f8b4873b63a0ecc7e0fb900c07607"} err="failed to get container status \"8b6f8a1b87bc5c610a532fb17ee32a73f90f8b4873b63a0ecc7e0fb900c07607\": rpc error: code = NotFound desc = could not find container \"8b6f8a1b87bc5c610a532fb17ee32a73f90f8b4873b63a0ecc7e0fb900c07607\": container with ID starting with 8b6f8a1b87bc5c610a532fb17ee32a73f90f8b4873b63a0ecc7e0fb900c07607 not found: ID does not exist" Feb 18 12:12:46 crc kubenswrapper[4922]: I0218 12:12:46.806470 4922 scope.go:117] "RemoveContainer" containerID="eddc46b683c01644b3304efd906e75b5a9fd440e15fcfe930f86c07c2b3ba4fa" Feb 18 12:12:46 crc kubenswrapper[4922]: E0218 12:12:46.806930 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eddc46b683c01644b3304efd906e75b5a9fd440e15fcfe930f86c07c2b3ba4fa\": container with ID starting with eddc46b683c01644b3304efd906e75b5a9fd440e15fcfe930f86c07c2b3ba4fa not found: ID does not exist" containerID="eddc46b683c01644b3304efd906e75b5a9fd440e15fcfe930f86c07c2b3ba4fa" Feb 18 12:12:46 crc kubenswrapper[4922]: I0218 12:12:46.806951 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eddc46b683c01644b3304efd906e75b5a9fd440e15fcfe930f86c07c2b3ba4fa"} err="failed to get container status \"eddc46b683c01644b3304efd906e75b5a9fd440e15fcfe930f86c07c2b3ba4fa\": rpc error: code = NotFound desc = could not find container \"eddc46b683c01644b3304efd906e75b5a9fd440e15fcfe930f86c07c2b3ba4fa\": container with ID starting with eddc46b683c01644b3304efd906e75b5a9fd440e15fcfe930f86c07c2b3ba4fa not found: ID does not exist" Feb 18 12:12:46 crc kubenswrapper[4922]: I0218 12:12:46.806966 4922 scope.go:117] "RemoveContainer" containerID="dd1dba4d12b91de3b2090b7189eb56e82f21a53041e34adc93da1219ee52a1cf" Feb 18 12:12:46 crc kubenswrapper[4922]: E0218 12:12:46.807329 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd1dba4d12b91de3b2090b7189eb56e82f21a53041e34adc93da1219ee52a1cf\": container with ID starting with dd1dba4d12b91de3b2090b7189eb56e82f21a53041e34adc93da1219ee52a1cf not found: ID does not exist" containerID="dd1dba4d12b91de3b2090b7189eb56e82f21a53041e34adc93da1219ee52a1cf" Feb 18 12:12:46 crc kubenswrapper[4922]: I0218 12:12:46.807398 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd1dba4d12b91de3b2090b7189eb56e82f21a53041e34adc93da1219ee52a1cf"} err="failed to get container status \"dd1dba4d12b91de3b2090b7189eb56e82f21a53041e34adc93da1219ee52a1cf\": rpc error: code = NotFound desc = could not find container \"dd1dba4d12b91de3b2090b7189eb56e82f21a53041e34adc93da1219ee52a1cf\": container with ID starting with dd1dba4d12b91de3b2090b7189eb56e82f21a53041e34adc93da1219ee52a1cf not found: ID does not exist" Feb 18 12:12:46 crc kubenswrapper[4922]: I0218 12:12:46.998041 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12118652-851f-47e2-ac7d-42304ca159f7" path="/var/lib/kubelet/pods/12118652-851f-47e2-ac7d-42304ca159f7/volumes" Feb 18 12:13:04 crc kubenswrapper[4922]: I0218 12:13:04.862020 4922 generic.go:334] "Generic (PLEG): container finished" podID="9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3" containerID="f675a65ec5122eac9b3db3da6055626f4e90d9a34b75684d8bd611667cdcc4bc" exitCode=0 Feb 18 12:13:04 crc kubenswrapper[4922]: I0218 12:13:04.862113 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25nz6" event={"ID":"9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3","Type":"ContainerDied","Data":"f675a65ec5122eac9b3db3da6055626f4e90d9a34b75684d8bd611667cdcc4bc"} Feb 18 12:13:06 crc kubenswrapper[4922]: I0218 12:13:06.282316 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25nz6" Feb 18 12:13:06 crc kubenswrapper[4922]: I0218 12:13:06.311541 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3-nova-metadata-neutron-config-0\") pod \"9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3\" (UID: \"9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3\") " Feb 18 12:13:06 crc kubenswrapper[4922]: I0218 12:13:06.311951 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htkvl\" (UniqueName: \"kubernetes.io/projected/9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3-kube-api-access-htkvl\") pod \"9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3\" (UID: \"9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3\") " Feb 18 12:13:06 crc kubenswrapper[4922]: I0218 12:13:06.312018 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3-neutron-ovn-metadata-agent-neutron-config-0\") pod \"9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3\" (UID: \"9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3\") " Feb 18 12:13:06 crc kubenswrapper[4922]: I0218 12:13:06.312070 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3-neutron-metadata-combined-ca-bundle\") pod \"9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3\" (UID: \"9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3\") " Feb 18 12:13:06 crc kubenswrapper[4922]: I0218 12:13:06.312176 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3-inventory\") pod \"9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3\" (UID: \"9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3\") " Feb 18 12:13:06 crc kubenswrapper[4922]: I0218 12:13:06.312223 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3-ssh-key-openstack-edpm-ipam\") pod \"9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3\" (UID: \"9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3\") " Feb 18 12:13:06 crc kubenswrapper[4922]: I0218 12:13:06.317408 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3" (UID: "9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:13:06 crc kubenswrapper[4922]: I0218 12:13:06.318408 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3-kube-api-access-htkvl" (OuterVolumeSpecName: "kube-api-access-htkvl") pod "9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3" (UID: "9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3"). InnerVolumeSpecName "kube-api-access-htkvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:13:06 crc kubenswrapper[4922]: I0218 12:13:06.341339 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3" (UID: "9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:13:06 crc kubenswrapper[4922]: I0218 12:13:06.350209 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3" (UID: "9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:13:06 crc kubenswrapper[4922]: I0218 12:13:06.358151 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3" (UID: "9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:13:06 crc kubenswrapper[4922]: I0218 12:13:06.371257 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3-inventory" (OuterVolumeSpecName: "inventory") pod "9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3" (UID: "9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:13:06 crc kubenswrapper[4922]: I0218 12:13:06.417544 4922 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 18 12:13:06 crc kubenswrapper[4922]: I0218 12:13:06.419058 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htkvl\" (UniqueName: \"kubernetes.io/projected/9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3-kube-api-access-htkvl\") on node \"crc\" DevicePath \"\"" Feb 18 12:13:06 crc kubenswrapper[4922]: I0218 12:13:06.419189 4922 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 18 12:13:06 crc kubenswrapper[4922]: I0218 12:13:06.419261 4922 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 12:13:06 crc kubenswrapper[4922]: I0218 12:13:06.419335 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 12:13:06 crc kubenswrapper[4922]: I0218 12:13:06.419500 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 12:13:06 crc kubenswrapper[4922]: I0218 12:13:06.882013 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25nz6" event={"ID":"9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3","Type":"ContainerDied","Data":"a81e1f243cb58e624334569d3d46568771a74e45026b207775af54bf872d2f98"} Feb 18 12:13:06 crc kubenswrapper[4922]: I0218 12:13:06.882316 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a81e1f243cb58e624334569d3d46568771a74e45026b207775af54bf872d2f98" Feb 18 12:13:06 crc kubenswrapper[4922]: I0218 12:13:06.882062 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25nz6" Feb 18 12:13:07 crc kubenswrapper[4922]: I0218 12:13:07.000812 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2xhf"] Feb 18 12:13:07 crc kubenswrapper[4922]: E0218 12:13:07.001277 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c0e4049-cc63-4ef1-aef5-0542ca9b9667" containerName="extract-utilities" Feb 18 12:13:07 crc kubenswrapper[4922]: I0218 12:13:07.001294 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c0e4049-cc63-4ef1-aef5-0542ca9b9667" containerName="extract-utilities" Feb 18 12:13:07 crc kubenswrapper[4922]: E0218 12:13:07.001313 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c22488e9-a8cd-4400-8b66-15074c7726ac" containerName="extract-content" Feb 18 12:13:07 crc kubenswrapper[4922]: I0218 12:13:07.001323 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="c22488e9-a8cd-4400-8b66-15074c7726ac" containerName="extract-content" Feb 18 12:13:07 crc kubenswrapper[4922]: E0218 12:13:07.001342 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c22488e9-a8cd-4400-8b66-15074c7726ac" containerName="extract-utilities" Feb 18 12:13:07 crc kubenswrapper[4922]: I0218 12:13:07.001352 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="c22488e9-a8cd-4400-8b66-15074c7726ac" containerName="extract-utilities" Feb 18 12:13:07 crc kubenswrapper[4922]: E0218 12:13:07.001392 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c0e4049-cc63-4ef1-aef5-0542ca9b9667" containerName="registry-server" Feb 18 12:13:07 crc kubenswrapper[4922]: I0218 12:13:07.001402 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c0e4049-cc63-4ef1-aef5-0542ca9b9667" containerName="registry-server" Feb 18 12:13:07 crc kubenswrapper[4922]: E0218 12:13:07.001418 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c0e4049-cc63-4ef1-aef5-0542ca9b9667" containerName="extract-content" Feb 18 12:13:07 crc kubenswrapper[4922]: I0218 12:13:07.001425 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c0e4049-cc63-4ef1-aef5-0542ca9b9667" containerName="extract-content" Feb 18 12:13:07 crc kubenswrapper[4922]: E0218 12:13:07.001435 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c22488e9-a8cd-4400-8b66-15074c7726ac" containerName="registry-server" Feb 18 12:13:07 crc kubenswrapper[4922]: I0218 12:13:07.001443 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="c22488e9-a8cd-4400-8b66-15074c7726ac" containerName="registry-server" Feb 18 12:13:07 crc kubenswrapper[4922]: E0218 12:13:07.001458 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12118652-851f-47e2-ac7d-42304ca159f7" containerName="registry-server" Feb 18 12:13:07 crc kubenswrapper[4922]: I0218 12:13:07.001466 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="12118652-851f-47e2-ac7d-42304ca159f7" containerName="registry-server" Feb 18 12:13:07 crc kubenswrapper[4922]: E0218 12:13:07.001478 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 18 12:13:07 crc kubenswrapper[4922]: I0218 12:13:07.001488 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 18 12:13:07 crc kubenswrapper[4922]: E0218 12:13:07.001504 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12118652-851f-47e2-ac7d-42304ca159f7" containerName="extract-utilities" Feb 18 12:13:07 crc kubenswrapper[4922]: I0218 12:13:07.001513 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="12118652-851f-47e2-ac7d-42304ca159f7" containerName="extract-utilities" Feb 18 12:13:07 crc kubenswrapper[4922]: E0218 12:13:07.001537 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12118652-851f-47e2-ac7d-42304ca159f7" containerName="extract-content" Feb 18 12:13:07 crc kubenswrapper[4922]: I0218 12:13:07.001544 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="12118652-851f-47e2-ac7d-42304ca159f7" containerName="extract-content" Feb 18 12:13:07 crc kubenswrapper[4922]: I0218 12:13:07.001786 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="c22488e9-a8cd-4400-8b66-15074c7726ac" containerName="registry-server" Feb 18 12:13:07 crc kubenswrapper[4922]: I0218 12:13:07.001820 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="12118652-851f-47e2-ac7d-42304ca159f7" containerName="registry-server" Feb 18 12:13:07 crc kubenswrapper[4922]: I0218 12:13:07.001832 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 18 12:13:07 crc kubenswrapper[4922]: I0218 12:13:07.001845 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c0e4049-cc63-4ef1-aef5-0542ca9b9667" containerName="registry-server" Feb 18 12:13:07 crc kubenswrapper[4922]: I0218 12:13:07.006523 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2xhf" Feb 18 12:13:07 crc kubenswrapper[4922]: I0218 12:13:07.008741 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 12:13:07 crc kubenswrapper[4922]: I0218 12:13:07.008913 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Feb 18 12:13:07 crc kubenswrapper[4922]: I0218 12:13:07.009685 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8fsfv" Feb 18 12:13:07 crc kubenswrapper[4922]: I0218 12:13:07.009964 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 12:13:07 crc kubenswrapper[4922]: I0218 12:13:07.011004 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 12:13:07 crc kubenswrapper[4922]: I0218 12:13:07.032665 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7d136111-09bf-46fe-aaf8-868a27741f9b-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-k2xhf\" (UID: \"7d136111-09bf-46fe-aaf8-868a27741f9b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2xhf" Feb 18 12:13:07 crc kubenswrapper[4922]: I0218 12:13:07.033337 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6h8x\" (UniqueName: \"kubernetes.io/projected/7d136111-09bf-46fe-aaf8-868a27741f9b-kube-api-access-c6h8x\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-k2xhf\" (UID: \"7d136111-09bf-46fe-aaf8-868a27741f9b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2xhf" Feb 18 12:13:07 crc kubenswrapper[4922]: I0218 12:13:07.034130 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d136111-09bf-46fe-aaf8-868a27741f9b-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-k2xhf\" (UID: \"7d136111-09bf-46fe-aaf8-868a27741f9b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2xhf" Feb 18 12:13:07 crc kubenswrapper[4922]: I0218 12:13:07.034266 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7d136111-09bf-46fe-aaf8-868a27741f9b-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-k2xhf\" (UID: \"7d136111-09bf-46fe-aaf8-868a27741f9b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2xhf" Feb 18 12:13:07 crc kubenswrapper[4922]: I0218 12:13:07.034313 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d136111-09bf-46fe-aaf8-868a27741f9b-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-k2xhf\" (UID: \"7d136111-09bf-46fe-aaf8-868a27741f9b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2xhf" Feb 18 12:13:07 crc kubenswrapper[4922]: I0218 12:13:07.039666 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2xhf"] Feb 18 12:13:07 crc kubenswrapper[4922]: I0218 12:13:07.136275 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6h8x\" (UniqueName: \"kubernetes.io/projected/7d136111-09bf-46fe-aaf8-868a27741f9b-kube-api-access-c6h8x\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-k2xhf\" (UID: \"7d136111-09bf-46fe-aaf8-868a27741f9b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2xhf" Feb 18 12:13:07 crc kubenswrapper[4922]: I0218 12:13:07.136402 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d136111-09bf-46fe-aaf8-868a27741f9b-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-k2xhf\" (UID: \"7d136111-09bf-46fe-aaf8-868a27741f9b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2xhf" Feb 18 12:13:07 crc kubenswrapper[4922]: I0218 12:13:07.136493 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7d136111-09bf-46fe-aaf8-868a27741f9b-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-k2xhf\" (UID: \"7d136111-09bf-46fe-aaf8-868a27741f9b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2xhf" Feb 18 12:13:07 crc kubenswrapper[4922]: I0218 12:13:07.136640 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d136111-09bf-46fe-aaf8-868a27741f9b-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-k2xhf\" (UID: \"7d136111-09bf-46fe-aaf8-868a27741f9b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2xhf" Feb 18 12:13:07 crc kubenswrapper[4922]: I0218 12:13:07.136768 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7d136111-09bf-46fe-aaf8-868a27741f9b-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-k2xhf\" (UID: \"7d136111-09bf-46fe-aaf8-868a27741f9b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2xhf" Feb 18 12:13:07 crc kubenswrapper[4922]: I0218 12:13:07.142952 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7d136111-09bf-46fe-aaf8-868a27741f9b-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-k2xhf\" (UID: \"7d136111-09bf-46fe-aaf8-868a27741f9b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2xhf" Feb 18 12:13:07 crc kubenswrapper[4922]: I0218 12:13:07.143020 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d136111-09bf-46fe-aaf8-868a27741f9b-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-k2xhf\" (UID: \"7d136111-09bf-46fe-aaf8-868a27741f9b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2xhf" Feb 18 12:13:07 crc kubenswrapper[4922]: I0218 12:13:07.143779 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7d136111-09bf-46fe-aaf8-868a27741f9b-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-k2xhf\" (UID: \"7d136111-09bf-46fe-aaf8-868a27741f9b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2xhf" Feb 18 12:13:07 crc kubenswrapper[4922]: I0218 12:13:07.144055 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d136111-09bf-46fe-aaf8-868a27741f9b-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-k2xhf\" (UID: \"7d136111-09bf-46fe-aaf8-868a27741f9b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2xhf" Feb 18 12:13:07 crc kubenswrapper[4922]: I0218 12:13:07.158772 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6h8x\" (UniqueName: \"kubernetes.io/projected/7d136111-09bf-46fe-aaf8-868a27741f9b-kube-api-access-c6h8x\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-k2xhf\" (UID: \"7d136111-09bf-46fe-aaf8-868a27741f9b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2xhf" Feb 18 12:13:07 crc kubenswrapper[4922]: I0218 12:13:07.334568 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2xhf" Feb 18 12:13:07 crc kubenswrapper[4922]: I0218 12:13:07.898954 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2xhf"] Feb 18 12:13:08 crc kubenswrapper[4922]: I0218 12:13:08.912819 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2xhf" event={"ID":"7d136111-09bf-46fe-aaf8-868a27741f9b","Type":"ContainerStarted","Data":"004c81ea37bbcda1faf27f9cf0255e5a647f141e3451153704351a8c28aa6714"} Feb 18 12:13:08 crc kubenswrapper[4922]: I0218 12:13:08.912899 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2xhf" event={"ID":"7d136111-09bf-46fe-aaf8-868a27741f9b","Type":"ContainerStarted","Data":"d334758ef7c460ce4bd03b306cede9ac406be6bf21f14ff74b552e5a162c62ec"} Feb 18 12:13:08 crc kubenswrapper[4922]: I0218 12:13:08.941836 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2xhf" podStartSLOduration=2.199500702 podStartE2EDuration="2.941813855s" podCreationTimestamp="2026-02-18 12:13:06 +0000 UTC" firstStartedPulling="2026-02-18 12:13:07.910595766 +0000 UTC m=+2189.638299846" lastFinishedPulling="2026-02-18 12:13:08.652908919 +0000 UTC m=+2190.380612999" observedRunningTime="2026-02-18 12:13:08.928853298 +0000 UTC m=+2190.656557398" watchObservedRunningTime="2026-02-18 12:13:08.941813855 +0000 UTC m=+2190.669517935" Feb 18 12:13:09 crc kubenswrapper[4922]: I0218 12:13:09.806961 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 12:13:09 crc kubenswrapper[4922]: I0218 12:13:09.807272 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 12:13:09 crc kubenswrapper[4922]: I0218 12:13:09.807316 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-znglx" Feb 18 12:13:09 crc kubenswrapper[4922]: I0218 12:13:09.808407 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f96eabfecb9bc217d8636beb3db027fe21b0aa914ece8706e84fc03338c589ff"} pod="openshift-machine-config-operator/machine-config-daemon-znglx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 12:13:09 crc kubenswrapper[4922]: I0218 12:13:09.809528 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" containerID="cri-o://f96eabfecb9bc217d8636beb3db027fe21b0aa914ece8706e84fc03338c589ff" gracePeriod=600 Feb 18 12:13:10 crc kubenswrapper[4922]: I0218 12:13:10.938780 4922 generic.go:334] "Generic (PLEG): container finished" podID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerID="f96eabfecb9bc217d8636beb3db027fe21b0aa914ece8706e84fc03338c589ff" exitCode=0 Feb 18 12:13:10 crc kubenswrapper[4922]: I0218 12:13:10.938919 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" event={"ID":"fdb7cedc-b2e3-48f0-80e0-e17073b43228","Type":"ContainerDied","Data":"f96eabfecb9bc217d8636beb3db027fe21b0aa914ece8706e84fc03338c589ff"} Feb 18 12:13:10 crc kubenswrapper[4922]: I0218 12:13:10.939799 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" event={"ID":"fdb7cedc-b2e3-48f0-80e0-e17073b43228","Type":"ContainerStarted","Data":"9330193a32435c3bb58e3c62888aafde935dba8cdd2799d11f365fb7bef953ab"} Feb 18 12:13:10 crc kubenswrapper[4922]: I0218 12:13:10.939839 4922 scope.go:117] "RemoveContainer" containerID="8b1f76f76a4d90d637663bb4e07ac7219793473bf03151c5be72aa190a2db8df" Feb 18 12:15:00 crc kubenswrapper[4922]: I0218 12:15:00.160020 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523615-grpz4"] Feb 18 12:15:00 crc kubenswrapper[4922]: I0218 12:15:00.162401 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523615-grpz4" Feb 18 12:15:00 crc kubenswrapper[4922]: I0218 12:15:00.167508 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 18 12:15:00 crc kubenswrapper[4922]: I0218 12:15:00.168074 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 18 12:15:00 crc kubenswrapper[4922]: I0218 12:15:00.174625 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523615-grpz4"] Feb 18 12:15:00 crc kubenswrapper[4922]: I0218 12:15:00.251915 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fefb3a87-d203-4ac1-b63d-61c582015132-secret-volume\") pod \"collect-profiles-29523615-grpz4\" (UID: \"fefb3a87-d203-4ac1-b63d-61c582015132\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523615-grpz4" Feb 18 12:15:00 crc kubenswrapper[4922]: I0218 12:15:00.252640 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fefb3a87-d203-4ac1-b63d-61c582015132-config-volume\") pod \"collect-profiles-29523615-grpz4\" (UID: \"fefb3a87-d203-4ac1-b63d-61c582015132\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523615-grpz4" Feb 18 12:15:00 crc kubenswrapper[4922]: I0218 12:15:00.253166 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcltl\" (UniqueName: \"kubernetes.io/projected/fefb3a87-d203-4ac1-b63d-61c582015132-kube-api-access-mcltl\") pod \"collect-profiles-29523615-grpz4\" (UID: \"fefb3a87-d203-4ac1-b63d-61c582015132\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523615-grpz4" Feb 18 12:15:00 crc kubenswrapper[4922]: I0218 12:15:00.355930 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fefb3a87-d203-4ac1-b63d-61c582015132-secret-volume\") pod \"collect-profiles-29523615-grpz4\" (UID: \"fefb3a87-d203-4ac1-b63d-61c582015132\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523615-grpz4" Feb 18 12:15:00 crc kubenswrapper[4922]: I0218 12:15:00.356139 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fefb3a87-d203-4ac1-b63d-61c582015132-config-volume\") pod \"collect-profiles-29523615-grpz4\" (UID: \"fefb3a87-d203-4ac1-b63d-61c582015132\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523615-grpz4" Feb 18 12:15:00 crc kubenswrapper[4922]: I0218 12:15:00.356185 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcltl\" (UniqueName: \"kubernetes.io/projected/fefb3a87-d203-4ac1-b63d-61c582015132-kube-api-access-mcltl\") pod \"collect-profiles-29523615-grpz4\" (UID: \"fefb3a87-d203-4ac1-b63d-61c582015132\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523615-grpz4" Feb 18 12:15:00 crc kubenswrapper[4922]: I0218 12:15:00.357792 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fefb3a87-d203-4ac1-b63d-61c582015132-config-volume\") pod \"collect-profiles-29523615-grpz4\" (UID: \"fefb3a87-d203-4ac1-b63d-61c582015132\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523615-grpz4" Feb 18 12:15:00 crc kubenswrapper[4922]: I0218 12:15:00.368256 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fefb3a87-d203-4ac1-b63d-61c582015132-secret-volume\") pod \"collect-profiles-29523615-grpz4\" (UID: \"fefb3a87-d203-4ac1-b63d-61c582015132\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523615-grpz4" Feb 18 12:15:00 crc kubenswrapper[4922]: I0218 12:15:00.379429 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcltl\" (UniqueName: \"kubernetes.io/projected/fefb3a87-d203-4ac1-b63d-61c582015132-kube-api-access-mcltl\") pod \"collect-profiles-29523615-grpz4\" (UID: \"fefb3a87-d203-4ac1-b63d-61c582015132\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523615-grpz4" Feb 18 12:15:00 crc kubenswrapper[4922]: I0218 12:15:00.484827 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523615-grpz4" Feb 18 12:15:00 crc kubenswrapper[4922]: I0218 12:15:00.961330 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523615-grpz4"] Feb 18 12:15:00 crc kubenswrapper[4922]: I0218 12:15:00.990113 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523615-grpz4" event={"ID":"fefb3a87-d203-4ac1-b63d-61c582015132","Type":"ContainerStarted","Data":"28cef7f1097db2361d5e94108dcd6c6a37119408cbb2f95de09d7b9bf0a7695b"} Feb 18 12:15:02 crc kubenswrapper[4922]: I0218 12:15:02.003298 4922 generic.go:334] "Generic (PLEG): container finished" podID="fefb3a87-d203-4ac1-b63d-61c582015132" containerID="5b401b8ee4f7943af0a7b7807634c73c9cc5371f7bb8ea18f378db7de3390a99" exitCode=0 Feb 18 12:15:02 crc kubenswrapper[4922]: I0218 12:15:02.003502 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523615-grpz4" event={"ID":"fefb3a87-d203-4ac1-b63d-61c582015132","Type":"ContainerDied","Data":"5b401b8ee4f7943af0a7b7807634c73c9cc5371f7bb8ea18f378db7de3390a99"} Feb 18 12:15:03 crc kubenswrapper[4922]: I0218 12:15:03.358119 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523615-grpz4" Feb 18 12:15:03 crc kubenswrapper[4922]: I0218 12:15:03.430749 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fefb3a87-d203-4ac1-b63d-61c582015132-config-volume\") pod \"fefb3a87-d203-4ac1-b63d-61c582015132\" (UID: \"fefb3a87-d203-4ac1-b63d-61c582015132\") " Feb 18 12:15:03 crc kubenswrapper[4922]: I0218 12:15:03.430931 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fefb3a87-d203-4ac1-b63d-61c582015132-secret-volume\") pod \"fefb3a87-d203-4ac1-b63d-61c582015132\" (UID: \"fefb3a87-d203-4ac1-b63d-61c582015132\") " Feb 18 12:15:03 crc kubenswrapper[4922]: I0218 12:15:03.430969 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcltl\" (UniqueName: \"kubernetes.io/projected/fefb3a87-d203-4ac1-b63d-61c582015132-kube-api-access-mcltl\") pod \"fefb3a87-d203-4ac1-b63d-61c582015132\" (UID: \"fefb3a87-d203-4ac1-b63d-61c582015132\") " Feb 18 12:15:03 crc kubenswrapper[4922]: I0218 12:15:03.432592 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fefb3a87-d203-4ac1-b63d-61c582015132-config-volume" (OuterVolumeSpecName: "config-volume") pod "fefb3a87-d203-4ac1-b63d-61c582015132" (UID: "fefb3a87-d203-4ac1-b63d-61c582015132"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 12:15:03 crc kubenswrapper[4922]: I0218 12:15:03.439946 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fefb3a87-d203-4ac1-b63d-61c582015132-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "fefb3a87-d203-4ac1-b63d-61c582015132" (UID: "fefb3a87-d203-4ac1-b63d-61c582015132"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:15:03 crc kubenswrapper[4922]: I0218 12:15:03.440060 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fefb3a87-d203-4ac1-b63d-61c582015132-kube-api-access-mcltl" (OuterVolumeSpecName: "kube-api-access-mcltl") pod "fefb3a87-d203-4ac1-b63d-61c582015132" (UID: "fefb3a87-d203-4ac1-b63d-61c582015132"). InnerVolumeSpecName "kube-api-access-mcltl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:15:03 crc kubenswrapper[4922]: I0218 12:15:03.533238 4922 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fefb3a87-d203-4ac1-b63d-61c582015132-config-volume\") on node \"crc\" DevicePath \"\"" Feb 18 12:15:03 crc kubenswrapper[4922]: I0218 12:15:03.533274 4922 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fefb3a87-d203-4ac1-b63d-61c582015132-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 18 12:15:03 crc kubenswrapper[4922]: I0218 12:15:03.533283 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcltl\" (UniqueName: \"kubernetes.io/projected/fefb3a87-d203-4ac1-b63d-61c582015132-kube-api-access-mcltl\") on node \"crc\" DevicePath \"\"" Feb 18 12:15:04 crc kubenswrapper[4922]: I0218 12:15:04.023460 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523615-grpz4" event={"ID":"fefb3a87-d203-4ac1-b63d-61c582015132","Type":"ContainerDied","Data":"28cef7f1097db2361d5e94108dcd6c6a37119408cbb2f95de09d7b9bf0a7695b"} Feb 18 12:15:04 crc kubenswrapper[4922]: I0218 12:15:04.023508 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28cef7f1097db2361d5e94108dcd6c6a37119408cbb2f95de09d7b9bf0a7695b" Feb 18 12:15:04 crc kubenswrapper[4922]: I0218 12:15:04.023521 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523615-grpz4" Feb 18 12:15:04 crc kubenswrapper[4922]: I0218 12:15:04.438027 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523570-k54zv"] Feb 18 12:15:04 crc kubenswrapper[4922]: I0218 12:15:04.446931 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523570-k54zv"] Feb 18 12:15:04 crc kubenswrapper[4922]: I0218 12:15:04.988695 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75c707c4-5c62-438f-8312-2307d3ef0ba8" path="/var/lib/kubelet/pods/75c707c4-5c62-438f-8312-2307d3ef0ba8/volumes" Feb 18 12:15:12 crc kubenswrapper[4922]: I0218 12:15:12.878602 4922 scope.go:117] "RemoveContainer" containerID="28b5d335d68c0542326e75d8138276312eb3264af281f35e13ca715cee393fd1" Feb 18 12:15:39 crc kubenswrapper[4922]: I0218 12:15:39.807080 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 12:15:39 crc kubenswrapper[4922]: I0218 12:15:39.808694 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 12:16:09 crc kubenswrapper[4922]: I0218 12:16:09.808111 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 12:16:09 crc kubenswrapper[4922]: I0218 12:16:09.808764 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 12:16:39 crc kubenswrapper[4922]: I0218 12:16:39.808119 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 12:16:39 crc kubenswrapper[4922]: I0218 12:16:39.808698 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 12:16:39 crc kubenswrapper[4922]: I0218 12:16:39.808758 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-znglx" Feb 18 12:16:39 crc kubenswrapper[4922]: I0218 12:16:39.809570 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9330193a32435c3bb58e3c62888aafde935dba8cdd2799d11f365fb7bef953ab"} pod="openshift-machine-config-operator/machine-config-daemon-znglx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 12:16:39 crc kubenswrapper[4922]: I0218 12:16:39.809707 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" containerID="cri-o://9330193a32435c3bb58e3c62888aafde935dba8cdd2799d11f365fb7bef953ab" gracePeriod=600 Feb 18 12:16:39 crc kubenswrapper[4922]: E0218 12:16:39.945404 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:16:40 crc kubenswrapper[4922]: I0218 12:16:40.915810 4922 generic.go:334] "Generic (PLEG): container finished" podID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerID="9330193a32435c3bb58e3c62888aafde935dba8cdd2799d11f365fb7bef953ab" exitCode=0 Feb 18 12:16:40 crc kubenswrapper[4922]: I0218 12:16:40.916119 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" event={"ID":"fdb7cedc-b2e3-48f0-80e0-e17073b43228","Type":"ContainerDied","Data":"9330193a32435c3bb58e3c62888aafde935dba8cdd2799d11f365fb7bef953ab"} Feb 18 12:16:40 crc kubenswrapper[4922]: I0218 12:16:40.916151 4922 scope.go:117] "RemoveContainer" containerID="f96eabfecb9bc217d8636beb3db027fe21b0aa914ece8706e84fc03338c589ff" Feb 18 12:16:40 crc kubenswrapper[4922]: I0218 12:16:40.916822 4922 scope.go:117] "RemoveContainer" containerID="9330193a32435c3bb58e3c62888aafde935dba8cdd2799d11f365fb7bef953ab" Feb 18 12:16:40 crc kubenswrapper[4922]: E0218 12:16:40.917075 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:16:53 crc kubenswrapper[4922]: I0218 12:16:53.973695 4922 scope.go:117] "RemoveContainer" containerID="9330193a32435c3bb58e3c62888aafde935dba8cdd2799d11f365fb7bef953ab" Feb 18 12:16:53 crc kubenswrapper[4922]: E0218 12:16:53.974768 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:17:07 crc kubenswrapper[4922]: I0218 12:17:07.973979 4922 scope.go:117] "RemoveContainer" containerID="9330193a32435c3bb58e3c62888aafde935dba8cdd2799d11f365fb7bef953ab" Feb 18 12:17:07 crc kubenswrapper[4922]: E0218 12:17:07.975721 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:17:21 crc kubenswrapper[4922]: I0218 12:17:21.974671 4922 scope.go:117] "RemoveContainer" containerID="9330193a32435c3bb58e3c62888aafde935dba8cdd2799d11f365fb7bef953ab" Feb 18 12:17:21 crc kubenswrapper[4922]: E0218 12:17:21.975543 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:17:25 crc kubenswrapper[4922]: I0218 12:17:25.326965 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8vqh2"] Feb 18 12:17:25 crc kubenswrapper[4922]: E0218 12:17:25.327836 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fefb3a87-d203-4ac1-b63d-61c582015132" containerName="collect-profiles" Feb 18 12:17:25 crc kubenswrapper[4922]: I0218 12:17:25.327856 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="fefb3a87-d203-4ac1-b63d-61c582015132" containerName="collect-profiles" Feb 18 12:17:25 crc kubenswrapper[4922]: I0218 12:17:25.329026 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="fefb3a87-d203-4ac1-b63d-61c582015132" containerName="collect-profiles" Feb 18 12:17:25 crc kubenswrapper[4922]: I0218 12:17:25.330700 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8vqh2" Feb 18 12:17:25 crc kubenswrapper[4922]: I0218 12:17:25.350647 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8vqh2"] Feb 18 12:17:25 crc kubenswrapper[4922]: I0218 12:17:25.377379 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/526949a6-53f6-4b36-b4ec-48a4a8b612e9-catalog-content\") pod \"community-operators-8vqh2\" (UID: \"526949a6-53f6-4b36-b4ec-48a4a8b612e9\") " pod="openshift-marketplace/community-operators-8vqh2" Feb 18 12:17:25 crc kubenswrapper[4922]: I0218 12:17:25.377802 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkr4g\" (UniqueName: \"kubernetes.io/projected/526949a6-53f6-4b36-b4ec-48a4a8b612e9-kube-api-access-dkr4g\") pod \"community-operators-8vqh2\" (UID: \"526949a6-53f6-4b36-b4ec-48a4a8b612e9\") " pod="openshift-marketplace/community-operators-8vqh2" Feb 18 12:17:25 crc kubenswrapper[4922]: I0218 12:17:25.377883 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/526949a6-53f6-4b36-b4ec-48a4a8b612e9-utilities\") pod \"community-operators-8vqh2\" (UID: \"526949a6-53f6-4b36-b4ec-48a4a8b612e9\") " pod="openshift-marketplace/community-operators-8vqh2" Feb 18 12:17:25 crc kubenswrapper[4922]: I0218 12:17:25.481814 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkr4g\" (UniqueName: \"kubernetes.io/projected/526949a6-53f6-4b36-b4ec-48a4a8b612e9-kube-api-access-dkr4g\") pod \"community-operators-8vqh2\" (UID: \"526949a6-53f6-4b36-b4ec-48a4a8b612e9\") " pod="openshift-marketplace/community-operators-8vqh2" Feb 18 12:17:25 crc kubenswrapper[4922]: I0218 12:17:25.481897 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/526949a6-53f6-4b36-b4ec-48a4a8b612e9-utilities\") pod \"community-operators-8vqh2\" (UID: \"526949a6-53f6-4b36-b4ec-48a4a8b612e9\") " pod="openshift-marketplace/community-operators-8vqh2" Feb 18 12:17:25 crc kubenswrapper[4922]: I0218 12:17:25.481981 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/526949a6-53f6-4b36-b4ec-48a4a8b612e9-catalog-content\") pod \"community-operators-8vqh2\" (UID: \"526949a6-53f6-4b36-b4ec-48a4a8b612e9\") " pod="openshift-marketplace/community-operators-8vqh2" Feb 18 12:17:25 crc kubenswrapper[4922]: I0218 12:17:25.482592 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/526949a6-53f6-4b36-b4ec-48a4a8b612e9-catalog-content\") pod \"community-operators-8vqh2\" (UID: \"526949a6-53f6-4b36-b4ec-48a4a8b612e9\") " pod="openshift-marketplace/community-operators-8vqh2" Feb 18 12:17:25 crc kubenswrapper[4922]: I0218 12:17:25.482875 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/526949a6-53f6-4b36-b4ec-48a4a8b612e9-utilities\") pod \"community-operators-8vqh2\" (UID: \"526949a6-53f6-4b36-b4ec-48a4a8b612e9\") " pod="openshift-marketplace/community-operators-8vqh2" Feb 18 12:17:25 crc kubenswrapper[4922]: I0218 12:17:25.507321 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkr4g\" (UniqueName: \"kubernetes.io/projected/526949a6-53f6-4b36-b4ec-48a4a8b612e9-kube-api-access-dkr4g\") pod \"community-operators-8vqh2\" (UID: \"526949a6-53f6-4b36-b4ec-48a4a8b612e9\") " pod="openshift-marketplace/community-operators-8vqh2" Feb 18 12:17:25 crc kubenswrapper[4922]: I0218 12:17:25.654443 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8vqh2" Feb 18 12:17:26 crc kubenswrapper[4922]: I0218 12:17:26.248578 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8vqh2"] Feb 18 12:17:26 crc kubenswrapper[4922]: I0218 12:17:26.351482 4922 generic.go:334] "Generic (PLEG): container finished" podID="7d136111-09bf-46fe-aaf8-868a27741f9b" containerID="004c81ea37bbcda1faf27f9cf0255e5a647f141e3451153704351a8c28aa6714" exitCode=0 Feb 18 12:17:26 crc kubenswrapper[4922]: I0218 12:17:26.351564 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2xhf" event={"ID":"7d136111-09bf-46fe-aaf8-868a27741f9b","Type":"ContainerDied","Data":"004c81ea37bbcda1faf27f9cf0255e5a647f141e3451153704351a8c28aa6714"} Feb 18 12:17:26 crc kubenswrapper[4922]: I0218 12:17:26.391730 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vqh2" event={"ID":"526949a6-53f6-4b36-b4ec-48a4a8b612e9","Type":"ContainerStarted","Data":"01c732102ca2fd46bdb631cd0a64430ba5aa0394d12e4a6da819849d4a2e5c11"} Feb 18 12:17:27 crc kubenswrapper[4922]: I0218 12:17:27.404062 4922 generic.go:334] "Generic (PLEG): container finished" podID="526949a6-53f6-4b36-b4ec-48a4a8b612e9" containerID="0a1c454e5a730d36b5ea73c9d1cecd08af9d5ee5f5c304af6fb52d0ccce6cbb8" exitCode=0 Feb 18 12:17:27 crc kubenswrapper[4922]: I0218 12:17:27.404188 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vqh2" event={"ID":"526949a6-53f6-4b36-b4ec-48a4a8b612e9","Type":"ContainerDied","Data":"0a1c454e5a730d36b5ea73c9d1cecd08af9d5ee5f5c304af6fb52d0ccce6cbb8"} Feb 18 12:17:27 crc kubenswrapper[4922]: I0218 12:17:27.412334 4922 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 12:17:27 crc kubenswrapper[4922]: I0218 12:17:27.828897 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2xhf" Feb 18 12:17:27 crc kubenswrapper[4922]: I0218 12:17:27.956089 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d136111-09bf-46fe-aaf8-868a27741f9b-inventory\") pod \"7d136111-09bf-46fe-aaf8-868a27741f9b\" (UID: \"7d136111-09bf-46fe-aaf8-868a27741f9b\") " Feb 18 12:17:27 crc kubenswrapper[4922]: I0218 12:17:27.956710 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7d136111-09bf-46fe-aaf8-868a27741f9b-libvirt-secret-0\") pod \"7d136111-09bf-46fe-aaf8-868a27741f9b\" (UID: \"7d136111-09bf-46fe-aaf8-868a27741f9b\") " Feb 18 12:17:27 crc kubenswrapper[4922]: I0218 12:17:27.956810 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7d136111-09bf-46fe-aaf8-868a27741f9b-ssh-key-openstack-edpm-ipam\") pod \"7d136111-09bf-46fe-aaf8-868a27741f9b\" (UID: \"7d136111-09bf-46fe-aaf8-868a27741f9b\") " Feb 18 12:17:27 crc kubenswrapper[4922]: I0218 12:17:27.956911 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d136111-09bf-46fe-aaf8-868a27741f9b-libvirt-combined-ca-bundle\") pod \"7d136111-09bf-46fe-aaf8-868a27741f9b\" (UID: \"7d136111-09bf-46fe-aaf8-868a27741f9b\") " Feb 18 12:17:27 crc kubenswrapper[4922]: I0218 12:17:27.957114 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6h8x\" (UniqueName: \"kubernetes.io/projected/7d136111-09bf-46fe-aaf8-868a27741f9b-kube-api-access-c6h8x\") pod \"7d136111-09bf-46fe-aaf8-868a27741f9b\" (UID: \"7d136111-09bf-46fe-aaf8-868a27741f9b\") " Feb 18 12:17:27 crc kubenswrapper[4922]: I0218 12:17:27.963798 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d136111-09bf-46fe-aaf8-868a27741f9b-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "7d136111-09bf-46fe-aaf8-868a27741f9b" (UID: "7d136111-09bf-46fe-aaf8-868a27741f9b"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:17:27 crc kubenswrapper[4922]: I0218 12:17:27.964478 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d136111-09bf-46fe-aaf8-868a27741f9b-kube-api-access-c6h8x" (OuterVolumeSpecName: "kube-api-access-c6h8x") pod "7d136111-09bf-46fe-aaf8-868a27741f9b" (UID: "7d136111-09bf-46fe-aaf8-868a27741f9b"). InnerVolumeSpecName "kube-api-access-c6h8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:17:27 crc kubenswrapper[4922]: I0218 12:17:27.992853 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d136111-09bf-46fe-aaf8-868a27741f9b-inventory" (OuterVolumeSpecName: "inventory") pod "7d136111-09bf-46fe-aaf8-868a27741f9b" (UID: "7d136111-09bf-46fe-aaf8-868a27741f9b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:17:27 crc kubenswrapper[4922]: I0218 12:17:27.994922 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d136111-09bf-46fe-aaf8-868a27741f9b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7d136111-09bf-46fe-aaf8-868a27741f9b" (UID: "7d136111-09bf-46fe-aaf8-868a27741f9b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.006882 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d136111-09bf-46fe-aaf8-868a27741f9b-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "7d136111-09bf-46fe-aaf8-868a27741f9b" (UID: "7d136111-09bf-46fe-aaf8-868a27741f9b"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.061294 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7d136111-09bf-46fe-aaf8-868a27741f9b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.061353 4922 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d136111-09bf-46fe-aaf8-868a27741f9b-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.061395 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6h8x\" (UniqueName: \"kubernetes.io/projected/7d136111-09bf-46fe-aaf8-868a27741f9b-kube-api-access-c6h8x\") on node \"crc\" DevicePath \"\"" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.061413 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d136111-09bf-46fe-aaf8-868a27741f9b-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.061426 4922 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7d136111-09bf-46fe-aaf8-868a27741f9b-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.422859 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2xhf" event={"ID":"7d136111-09bf-46fe-aaf8-868a27741f9b","Type":"ContainerDied","Data":"d334758ef7c460ce4bd03b306cede9ac406be6bf21f14ff74b552e5a162c62ec"} Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.422911 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d334758ef7c460ce4bd03b306cede9ac406be6bf21f14ff74b552e5a162c62ec" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.422948 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2xhf" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.506774 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-hswp7"] Feb 18 12:17:28 crc kubenswrapper[4922]: E0218 12:17:28.507337 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d136111-09bf-46fe-aaf8-868a27741f9b" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.507410 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d136111-09bf-46fe-aaf8-868a27741f9b" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.507698 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d136111-09bf-46fe-aaf8-868a27741f9b" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.508622 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hswp7" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.517013 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.517435 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.517561 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.517699 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.517830 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8fsfv" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.518012 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.518137 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.528279 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-hswp7"] Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.573798 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hswp7\" (UID: \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hswp7" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.573895 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hswp7\" (UID: \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hswp7" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.573929 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hswp7\" (UID: \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hswp7" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.574313 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hswp7\" (UID: \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hswp7" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.574509 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hswp7\" (UID: \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hswp7" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.574572 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rprl\" (UniqueName: \"kubernetes.io/projected/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-kube-api-access-9rprl\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hswp7\" (UID: \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hswp7" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.574719 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hswp7\" (UID: \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hswp7" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.574853 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hswp7\" (UID: \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hswp7" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.574889 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hswp7\" (UID: \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hswp7" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.574926 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hswp7\" (UID: \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hswp7" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.575038 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hswp7\" (UID: \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hswp7" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.676828 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hswp7\" (UID: \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hswp7" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.676930 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hswp7\" (UID: \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hswp7" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.676973 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rprl\" (UniqueName: \"kubernetes.io/projected/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-kube-api-access-9rprl\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hswp7\" (UID: \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hswp7" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.677019 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hswp7\" (UID: \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hswp7" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.677078 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hswp7\" (UID: \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hswp7" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.677105 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hswp7\" (UID: \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hswp7" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.677144 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hswp7\" (UID: \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hswp7" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.677191 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hswp7\" (UID: \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hswp7" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.677242 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hswp7\" (UID: \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hswp7" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.677287 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hswp7\" (UID: \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hswp7" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.677323 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hswp7\" (UID: \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hswp7" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.678165 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hswp7\" (UID: \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hswp7" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.682869 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hswp7\" (UID: \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hswp7" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.682880 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hswp7\" (UID: \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hswp7" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.683123 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hswp7\" (UID: \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hswp7" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.684529 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hswp7\" (UID: \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hswp7" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.684674 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hswp7\" (UID: \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hswp7" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.684906 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hswp7\" (UID: \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hswp7" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.685493 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hswp7\" (UID: \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hswp7" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.686591 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hswp7\" (UID: \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hswp7" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.688101 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hswp7\" (UID: \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hswp7" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.702291 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rprl\" (UniqueName: \"kubernetes.io/projected/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-kube-api-access-9rprl\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hswp7\" (UID: \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hswp7" Feb 18 12:17:28 crc kubenswrapper[4922]: I0218 12:17:28.831519 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hswp7" Feb 18 12:17:29 crc kubenswrapper[4922]: I0218 12:17:29.375512 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-hswp7"] Feb 18 12:17:29 crc kubenswrapper[4922]: I0218 12:17:29.432569 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hswp7" event={"ID":"6e9e482a-c85e-473f-b848-e6fb6ba6afcd","Type":"ContainerStarted","Data":"b802ddd617fe0e8d5f19f836561f6f6d8d1e7e23231f318980afcb28d7a13059"} Feb 18 12:17:31 crc kubenswrapper[4922]: I0218 12:17:31.507763 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hswp7" event={"ID":"6e9e482a-c85e-473f-b848-e6fb6ba6afcd","Type":"ContainerStarted","Data":"2adb9429bcb3ef41bd775beaaf3c6d68c44fa008e3e767ad5a7534d6bcd41e78"} Feb 18 12:17:32 crc kubenswrapper[4922]: I0218 12:17:32.519604 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vqh2" event={"ID":"526949a6-53f6-4b36-b4ec-48a4a8b612e9","Type":"ContainerStarted","Data":"a92dee8aeb759eec962acd64c4a337ee288c07c686cfa1ad8f7c4d73cb7393ee"} Feb 18 12:17:32 crc kubenswrapper[4922]: I0218 12:17:32.546872 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hswp7" podStartSLOduration=3.734466121 podStartE2EDuration="4.546843325s" podCreationTimestamp="2026-02-18 12:17:28 +0000 UTC" firstStartedPulling="2026-02-18 12:17:29.387037443 +0000 UTC m=+2451.114741523" lastFinishedPulling="2026-02-18 12:17:30.199414647 +0000 UTC m=+2451.927118727" observedRunningTime="2026-02-18 12:17:31.536495091 +0000 UTC m=+2453.264199171" watchObservedRunningTime="2026-02-18 12:17:32.546843325 +0000 UTC m=+2454.274547435" Feb 18 12:17:34 crc kubenswrapper[4922]: I0218 12:17:34.543714 4922 generic.go:334] "Generic (PLEG): container finished" podID="526949a6-53f6-4b36-b4ec-48a4a8b612e9" containerID="a92dee8aeb759eec962acd64c4a337ee288c07c686cfa1ad8f7c4d73cb7393ee" exitCode=0 Feb 18 12:17:34 crc kubenswrapper[4922]: I0218 12:17:34.543767 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vqh2" event={"ID":"526949a6-53f6-4b36-b4ec-48a4a8b612e9","Type":"ContainerDied","Data":"a92dee8aeb759eec962acd64c4a337ee288c07c686cfa1ad8f7c4d73cb7393ee"} Feb 18 12:17:35 crc kubenswrapper[4922]: I0218 12:17:35.555776 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vqh2" event={"ID":"526949a6-53f6-4b36-b4ec-48a4a8b612e9","Type":"ContainerStarted","Data":"dc982e8e0ad2018b62dcf843753a3126bfdb6d8b3e07965135a9f90ed93bc56f"} Feb 18 12:17:35 crc kubenswrapper[4922]: I0218 12:17:35.575243 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8vqh2" podStartSLOduration=3.016852244 podStartE2EDuration="10.575222238s" podCreationTimestamp="2026-02-18 12:17:25 +0000 UTC" firstStartedPulling="2026-02-18 12:17:27.40963973 +0000 UTC m=+2449.137343810" lastFinishedPulling="2026-02-18 12:17:34.968009714 +0000 UTC m=+2456.695713804" observedRunningTime="2026-02-18 12:17:35.57369669 +0000 UTC m=+2457.301400790" watchObservedRunningTime="2026-02-18 12:17:35.575222238 +0000 UTC m=+2457.302926318" Feb 18 12:17:35 crc kubenswrapper[4922]: I0218 12:17:35.655257 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8vqh2" Feb 18 12:17:35 crc kubenswrapper[4922]: I0218 12:17:35.655342 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8vqh2" Feb 18 12:17:35 crc kubenswrapper[4922]: I0218 12:17:35.972822 4922 scope.go:117] "RemoveContainer" containerID="9330193a32435c3bb58e3c62888aafde935dba8cdd2799d11f365fb7bef953ab" Feb 18 12:17:35 crc kubenswrapper[4922]: E0218 12:17:35.973501 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:17:36 crc kubenswrapper[4922]: I0218 12:17:36.709799 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-8vqh2" podUID="526949a6-53f6-4b36-b4ec-48a4a8b612e9" containerName="registry-server" probeResult="failure" output=< Feb 18 12:17:36 crc kubenswrapper[4922]: timeout: failed to connect service ":50051" within 1s Feb 18 12:17:36 crc kubenswrapper[4922]: > Feb 18 12:17:45 crc kubenswrapper[4922]: I0218 12:17:45.708040 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8vqh2" Feb 18 12:17:45 crc kubenswrapper[4922]: I0218 12:17:45.762466 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8vqh2" Feb 18 12:17:45 crc kubenswrapper[4922]: I0218 12:17:45.949311 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8vqh2"] Feb 18 12:17:47 crc kubenswrapper[4922]: I0218 12:17:47.657068 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8vqh2" podUID="526949a6-53f6-4b36-b4ec-48a4a8b612e9" containerName="registry-server" containerID="cri-o://dc982e8e0ad2018b62dcf843753a3126bfdb6d8b3e07965135a9f90ed93bc56f" gracePeriod=2 Feb 18 12:17:48 crc kubenswrapper[4922]: I0218 12:17:48.170514 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8vqh2" Feb 18 12:17:48 crc kubenswrapper[4922]: I0218 12:17:48.335994 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/526949a6-53f6-4b36-b4ec-48a4a8b612e9-utilities\") pod \"526949a6-53f6-4b36-b4ec-48a4a8b612e9\" (UID: \"526949a6-53f6-4b36-b4ec-48a4a8b612e9\") " Feb 18 12:17:48 crc kubenswrapper[4922]: I0218 12:17:48.336209 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkr4g\" (UniqueName: \"kubernetes.io/projected/526949a6-53f6-4b36-b4ec-48a4a8b612e9-kube-api-access-dkr4g\") pod \"526949a6-53f6-4b36-b4ec-48a4a8b612e9\" (UID: \"526949a6-53f6-4b36-b4ec-48a4a8b612e9\") " Feb 18 12:17:48 crc kubenswrapper[4922]: I0218 12:17:48.336280 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/526949a6-53f6-4b36-b4ec-48a4a8b612e9-catalog-content\") pod \"526949a6-53f6-4b36-b4ec-48a4a8b612e9\" (UID: \"526949a6-53f6-4b36-b4ec-48a4a8b612e9\") " Feb 18 12:17:48 crc kubenswrapper[4922]: I0218 12:17:48.336787 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/526949a6-53f6-4b36-b4ec-48a4a8b612e9-utilities" (OuterVolumeSpecName: "utilities") pod "526949a6-53f6-4b36-b4ec-48a4a8b612e9" (UID: "526949a6-53f6-4b36-b4ec-48a4a8b612e9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:17:48 crc kubenswrapper[4922]: I0218 12:17:48.337231 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/526949a6-53f6-4b36-b4ec-48a4a8b612e9-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 12:17:48 crc kubenswrapper[4922]: I0218 12:17:48.343691 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/526949a6-53f6-4b36-b4ec-48a4a8b612e9-kube-api-access-dkr4g" (OuterVolumeSpecName: "kube-api-access-dkr4g") pod "526949a6-53f6-4b36-b4ec-48a4a8b612e9" (UID: "526949a6-53f6-4b36-b4ec-48a4a8b612e9"). InnerVolumeSpecName "kube-api-access-dkr4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:17:48 crc kubenswrapper[4922]: I0218 12:17:48.392900 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/526949a6-53f6-4b36-b4ec-48a4a8b612e9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "526949a6-53f6-4b36-b4ec-48a4a8b612e9" (UID: "526949a6-53f6-4b36-b4ec-48a4a8b612e9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:17:48 crc kubenswrapper[4922]: I0218 12:17:48.438621 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkr4g\" (UniqueName: \"kubernetes.io/projected/526949a6-53f6-4b36-b4ec-48a4a8b612e9-kube-api-access-dkr4g\") on node \"crc\" DevicePath \"\"" Feb 18 12:17:48 crc kubenswrapper[4922]: I0218 12:17:48.438841 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/526949a6-53f6-4b36-b4ec-48a4a8b612e9-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 12:17:48 crc kubenswrapper[4922]: I0218 12:17:48.670174 4922 generic.go:334] "Generic (PLEG): container finished" podID="526949a6-53f6-4b36-b4ec-48a4a8b612e9" containerID="dc982e8e0ad2018b62dcf843753a3126bfdb6d8b3e07965135a9f90ed93bc56f" exitCode=0 Feb 18 12:17:48 crc kubenswrapper[4922]: I0218 12:17:48.670240 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vqh2" event={"ID":"526949a6-53f6-4b36-b4ec-48a4a8b612e9","Type":"ContainerDied","Data":"dc982e8e0ad2018b62dcf843753a3126bfdb6d8b3e07965135a9f90ed93bc56f"} Feb 18 12:17:48 crc kubenswrapper[4922]: I0218 12:17:48.670271 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8vqh2" Feb 18 12:17:48 crc kubenswrapper[4922]: I0218 12:17:48.670623 4922 scope.go:117] "RemoveContainer" containerID="dc982e8e0ad2018b62dcf843753a3126bfdb6d8b3e07965135a9f90ed93bc56f" Feb 18 12:17:48 crc kubenswrapper[4922]: I0218 12:17:48.670584 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vqh2" event={"ID":"526949a6-53f6-4b36-b4ec-48a4a8b612e9","Type":"ContainerDied","Data":"01c732102ca2fd46bdb631cd0a64430ba5aa0394d12e4a6da819849d4a2e5c11"} Feb 18 12:17:48 crc kubenswrapper[4922]: I0218 12:17:48.699923 4922 scope.go:117] "RemoveContainer" containerID="a92dee8aeb759eec962acd64c4a337ee288c07c686cfa1ad8f7c4d73cb7393ee" Feb 18 12:17:48 crc kubenswrapper[4922]: I0218 12:17:48.719394 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8vqh2"] Feb 18 12:17:48 crc kubenswrapper[4922]: I0218 12:17:48.728162 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8vqh2"] Feb 18 12:17:48 crc kubenswrapper[4922]: I0218 12:17:48.741222 4922 scope.go:117] "RemoveContainer" containerID="0a1c454e5a730d36b5ea73c9d1cecd08af9d5ee5f5c304af6fb52d0ccce6cbb8" Feb 18 12:17:48 crc kubenswrapper[4922]: I0218 12:17:48.805560 4922 scope.go:117] "RemoveContainer" containerID="dc982e8e0ad2018b62dcf843753a3126bfdb6d8b3e07965135a9f90ed93bc56f" Feb 18 12:17:48 crc kubenswrapper[4922]: E0218 12:17:48.809475 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc982e8e0ad2018b62dcf843753a3126bfdb6d8b3e07965135a9f90ed93bc56f\": container with ID starting with dc982e8e0ad2018b62dcf843753a3126bfdb6d8b3e07965135a9f90ed93bc56f not found: ID does not exist" containerID="dc982e8e0ad2018b62dcf843753a3126bfdb6d8b3e07965135a9f90ed93bc56f" Feb 18 12:17:48 crc kubenswrapper[4922]: I0218 12:17:48.809513 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc982e8e0ad2018b62dcf843753a3126bfdb6d8b3e07965135a9f90ed93bc56f"} err="failed to get container status \"dc982e8e0ad2018b62dcf843753a3126bfdb6d8b3e07965135a9f90ed93bc56f\": rpc error: code = NotFound desc = could not find container \"dc982e8e0ad2018b62dcf843753a3126bfdb6d8b3e07965135a9f90ed93bc56f\": container with ID starting with dc982e8e0ad2018b62dcf843753a3126bfdb6d8b3e07965135a9f90ed93bc56f not found: ID does not exist" Feb 18 12:17:48 crc kubenswrapper[4922]: I0218 12:17:48.809536 4922 scope.go:117] "RemoveContainer" containerID="a92dee8aeb759eec962acd64c4a337ee288c07c686cfa1ad8f7c4d73cb7393ee" Feb 18 12:17:48 crc kubenswrapper[4922]: E0218 12:17:48.817496 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a92dee8aeb759eec962acd64c4a337ee288c07c686cfa1ad8f7c4d73cb7393ee\": container with ID starting with a92dee8aeb759eec962acd64c4a337ee288c07c686cfa1ad8f7c4d73cb7393ee not found: ID does not exist" containerID="a92dee8aeb759eec962acd64c4a337ee288c07c686cfa1ad8f7c4d73cb7393ee" Feb 18 12:17:48 crc kubenswrapper[4922]: I0218 12:17:48.817531 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a92dee8aeb759eec962acd64c4a337ee288c07c686cfa1ad8f7c4d73cb7393ee"} err="failed to get container status \"a92dee8aeb759eec962acd64c4a337ee288c07c686cfa1ad8f7c4d73cb7393ee\": rpc error: code = NotFound desc = could not find container \"a92dee8aeb759eec962acd64c4a337ee288c07c686cfa1ad8f7c4d73cb7393ee\": container with ID starting with a92dee8aeb759eec962acd64c4a337ee288c07c686cfa1ad8f7c4d73cb7393ee not found: ID does not exist" Feb 18 12:17:48 crc kubenswrapper[4922]: I0218 12:17:48.817557 4922 scope.go:117] "RemoveContainer" containerID="0a1c454e5a730d36b5ea73c9d1cecd08af9d5ee5f5c304af6fb52d0ccce6cbb8" Feb 18 12:17:48 crc kubenswrapper[4922]: E0218 12:17:48.824497 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a1c454e5a730d36b5ea73c9d1cecd08af9d5ee5f5c304af6fb52d0ccce6cbb8\": container with ID starting with 0a1c454e5a730d36b5ea73c9d1cecd08af9d5ee5f5c304af6fb52d0ccce6cbb8 not found: ID does not exist" containerID="0a1c454e5a730d36b5ea73c9d1cecd08af9d5ee5f5c304af6fb52d0ccce6cbb8" Feb 18 12:17:48 crc kubenswrapper[4922]: I0218 12:17:48.824544 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a1c454e5a730d36b5ea73c9d1cecd08af9d5ee5f5c304af6fb52d0ccce6cbb8"} err="failed to get container status \"0a1c454e5a730d36b5ea73c9d1cecd08af9d5ee5f5c304af6fb52d0ccce6cbb8\": rpc error: code = NotFound desc = could not find container \"0a1c454e5a730d36b5ea73c9d1cecd08af9d5ee5f5c304af6fb52d0ccce6cbb8\": container with ID starting with 0a1c454e5a730d36b5ea73c9d1cecd08af9d5ee5f5c304af6fb52d0ccce6cbb8 not found: ID does not exist" Feb 18 12:17:48 crc kubenswrapper[4922]: I0218 12:17:48.985091 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="526949a6-53f6-4b36-b4ec-48a4a8b612e9" path="/var/lib/kubelet/pods/526949a6-53f6-4b36-b4ec-48a4a8b612e9/volumes" Feb 18 12:17:49 crc kubenswrapper[4922]: I0218 12:17:49.973148 4922 scope.go:117] "RemoveContainer" containerID="9330193a32435c3bb58e3c62888aafde935dba8cdd2799d11f365fb7bef953ab" Feb 18 12:17:49 crc kubenswrapper[4922]: E0218 12:17:49.973531 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:18:03 crc kubenswrapper[4922]: I0218 12:18:03.973536 4922 scope.go:117] "RemoveContainer" containerID="9330193a32435c3bb58e3c62888aafde935dba8cdd2799d11f365fb7bef953ab" Feb 18 12:18:03 crc kubenswrapper[4922]: E0218 12:18:03.974163 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:18:15 crc kubenswrapper[4922]: I0218 12:18:15.973344 4922 scope.go:117] "RemoveContainer" containerID="9330193a32435c3bb58e3c62888aafde935dba8cdd2799d11f365fb7bef953ab" Feb 18 12:18:15 crc kubenswrapper[4922]: E0218 12:18:15.974259 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:18:29 crc kubenswrapper[4922]: I0218 12:18:29.004489 4922 scope.go:117] "RemoveContainer" containerID="9330193a32435c3bb58e3c62888aafde935dba8cdd2799d11f365fb7bef953ab" Feb 18 12:18:29 crc kubenswrapper[4922]: E0218 12:18:29.006286 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:18:42 crc kubenswrapper[4922]: I0218 12:18:42.975022 4922 scope.go:117] "RemoveContainer" containerID="9330193a32435c3bb58e3c62888aafde935dba8cdd2799d11f365fb7bef953ab" Feb 18 12:18:42 crc kubenswrapper[4922]: E0218 12:18:42.975931 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:18:53 crc kubenswrapper[4922]: I0218 12:18:53.974297 4922 scope.go:117] "RemoveContainer" containerID="9330193a32435c3bb58e3c62888aafde935dba8cdd2799d11f365fb7bef953ab" Feb 18 12:18:53 crc kubenswrapper[4922]: E0218 12:18:53.975524 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:19:05 crc kubenswrapper[4922]: I0218 12:19:05.973540 4922 scope.go:117] "RemoveContainer" containerID="9330193a32435c3bb58e3c62888aafde935dba8cdd2799d11f365fb7bef953ab" Feb 18 12:19:05 crc kubenswrapper[4922]: E0218 12:19:05.974353 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:19:17 crc kubenswrapper[4922]: I0218 12:19:17.973169 4922 scope.go:117] "RemoveContainer" containerID="9330193a32435c3bb58e3c62888aafde935dba8cdd2799d11f365fb7bef953ab" Feb 18 12:19:17 crc kubenswrapper[4922]: E0218 12:19:17.973953 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:19:31 crc kubenswrapper[4922]: I0218 12:19:31.973978 4922 scope.go:117] "RemoveContainer" containerID="9330193a32435c3bb58e3c62888aafde935dba8cdd2799d11f365fb7bef953ab" Feb 18 12:19:31 crc kubenswrapper[4922]: E0218 12:19:31.974864 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:19:42 crc kubenswrapper[4922]: I0218 12:19:42.973298 4922 scope.go:117] "RemoveContainer" containerID="9330193a32435c3bb58e3c62888aafde935dba8cdd2799d11f365fb7bef953ab" Feb 18 12:19:42 crc kubenswrapper[4922]: E0218 12:19:42.974123 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:19:51 crc kubenswrapper[4922]: I0218 12:19:51.783887 4922 generic.go:334] "Generic (PLEG): container finished" podID="6e9e482a-c85e-473f-b848-e6fb6ba6afcd" containerID="2adb9429bcb3ef41bd775beaaf3c6d68c44fa008e3e767ad5a7534d6bcd41e78" exitCode=0 Feb 18 12:19:51 crc kubenswrapper[4922]: I0218 12:19:51.783976 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hswp7" event={"ID":"6e9e482a-c85e-473f-b848-e6fb6ba6afcd","Type":"ContainerDied","Data":"2adb9429bcb3ef41bd775beaaf3c6d68c44fa008e3e767ad5a7534d6bcd41e78"} Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.234916 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hswp7" Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.354662 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-nova-extra-config-0\") pod \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\" (UID: \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\") " Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.354718 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-nova-cell1-compute-config-1\") pod \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\" (UID: \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\") " Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.354755 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-nova-migration-ssh-key-1\") pod \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\" (UID: \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\") " Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.354799 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-nova-cell1-compute-config-2\") pod \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\" (UID: \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\") " Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.354870 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-nova-migration-ssh-key-0\") pod \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\" (UID: \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\") " Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.354941 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-ssh-key-openstack-edpm-ipam\") pod \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\" (UID: \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\") " Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.355007 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-nova-cell1-compute-config-0\") pod \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\" (UID: \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\") " Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.355034 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-nova-combined-ca-bundle\") pod \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\" (UID: \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\") " Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.355099 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rprl\" (UniqueName: \"kubernetes.io/projected/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-kube-api-access-9rprl\") pod \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\" (UID: \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\") " Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.355128 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-nova-cell1-compute-config-3\") pod \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\" (UID: \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\") " Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.355162 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-inventory\") pod \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\" (UID: \"6e9e482a-c85e-473f-b848-e6fb6ba6afcd\") " Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.362688 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "6e9e482a-c85e-473f-b848-e6fb6ba6afcd" (UID: "6e9e482a-c85e-473f-b848-e6fb6ba6afcd"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.366589 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-kube-api-access-9rprl" (OuterVolumeSpecName: "kube-api-access-9rprl") pod "6e9e482a-c85e-473f-b848-e6fb6ba6afcd" (UID: "6e9e482a-c85e-473f-b848-e6fb6ba6afcd"). InnerVolumeSpecName "kube-api-access-9rprl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.392220 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "6e9e482a-c85e-473f-b848-e6fb6ba6afcd" (UID: "6e9e482a-c85e-473f-b848-e6fb6ba6afcd"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.396261 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "6e9e482a-c85e-473f-b848-e6fb6ba6afcd" (UID: "6e9e482a-c85e-473f-b848-e6fb6ba6afcd"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.397776 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "6e9e482a-c85e-473f-b848-e6fb6ba6afcd" (UID: "6e9e482a-c85e-473f-b848-e6fb6ba6afcd"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.408703 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6e9e482a-c85e-473f-b848-e6fb6ba6afcd" (UID: "6e9e482a-c85e-473f-b848-e6fb6ba6afcd"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.411087 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "6e9e482a-c85e-473f-b848-e6fb6ba6afcd" (UID: "6e9e482a-c85e-473f-b848-e6fb6ba6afcd"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.413725 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "6e9e482a-c85e-473f-b848-e6fb6ba6afcd" (UID: "6e9e482a-c85e-473f-b848-e6fb6ba6afcd"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.414445 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-inventory" (OuterVolumeSpecName: "inventory") pod "6e9e482a-c85e-473f-b848-e6fb6ba6afcd" (UID: "6e9e482a-c85e-473f-b848-e6fb6ba6afcd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.417180 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "6e9e482a-c85e-473f-b848-e6fb6ba6afcd" (UID: "6e9e482a-c85e-473f-b848-e6fb6ba6afcd"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.420575 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "6e9e482a-c85e-473f-b848-e6fb6ba6afcd" (UID: "6e9e482a-c85e-473f-b848-e6fb6ba6afcd"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.457299 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.457338 4922 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.457354 4922 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.457386 4922 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.457398 4922 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.457410 4922 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.457421 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.457432 4922 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.457441 4922 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.457452 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rprl\" (UniqueName: \"kubernetes.io/projected/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-kube-api-access-9rprl\") on node \"crc\" DevicePath \"\"" Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.457462 4922 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/6e9e482a-c85e-473f-b848-e6fb6ba6afcd-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.810788 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hswp7" event={"ID":"6e9e482a-c85e-473f-b848-e6fb6ba6afcd","Type":"ContainerDied","Data":"b802ddd617fe0e8d5f19f836561f6f6d8d1e7e23231f318980afcb28d7a13059"} Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.810845 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b802ddd617fe0e8d5f19f836561f6f6d8d1e7e23231f318980afcb28d7a13059" Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.810903 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hswp7" Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.929144 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-57sjs"] Feb 18 12:19:53 crc kubenswrapper[4922]: E0218 12:19:53.929712 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="526949a6-53f6-4b36-b4ec-48a4a8b612e9" containerName="extract-utilities" Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.929737 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="526949a6-53f6-4b36-b4ec-48a4a8b612e9" containerName="extract-utilities" Feb 18 12:19:53 crc kubenswrapper[4922]: E0218 12:19:53.929780 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e9e482a-c85e-473f-b848-e6fb6ba6afcd" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.929789 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e9e482a-c85e-473f-b848-e6fb6ba6afcd" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 18 12:19:53 crc kubenswrapper[4922]: E0218 12:19:53.929808 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="526949a6-53f6-4b36-b4ec-48a4a8b612e9" containerName="extract-content" Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.929818 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="526949a6-53f6-4b36-b4ec-48a4a8b612e9" containerName="extract-content" Feb 18 12:19:53 crc kubenswrapper[4922]: E0218 12:19:53.929829 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="526949a6-53f6-4b36-b4ec-48a4a8b612e9" containerName="registry-server" Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.929837 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="526949a6-53f6-4b36-b4ec-48a4a8b612e9" containerName="registry-server" Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.930050 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="526949a6-53f6-4b36-b4ec-48a4a8b612e9" containerName="registry-server" Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.930077 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e9e482a-c85e-473f-b848-e6fb6ba6afcd" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.931002 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-57sjs" Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.934153 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.934349 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.934536 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.934647 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8fsfv" Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.934765 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.965423 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/0c5871a2-bb79-4b43-a830-7714fa7d8241-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-57sjs\" (UID: \"0c5871a2-bb79-4b43-a830-7714fa7d8241\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-57sjs" Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.965540 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c5871a2-bb79-4b43-a830-7714fa7d8241-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-57sjs\" (UID: \"0c5871a2-bb79-4b43-a830-7714fa7d8241\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-57sjs" Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.965564 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qxrx\" (UniqueName: \"kubernetes.io/projected/0c5871a2-bb79-4b43-a830-7714fa7d8241-kube-api-access-2qxrx\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-57sjs\" (UID: \"0c5871a2-bb79-4b43-a830-7714fa7d8241\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-57sjs" Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.965586 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0c5871a2-bb79-4b43-a830-7714fa7d8241-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-57sjs\" (UID: \"0c5871a2-bb79-4b43-a830-7714fa7d8241\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-57sjs" Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.965642 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/0c5871a2-bb79-4b43-a830-7714fa7d8241-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-57sjs\" (UID: \"0c5871a2-bb79-4b43-a830-7714fa7d8241\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-57sjs" Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.965669 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0c5871a2-bb79-4b43-a830-7714fa7d8241-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-57sjs\" (UID: \"0c5871a2-bb79-4b43-a830-7714fa7d8241\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-57sjs" Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.965722 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/0c5871a2-bb79-4b43-a830-7714fa7d8241-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-57sjs\" (UID: \"0c5871a2-bb79-4b43-a830-7714fa7d8241\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-57sjs" Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.966946 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-57sjs"] Feb 18 12:19:53 crc kubenswrapper[4922]: I0218 12:19:53.974229 4922 scope.go:117] "RemoveContainer" containerID="9330193a32435c3bb58e3c62888aafde935dba8cdd2799d11f365fb7bef953ab" Feb 18 12:19:53 crc kubenswrapper[4922]: E0218 12:19:53.974474 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:19:54 crc kubenswrapper[4922]: I0218 12:19:54.067609 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/0c5871a2-bb79-4b43-a830-7714fa7d8241-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-57sjs\" (UID: \"0c5871a2-bb79-4b43-a830-7714fa7d8241\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-57sjs" Feb 18 12:19:54 crc kubenswrapper[4922]: I0218 12:19:54.067801 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/0c5871a2-bb79-4b43-a830-7714fa7d8241-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-57sjs\" (UID: \"0c5871a2-bb79-4b43-a830-7714fa7d8241\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-57sjs" Feb 18 12:19:54 crc kubenswrapper[4922]: I0218 12:19:54.067946 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c5871a2-bb79-4b43-a830-7714fa7d8241-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-57sjs\" (UID: \"0c5871a2-bb79-4b43-a830-7714fa7d8241\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-57sjs" Feb 18 12:19:54 crc kubenswrapper[4922]: I0218 12:19:54.067975 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qxrx\" (UniqueName: \"kubernetes.io/projected/0c5871a2-bb79-4b43-a830-7714fa7d8241-kube-api-access-2qxrx\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-57sjs\" (UID: \"0c5871a2-bb79-4b43-a830-7714fa7d8241\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-57sjs" Feb 18 12:19:54 crc kubenswrapper[4922]: I0218 12:19:54.068007 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0c5871a2-bb79-4b43-a830-7714fa7d8241-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-57sjs\" (UID: \"0c5871a2-bb79-4b43-a830-7714fa7d8241\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-57sjs" Feb 18 12:19:54 crc kubenswrapper[4922]: I0218 12:19:54.068128 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/0c5871a2-bb79-4b43-a830-7714fa7d8241-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-57sjs\" (UID: \"0c5871a2-bb79-4b43-a830-7714fa7d8241\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-57sjs" Feb 18 12:19:54 crc kubenswrapper[4922]: I0218 12:19:54.068166 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0c5871a2-bb79-4b43-a830-7714fa7d8241-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-57sjs\" (UID: \"0c5871a2-bb79-4b43-a830-7714fa7d8241\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-57sjs" Feb 18 12:19:54 crc kubenswrapper[4922]: I0218 12:19:54.077702 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/0c5871a2-bb79-4b43-a830-7714fa7d8241-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-57sjs\" (UID: \"0c5871a2-bb79-4b43-a830-7714fa7d8241\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-57sjs" Feb 18 12:19:54 crc kubenswrapper[4922]: I0218 12:19:54.084008 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0c5871a2-bb79-4b43-a830-7714fa7d8241-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-57sjs\" (UID: \"0c5871a2-bb79-4b43-a830-7714fa7d8241\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-57sjs" Feb 18 12:19:54 crc kubenswrapper[4922]: I0218 12:19:54.085064 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c5871a2-bb79-4b43-a830-7714fa7d8241-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-57sjs\" (UID: \"0c5871a2-bb79-4b43-a830-7714fa7d8241\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-57sjs" Feb 18 12:19:54 crc kubenswrapper[4922]: I0218 12:19:54.088920 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/0c5871a2-bb79-4b43-a830-7714fa7d8241-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-57sjs\" (UID: \"0c5871a2-bb79-4b43-a830-7714fa7d8241\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-57sjs" Feb 18 12:19:54 crc kubenswrapper[4922]: I0218 12:19:54.095032 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/0c5871a2-bb79-4b43-a830-7714fa7d8241-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-57sjs\" (UID: \"0c5871a2-bb79-4b43-a830-7714fa7d8241\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-57sjs" Feb 18 12:19:54 crc kubenswrapper[4922]: I0218 12:19:54.103309 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0c5871a2-bb79-4b43-a830-7714fa7d8241-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-57sjs\" (UID: \"0c5871a2-bb79-4b43-a830-7714fa7d8241\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-57sjs" Feb 18 12:19:54 crc kubenswrapper[4922]: I0218 12:19:54.106307 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qxrx\" (UniqueName: \"kubernetes.io/projected/0c5871a2-bb79-4b43-a830-7714fa7d8241-kube-api-access-2qxrx\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-57sjs\" (UID: \"0c5871a2-bb79-4b43-a830-7714fa7d8241\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-57sjs" Feb 18 12:19:54 crc kubenswrapper[4922]: I0218 12:19:54.246786 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-57sjs" Feb 18 12:19:54 crc kubenswrapper[4922]: I0218 12:19:54.738284 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-57sjs"] Feb 18 12:19:54 crc kubenswrapper[4922]: I0218 12:19:54.823317 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-57sjs" event={"ID":"0c5871a2-bb79-4b43-a830-7714fa7d8241","Type":"ContainerStarted","Data":"0f088979f73251d390e5b8a547c861aea9ecc20d03724e3ee0291b2f23342cde"} Feb 18 12:19:55 crc kubenswrapper[4922]: I0218 12:19:55.834299 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-57sjs" event={"ID":"0c5871a2-bb79-4b43-a830-7714fa7d8241","Type":"ContainerStarted","Data":"2b647db5b11649df8cf1b62b53fef9fdce9f9caa2ea6827a8ff6a1eafa1b40eb"} Feb 18 12:19:55 crc kubenswrapper[4922]: I0218 12:19:55.864195 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-57sjs" podStartSLOduration=2.0672184319999998 podStartE2EDuration="2.864156741s" podCreationTimestamp="2026-02-18 12:19:53 +0000 UTC" firstStartedPulling="2026-02-18 12:19:54.743930003 +0000 UTC m=+2596.471634073" lastFinishedPulling="2026-02-18 12:19:55.540868312 +0000 UTC m=+2597.268572382" observedRunningTime="2026-02-18 12:19:55.851043859 +0000 UTC m=+2597.578747949" watchObservedRunningTime="2026-02-18 12:19:55.864156741 +0000 UTC m=+2597.591860821" Feb 18 12:20:07 crc kubenswrapper[4922]: I0218 12:20:07.974156 4922 scope.go:117] "RemoveContainer" containerID="9330193a32435c3bb58e3c62888aafde935dba8cdd2799d11f365fb7bef953ab" Feb 18 12:20:07 crc kubenswrapper[4922]: E0218 12:20:07.975604 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:20:19 crc kubenswrapper[4922]: I0218 12:20:19.973124 4922 scope.go:117] "RemoveContainer" containerID="9330193a32435c3bb58e3c62888aafde935dba8cdd2799d11f365fb7bef953ab" Feb 18 12:20:19 crc kubenswrapper[4922]: E0218 12:20:19.973824 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:20:34 crc kubenswrapper[4922]: I0218 12:20:34.973585 4922 scope.go:117] "RemoveContainer" containerID="9330193a32435c3bb58e3c62888aafde935dba8cdd2799d11f365fb7bef953ab" Feb 18 12:20:34 crc kubenswrapper[4922]: E0218 12:20:34.974278 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:20:45 crc kubenswrapper[4922]: I0218 12:20:45.973634 4922 scope.go:117] "RemoveContainer" containerID="9330193a32435c3bb58e3c62888aafde935dba8cdd2799d11f365fb7bef953ab" Feb 18 12:20:45 crc kubenswrapper[4922]: E0218 12:20:45.974463 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:20:58 crc kubenswrapper[4922]: I0218 12:20:58.981413 4922 scope.go:117] "RemoveContainer" containerID="9330193a32435c3bb58e3c62888aafde935dba8cdd2799d11f365fb7bef953ab" Feb 18 12:20:58 crc kubenswrapper[4922]: E0218 12:20:58.982190 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:21:10 crc kubenswrapper[4922]: I0218 12:21:10.973727 4922 scope.go:117] "RemoveContainer" containerID="9330193a32435c3bb58e3c62888aafde935dba8cdd2799d11f365fb7bef953ab" Feb 18 12:21:10 crc kubenswrapper[4922]: E0218 12:21:10.974677 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:21:23 crc kubenswrapper[4922]: I0218 12:21:23.973794 4922 scope.go:117] "RemoveContainer" containerID="9330193a32435c3bb58e3c62888aafde935dba8cdd2799d11f365fb7bef953ab" Feb 18 12:21:23 crc kubenswrapper[4922]: E0218 12:21:23.974626 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:21:35 crc kubenswrapper[4922]: I0218 12:21:35.975120 4922 scope.go:117] "RemoveContainer" containerID="9330193a32435c3bb58e3c62888aafde935dba8cdd2799d11f365fb7bef953ab" Feb 18 12:21:35 crc kubenswrapper[4922]: E0218 12:21:35.976098 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:21:43 crc kubenswrapper[4922]: I0218 12:21:43.755543 4922 generic.go:334] "Generic (PLEG): container finished" podID="0c5871a2-bb79-4b43-a830-7714fa7d8241" containerID="2b647db5b11649df8cf1b62b53fef9fdce9f9caa2ea6827a8ff6a1eafa1b40eb" exitCode=0 Feb 18 12:21:43 crc kubenswrapper[4922]: I0218 12:21:43.755630 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-57sjs" event={"ID":"0c5871a2-bb79-4b43-a830-7714fa7d8241","Type":"ContainerDied","Data":"2b647db5b11649df8cf1b62b53fef9fdce9f9caa2ea6827a8ff6a1eafa1b40eb"} Feb 18 12:21:45 crc kubenswrapper[4922]: I0218 12:21:45.241164 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-57sjs" Feb 18 12:21:45 crc kubenswrapper[4922]: I0218 12:21:45.287494 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0c5871a2-bb79-4b43-a830-7714fa7d8241-inventory\") pod \"0c5871a2-bb79-4b43-a830-7714fa7d8241\" (UID: \"0c5871a2-bb79-4b43-a830-7714fa7d8241\") " Feb 18 12:21:45 crc kubenswrapper[4922]: I0218 12:21:45.287842 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qxrx\" (UniqueName: \"kubernetes.io/projected/0c5871a2-bb79-4b43-a830-7714fa7d8241-kube-api-access-2qxrx\") pod \"0c5871a2-bb79-4b43-a830-7714fa7d8241\" (UID: \"0c5871a2-bb79-4b43-a830-7714fa7d8241\") " Feb 18 12:21:45 crc kubenswrapper[4922]: I0218 12:21:45.287873 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/0c5871a2-bb79-4b43-a830-7714fa7d8241-ceilometer-compute-config-data-0\") pod \"0c5871a2-bb79-4b43-a830-7714fa7d8241\" (UID: \"0c5871a2-bb79-4b43-a830-7714fa7d8241\") " Feb 18 12:21:45 crc kubenswrapper[4922]: I0218 12:21:45.287895 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c5871a2-bb79-4b43-a830-7714fa7d8241-telemetry-combined-ca-bundle\") pod \"0c5871a2-bb79-4b43-a830-7714fa7d8241\" (UID: \"0c5871a2-bb79-4b43-a830-7714fa7d8241\") " Feb 18 12:21:45 crc kubenswrapper[4922]: I0218 12:21:45.287913 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/0c5871a2-bb79-4b43-a830-7714fa7d8241-ceilometer-compute-config-data-2\") pod \"0c5871a2-bb79-4b43-a830-7714fa7d8241\" (UID: \"0c5871a2-bb79-4b43-a830-7714fa7d8241\") " Feb 18 12:21:45 crc kubenswrapper[4922]: I0218 12:21:45.288025 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0c5871a2-bb79-4b43-a830-7714fa7d8241-ssh-key-openstack-edpm-ipam\") pod \"0c5871a2-bb79-4b43-a830-7714fa7d8241\" (UID: \"0c5871a2-bb79-4b43-a830-7714fa7d8241\") " Feb 18 12:21:45 crc kubenswrapper[4922]: I0218 12:21:45.288057 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/0c5871a2-bb79-4b43-a830-7714fa7d8241-ceilometer-compute-config-data-1\") pod \"0c5871a2-bb79-4b43-a830-7714fa7d8241\" (UID: \"0c5871a2-bb79-4b43-a830-7714fa7d8241\") " Feb 18 12:21:45 crc kubenswrapper[4922]: I0218 12:21:45.293576 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c5871a2-bb79-4b43-a830-7714fa7d8241-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "0c5871a2-bb79-4b43-a830-7714fa7d8241" (UID: "0c5871a2-bb79-4b43-a830-7714fa7d8241"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:21:45 crc kubenswrapper[4922]: I0218 12:21:45.311308 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c5871a2-bb79-4b43-a830-7714fa7d8241-kube-api-access-2qxrx" (OuterVolumeSpecName: "kube-api-access-2qxrx") pod "0c5871a2-bb79-4b43-a830-7714fa7d8241" (UID: "0c5871a2-bb79-4b43-a830-7714fa7d8241"). InnerVolumeSpecName "kube-api-access-2qxrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:21:45 crc kubenswrapper[4922]: I0218 12:21:45.316020 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c5871a2-bb79-4b43-a830-7714fa7d8241-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "0c5871a2-bb79-4b43-a830-7714fa7d8241" (UID: "0c5871a2-bb79-4b43-a830-7714fa7d8241"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:21:45 crc kubenswrapper[4922]: I0218 12:21:45.317182 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c5871a2-bb79-4b43-a830-7714fa7d8241-inventory" (OuterVolumeSpecName: "inventory") pod "0c5871a2-bb79-4b43-a830-7714fa7d8241" (UID: "0c5871a2-bb79-4b43-a830-7714fa7d8241"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:21:45 crc kubenswrapper[4922]: I0218 12:21:45.318809 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c5871a2-bb79-4b43-a830-7714fa7d8241-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "0c5871a2-bb79-4b43-a830-7714fa7d8241" (UID: "0c5871a2-bb79-4b43-a830-7714fa7d8241"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:21:45 crc kubenswrapper[4922]: I0218 12:21:45.321382 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c5871a2-bb79-4b43-a830-7714fa7d8241-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "0c5871a2-bb79-4b43-a830-7714fa7d8241" (UID: "0c5871a2-bb79-4b43-a830-7714fa7d8241"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:21:45 crc kubenswrapper[4922]: I0218 12:21:45.323870 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c5871a2-bb79-4b43-a830-7714fa7d8241-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0c5871a2-bb79-4b43-a830-7714fa7d8241" (UID: "0c5871a2-bb79-4b43-a830-7714fa7d8241"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:21:45 crc kubenswrapper[4922]: I0218 12:21:45.393056 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0c5871a2-bb79-4b43-a830-7714fa7d8241-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 12:21:45 crc kubenswrapper[4922]: I0218 12:21:45.393197 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qxrx\" (UniqueName: \"kubernetes.io/projected/0c5871a2-bb79-4b43-a830-7714fa7d8241-kube-api-access-2qxrx\") on node \"crc\" DevicePath \"\"" Feb 18 12:21:45 crc kubenswrapper[4922]: I0218 12:21:45.393285 4922 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/0c5871a2-bb79-4b43-a830-7714fa7d8241-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Feb 18 12:21:45 crc kubenswrapper[4922]: I0218 12:21:45.393354 4922 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c5871a2-bb79-4b43-a830-7714fa7d8241-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 12:21:45 crc kubenswrapper[4922]: I0218 12:21:45.393445 4922 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/0c5871a2-bb79-4b43-a830-7714fa7d8241-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Feb 18 12:21:45 crc kubenswrapper[4922]: I0218 12:21:45.393507 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0c5871a2-bb79-4b43-a830-7714fa7d8241-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 12:21:45 crc kubenswrapper[4922]: I0218 12:21:45.393570 4922 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/0c5871a2-bb79-4b43-a830-7714fa7d8241-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Feb 18 12:21:45 crc kubenswrapper[4922]: I0218 12:21:45.778187 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-57sjs" event={"ID":"0c5871a2-bb79-4b43-a830-7714fa7d8241","Type":"ContainerDied","Data":"0f088979f73251d390e5b8a547c861aea9ecc20d03724e3ee0291b2f23342cde"} Feb 18 12:21:45 crc kubenswrapper[4922]: I0218 12:21:45.778227 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f088979f73251d390e5b8a547c861aea9ecc20d03724e3ee0291b2f23342cde" Feb 18 12:21:45 crc kubenswrapper[4922]: I0218 12:21:45.778282 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-57sjs" Feb 18 12:21:46 crc kubenswrapper[4922]: I0218 12:21:46.975596 4922 scope.go:117] "RemoveContainer" containerID="9330193a32435c3bb58e3c62888aafde935dba8cdd2799d11f365fb7bef953ab" Feb 18 12:21:47 crc kubenswrapper[4922]: I0218 12:21:47.810471 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" event={"ID":"fdb7cedc-b2e3-48f0-80e0-e17073b43228","Type":"ContainerStarted","Data":"ef51f81fc06c6367e33199b91ef56039257473dad48437b29bc57a1983bf05c7"} Feb 18 12:22:22 crc kubenswrapper[4922]: I0218 12:22:22.214771 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-x6t7b"] Feb 18 12:22:22 crc kubenswrapper[4922]: E0218 12:22:22.217626 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c5871a2-bb79-4b43-a830-7714fa7d8241" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 18 12:22:22 crc kubenswrapper[4922]: I0218 12:22:22.217651 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c5871a2-bb79-4b43-a830-7714fa7d8241" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 18 12:22:22 crc kubenswrapper[4922]: I0218 12:22:22.217834 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c5871a2-bb79-4b43-a830-7714fa7d8241" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 18 12:22:22 crc kubenswrapper[4922]: I0218 12:22:22.219283 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x6t7b" Feb 18 12:22:22 crc kubenswrapper[4922]: I0218 12:22:22.235202 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x6t7b"] Feb 18 12:22:22 crc kubenswrapper[4922]: I0218 12:22:22.241196 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6217a40d-f959-4afa-b48e-b25c1c1693c1-utilities\") pod \"redhat-operators-x6t7b\" (UID: \"6217a40d-f959-4afa-b48e-b25c1c1693c1\") " pod="openshift-marketplace/redhat-operators-x6t7b" Feb 18 12:22:22 crc kubenswrapper[4922]: I0218 12:22:22.241298 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6217a40d-f959-4afa-b48e-b25c1c1693c1-catalog-content\") pod \"redhat-operators-x6t7b\" (UID: \"6217a40d-f959-4afa-b48e-b25c1c1693c1\") " pod="openshift-marketplace/redhat-operators-x6t7b" Feb 18 12:22:22 crc kubenswrapper[4922]: I0218 12:22:22.241437 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw68x\" (UniqueName: \"kubernetes.io/projected/6217a40d-f959-4afa-b48e-b25c1c1693c1-kube-api-access-rw68x\") pod \"redhat-operators-x6t7b\" (UID: \"6217a40d-f959-4afa-b48e-b25c1c1693c1\") " pod="openshift-marketplace/redhat-operators-x6t7b" Feb 18 12:22:22 crc kubenswrapper[4922]: I0218 12:22:22.342766 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6217a40d-f959-4afa-b48e-b25c1c1693c1-utilities\") pod \"redhat-operators-x6t7b\" (UID: \"6217a40d-f959-4afa-b48e-b25c1c1693c1\") " pod="openshift-marketplace/redhat-operators-x6t7b" Feb 18 12:22:22 crc kubenswrapper[4922]: I0218 12:22:22.342837 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6217a40d-f959-4afa-b48e-b25c1c1693c1-catalog-content\") pod \"redhat-operators-x6t7b\" (UID: \"6217a40d-f959-4afa-b48e-b25c1c1693c1\") " pod="openshift-marketplace/redhat-operators-x6t7b" Feb 18 12:22:22 crc kubenswrapper[4922]: I0218 12:22:22.342870 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rw68x\" (UniqueName: \"kubernetes.io/projected/6217a40d-f959-4afa-b48e-b25c1c1693c1-kube-api-access-rw68x\") pod \"redhat-operators-x6t7b\" (UID: \"6217a40d-f959-4afa-b48e-b25c1c1693c1\") " pod="openshift-marketplace/redhat-operators-x6t7b" Feb 18 12:22:22 crc kubenswrapper[4922]: I0218 12:22:22.343298 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6217a40d-f959-4afa-b48e-b25c1c1693c1-utilities\") pod \"redhat-operators-x6t7b\" (UID: \"6217a40d-f959-4afa-b48e-b25c1c1693c1\") " pod="openshift-marketplace/redhat-operators-x6t7b" Feb 18 12:22:22 crc kubenswrapper[4922]: I0218 12:22:22.343437 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6217a40d-f959-4afa-b48e-b25c1c1693c1-catalog-content\") pod \"redhat-operators-x6t7b\" (UID: \"6217a40d-f959-4afa-b48e-b25c1c1693c1\") " pod="openshift-marketplace/redhat-operators-x6t7b" Feb 18 12:22:22 crc kubenswrapper[4922]: I0218 12:22:22.365003 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw68x\" (UniqueName: \"kubernetes.io/projected/6217a40d-f959-4afa-b48e-b25c1c1693c1-kube-api-access-rw68x\") pod \"redhat-operators-x6t7b\" (UID: \"6217a40d-f959-4afa-b48e-b25c1c1693c1\") " pod="openshift-marketplace/redhat-operators-x6t7b" Feb 18 12:22:22 crc kubenswrapper[4922]: I0218 12:22:22.583684 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x6t7b" Feb 18 12:22:23 crc kubenswrapper[4922]: I0218 12:22:23.083920 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x6t7b"] Feb 18 12:22:23 crc kubenswrapper[4922]: I0218 12:22:23.105172 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6t7b" event={"ID":"6217a40d-f959-4afa-b48e-b25c1c1693c1","Type":"ContainerStarted","Data":"01304453471ec40644ea6059952df8e463fbece81357fde7a770b3318d0244bd"} Feb 18 12:22:24 crc kubenswrapper[4922]: I0218 12:22:24.115070 4922 generic.go:334] "Generic (PLEG): container finished" podID="6217a40d-f959-4afa-b48e-b25c1c1693c1" containerID="13a7f699d921da48e9cffe7b53f8c4c33c05acbf32d0e501384d7a7ecca47296" exitCode=0 Feb 18 12:22:24 crc kubenswrapper[4922]: I0218 12:22:24.115124 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6t7b" event={"ID":"6217a40d-f959-4afa-b48e-b25c1c1693c1","Type":"ContainerDied","Data":"13a7f699d921da48e9cffe7b53f8c4c33c05acbf32d0e501384d7a7ecca47296"} Feb 18 12:22:25 crc kubenswrapper[4922]: I0218 12:22:25.125895 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6t7b" event={"ID":"6217a40d-f959-4afa-b48e-b25c1c1693c1","Type":"ContainerStarted","Data":"cbb7446ca5b63d5effa7c8e217cc14f792d76dd59052ea3d503ec58b1802b227"} Feb 18 12:22:27 crc kubenswrapper[4922]: I0218 12:22:27.594036 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gmwj8"] Feb 18 12:22:27 crc kubenswrapper[4922]: I0218 12:22:27.596272 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gmwj8" Feb 18 12:22:27 crc kubenswrapper[4922]: I0218 12:22:27.605585 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gmwj8"] Feb 18 12:22:27 crc kubenswrapper[4922]: I0218 12:22:27.661125 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz4fm\" (UniqueName: \"kubernetes.io/projected/d2f2ec0a-16c2-4808-8871-a8e56bd045a9-kube-api-access-nz4fm\") pod \"redhat-marketplace-gmwj8\" (UID: \"d2f2ec0a-16c2-4808-8871-a8e56bd045a9\") " pod="openshift-marketplace/redhat-marketplace-gmwj8" Feb 18 12:22:27 crc kubenswrapper[4922]: I0218 12:22:27.661171 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2f2ec0a-16c2-4808-8871-a8e56bd045a9-catalog-content\") pod \"redhat-marketplace-gmwj8\" (UID: \"d2f2ec0a-16c2-4808-8871-a8e56bd045a9\") " pod="openshift-marketplace/redhat-marketplace-gmwj8" Feb 18 12:22:27 crc kubenswrapper[4922]: I0218 12:22:27.661281 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2f2ec0a-16c2-4808-8871-a8e56bd045a9-utilities\") pod \"redhat-marketplace-gmwj8\" (UID: \"d2f2ec0a-16c2-4808-8871-a8e56bd045a9\") " pod="openshift-marketplace/redhat-marketplace-gmwj8" Feb 18 12:22:27 crc kubenswrapper[4922]: I0218 12:22:27.763417 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nz4fm\" (UniqueName: \"kubernetes.io/projected/d2f2ec0a-16c2-4808-8871-a8e56bd045a9-kube-api-access-nz4fm\") pod \"redhat-marketplace-gmwj8\" (UID: \"d2f2ec0a-16c2-4808-8871-a8e56bd045a9\") " pod="openshift-marketplace/redhat-marketplace-gmwj8" Feb 18 12:22:27 crc kubenswrapper[4922]: I0218 12:22:27.763477 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2f2ec0a-16c2-4808-8871-a8e56bd045a9-catalog-content\") pod \"redhat-marketplace-gmwj8\" (UID: \"d2f2ec0a-16c2-4808-8871-a8e56bd045a9\") " pod="openshift-marketplace/redhat-marketplace-gmwj8" Feb 18 12:22:27 crc kubenswrapper[4922]: I0218 12:22:27.763579 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2f2ec0a-16c2-4808-8871-a8e56bd045a9-utilities\") pod \"redhat-marketplace-gmwj8\" (UID: \"d2f2ec0a-16c2-4808-8871-a8e56bd045a9\") " pod="openshift-marketplace/redhat-marketplace-gmwj8" Feb 18 12:22:27 crc kubenswrapper[4922]: I0218 12:22:27.764170 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2f2ec0a-16c2-4808-8871-a8e56bd045a9-utilities\") pod \"redhat-marketplace-gmwj8\" (UID: \"d2f2ec0a-16c2-4808-8871-a8e56bd045a9\") " pod="openshift-marketplace/redhat-marketplace-gmwj8" Feb 18 12:22:27 crc kubenswrapper[4922]: I0218 12:22:27.764203 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2f2ec0a-16c2-4808-8871-a8e56bd045a9-catalog-content\") pod \"redhat-marketplace-gmwj8\" (UID: \"d2f2ec0a-16c2-4808-8871-a8e56bd045a9\") " pod="openshift-marketplace/redhat-marketplace-gmwj8" Feb 18 12:22:27 crc kubenswrapper[4922]: I0218 12:22:27.783260 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nz4fm\" (UniqueName: \"kubernetes.io/projected/d2f2ec0a-16c2-4808-8871-a8e56bd045a9-kube-api-access-nz4fm\") pod \"redhat-marketplace-gmwj8\" (UID: \"d2f2ec0a-16c2-4808-8871-a8e56bd045a9\") " pod="openshift-marketplace/redhat-marketplace-gmwj8" Feb 18 12:22:27 crc kubenswrapper[4922]: I0218 12:22:27.915202 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gmwj8" Feb 18 12:22:28 crc kubenswrapper[4922]: I0218 12:22:28.440619 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gmwj8"] Feb 18 12:22:29 crc kubenswrapper[4922]: E0218 12:22:29.051568 4922 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6217a40d_f959_4afa_b48e_b25c1c1693c1.slice/crio-conmon-cbb7446ca5b63d5effa7c8e217cc14f792d76dd59052ea3d503ec58b1802b227.scope\": RecentStats: unable to find data in memory cache]" Feb 18 12:22:29 crc kubenswrapper[4922]: I0218 12:22:29.166463 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gmwj8" event={"ID":"d2f2ec0a-16c2-4808-8871-a8e56bd045a9","Type":"ContainerStarted","Data":"5866c42fc40619d4c7f32758c298f20a5ee96d82a018e9ab977c8f97a23b5c45"} Feb 18 12:22:29 crc kubenswrapper[4922]: I0218 12:22:29.166517 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gmwj8" event={"ID":"d2f2ec0a-16c2-4808-8871-a8e56bd045a9","Type":"ContainerStarted","Data":"c172eba0a6e88b586259ce4d77fcba9de3974f0deeaae0baf659acb7a41aaa60"} Feb 18 12:22:29 crc kubenswrapper[4922]: I0218 12:22:29.169139 4922 generic.go:334] "Generic (PLEG): container finished" podID="6217a40d-f959-4afa-b48e-b25c1c1693c1" containerID="cbb7446ca5b63d5effa7c8e217cc14f792d76dd59052ea3d503ec58b1802b227" exitCode=0 Feb 18 12:22:29 crc kubenswrapper[4922]: I0218 12:22:29.169195 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6t7b" event={"ID":"6217a40d-f959-4afa-b48e-b25c1c1693c1","Type":"ContainerDied","Data":"cbb7446ca5b63d5effa7c8e217cc14f792d76dd59052ea3d503ec58b1802b227"} Feb 18 12:22:29 crc kubenswrapper[4922]: I0218 12:22:29.184404 4922 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 12:22:30 crc kubenswrapper[4922]: I0218 12:22:30.181443 4922 generic.go:334] "Generic (PLEG): container finished" podID="d2f2ec0a-16c2-4808-8871-a8e56bd045a9" containerID="5866c42fc40619d4c7f32758c298f20a5ee96d82a018e9ab977c8f97a23b5c45" exitCode=0 Feb 18 12:22:30 crc kubenswrapper[4922]: I0218 12:22:30.181670 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gmwj8" event={"ID":"d2f2ec0a-16c2-4808-8871-a8e56bd045a9","Type":"ContainerDied","Data":"5866c42fc40619d4c7f32758c298f20a5ee96d82a018e9ab977c8f97a23b5c45"} Feb 18 12:22:31 crc kubenswrapper[4922]: I0218 12:22:31.191972 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gmwj8" event={"ID":"d2f2ec0a-16c2-4808-8871-a8e56bd045a9","Type":"ContainerStarted","Data":"e7a62b2634ba91dccc174bd713229fd50823f1b237565279bd2b4231ccc19723"} Feb 18 12:22:31 crc kubenswrapper[4922]: I0218 12:22:31.196603 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6t7b" event={"ID":"6217a40d-f959-4afa-b48e-b25c1c1693c1","Type":"ContainerStarted","Data":"bbebd17e24478f9f7aaaca2db042a4da18b50d95dcd264d43cd200ad4b1d6eae"} Feb 18 12:22:31 crc kubenswrapper[4922]: I0218 12:22:31.237585 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-x6t7b" podStartSLOduration=3.108707252 podStartE2EDuration="9.23756636s" podCreationTimestamp="2026-02-18 12:22:22 +0000 UTC" firstStartedPulling="2026-02-18 12:22:24.1189741 +0000 UTC m=+2745.846678180" lastFinishedPulling="2026-02-18 12:22:30.247833208 +0000 UTC m=+2751.975537288" observedRunningTime="2026-02-18 12:22:31.22689851 +0000 UTC m=+2752.954602600" watchObservedRunningTime="2026-02-18 12:22:31.23756636 +0000 UTC m=+2752.965270440" Feb 18 12:22:32 crc kubenswrapper[4922]: I0218 12:22:32.208421 4922 generic.go:334] "Generic (PLEG): container finished" podID="d2f2ec0a-16c2-4808-8871-a8e56bd045a9" containerID="e7a62b2634ba91dccc174bd713229fd50823f1b237565279bd2b4231ccc19723" exitCode=0 Feb 18 12:22:32 crc kubenswrapper[4922]: I0218 12:22:32.208513 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gmwj8" event={"ID":"d2f2ec0a-16c2-4808-8871-a8e56bd045a9","Type":"ContainerDied","Data":"e7a62b2634ba91dccc174bd713229fd50823f1b237565279bd2b4231ccc19723"} Feb 18 12:22:32 crc kubenswrapper[4922]: I0218 12:22:32.584353 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-x6t7b" Feb 18 12:22:32 crc kubenswrapper[4922]: I0218 12:22:32.584667 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-x6t7b" Feb 18 12:22:33 crc kubenswrapper[4922]: I0218 12:22:33.221223 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gmwj8" event={"ID":"d2f2ec0a-16c2-4808-8871-a8e56bd045a9","Type":"ContainerStarted","Data":"824db469829217c6af7f82c4435908cbbf5d6fa5600ef7ccdcf836213996956a"} Feb 18 12:22:33 crc kubenswrapper[4922]: I0218 12:22:33.239986 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gmwj8" podStartSLOduration=3.736879006 podStartE2EDuration="6.239956925s" podCreationTimestamp="2026-02-18 12:22:27 +0000 UTC" firstStartedPulling="2026-02-18 12:22:30.183719923 +0000 UTC m=+2751.911424003" lastFinishedPulling="2026-02-18 12:22:32.686797842 +0000 UTC m=+2754.414501922" observedRunningTime="2026-02-18 12:22:33.236812115 +0000 UTC m=+2754.964516195" watchObservedRunningTime="2026-02-18 12:22:33.239956925 +0000 UTC m=+2754.967660995" Feb 18 12:22:33 crc kubenswrapper[4922]: I0218 12:22:33.630434 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-x6t7b" podUID="6217a40d-f959-4afa-b48e-b25c1c1693c1" containerName="registry-server" probeResult="failure" output=< Feb 18 12:22:33 crc kubenswrapper[4922]: timeout: failed to connect service ":50051" within 1s Feb 18 12:22:33 crc kubenswrapper[4922]: > Feb 18 12:22:37 crc kubenswrapper[4922]: I0218 12:22:37.915796 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gmwj8" Feb 18 12:22:37 crc kubenswrapper[4922]: I0218 12:22:37.917631 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gmwj8" Feb 18 12:22:37 crc kubenswrapper[4922]: I0218 12:22:37.971190 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gmwj8" Feb 18 12:22:38 crc kubenswrapper[4922]: I0218 12:22:38.318649 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gmwj8" Feb 18 12:22:38 crc kubenswrapper[4922]: I0218 12:22:38.375118 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gmwj8"] Feb 18 12:22:39 crc kubenswrapper[4922]: I0218 12:22:39.124440 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 18 12:22:39 crc kubenswrapper[4922]: I0218 12:22:39.124792 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="791eb2d2-e8b0-401d-8b8b-b79d675c1ca4" containerName="prometheus" containerID="cri-o://6a87e12a9d544ccc469c240f591cfdfbc76c565f80e42da4d9b856e9a979562b" gracePeriod=600 Feb 18 12:22:39 crc kubenswrapper[4922]: I0218 12:22:39.124922 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="791eb2d2-e8b0-401d-8b8b-b79d675c1ca4" containerName="config-reloader" containerID="cri-o://491ccef72e7d04d2ba0ad4f552d88e3b6a7d14b1901010114c09e2c9f1ceb17c" gracePeriod=600 Feb 18 12:22:39 crc kubenswrapper[4922]: I0218 12:22:39.124906 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="791eb2d2-e8b0-401d-8b8b-b79d675c1ca4" containerName="thanos-sidecar" containerID="cri-o://a913d6036fa961f4da9057dcc5a6d40a6e4b3e35d77e43f7033ef4f740b7546b" gracePeriod=600 Feb 18 12:22:39 crc kubenswrapper[4922]: I0218 12:22:39.291562 4922 generic.go:334] "Generic (PLEG): container finished" podID="791eb2d2-e8b0-401d-8b8b-b79d675c1ca4" containerID="a913d6036fa961f4da9057dcc5a6d40a6e4b3e35d77e43f7033ef4f740b7546b" exitCode=0 Feb 18 12:22:39 crc kubenswrapper[4922]: I0218 12:22:39.291851 4922 generic.go:334] "Generic (PLEG): container finished" podID="791eb2d2-e8b0-401d-8b8b-b79d675c1ca4" containerID="6a87e12a9d544ccc469c240f591cfdfbc76c565f80e42da4d9b856e9a979562b" exitCode=0 Feb 18 12:22:39 crc kubenswrapper[4922]: I0218 12:22:39.292671 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4","Type":"ContainerDied","Data":"a913d6036fa961f4da9057dcc5a6d40a6e4b3e35d77e43f7033ef4f740b7546b"} Feb 18 12:22:39 crc kubenswrapper[4922]: I0218 12:22:39.292711 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4","Type":"ContainerDied","Data":"6a87e12a9d544ccc469c240f591cfdfbc76c565f80e42da4d9b856e9a979562b"} Feb 18 12:22:39 crc kubenswrapper[4922]: E0218 12:22:39.328344 4922 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod791eb2d2_e8b0_401d_8b8b_b79d675c1ca4.slice/crio-6a87e12a9d544ccc469c240f591cfdfbc76c565f80e42da4d9b856e9a979562b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod791eb2d2_e8b0_401d_8b8b_b79d675c1ca4.slice/crio-conmon-a913d6036fa961f4da9057dcc5a6d40a6e4b3e35d77e43f7033ef4f740b7546b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod791eb2d2_e8b0_401d_8b8b_b79d675c1ca4.slice/crio-a913d6036fa961f4da9057dcc5a6d40a6e4b3e35d77e43f7033ef4f740b7546b.scope\": RecentStats: unable to find data in memory cache]" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.140212 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.303434 4922 generic.go:334] "Generic (PLEG): container finished" podID="791eb2d2-e8b0-401d-8b8b-b79d675c1ca4" containerID="491ccef72e7d04d2ba0ad4f552d88e3b6a7d14b1901010114c09e2c9f1ceb17c" exitCode=0 Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.303533 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.303523 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4","Type":"ContainerDied","Data":"491ccef72e7d04d2ba0ad4f552d88e3b6a7d14b1901010114c09e2c9f1ceb17c"} Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.303615 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4","Type":"ContainerDied","Data":"eb650b51fbe53331680115b5b916f11fa8f568a0682dba6d2801da306b88e53d"} Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.303655 4922 scope.go:117] "RemoveContainer" containerID="a913d6036fa961f4da9057dcc5a6d40a6e4b3e35d77e43f7033ef4f740b7546b" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.303654 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gmwj8" podUID="d2f2ec0a-16c2-4808-8871-a8e56bd045a9" containerName="registry-server" containerID="cri-o://824db469829217c6af7f82c4435908cbbf5d6fa5600ef7ccdcf836213996956a" gracePeriod=2 Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.329179 4922 scope.go:117] "RemoveContainer" containerID="491ccef72e7d04d2ba0ad4f552d88e3b6a7d14b1901010114c09e2c9f1ceb17c" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.335066 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-prometheus-metric-storage-rulefiles-2\") pod \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") " Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.335108 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-prometheus-metric-storage-rulefiles-1\") pod \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") " Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.335130 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") " Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.335210 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vp5t7\" (UniqueName: \"kubernetes.io/projected/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-kube-api-access-vp5t7\") pod \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") " Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.335246 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-secret-combined-ca-bundle\") pod \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") " Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.335273 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-thanos-prometheus-http-client-file\") pod \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") " Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.335322 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-prometheus-metric-storage-rulefiles-0\") pod \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") " Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.335343 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") " Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.335403 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-web-config\") pod \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") " Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.335426 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-tls-assets\") pod \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") " Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.335447 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-config-out\") pod \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") " Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.335910 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15ee050c-37dd-43df-8cbe-1200f09a5545\") pod \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") " Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.335961 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-config\") pod \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\" (UID: \"791eb2d2-e8b0-401d-8b8b-b79d675c1ca4\") " Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.337020 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "791eb2d2-e8b0-401d-8b8b-b79d675c1ca4" (UID: "791eb2d2-e8b0-401d-8b8b-b79d675c1ca4"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.337116 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "791eb2d2-e8b0-401d-8b8b-b79d675c1ca4" (UID: "791eb2d2-e8b0-401d-8b8b-b79d675c1ca4"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.337484 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "791eb2d2-e8b0-401d-8b8b-b79d675c1ca4" (UID: "791eb2d2-e8b0-401d-8b8b-b79d675c1ca4"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.341640 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "791eb2d2-e8b0-401d-8b8b-b79d675c1ca4" (UID: "791eb2d2-e8b0-401d-8b8b-b79d675c1ca4"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.343156 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-config" (OuterVolumeSpecName: "config") pod "791eb2d2-e8b0-401d-8b8b-b79d675c1ca4" (UID: "791eb2d2-e8b0-401d-8b8b-b79d675c1ca4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.343327 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "791eb2d2-e8b0-401d-8b8b-b79d675c1ca4" (UID: "791eb2d2-e8b0-401d-8b8b-b79d675c1ca4"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.343458 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "791eb2d2-e8b0-401d-8b8b-b79d675c1ca4" (UID: "791eb2d2-e8b0-401d-8b8b-b79d675c1ca4"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.344084 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "791eb2d2-e8b0-401d-8b8b-b79d675c1ca4" (UID: "791eb2d2-e8b0-401d-8b8b-b79d675c1ca4"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.345973 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-config-out" (OuterVolumeSpecName: "config-out") pod "791eb2d2-e8b0-401d-8b8b-b79d675c1ca4" (UID: "791eb2d2-e8b0-401d-8b8b-b79d675c1ca4"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.347702 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-kube-api-access-vp5t7" (OuterVolumeSpecName: "kube-api-access-vp5t7") pod "791eb2d2-e8b0-401d-8b8b-b79d675c1ca4" (UID: "791eb2d2-e8b0-401d-8b8b-b79d675c1ca4"). InnerVolumeSpecName "kube-api-access-vp5t7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.349673 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "791eb2d2-e8b0-401d-8b8b-b79d675c1ca4" (UID: "791eb2d2-e8b0-401d-8b8b-b79d675c1ca4"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.359331 4922 scope.go:117] "RemoveContainer" containerID="6a87e12a9d544ccc469c240f591cfdfbc76c565f80e42da4d9b856e9a979562b" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.370810 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15ee050c-37dd-43df-8cbe-1200f09a5545" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "791eb2d2-e8b0-401d-8b8b-b79d675c1ca4" (UID: "791eb2d2-e8b0-401d-8b8b-b79d675c1ca4"). InnerVolumeSpecName "pvc-15ee050c-37dd-43df-8cbe-1200f09a5545". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.438295 4922 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-15ee050c-37dd-43df-8cbe-1200f09a5545\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15ee050c-37dd-43df-8cbe-1200f09a5545\") on node \"crc\" " Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.438337 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-config\") on node \"crc\" DevicePath \"\"" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.438353 4922 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.438462 4922 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.438489 4922 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.438505 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vp5t7\" (UniqueName: \"kubernetes.io/projected/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-kube-api-access-vp5t7\") on node \"crc\" DevicePath \"\"" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.438518 4922 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.438536 4922 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.438551 4922 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.438565 4922 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.438579 4922 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-tls-assets\") on node \"crc\" DevicePath \"\"" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.438762 4922 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-config-out\") on node \"crc\" DevicePath \"\"" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.462856 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-web-config" (OuterVolumeSpecName: "web-config") pod "791eb2d2-e8b0-401d-8b8b-b79d675c1ca4" (UID: "791eb2d2-e8b0-401d-8b8b-b79d675c1ca4"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.511154 4922 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.512158 4922 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-15ee050c-37dd-43df-8cbe-1200f09a5545" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15ee050c-37dd-43df-8cbe-1200f09a5545") on node "crc" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.540514 4922 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4-web-config\") on node \"crc\" DevicePath \"\"" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.540549 4922 reconciler_common.go:293] "Volume detached for volume \"pvc-15ee050c-37dd-43df-8cbe-1200f09a5545\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15ee050c-37dd-43df-8cbe-1200f09a5545\") on node \"crc\" DevicePath \"\"" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.613652 4922 scope.go:117] "RemoveContainer" containerID="66d83f5860f3877f372e8e31496543d7009d3da9bb6e8967b9cf02ffa25c24f8" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.663579 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.668829 4922 scope.go:117] "RemoveContainer" containerID="a913d6036fa961f4da9057dcc5a6d40a6e4b3e35d77e43f7033ef4f740b7546b" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.679201 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 18 12:22:40 crc kubenswrapper[4922]: E0218 12:22:40.692798 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a913d6036fa961f4da9057dcc5a6d40a6e4b3e35d77e43f7033ef4f740b7546b\": container with ID starting with a913d6036fa961f4da9057dcc5a6d40a6e4b3e35d77e43f7033ef4f740b7546b not found: ID does not exist" containerID="a913d6036fa961f4da9057dcc5a6d40a6e4b3e35d77e43f7033ef4f740b7546b" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.692858 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a913d6036fa961f4da9057dcc5a6d40a6e4b3e35d77e43f7033ef4f740b7546b"} err="failed to get container status \"a913d6036fa961f4da9057dcc5a6d40a6e4b3e35d77e43f7033ef4f740b7546b\": rpc error: code = NotFound desc = could not find container \"a913d6036fa961f4da9057dcc5a6d40a6e4b3e35d77e43f7033ef4f740b7546b\": container with ID starting with a913d6036fa961f4da9057dcc5a6d40a6e4b3e35d77e43f7033ef4f740b7546b not found: ID does not exist" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.692890 4922 scope.go:117] "RemoveContainer" containerID="491ccef72e7d04d2ba0ad4f552d88e3b6a7d14b1901010114c09e2c9f1ceb17c" Feb 18 12:22:40 crc kubenswrapper[4922]: E0218 12:22:40.693403 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"491ccef72e7d04d2ba0ad4f552d88e3b6a7d14b1901010114c09e2c9f1ceb17c\": container with ID starting with 491ccef72e7d04d2ba0ad4f552d88e3b6a7d14b1901010114c09e2c9f1ceb17c not found: ID does not exist" containerID="491ccef72e7d04d2ba0ad4f552d88e3b6a7d14b1901010114c09e2c9f1ceb17c" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.693448 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"491ccef72e7d04d2ba0ad4f552d88e3b6a7d14b1901010114c09e2c9f1ceb17c"} err="failed to get container status \"491ccef72e7d04d2ba0ad4f552d88e3b6a7d14b1901010114c09e2c9f1ceb17c\": rpc error: code = NotFound desc = could not find container \"491ccef72e7d04d2ba0ad4f552d88e3b6a7d14b1901010114c09e2c9f1ceb17c\": container with ID starting with 491ccef72e7d04d2ba0ad4f552d88e3b6a7d14b1901010114c09e2c9f1ceb17c not found: ID does not exist" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.693476 4922 scope.go:117] "RemoveContainer" containerID="6a87e12a9d544ccc469c240f591cfdfbc76c565f80e42da4d9b856e9a979562b" Feb 18 12:22:40 crc kubenswrapper[4922]: E0218 12:22:40.693810 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a87e12a9d544ccc469c240f591cfdfbc76c565f80e42da4d9b856e9a979562b\": container with ID starting with 6a87e12a9d544ccc469c240f591cfdfbc76c565f80e42da4d9b856e9a979562b not found: ID does not exist" containerID="6a87e12a9d544ccc469c240f591cfdfbc76c565f80e42da4d9b856e9a979562b" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.693847 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a87e12a9d544ccc469c240f591cfdfbc76c565f80e42da4d9b856e9a979562b"} err="failed to get container status \"6a87e12a9d544ccc469c240f591cfdfbc76c565f80e42da4d9b856e9a979562b\": rpc error: code = NotFound desc = could not find container \"6a87e12a9d544ccc469c240f591cfdfbc76c565f80e42da4d9b856e9a979562b\": container with ID starting with 6a87e12a9d544ccc469c240f591cfdfbc76c565f80e42da4d9b856e9a979562b not found: ID does not exist" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.693866 4922 scope.go:117] "RemoveContainer" containerID="66d83f5860f3877f372e8e31496543d7009d3da9bb6e8967b9cf02ffa25c24f8" Feb 18 12:22:40 crc kubenswrapper[4922]: E0218 12:22:40.697672 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66d83f5860f3877f372e8e31496543d7009d3da9bb6e8967b9cf02ffa25c24f8\": container with ID starting with 66d83f5860f3877f372e8e31496543d7009d3da9bb6e8967b9cf02ffa25c24f8 not found: ID does not exist" containerID="66d83f5860f3877f372e8e31496543d7009d3da9bb6e8967b9cf02ffa25c24f8" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.697773 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66d83f5860f3877f372e8e31496543d7009d3da9bb6e8967b9cf02ffa25c24f8"} err="failed to get container status \"66d83f5860f3877f372e8e31496543d7009d3da9bb6e8967b9cf02ffa25c24f8\": rpc error: code = NotFound desc = could not find container \"66d83f5860f3877f372e8e31496543d7009d3da9bb6e8967b9cf02ffa25c24f8\": container with ID starting with 66d83f5860f3877f372e8e31496543d7009d3da9bb6e8967b9cf02ffa25c24f8 not found: ID does not exist" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.708716 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 18 12:22:40 crc kubenswrapper[4922]: E0218 12:22:40.709193 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="791eb2d2-e8b0-401d-8b8b-b79d675c1ca4" containerName="prometheus" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.709218 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="791eb2d2-e8b0-401d-8b8b-b79d675c1ca4" containerName="prometheus" Feb 18 12:22:40 crc kubenswrapper[4922]: E0218 12:22:40.709242 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="791eb2d2-e8b0-401d-8b8b-b79d675c1ca4" containerName="config-reloader" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.709251 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="791eb2d2-e8b0-401d-8b8b-b79d675c1ca4" containerName="config-reloader" Feb 18 12:22:40 crc kubenswrapper[4922]: E0218 12:22:40.709270 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="791eb2d2-e8b0-401d-8b8b-b79d675c1ca4" containerName="init-config-reloader" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.709280 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="791eb2d2-e8b0-401d-8b8b-b79d675c1ca4" containerName="init-config-reloader" Feb 18 12:22:40 crc kubenswrapper[4922]: E0218 12:22:40.709297 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="791eb2d2-e8b0-401d-8b8b-b79d675c1ca4" containerName="thanos-sidecar" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.709304 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="791eb2d2-e8b0-401d-8b8b-b79d675c1ca4" containerName="thanos-sidecar" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.709658 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="791eb2d2-e8b0-401d-8b8b-b79d675c1ca4" containerName="config-reloader" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.709688 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="791eb2d2-e8b0-401d-8b8b-b79d675c1ca4" containerName="thanos-sidecar" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.709700 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="791eb2d2-e8b0-401d-8b8b-b79d675c1ca4" containerName="prometheus" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.711873 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.714055 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-xmthr" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.715301 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.715750 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.716053 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.716294 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.716473 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.716737 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.726727 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.746598 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.777745 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gmwj8" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.857410 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13358646-85fa-4761-b4e8-ce5baf8851da-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"13358646-85fa-4761-b4e8-ce5baf8851da\") " pod="openstack/prometheus-metric-storage-0" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.857811 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/13358646-85fa-4761-b4e8-ce5baf8851da-config\") pod \"prometheus-metric-storage-0\" (UID: \"13358646-85fa-4761-b4e8-ce5baf8851da\") " pod="openstack/prometheus-metric-storage-0" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.857927 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/13358646-85fa-4761-b4e8-ce5baf8851da-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"13358646-85fa-4761-b4e8-ce5baf8851da\") " pod="openstack/prometheus-metric-storage-0" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.857968 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/13358646-85fa-4761-b4e8-ce5baf8851da-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"13358646-85fa-4761-b4e8-ce5baf8851da\") " pod="openstack/prometheus-metric-storage-0" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.858003 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/13358646-85fa-4761-b4e8-ce5baf8851da-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"13358646-85fa-4761-b4e8-ce5baf8851da\") " pod="openstack/prometheus-metric-storage-0" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.858042 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/13358646-85fa-4761-b4e8-ce5baf8851da-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"13358646-85fa-4761-b4e8-ce5baf8851da\") " pod="openstack/prometheus-metric-storage-0" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.858126 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-15ee050c-37dd-43df-8cbe-1200f09a5545\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15ee050c-37dd-43df-8cbe-1200f09a5545\") pod \"prometheus-metric-storage-0\" (UID: \"13358646-85fa-4761-b4e8-ce5baf8851da\") " pod="openstack/prometheus-metric-storage-0" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.858168 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/13358646-85fa-4761-b4e8-ce5baf8851da-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"13358646-85fa-4761-b4e8-ce5baf8851da\") " pod="openstack/prometheus-metric-storage-0" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.858248 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/13358646-85fa-4761-b4e8-ce5baf8851da-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"13358646-85fa-4761-b4e8-ce5baf8851da\") " pod="openstack/prometheus-metric-storage-0" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.858323 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/13358646-85fa-4761-b4e8-ce5baf8851da-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"13358646-85fa-4761-b4e8-ce5baf8851da\") " pod="openstack/prometheus-metric-storage-0" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.858387 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/13358646-85fa-4761-b4e8-ce5baf8851da-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"13358646-85fa-4761-b4e8-ce5baf8851da\") " pod="openstack/prometheus-metric-storage-0" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.858540 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgk7d\" (UniqueName: \"kubernetes.io/projected/13358646-85fa-4761-b4e8-ce5baf8851da-kube-api-access-sgk7d\") pod \"prometheus-metric-storage-0\" (UID: \"13358646-85fa-4761-b4e8-ce5baf8851da\") " pod="openstack/prometheus-metric-storage-0" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.858573 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/13358646-85fa-4761-b4e8-ce5baf8851da-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"13358646-85fa-4761-b4e8-ce5baf8851da\") " pod="openstack/prometheus-metric-storage-0" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.960028 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2f2ec0a-16c2-4808-8871-a8e56bd045a9-utilities\") pod \"d2f2ec0a-16c2-4808-8871-a8e56bd045a9\" (UID: \"d2f2ec0a-16c2-4808-8871-a8e56bd045a9\") " Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.960230 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nz4fm\" (UniqueName: \"kubernetes.io/projected/d2f2ec0a-16c2-4808-8871-a8e56bd045a9-kube-api-access-nz4fm\") pod \"d2f2ec0a-16c2-4808-8871-a8e56bd045a9\" (UID: \"d2f2ec0a-16c2-4808-8871-a8e56bd045a9\") " Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.960399 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2f2ec0a-16c2-4808-8871-a8e56bd045a9-catalog-content\") pod \"d2f2ec0a-16c2-4808-8871-a8e56bd045a9\" (UID: \"d2f2ec0a-16c2-4808-8871-a8e56bd045a9\") " Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.960656 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgk7d\" (UniqueName: \"kubernetes.io/projected/13358646-85fa-4761-b4e8-ce5baf8851da-kube-api-access-sgk7d\") pod \"prometheus-metric-storage-0\" (UID: \"13358646-85fa-4761-b4e8-ce5baf8851da\") " pod="openstack/prometheus-metric-storage-0" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.960689 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/13358646-85fa-4761-b4e8-ce5baf8851da-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"13358646-85fa-4761-b4e8-ce5baf8851da\") " pod="openstack/prometheus-metric-storage-0" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.960725 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13358646-85fa-4761-b4e8-ce5baf8851da-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"13358646-85fa-4761-b4e8-ce5baf8851da\") " pod="openstack/prometheus-metric-storage-0" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.960742 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/13358646-85fa-4761-b4e8-ce5baf8851da-config\") pod \"prometheus-metric-storage-0\" (UID: \"13358646-85fa-4761-b4e8-ce5baf8851da\") " pod="openstack/prometheus-metric-storage-0" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.960834 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2f2ec0a-16c2-4808-8871-a8e56bd045a9-utilities" (OuterVolumeSpecName: "utilities") pod "d2f2ec0a-16c2-4808-8871-a8e56bd045a9" (UID: "d2f2ec0a-16c2-4808-8871-a8e56bd045a9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.960929 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/13358646-85fa-4761-b4e8-ce5baf8851da-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"13358646-85fa-4761-b4e8-ce5baf8851da\") " pod="openstack/prometheus-metric-storage-0" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.961469 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/13358646-85fa-4761-b4e8-ce5baf8851da-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"13358646-85fa-4761-b4e8-ce5baf8851da\") " pod="openstack/prometheus-metric-storage-0" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.961530 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/13358646-85fa-4761-b4e8-ce5baf8851da-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"13358646-85fa-4761-b4e8-ce5baf8851da\") " pod="openstack/prometheus-metric-storage-0" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.961565 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/13358646-85fa-4761-b4e8-ce5baf8851da-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"13358646-85fa-4761-b4e8-ce5baf8851da\") " pod="openstack/prometheus-metric-storage-0" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.961707 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/13358646-85fa-4761-b4e8-ce5baf8851da-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"13358646-85fa-4761-b4e8-ce5baf8851da\") " pod="openstack/prometheus-metric-storage-0" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.962143 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-15ee050c-37dd-43df-8cbe-1200f09a5545\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15ee050c-37dd-43df-8cbe-1200f09a5545\") pod \"prometheus-metric-storage-0\" (UID: \"13358646-85fa-4761-b4e8-ce5baf8851da\") " pod="openstack/prometheus-metric-storage-0" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.962205 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/13358646-85fa-4761-b4e8-ce5baf8851da-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"13358646-85fa-4761-b4e8-ce5baf8851da\") " pod="openstack/prometheus-metric-storage-0" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.962806 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/13358646-85fa-4761-b4e8-ce5baf8851da-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"13358646-85fa-4761-b4e8-ce5baf8851da\") " pod="openstack/prometheus-metric-storage-0" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.962880 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/13358646-85fa-4761-b4e8-ce5baf8851da-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"13358646-85fa-4761-b4e8-ce5baf8851da\") " pod="openstack/prometheus-metric-storage-0" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.962897 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/13358646-85fa-4761-b4e8-ce5baf8851da-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"13358646-85fa-4761-b4e8-ce5baf8851da\") " pod="openstack/prometheus-metric-storage-0" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.962946 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/13358646-85fa-4761-b4e8-ce5baf8851da-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"13358646-85fa-4761-b4e8-ce5baf8851da\") " pod="openstack/prometheus-metric-storage-0" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.963528 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2f2ec0a-16c2-4808-8871-a8e56bd045a9-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.963647 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/13358646-85fa-4761-b4e8-ce5baf8851da-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"13358646-85fa-4761-b4e8-ce5baf8851da\") " pod="openstack/prometheus-metric-storage-0" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.965824 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/13358646-85fa-4761-b4e8-ce5baf8851da-config\") pod \"prometheus-metric-storage-0\" (UID: \"13358646-85fa-4761-b4e8-ce5baf8851da\") " pod="openstack/prometheus-metric-storage-0" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.970296 4922 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.970346 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-15ee050c-37dd-43df-8cbe-1200f09a5545\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15ee050c-37dd-43df-8cbe-1200f09a5545\") pod \"prometheus-metric-storage-0\" (UID: \"13358646-85fa-4761-b4e8-ce5baf8851da\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d461f0c4a551673a0d7d7003637451f1312f1b9722a2159a051859daee296e97/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.970454 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/13358646-85fa-4761-b4e8-ce5baf8851da-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"13358646-85fa-4761-b4e8-ce5baf8851da\") " pod="openstack/prometheus-metric-storage-0" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.970573 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13358646-85fa-4761-b4e8-ce5baf8851da-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"13358646-85fa-4761-b4e8-ce5baf8851da\") " pod="openstack/prometheus-metric-storage-0" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.971160 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2f2ec0a-16c2-4808-8871-a8e56bd045a9-kube-api-access-nz4fm" (OuterVolumeSpecName: "kube-api-access-nz4fm") pod "d2f2ec0a-16c2-4808-8871-a8e56bd045a9" (UID: "d2f2ec0a-16c2-4808-8871-a8e56bd045a9"). InnerVolumeSpecName "kube-api-access-nz4fm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.974585 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/13358646-85fa-4761-b4e8-ce5baf8851da-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"13358646-85fa-4761-b4e8-ce5baf8851da\") " pod="openstack/prometheus-metric-storage-0" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.976861 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/13358646-85fa-4761-b4e8-ce5baf8851da-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"13358646-85fa-4761-b4e8-ce5baf8851da\") " pod="openstack/prometheus-metric-storage-0" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.978623 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/13358646-85fa-4761-b4e8-ce5baf8851da-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"13358646-85fa-4761-b4e8-ce5baf8851da\") " pod="openstack/prometheus-metric-storage-0" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.980968 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/13358646-85fa-4761-b4e8-ce5baf8851da-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"13358646-85fa-4761-b4e8-ce5baf8851da\") " pod="openstack/prometheus-metric-storage-0" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.981915 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/13358646-85fa-4761-b4e8-ce5baf8851da-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"13358646-85fa-4761-b4e8-ce5baf8851da\") " pod="openstack/prometheus-metric-storage-0" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.982562 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgk7d\" (UniqueName: \"kubernetes.io/projected/13358646-85fa-4761-b4e8-ce5baf8851da-kube-api-access-sgk7d\") pod \"prometheus-metric-storage-0\" (UID: \"13358646-85fa-4761-b4e8-ce5baf8851da\") " pod="openstack/prometheus-metric-storage-0" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.984086 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2f2ec0a-16c2-4808-8871-a8e56bd045a9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d2f2ec0a-16c2-4808-8871-a8e56bd045a9" (UID: "d2f2ec0a-16c2-4808-8871-a8e56bd045a9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:22:40 crc kubenswrapper[4922]: I0218 12:22:40.986938 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="791eb2d2-e8b0-401d-8b8b-b79d675c1ca4" path="/var/lib/kubelet/pods/791eb2d2-e8b0-401d-8b8b-b79d675c1ca4/volumes" Feb 18 12:22:41 crc kubenswrapper[4922]: I0218 12:22:41.029980 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-15ee050c-37dd-43df-8cbe-1200f09a5545\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15ee050c-37dd-43df-8cbe-1200f09a5545\") pod \"prometheus-metric-storage-0\" (UID: \"13358646-85fa-4761-b4e8-ce5baf8851da\") " pod="openstack/prometheus-metric-storage-0" Feb 18 12:22:41 crc kubenswrapper[4922]: I0218 12:22:41.064889 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2f2ec0a-16c2-4808-8871-a8e56bd045a9-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 12:22:41 crc kubenswrapper[4922]: I0218 12:22:41.064931 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nz4fm\" (UniqueName: \"kubernetes.io/projected/d2f2ec0a-16c2-4808-8871-a8e56bd045a9-kube-api-access-nz4fm\") on node \"crc\" DevicePath \"\"" Feb 18 12:22:41 crc kubenswrapper[4922]: I0218 12:22:41.098494 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 18 12:22:41 crc kubenswrapper[4922]: I0218 12:22:41.331988 4922 generic.go:334] "Generic (PLEG): container finished" podID="d2f2ec0a-16c2-4808-8871-a8e56bd045a9" containerID="824db469829217c6af7f82c4435908cbbf5d6fa5600ef7ccdcf836213996956a" exitCode=0 Feb 18 12:22:41 crc kubenswrapper[4922]: I0218 12:22:41.332675 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gmwj8" Feb 18 12:22:41 crc kubenswrapper[4922]: I0218 12:22:41.332632 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gmwj8" event={"ID":"d2f2ec0a-16c2-4808-8871-a8e56bd045a9","Type":"ContainerDied","Data":"824db469829217c6af7f82c4435908cbbf5d6fa5600ef7ccdcf836213996956a"} Feb 18 12:22:41 crc kubenswrapper[4922]: I0218 12:22:41.332755 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gmwj8" event={"ID":"d2f2ec0a-16c2-4808-8871-a8e56bd045a9","Type":"ContainerDied","Data":"c172eba0a6e88b586259ce4d77fcba9de3974f0deeaae0baf659acb7a41aaa60"} Feb 18 12:22:41 crc kubenswrapper[4922]: I0218 12:22:41.332820 4922 scope.go:117] "RemoveContainer" containerID="824db469829217c6af7f82c4435908cbbf5d6fa5600ef7ccdcf836213996956a" Feb 18 12:22:41 crc kubenswrapper[4922]: I0218 12:22:41.369103 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gmwj8"] Feb 18 12:22:41 crc kubenswrapper[4922]: I0218 12:22:41.381684 4922 scope.go:117] "RemoveContainer" containerID="e7a62b2634ba91dccc174bd713229fd50823f1b237565279bd2b4231ccc19723" Feb 18 12:22:41 crc kubenswrapper[4922]: I0218 12:22:41.383173 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gmwj8"] Feb 18 12:22:41 crc kubenswrapper[4922]: I0218 12:22:41.438770 4922 scope.go:117] "RemoveContainer" containerID="5866c42fc40619d4c7f32758c298f20a5ee96d82a018e9ab977c8f97a23b5c45" Feb 18 12:22:41 crc kubenswrapper[4922]: I0218 12:22:41.491968 4922 scope.go:117] "RemoveContainer" containerID="824db469829217c6af7f82c4435908cbbf5d6fa5600ef7ccdcf836213996956a" Feb 18 12:22:41 crc kubenswrapper[4922]: E0218 12:22:41.492439 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"824db469829217c6af7f82c4435908cbbf5d6fa5600ef7ccdcf836213996956a\": container with ID starting with 824db469829217c6af7f82c4435908cbbf5d6fa5600ef7ccdcf836213996956a not found: ID does not exist" containerID="824db469829217c6af7f82c4435908cbbf5d6fa5600ef7ccdcf836213996956a" Feb 18 12:22:41 crc kubenswrapper[4922]: I0218 12:22:41.492476 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"824db469829217c6af7f82c4435908cbbf5d6fa5600ef7ccdcf836213996956a"} err="failed to get container status \"824db469829217c6af7f82c4435908cbbf5d6fa5600ef7ccdcf836213996956a\": rpc error: code = NotFound desc = could not find container \"824db469829217c6af7f82c4435908cbbf5d6fa5600ef7ccdcf836213996956a\": container with ID starting with 824db469829217c6af7f82c4435908cbbf5d6fa5600ef7ccdcf836213996956a not found: ID does not exist" Feb 18 12:22:41 crc kubenswrapper[4922]: I0218 12:22:41.492504 4922 scope.go:117] "RemoveContainer" containerID="e7a62b2634ba91dccc174bd713229fd50823f1b237565279bd2b4231ccc19723" Feb 18 12:22:41 crc kubenswrapper[4922]: E0218 12:22:41.492987 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7a62b2634ba91dccc174bd713229fd50823f1b237565279bd2b4231ccc19723\": container with ID starting with e7a62b2634ba91dccc174bd713229fd50823f1b237565279bd2b4231ccc19723 not found: ID does not exist" containerID="e7a62b2634ba91dccc174bd713229fd50823f1b237565279bd2b4231ccc19723" Feb 18 12:22:41 crc kubenswrapper[4922]: I0218 12:22:41.493036 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7a62b2634ba91dccc174bd713229fd50823f1b237565279bd2b4231ccc19723"} err="failed to get container status \"e7a62b2634ba91dccc174bd713229fd50823f1b237565279bd2b4231ccc19723\": rpc error: code = NotFound desc = could not find container \"e7a62b2634ba91dccc174bd713229fd50823f1b237565279bd2b4231ccc19723\": container with ID starting with e7a62b2634ba91dccc174bd713229fd50823f1b237565279bd2b4231ccc19723 not found: ID does not exist" Feb 18 12:22:41 crc kubenswrapper[4922]: I0218 12:22:41.493063 4922 scope.go:117] "RemoveContainer" containerID="5866c42fc40619d4c7f32758c298f20a5ee96d82a018e9ab977c8f97a23b5c45" Feb 18 12:22:41 crc kubenswrapper[4922]: E0218 12:22:41.493348 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5866c42fc40619d4c7f32758c298f20a5ee96d82a018e9ab977c8f97a23b5c45\": container with ID starting with 5866c42fc40619d4c7f32758c298f20a5ee96d82a018e9ab977c8f97a23b5c45 not found: ID does not exist" containerID="5866c42fc40619d4c7f32758c298f20a5ee96d82a018e9ab977c8f97a23b5c45" Feb 18 12:22:41 crc kubenswrapper[4922]: I0218 12:22:41.493415 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5866c42fc40619d4c7f32758c298f20a5ee96d82a018e9ab977c8f97a23b5c45"} err="failed to get container status \"5866c42fc40619d4c7f32758c298f20a5ee96d82a018e9ab977c8f97a23b5c45\": rpc error: code = NotFound desc = could not find container \"5866c42fc40619d4c7f32758c298f20a5ee96d82a018e9ab977c8f97a23b5c45\": container with ID starting with 5866c42fc40619d4c7f32758c298f20a5ee96d82a018e9ab977c8f97a23b5c45 not found: ID does not exist" Feb 18 12:22:41 crc kubenswrapper[4922]: I0218 12:22:41.591405 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 18 12:22:42 crc kubenswrapper[4922]: I0218 12:22:42.345425 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"13358646-85fa-4761-b4e8-ce5baf8851da","Type":"ContainerStarted","Data":"ef9587009618985e51ce2fcc04d7b7619474cc27c677aa1e820ca8841fcfc34c"} Feb 18 12:22:42 crc kubenswrapper[4922]: I0218 12:22:42.633597 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-x6t7b" Feb 18 12:22:42 crc kubenswrapper[4922]: I0218 12:22:42.688624 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-x6t7b" Feb 18 12:22:42 crc kubenswrapper[4922]: I0218 12:22:42.985960 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2f2ec0a-16c2-4808-8871-a8e56bd045a9" path="/var/lib/kubelet/pods/d2f2ec0a-16c2-4808-8871-a8e56bd045a9/volumes" Feb 18 12:22:43 crc kubenswrapper[4922]: I0218 12:22:43.611657 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x6t7b"] Feb 18 12:22:44 crc kubenswrapper[4922]: I0218 12:22:44.364796 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-x6t7b" podUID="6217a40d-f959-4afa-b48e-b25c1c1693c1" containerName="registry-server" containerID="cri-o://bbebd17e24478f9f7aaaca2db042a4da18b50d95dcd264d43cd200ad4b1d6eae" gracePeriod=2 Feb 18 12:22:44 crc kubenswrapper[4922]: I0218 12:22:44.815274 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x6t7b" Feb 18 12:22:44 crc kubenswrapper[4922]: I0218 12:22:44.950795 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rw68x\" (UniqueName: \"kubernetes.io/projected/6217a40d-f959-4afa-b48e-b25c1c1693c1-kube-api-access-rw68x\") pod \"6217a40d-f959-4afa-b48e-b25c1c1693c1\" (UID: \"6217a40d-f959-4afa-b48e-b25c1c1693c1\") " Feb 18 12:22:44 crc kubenswrapper[4922]: I0218 12:22:44.950861 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6217a40d-f959-4afa-b48e-b25c1c1693c1-catalog-content\") pod \"6217a40d-f959-4afa-b48e-b25c1c1693c1\" (UID: \"6217a40d-f959-4afa-b48e-b25c1c1693c1\") " Feb 18 12:22:44 crc kubenswrapper[4922]: I0218 12:22:44.951136 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6217a40d-f959-4afa-b48e-b25c1c1693c1-utilities\") pod \"6217a40d-f959-4afa-b48e-b25c1c1693c1\" (UID: \"6217a40d-f959-4afa-b48e-b25c1c1693c1\") " Feb 18 12:22:44 crc kubenswrapper[4922]: I0218 12:22:44.951878 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6217a40d-f959-4afa-b48e-b25c1c1693c1-utilities" (OuterVolumeSpecName: "utilities") pod "6217a40d-f959-4afa-b48e-b25c1c1693c1" (UID: "6217a40d-f959-4afa-b48e-b25c1c1693c1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:22:44 crc kubenswrapper[4922]: I0218 12:22:44.959707 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6217a40d-f959-4afa-b48e-b25c1c1693c1-kube-api-access-rw68x" (OuterVolumeSpecName: "kube-api-access-rw68x") pod "6217a40d-f959-4afa-b48e-b25c1c1693c1" (UID: "6217a40d-f959-4afa-b48e-b25c1c1693c1"). InnerVolumeSpecName "kube-api-access-rw68x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:22:45 crc kubenswrapper[4922]: I0218 12:22:45.054910 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6217a40d-f959-4afa-b48e-b25c1c1693c1-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 12:22:45 crc kubenswrapper[4922]: I0218 12:22:45.054957 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rw68x\" (UniqueName: \"kubernetes.io/projected/6217a40d-f959-4afa-b48e-b25c1c1693c1-kube-api-access-rw68x\") on node \"crc\" DevicePath \"\"" Feb 18 12:22:45 crc kubenswrapper[4922]: I0218 12:22:45.125883 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6217a40d-f959-4afa-b48e-b25c1c1693c1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6217a40d-f959-4afa-b48e-b25c1c1693c1" (UID: "6217a40d-f959-4afa-b48e-b25c1c1693c1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:22:45 crc kubenswrapper[4922]: I0218 12:22:45.156422 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6217a40d-f959-4afa-b48e-b25c1c1693c1-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 12:22:45 crc kubenswrapper[4922]: I0218 12:22:45.373961 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"13358646-85fa-4761-b4e8-ce5baf8851da","Type":"ContainerStarted","Data":"8727adbd08a3ca5bed3dd1d23301105b8cb55bae17fe0f81c41857080c79500d"} Feb 18 12:22:45 crc kubenswrapper[4922]: I0218 12:22:45.377461 4922 generic.go:334] "Generic (PLEG): container finished" podID="6217a40d-f959-4afa-b48e-b25c1c1693c1" containerID="bbebd17e24478f9f7aaaca2db042a4da18b50d95dcd264d43cd200ad4b1d6eae" exitCode=0 Feb 18 12:22:45 crc kubenswrapper[4922]: I0218 12:22:45.377518 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6t7b" event={"ID":"6217a40d-f959-4afa-b48e-b25c1c1693c1","Type":"ContainerDied","Data":"bbebd17e24478f9f7aaaca2db042a4da18b50d95dcd264d43cd200ad4b1d6eae"} Feb 18 12:22:45 crc kubenswrapper[4922]: I0218 12:22:45.377544 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6t7b" event={"ID":"6217a40d-f959-4afa-b48e-b25c1c1693c1","Type":"ContainerDied","Data":"01304453471ec40644ea6059952df8e463fbece81357fde7a770b3318d0244bd"} Feb 18 12:22:45 crc kubenswrapper[4922]: I0218 12:22:45.377563 4922 scope.go:117] "RemoveContainer" containerID="bbebd17e24478f9f7aaaca2db042a4da18b50d95dcd264d43cd200ad4b1d6eae" Feb 18 12:22:45 crc kubenswrapper[4922]: I0218 12:22:45.377942 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x6t7b" Feb 18 12:22:45 crc kubenswrapper[4922]: I0218 12:22:45.399343 4922 scope.go:117] "RemoveContainer" containerID="cbb7446ca5b63d5effa7c8e217cc14f792d76dd59052ea3d503ec58b1802b227" Feb 18 12:22:45 crc kubenswrapper[4922]: I0218 12:22:45.438584 4922 scope.go:117] "RemoveContainer" containerID="13a7f699d921da48e9cffe7b53f8c4c33c05acbf32d0e501384d7a7ecca47296" Feb 18 12:22:45 crc kubenswrapper[4922]: I0218 12:22:45.440271 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x6t7b"] Feb 18 12:22:45 crc kubenswrapper[4922]: I0218 12:22:45.457743 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-x6t7b"] Feb 18 12:22:45 crc kubenswrapper[4922]: I0218 12:22:45.483714 4922 scope.go:117] "RemoveContainer" containerID="bbebd17e24478f9f7aaaca2db042a4da18b50d95dcd264d43cd200ad4b1d6eae" Feb 18 12:22:45 crc kubenswrapper[4922]: E0218 12:22:45.484581 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbebd17e24478f9f7aaaca2db042a4da18b50d95dcd264d43cd200ad4b1d6eae\": container with ID starting with bbebd17e24478f9f7aaaca2db042a4da18b50d95dcd264d43cd200ad4b1d6eae not found: ID does not exist" containerID="bbebd17e24478f9f7aaaca2db042a4da18b50d95dcd264d43cd200ad4b1d6eae" Feb 18 12:22:45 crc kubenswrapper[4922]: I0218 12:22:45.484628 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbebd17e24478f9f7aaaca2db042a4da18b50d95dcd264d43cd200ad4b1d6eae"} err="failed to get container status \"bbebd17e24478f9f7aaaca2db042a4da18b50d95dcd264d43cd200ad4b1d6eae\": rpc error: code = NotFound desc = could not find container \"bbebd17e24478f9f7aaaca2db042a4da18b50d95dcd264d43cd200ad4b1d6eae\": container with ID starting with bbebd17e24478f9f7aaaca2db042a4da18b50d95dcd264d43cd200ad4b1d6eae not found: ID does not exist" Feb 18 12:22:45 crc kubenswrapper[4922]: I0218 12:22:45.484660 4922 scope.go:117] "RemoveContainer" containerID="cbb7446ca5b63d5effa7c8e217cc14f792d76dd59052ea3d503ec58b1802b227" Feb 18 12:22:45 crc kubenswrapper[4922]: E0218 12:22:45.485096 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbb7446ca5b63d5effa7c8e217cc14f792d76dd59052ea3d503ec58b1802b227\": container with ID starting with cbb7446ca5b63d5effa7c8e217cc14f792d76dd59052ea3d503ec58b1802b227 not found: ID does not exist" containerID="cbb7446ca5b63d5effa7c8e217cc14f792d76dd59052ea3d503ec58b1802b227" Feb 18 12:22:45 crc kubenswrapper[4922]: I0218 12:22:45.485150 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbb7446ca5b63d5effa7c8e217cc14f792d76dd59052ea3d503ec58b1802b227"} err="failed to get container status \"cbb7446ca5b63d5effa7c8e217cc14f792d76dd59052ea3d503ec58b1802b227\": rpc error: code = NotFound desc = could not find container \"cbb7446ca5b63d5effa7c8e217cc14f792d76dd59052ea3d503ec58b1802b227\": container with ID starting with cbb7446ca5b63d5effa7c8e217cc14f792d76dd59052ea3d503ec58b1802b227 not found: ID does not exist" Feb 18 12:22:45 crc kubenswrapper[4922]: I0218 12:22:45.485186 4922 scope.go:117] "RemoveContainer" containerID="13a7f699d921da48e9cffe7b53f8c4c33c05acbf32d0e501384d7a7ecca47296" Feb 18 12:22:45 crc kubenswrapper[4922]: E0218 12:22:45.485495 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13a7f699d921da48e9cffe7b53f8c4c33c05acbf32d0e501384d7a7ecca47296\": container with ID starting with 13a7f699d921da48e9cffe7b53f8c4c33c05acbf32d0e501384d7a7ecca47296 not found: ID does not exist" containerID="13a7f699d921da48e9cffe7b53f8c4c33c05acbf32d0e501384d7a7ecca47296" Feb 18 12:22:45 crc kubenswrapper[4922]: I0218 12:22:45.485531 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13a7f699d921da48e9cffe7b53f8c4c33c05acbf32d0e501384d7a7ecca47296"} err="failed to get container status \"13a7f699d921da48e9cffe7b53f8c4c33c05acbf32d0e501384d7a7ecca47296\": rpc error: code = NotFound desc = could not find container \"13a7f699d921da48e9cffe7b53f8c4c33c05acbf32d0e501384d7a7ecca47296\": container with ID starting with 13a7f699d921da48e9cffe7b53f8c4c33c05acbf32d0e501384d7a7ecca47296 not found: ID does not exist" Feb 18 12:22:46 crc kubenswrapper[4922]: I0218 12:22:46.984929 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6217a40d-f959-4afa-b48e-b25c1c1693c1" path="/var/lib/kubelet/pods/6217a40d-f959-4afa-b48e-b25c1c1693c1/volumes" Feb 18 12:22:53 crc kubenswrapper[4922]: I0218 12:22:53.429891 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ktl86"] Feb 18 12:22:53 crc kubenswrapper[4922]: E0218 12:22:53.430925 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2f2ec0a-16c2-4808-8871-a8e56bd045a9" containerName="extract-content" Feb 18 12:22:53 crc kubenswrapper[4922]: I0218 12:22:53.430947 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2f2ec0a-16c2-4808-8871-a8e56bd045a9" containerName="extract-content" Feb 18 12:22:53 crc kubenswrapper[4922]: E0218 12:22:53.430973 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6217a40d-f959-4afa-b48e-b25c1c1693c1" containerName="registry-server" Feb 18 12:22:53 crc kubenswrapper[4922]: I0218 12:22:53.430981 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="6217a40d-f959-4afa-b48e-b25c1c1693c1" containerName="registry-server" Feb 18 12:22:53 crc kubenswrapper[4922]: E0218 12:22:53.430989 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2f2ec0a-16c2-4808-8871-a8e56bd045a9" containerName="registry-server" Feb 18 12:22:53 crc kubenswrapper[4922]: I0218 12:22:53.430997 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2f2ec0a-16c2-4808-8871-a8e56bd045a9" containerName="registry-server" Feb 18 12:22:53 crc kubenswrapper[4922]: E0218 12:22:53.431021 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6217a40d-f959-4afa-b48e-b25c1c1693c1" containerName="extract-utilities" Feb 18 12:22:53 crc kubenswrapper[4922]: I0218 12:22:53.431029 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="6217a40d-f959-4afa-b48e-b25c1c1693c1" containerName="extract-utilities" Feb 18 12:22:53 crc kubenswrapper[4922]: E0218 12:22:53.431054 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2f2ec0a-16c2-4808-8871-a8e56bd045a9" containerName="extract-utilities" Feb 18 12:22:53 crc kubenswrapper[4922]: I0218 12:22:53.431062 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2f2ec0a-16c2-4808-8871-a8e56bd045a9" containerName="extract-utilities" Feb 18 12:22:53 crc kubenswrapper[4922]: E0218 12:22:53.431072 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6217a40d-f959-4afa-b48e-b25c1c1693c1" containerName="extract-content" Feb 18 12:22:53 crc kubenswrapper[4922]: I0218 12:22:53.431080 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="6217a40d-f959-4afa-b48e-b25c1c1693c1" containerName="extract-content" Feb 18 12:22:53 crc kubenswrapper[4922]: I0218 12:22:53.431296 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="6217a40d-f959-4afa-b48e-b25c1c1693c1" containerName="registry-server" Feb 18 12:22:53 crc kubenswrapper[4922]: I0218 12:22:53.431333 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2f2ec0a-16c2-4808-8871-a8e56bd045a9" containerName="registry-server" Feb 18 12:22:53 crc kubenswrapper[4922]: I0218 12:22:53.433051 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ktl86" Feb 18 12:22:53 crc kubenswrapper[4922]: I0218 12:22:53.451917 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ktl86"] Feb 18 12:22:53 crc kubenswrapper[4922]: I0218 12:22:53.472184 4922 generic.go:334] "Generic (PLEG): container finished" podID="13358646-85fa-4761-b4e8-ce5baf8851da" containerID="8727adbd08a3ca5bed3dd1d23301105b8cb55bae17fe0f81c41857080c79500d" exitCode=0 Feb 18 12:22:53 crc kubenswrapper[4922]: I0218 12:22:53.472239 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"13358646-85fa-4761-b4e8-ce5baf8851da","Type":"ContainerDied","Data":"8727adbd08a3ca5bed3dd1d23301105b8cb55bae17fe0f81c41857080c79500d"} Feb 18 12:22:53 crc kubenswrapper[4922]: I0218 12:22:53.532478 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6668bc10-67e0-40a0-bdf8-760c01e67ffb-catalog-content\") pod \"certified-operators-ktl86\" (UID: \"6668bc10-67e0-40a0-bdf8-760c01e67ffb\") " pod="openshift-marketplace/certified-operators-ktl86" Feb 18 12:22:53 crc kubenswrapper[4922]: I0218 12:22:53.533117 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x66mb\" (UniqueName: \"kubernetes.io/projected/6668bc10-67e0-40a0-bdf8-760c01e67ffb-kube-api-access-x66mb\") pod \"certified-operators-ktl86\" (UID: \"6668bc10-67e0-40a0-bdf8-760c01e67ffb\") " pod="openshift-marketplace/certified-operators-ktl86" Feb 18 12:22:53 crc kubenswrapper[4922]: I0218 12:22:53.533284 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6668bc10-67e0-40a0-bdf8-760c01e67ffb-utilities\") pod \"certified-operators-ktl86\" (UID: \"6668bc10-67e0-40a0-bdf8-760c01e67ffb\") " pod="openshift-marketplace/certified-operators-ktl86" Feb 18 12:22:53 crc kubenswrapper[4922]: I0218 12:22:53.635084 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6668bc10-67e0-40a0-bdf8-760c01e67ffb-catalog-content\") pod \"certified-operators-ktl86\" (UID: \"6668bc10-67e0-40a0-bdf8-760c01e67ffb\") " pod="openshift-marketplace/certified-operators-ktl86" Feb 18 12:22:53 crc kubenswrapper[4922]: I0218 12:22:53.635633 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6668bc10-67e0-40a0-bdf8-760c01e67ffb-catalog-content\") pod \"certified-operators-ktl86\" (UID: \"6668bc10-67e0-40a0-bdf8-760c01e67ffb\") " pod="openshift-marketplace/certified-operators-ktl86" Feb 18 12:22:53 crc kubenswrapper[4922]: I0218 12:22:53.635645 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x66mb\" (UniqueName: \"kubernetes.io/projected/6668bc10-67e0-40a0-bdf8-760c01e67ffb-kube-api-access-x66mb\") pod \"certified-operators-ktl86\" (UID: \"6668bc10-67e0-40a0-bdf8-760c01e67ffb\") " pod="openshift-marketplace/certified-operators-ktl86" Feb 18 12:22:53 crc kubenswrapper[4922]: I0218 12:22:53.635811 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6668bc10-67e0-40a0-bdf8-760c01e67ffb-utilities\") pod \"certified-operators-ktl86\" (UID: \"6668bc10-67e0-40a0-bdf8-760c01e67ffb\") " pod="openshift-marketplace/certified-operators-ktl86" Feb 18 12:22:53 crc kubenswrapper[4922]: I0218 12:22:53.636380 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6668bc10-67e0-40a0-bdf8-760c01e67ffb-utilities\") pod \"certified-operators-ktl86\" (UID: \"6668bc10-67e0-40a0-bdf8-760c01e67ffb\") " pod="openshift-marketplace/certified-operators-ktl86" Feb 18 12:22:53 crc kubenswrapper[4922]: I0218 12:22:53.655811 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x66mb\" (UniqueName: \"kubernetes.io/projected/6668bc10-67e0-40a0-bdf8-760c01e67ffb-kube-api-access-x66mb\") pod \"certified-operators-ktl86\" (UID: \"6668bc10-67e0-40a0-bdf8-760c01e67ffb\") " pod="openshift-marketplace/certified-operators-ktl86" Feb 18 12:22:53 crc kubenswrapper[4922]: I0218 12:22:53.754083 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ktl86" Feb 18 12:22:54 crc kubenswrapper[4922]: W0218 12:22:54.245426 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6668bc10_67e0_40a0_bdf8_760c01e67ffb.slice/crio-5800716cdf2af93e328e6d448e636f39680309cf829bdaeeca901dc7afe39c91 WatchSource:0}: Error finding container 5800716cdf2af93e328e6d448e636f39680309cf829bdaeeca901dc7afe39c91: Status 404 returned error can't find the container with id 5800716cdf2af93e328e6d448e636f39680309cf829bdaeeca901dc7afe39c91 Feb 18 12:22:54 crc kubenswrapper[4922]: I0218 12:22:54.252800 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ktl86"] Feb 18 12:22:54 crc kubenswrapper[4922]: I0218 12:22:54.481948 4922 generic.go:334] "Generic (PLEG): container finished" podID="6668bc10-67e0-40a0-bdf8-760c01e67ffb" containerID="f52dca76a6dfec27e0a59ae63867fef5d319b89c5f69f836ea7fe6d9a54240c9" exitCode=0 Feb 18 12:22:54 crc kubenswrapper[4922]: I0218 12:22:54.482074 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ktl86" event={"ID":"6668bc10-67e0-40a0-bdf8-760c01e67ffb","Type":"ContainerDied","Data":"f52dca76a6dfec27e0a59ae63867fef5d319b89c5f69f836ea7fe6d9a54240c9"} Feb 18 12:22:54 crc kubenswrapper[4922]: I0218 12:22:54.482237 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ktl86" event={"ID":"6668bc10-67e0-40a0-bdf8-760c01e67ffb","Type":"ContainerStarted","Data":"5800716cdf2af93e328e6d448e636f39680309cf829bdaeeca901dc7afe39c91"} Feb 18 12:22:54 crc kubenswrapper[4922]: I0218 12:22:54.484626 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"13358646-85fa-4761-b4e8-ce5baf8851da","Type":"ContainerStarted","Data":"35204e3522b95ed558a273f4aaedf1fd175ede45f45ff6adc97a12ea6fee34c8"} Feb 18 12:22:56 crc kubenswrapper[4922]: I0218 12:22:56.508194 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ktl86" event={"ID":"6668bc10-67e0-40a0-bdf8-760c01e67ffb","Type":"ContainerStarted","Data":"05b11aba883e59a330f5bc2e9bb5d474ccb35d90127c13922b78218898b4cb6e"} Feb 18 12:22:57 crc kubenswrapper[4922]: I0218 12:22:57.522489 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"13358646-85fa-4761-b4e8-ce5baf8851da","Type":"ContainerStarted","Data":"4bcbcff1cb39e7c9b088b3aa53e88e0fbd4078ea0f6f0e23b93e68bb1838f2da"} Feb 18 12:22:57 crc kubenswrapper[4922]: I0218 12:22:57.522851 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"13358646-85fa-4761-b4e8-ce5baf8851da","Type":"ContainerStarted","Data":"9399d8895b96cf629da034389c9ca80ee6f91e327a0d6f52e7d26d4dd7e3c8a3"} Feb 18 12:22:57 crc kubenswrapper[4922]: I0218 12:22:57.524508 4922 generic.go:334] "Generic (PLEG): container finished" podID="6668bc10-67e0-40a0-bdf8-760c01e67ffb" containerID="05b11aba883e59a330f5bc2e9bb5d474ccb35d90127c13922b78218898b4cb6e" exitCode=0 Feb 18 12:22:57 crc kubenswrapper[4922]: I0218 12:22:57.524543 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ktl86" event={"ID":"6668bc10-67e0-40a0-bdf8-760c01e67ffb","Type":"ContainerDied","Data":"05b11aba883e59a330f5bc2e9bb5d474ccb35d90127c13922b78218898b4cb6e"} Feb 18 12:22:57 crc kubenswrapper[4922]: I0218 12:22:57.554487 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=17.554453658 podStartE2EDuration="17.554453658s" podCreationTimestamp="2026-02-18 12:22:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 12:22:57.54741072 +0000 UTC m=+2779.275114820" watchObservedRunningTime="2026-02-18 12:22:57.554453658 +0000 UTC m=+2779.282157768" Feb 18 12:22:58 crc kubenswrapper[4922]: I0218 12:22:58.544941 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ktl86" event={"ID":"6668bc10-67e0-40a0-bdf8-760c01e67ffb","Type":"ContainerStarted","Data":"2f06fa7d1ccad53762185f070ac63faec3b39c0ad9069caaed5a221f878e67cc"} Feb 18 12:22:58 crc kubenswrapper[4922]: I0218 12:22:58.572534 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ktl86" podStartSLOduration=2.040614907 podStartE2EDuration="5.572506118s" podCreationTimestamp="2026-02-18 12:22:53 +0000 UTC" firstStartedPulling="2026-02-18 12:22:54.483505434 +0000 UTC m=+2776.211209514" lastFinishedPulling="2026-02-18 12:22:58.015396645 +0000 UTC m=+2779.743100725" observedRunningTime="2026-02-18 12:22:58.562754741 +0000 UTC m=+2780.290458811" watchObservedRunningTime="2026-02-18 12:22:58.572506118 +0000 UTC m=+2780.300210208" Feb 18 12:23:01 crc kubenswrapper[4922]: I0218 12:23:01.099129 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 18 12:23:03 crc kubenswrapper[4922]: I0218 12:23:03.754451 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ktl86" Feb 18 12:23:03 crc kubenswrapper[4922]: I0218 12:23:03.754956 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ktl86" Feb 18 12:23:03 crc kubenswrapper[4922]: I0218 12:23:03.810212 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ktl86" Feb 18 12:23:04 crc kubenswrapper[4922]: I0218 12:23:04.641242 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ktl86" Feb 18 12:23:04 crc kubenswrapper[4922]: I0218 12:23:04.692126 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ktl86"] Feb 18 12:23:06 crc kubenswrapper[4922]: I0218 12:23:06.614671 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ktl86" podUID="6668bc10-67e0-40a0-bdf8-760c01e67ffb" containerName="registry-server" containerID="cri-o://2f06fa7d1ccad53762185f070ac63faec3b39c0ad9069caaed5a221f878e67cc" gracePeriod=2 Feb 18 12:23:07 crc kubenswrapper[4922]: I0218 12:23:07.156400 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ktl86" Feb 18 12:23:07 crc kubenswrapper[4922]: I0218 12:23:07.195481 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6668bc10-67e0-40a0-bdf8-760c01e67ffb-utilities\") pod \"6668bc10-67e0-40a0-bdf8-760c01e67ffb\" (UID: \"6668bc10-67e0-40a0-bdf8-760c01e67ffb\") " Feb 18 12:23:07 crc kubenswrapper[4922]: I0218 12:23:07.195884 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x66mb\" (UniqueName: \"kubernetes.io/projected/6668bc10-67e0-40a0-bdf8-760c01e67ffb-kube-api-access-x66mb\") pod \"6668bc10-67e0-40a0-bdf8-760c01e67ffb\" (UID: \"6668bc10-67e0-40a0-bdf8-760c01e67ffb\") " Feb 18 12:23:07 crc kubenswrapper[4922]: I0218 12:23:07.195958 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6668bc10-67e0-40a0-bdf8-760c01e67ffb-catalog-content\") pod \"6668bc10-67e0-40a0-bdf8-760c01e67ffb\" (UID: \"6668bc10-67e0-40a0-bdf8-760c01e67ffb\") " Feb 18 12:23:07 crc kubenswrapper[4922]: I0218 12:23:07.200444 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6668bc10-67e0-40a0-bdf8-760c01e67ffb-utilities" (OuterVolumeSpecName: "utilities") pod "6668bc10-67e0-40a0-bdf8-760c01e67ffb" (UID: "6668bc10-67e0-40a0-bdf8-760c01e67ffb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:23:07 crc kubenswrapper[4922]: I0218 12:23:07.208410 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6668bc10-67e0-40a0-bdf8-760c01e67ffb-kube-api-access-x66mb" (OuterVolumeSpecName: "kube-api-access-x66mb") pod "6668bc10-67e0-40a0-bdf8-760c01e67ffb" (UID: "6668bc10-67e0-40a0-bdf8-760c01e67ffb"). InnerVolumeSpecName "kube-api-access-x66mb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:23:07 crc kubenswrapper[4922]: I0218 12:23:07.253746 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6668bc10-67e0-40a0-bdf8-760c01e67ffb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6668bc10-67e0-40a0-bdf8-760c01e67ffb" (UID: "6668bc10-67e0-40a0-bdf8-760c01e67ffb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:23:07 crc kubenswrapper[4922]: I0218 12:23:07.299769 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6668bc10-67e0-40a0-bdf8-760c01e67ffb-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 12:23:07 crc kubenswrapper[4922]: I0218 12:23:07.299838 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x66mb\" (UniqueName: \"kubernetes.io/projected/6668bc10-67e0-40a0-bdf8-760c01e67ffb-kube-api-access-x66mb\") on node \"crc\" DevicePath \"\"" Feb 18 12:23:07 crc kubenswrapper[4922]: I0218 12:23:07.299850 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6668bc10-67e0-40a0-bdf8-760c01e67ffb-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 12:23:07 crc kubenswrapper[4922]: I0218 12:23:07.627230 4922 generic.go:334] "Generic (PLEG): container finished" podID="6668bc10-67e0-40a0-bdf8-760c01e67ffb" containerID="2f06fa7d1ccad53762185f070ac63faec3b39c0ad9069caaed5a221f878e67cc" exitCode=0 Feb 18 12:23:07 crc kubenswrapper[4922]: I0218 12:23:07.627290 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ktl86" event={"ID":"6668bc10-67e0-40a0-bdf8-760c01e67ffb","Type":"ContainerDied","Data":"2f06fa7d1ccad53762185f070ac63faec3b39c0ad9069caaed5a221f878e67cc"} Feb 18 12:23:07 crc kubenswrapper[4922]: I0218 12:23:07.627391 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ktl86" event={"ID":"6668bc10-67e0-40a0-bdf8-760c01e67ffb","Type":"ContainerDied","Data":"5800716cdf2af93e328e6d448e636f39680309cf829bdaeeca901dc7afe39c91"} Feb 18 12:23:07 crc kubenswrapper[4922]: I0218 12:23:07.627418 4922 scope.go:117] "RemoveContainer" containerID="2f06fa7d1ccad53762185f070ac63faec3b39c0ad9069caaed5a221f878e67cc" Feb 18 12:23:07 crc kubenswrapper[4922]: I0218 12:23:07.627420 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ktl86" Feb 18 12:23:07 crc kubenswrapper[4922]: I0218 12:23:07.663081 4922 scope.go:117] "RemoveContainer" containerID="05b11aba883e59a330f5bc2e9bb5d474ccb35d90127c13922b78218898b4cb6e" Feb 18 12:23:07 crc kubenswrapper[4922]: I0218 12:23:07.686435 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ktl86"] Feb 18 12:23:07 crc kubenswrapper[4922]: I0218 12:23:07.694707 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ktl86"] Feb 18 12:23:07 crc kubenswrapper[4922]: I0218 12:23:07.709601 4922 scope.go:117] "RemoveContainer" containerID="f52dca76a6dfec27e0a59ae63867fef5d319b89c5f69f836ea7fe6d9a54240c9" Feb 18 12:23:07 crc kubenswrapper[4922]: I0218 12:23:07.760655 4922 scope.go:117] "RemoveContainer" containerID="2f06fa7d1ccad53762185f070ac63faec3b39c0ad9069caaed5a221f878e67cc" Feb 18 12:23:07 crc kubenswrapper[4922]: E0218 12:23:07.761313 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f06fa7d1ccad53762185f070ac63faec3b39c0ad9069caaed5a221f878e67cc\": container with ID starting with 2f06fa7d1ccad53762185f070ac63faec3b39c0ad9069caaed5a221f878e67cc not found: ID does not exist" containerID="2f06fa7d1ccad53762185f070ac63faec3b39c0ad9069caaed5a221f878e67cc" Feb 18 12:23:07 crc kubenswrapper[4922]: I0218 12:23:07.761388 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f06fa7d1ccad53762185f070ac63faec3b39c0ad9069caaed5a221f878e67cc"} err="failed to get container status \"2f06fa7d1ccad53762185f070ac63faec3b39c0ad9069caaed5a221f878e67cc\": rpc error: code = NotFound desc = could not find container \"2f06fa7d1ccad53762185f070ac63faec3b39c0ad9069caaed5a221f878e67cc\": container with ID starting with 2f06fa7d1ccad53762185f070ac63faec3b39c0ad9069caaed5a221f878e67cc not found: ID does not exist" Feb 18 12:23:07 crc kubenswrapper[4922]: I0218 12:23:07.761418 4922 scope.go:117] "RemoveContainer" containerID="05b11aba883e59a330f5bc2e9bb5d474ccb35d90127c13922b78218898b4cb6e" Feb 18 12:23:07 crc kubenswrapper[4922]: E0218 12:23:07.761766 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05b11aba883e59a330f5bc2e9bb5d474ccb35d90127c13922b78218898b4cb6e\": container with ID starting with 05b11aba883e59a330f5bc2e9bb5d474ccb35d90127c13922b78218898b4cb6e not found: ID does not exist" containerID="05b11aba883e59a330f5bc2e9bb5d474ccb35d90127c13922b78218898b4cb6e" Feb 18 12:23:07 crc kubenswrapper[4922]: I0218 12:23:07.761816 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05b11aba883e59a330f5bc2e9bb5d474ccb35d90127c13922b78218898b4cb6e"} err="failed to get container status \"05b11aba883e59a330f5bc2e9bb5d474ccb35d90127c13922b78218898b4cb6e\": rpc error: code = NotFound desc = could not find container \"05b11aba883e59a330f5bc2e9bb5d474ccb35d90127c13922b78218898b4cb6e\": container with ID starting with 05b11aba883e59a330f5bc2e9bb5d474ccb35d90127c13922b78218898b4cb6e not found: ID does not exist" Feb 18 12:23:07 crc kubenswrapper[4922]: I0218 12:23:07.761836 4922 scope.go:117] "RemoveContainer" containerID="f52dca76a6dfec27e0a59ae63867fef5d319b89c5f69f836ea7fe6d9a54240c9" Feb 18 12:23:07 crc kubenswrapper[4922]: E0218 12:23:07.762126 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f52dca76a6dfec27e0a59ae63867fef5d319b89c5f69f836ea7fe6d9a54240c9\": container with ID starting with f52dca76a6dfec27e0a59ae63867fef5d319b89c5f69f836ea7fe6d9a54240c9 not found: ID does not exist" containerID="f52dca76a6dfec27e0a59ae63867fef5d319b89c5f69f836ea7fe6d9a54240c9" Feb 18 12:23:07 crc kubenswrapper[4922]: I0218 12:23:07.762181 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f52dca76a6dfec27e0a59ae63867fef5d319b89c5f69f836ea7fe6d9a54240c9"} err="failed to get container status \"f52dca76a6dfec27e0a59ae63867fef5d319b89c5f69f836ea7fe6d9a54240c9\": rpc error: code = NotFound desc = could not find container \"f52dca76a6dfec27e0a59ae63867fef5d319b89c5f69f836ea7fe6d9a54240c9\": container with ID starting with f52dca76a6dfec27e0a59ae63867fef5d319b89c5f69f836ea7fe6d9a54240c9 not found: ID does not exist" Feb 18 12:23:08 crc kubenswrapper[4922]: I0218 12:23:08.986201 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6668bc10-67e0-40a0-bdf8-760c01e67ffb" path="/var/lib/kubelet/pods/6668bc10-67e0-40a0-bdf8-760c01e67ffb/volumes" Feb 18 12:23:11 crc kubenswrapper[4922]: I0218 12:23:11.099333 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 18 12:23:11 crc kubenswrapper[4922]: I0218 12:23:11.105354 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 18 12:23:11 crc kubenswrapper[4922]: I0218 12:23:11.675600 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 18 12:23:34 crc kubenswrapper[4922]: I0218 12:23:34.571156 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Feb 18 12:23:34 crc kubenswrapper[4922]: E0218 12:23:34.572259 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6668bc10-67e0-40a0-bdf8-760c01e67ffb" containerName="extract-content" Feb 18 12:23:34 crc kubenswrapper[4922]: I0218 12:23:34.572276 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="6668bc10-67e0-40a0-bdf8-760c01e67ffb" containerName="extract-content" Feb 18 12:23:34 crc kubenswrapper[4922]: E0218 12:23:34.572295 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6668bc10-67e0-40a0-bdf8-760c01e67ffb" containerName="registry-server" Feb 18 12:23:34 crc kubenswrapper[4922]: I0218 12:23:34.572301 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="6668bc10-67e0-40a0-bdf8-760c01e67ffb" containerName="registry-server" Feb 18 12:23:34 crc kubenswrapper[4922]: E0218 12:23:34.572337 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6668bc10-67e0-40a0-bdf8-760c01e67ffb" containerName="extract-utilities" Feb 18 12:23:34 crc kubenswrapper[4922]: I0218 12:23:34.572343 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="6668bc10-67e0-40a0-bdf8-760c01e67ffb" containerName="extract-utilities" Feb 18 12:23:34 crc kubenswrapper[4922]: I0218 12:23:34.572625 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="6668bc10-67e0-40a0-bdf8-760c01e67ffb" containerName="registry-server" Feb 18 12:23:34 crc kubenswrapper[4922]: I0218 12:23:34.573591 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 18 12:23:34 crc kubenswrapper[4922]: I0218 12:23:34.579202 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-thfnr" Feb 18 12:23:34 crc kubenswrapper[4922]: I0218 12:23:34.579503 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Feb 18 12:23:34 crc kubenswrapper[4922]: I0218 12:23:34.579637 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 18 12:23:34 crc kubenswrapper[4922]: I0218 12:23:34.579768 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Feb 18 12:23:34 crc kubenswrapper[4922]: I0218 12:23:34.587034 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 18 12:23:34 crc kubenswrapper[4922]: I0218 12:23:34.670758 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4525818f-9e1d-48a0-8ec1-1a22a0841dd4-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"4525818f-9e1d-48a0-8ec1-1a22a0841dd4\") " pod="openstack/tempest-tests-tempest" Feb 18 12:23:34 crc kubenswrapper[4922]: I0218 12:23:34.670885 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqctm\" (UniqueName: \"kubernetes.io/projected/4525818f-9e1d-48a0-8ec1-1a22a0841dd4-kube-api-access-vqctm\") pod \"tempest-tests-tempest\" (UID: \"4525818f-9e1d-48a0-8ec1-1a22a0841dd4\") " pod="openstack/tempest-tests-tempest" Feb 18 12:23:34 crc kubenswrapper[4922]: I0218 12:23:34.670970 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4525818f-9e1d-48a0-8ec1-1a22a0841dd4-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"4525818f-9e1d-48a0-8ec1-1a22a0841dd4\") " pod="openstack/tempest-tests-tempest" Feb 18 12:23:34 crc kubenswrapper[4922]: I0218 12:23:34.671013 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4525818f-9e1d-48a0-8ec1-1a22a0841dd4-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"4525818f-9e1d-48a0-8ec1-1a22a0841dd4\") " pod="openstack/tempest-tests-tempest" Feb 18 12:23:34 crc kubenswrapper[4922]: I0218 12:23:34.671065 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4525818f-9e1d-48a0-8ec1-1a22a0841dd4-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"4525818f-9e1d-48a0-8ec1-1a22a0841dd4\") " pod="openstack/tempest-tests-tempest" Feb 18 12:23:34 crc kubenswrapper[4922]: I0218 12:23:34.671142 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4525818f-9e1d-48a0-8ec1-1a22a0841dd4-config-data\") pod \"tempest-tests-tempest\" (UID: \"4525818f-9e1d-48a0-8ec1-1a22a0841dd4\") " pod="openstack/tempest-tests-tempest" Feb 18 12:23:34 crc kubenswrapper[4922]: I0218 12:23:34.671187 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4525818f-9e1d-48a0-8ec1-1a22a0841dd4-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"4525818f-9e1d-48a0-8ec1-1a22a0841dd4\") " pod="openstack/tempest-tests-tempest" Feb 18 12:23:34 crc kubenswrapper[4922]: I0218 12:23:34.671392 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4525818f-9e1d-48a0-8ec1-1a22a0841dd4-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"4525818f-9e1d-48a0-8ec1-1a22a0841dd4\") " pod="openstack/tempest-tests-tempest" Feb 18 12:23:34 crc kubenswrapper[4922]: I0218 12:23:34.671432 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"4525818f-9e1d-48a0-8ec1-1a22a0841dd4\") " pod="openstack/tempest-tests-tempest" Feb 18 12:23:34 crc kubenswrapper[4922]: I0218 12:23:34.773105 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqctm\" (UniqueName: \"kubernetes.io/projected/4525818f-9e1d-48a0-8ec1-1a22a0841dd4-kube-api-access-vqctm\") pod \"tempest-tests-tempest\" (UID: \"4525818f-9e1d-48a0-8ec1-1a22a0841dd4\") " pod="openstack/tempest-tests-tempest" Feb 18 12:23:34 crc kubenswrapper[4922]: I0218 12:23:34.773211 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4525818f-9e1d-48a0-8ec1-1a22a0841dd4-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"4525818f-9e1d-48a0-8ec1-1a22a0841dd4\") " pod="openstack/tempest-tests-tempest" Feb 18 12:23:34 crc kubenswrapper[4922]: I0218 12:23:34.773259 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4525818f-9e1d-48a0-8ec1-1a22a0841dd4-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"4525818f-9e1d-48a0-8ec1-1a22a0841dd4\") " pod="openstack/tempest-tests-tempest" Feb 18 12:23:34 crc kubenswrapper[4922]: I0218 12:23:34.773311 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4525818f-9e1d-48a0-8ec1-1a22a0841dd4-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"4525818f-9e1d-48a0-8ec1-1a22a0841dd4\") " pod="openstack/tempest-tests-tempest" Feb 18 12:23:34 crc kubenswrapper[4922]: I0218 12:23:34.773435 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4525818f-9e1d-48a0-8ec1-1a22a0841dd4-config-data\") pod \"tempest-tests-tempest\" (UID: \"4525818f-9e1d-48a0-8ec1-1a22a0841dd4\") " pod="openstack/tempest-tests-tempest" Feb 18 12:23:34 crc kubenswrapper[4922]: I0218 12:23:34.773460 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4525818f-9e1d-48a0-8ec1-1a22a0841dd4-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"4525818f-9e1d-48a0-8ec1-1a22a0841dd4\") " pod="openstack/tempest-tests-tempest" Feb 18 12:23:34 crc kubenswrapper[4922]: I0218 12:23:34.773520 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4525818f-9e1d-48a0-8ec1-1a22a0841dd4-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"4525818f-9e1d-48a0-8ec1-1a22a0841dd4\") " pod="openstack/tempest-tests-tempest" Feb 18 12:23:34 crc kubenswrapper[4922]: I0218 12:23:34.773544 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"4525818f-9e1d-48a0-8ec1-1a22a0841dd4\") " pod="openstack/tempest-tests-tempest" Feb 18 12:23:34 crc kubenswrapper[4922]: I0218 12:23:34.773635 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4525818f-9e1d-48a0-8ec1-1a22a0841dd4-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"4525818f-9e1d-48a0-8ec1-1a22a0841dd4\") " pod="openstack/tempest-tests-tempest" Feb 18 12:23:34 crc kubenswrapper[4922]: I0218 12:23:34.774690 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4525818f-9e1d-48a0-8ec1-1a22a0841dd4-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"4525818f-9e1d-48a0-8ec1-1a22a0841dd4\") " pod="openstack/tempest-tests-tempest" Feb 18 12:23:34 crc kubenswrapper[4922]: I0218 12:23:34.775173 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"4525818f-9e1d-48a0-8ec1-1a22a0841dd4\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/tempest-tests-tempest" Feb 18 12:23:34 crc kubenswrapper[4922]: I0218 12:23:34.775396 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4525818f-9e1d-48a0-8ec1-1a22a0841dd4-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"4525818f-9e1d-48a0-8ec1-1a22a0841dd4\") " pod="openstack/tempest-tests-tempest" Feb 18 12:23:34 crc kubenswrapper[4922]: I0218 12:23:34.775397 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4525818f-9e1d-48a0-8ec1-1a22a0841dd4-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"4525818f-9e1d-48a0-8ec1-1a22a0841dd4\") " pod="openstack/tempest-tests-tempest" Feb 18 12:23:34 crc kubenswrapper[4922]: I0218 12:23:34.775569 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4525818f-9e1d-48a0-8ec1-1a22a0841dd4-config-data\") pod \"tempest-tests-tempest\" (UID: \"4525818f-9e1d-48a0-8ec1-1a22a0841dd4\") " pod="openstack/tempest-tests-tempest" Feb 18 12:23:34 crc kubenswrapper[4922]: I0218 12:23:34.784454 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4525818f-9e1d-48a0-8ec1-1a22a0841dd4-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"4525818f-9e1d-48a0-8ec1-1a22a0841dd4\") " pod="openstack/tempest-tests-tempest" Feb 18 12:23:34 crc kubenswrapper[4922]: I0218 12:23:34.786489 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4525818f-9e1d-48a0-8ec1-1a22a0841dd4-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"4525818f-9e1d-48a0-8ec1-1a22a0841dd4\") " pod="openstack/tempest-tests-tempest" Feb 18 12:23:34 crc kubenswrapper[4922]: I0218 12:23:34.786696 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4525818f-9e1d-48a0-8ec1-1a22a0841dd4-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"4525818f-9e1d-48a0-8ec1-1a22a0841dd4\") " pod="openstack/tempest-tests-tempest" Feb 18 12:23:34 crc kubenswrapper[4922]: I0218 12:23:34.796324 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqctm\" (UniqueName: \"kubernetes.io/projected/4525818f-9e1d-48a0-8ec1-1a22a0841dd4-kube-api-access-vqctm\") pod \"tempest-tests-tempest\" (UID: \"4525818f-9e1d-48a0-8ec1-1a22a0841dd4\") " pod="openstack/tempest-tests-tempest" Feb 18 12:23:34 crc kubenswrapper[4922]: I0218 12:23:34.813978 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"4525818f-9e1d-48a0-8ec1-1a22a0841dd4\") " pod="openstack/tempest-tests-tempest" Feb 18 12:23:34 crc kubenswrapper[4922]: I0218 12:23:34.909020 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 18 12:23:35 crc kubenswrapper[4922]: I0218 12:23:35.402791 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 18 12:23:35 crc kubenswrapper[4922]: I0218 12:23:35.907565 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"4525818f-9e1d-48a0-8ec1-1a22a0841dd4","Type":"ContainerStarted","Data":"492694f74401bc119697f1caa4fa178df1922c217659e262bc75d36660dd58d8"} Feb 18 12:23:47 crc kubenswrapper[4922]: I0218 12:23:47.268336 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 18 12:23:48 crc kubenswrapper[4922]: I0218 12:23:48.036463 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"4525818f-9e1d-48a0-8ec1-1a22a0841dd4","Type":"ContainerStarted","Data":"e3b2b8b928d4d252bf46e4bb853a742c089adf478f27339f292b3bd6347dcdc0"} Feb 18 12:23:48 crc kubenswrapper[4922]: I0218 12:23:48.058832 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.198247956 podStartE2EDuration="15.058814711s" podCreationTimestamp="2026-02-18 12:23:33 +0000 UTC" firstStartedPulling="2026-02-18 12:23:35.404485018 +0000 UTC m=+2817.132189108" lastFinishedPulling="2026-02-18 12:23:47.265051753 +0000 UTC m=+2828.992755863" observedRunningTime="2026-02-18 12:23:48.052319787 +0000 UTC m=+2829.780023887" watchObservedRunningTime="2026-02-18 12:23:48.058814711 +0000 UTC m=+2829.786518791" Feb 18 12:24:09 crc kubenswrapper[4922]: I0218 12:24:09.808105 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 12:24:09 crc kubenswrapper[4922]: I0218 12:24:09.808782 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 12:24:39 crc kubenswrapper[4922]: I0218 12:24:39.808089 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 12:24:39 crc kubenswrapper[4922]: I0218 12:24:39.808623 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 12:25:09 crc kubenswrapper[4922]: I0218 12:25:09.807734 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 12:25:09 crc kubenswrapper[4922]: I0218 12:25:09.808275 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 12:25:09 crc kubenswrapper[4922]: I0218 12:25:09.808319 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-znglx" Feb 18 12:25:09 crc kubenswrapper[4922]: I0218 12:25:09.808865 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ef51f81fc06c6367e33199b91ef56039257473dad48437b29bc57a1983bf05c7"} pod="openshift-machine-config-operator/machine-config-daemon-znglx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 12:25:09 crc kubenswrapper[4922]: I0218 12:25:09.808932 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" containerID="cri-o://ef51f81fc06c6367e33199b91ef56039257473dad48437b29bc57a1983bf05c7" gracePeriod=600 Feb 18 12:25:10 crc kubenswrapper[4922]: I0218 12:25:10.779526 4922 generic.go:334] "Generic (PLEG): container finished" podID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerID="ef51f81fc06c6367e33199b91ef56039257473dad48437b29bc57a1983bf05c7" exitCode=0 Feb 18 12:25:10 crc kubenswrapper[4922]: I0218 12:25:10.779589 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" event={"ID":"fdb7cedc-b2e3-48f0-80e0-e17073b43228","Type":"ContainerDied","Data":"ef51f81fc06c6367e33199b91ef56039257473dad48437b29bc57a1983bf05c7"} Feb 18 12:25:10 crc kubenswrapper[4922]: I0218 12:25:10.779958 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" event={"ID":"fdb7cedc-b2e3-48f0-80e0-e17073b43228","Type":"ContainerStarted","Data":"16600860fb4ac9263fdc6d81dfcafd7d172469c3e817b2a15138f9381adaba7a"} Feb 18 12:25:10 crc kubenswrapper[4922]: I0218 12:25:10.779980 4922 scope.go:117] "RemoveContainer" containerID="9330193a32435c3bb58e3c62888aafde935dba8cdd2799d11f365fb7bef953ab" Feb 18 12:27:29 crc kubenswrapper[4922]: I0218 12:27:29.882878 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kz8nw"] Feb 18 12:27:29 crc kubenswrapper[4922]: I0218 12:27:29.886761 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kz8nw" Feb 18 12:27:29 crc kubenswrapper[4922]: I0218 12:27:29.906907 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kz8nw"] Feb 18 12:27:29 crc kubenswrapper[4922]: I0218 12:27:29.917856 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/476deb39-d3e2-47b1-a10a-1043938fbbe0-utilities\") pod \"community-operators-kz8nw\" (UID: \"476deb39-d3e2-47b1-a10a-1043938fbbe0\") " pod="openshift-marketplace/community-operators-kz8nw" Feb 18 12:27:29 crc kubenswrapper[4922]: I0218 12:27:29.918165 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z44tg\" (UniqueName: \"kubernetes.io/projected/476deb39-d3e2-47b1-a10a-1043938fbbe0-kube-api-access-z44tg\") pod \"community-operators-kz8nw\" (UID: \"476deb39-d3e2-47b1-a10a-1043938fbbe0\") " pod="openshift-marketplace/community-operators-kz8nw" Feb 18 12:27:29 crc kubenswrapper[4922]: I0218 12:27:29.918330 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/476deb39-d3e2-47b1-a10a-1043938fbbe0-catalog-content\") pod \"community-operators-kz8nw\" (UID: \"476deb39-d3e2-47b1-a10a-1043938fbbe0\") " pod="openshift-marketplace/community-operators-kz8nw" Feb 18 12:27:30 crc kubenswrapper[4922]: I0218 12:27:30.019529 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z44tg\" (UniqueName: \"kubernetes.io/projected/476deb39-d3e2-47b1-a10a-1043938fbbe0-kube-api-access-z44tg\") pod \"community-operators-kz8nw\" (UID: \"476deb39-d3e2-47b1-a10a-1043938fbbe0\") " pod="openshift-marketplace/community-operators-kz8nw" Feb 18 12:27:30 crc kubenswrapper[4922]: I0218 12:27:30.019612 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/476deb39-d3e2-47b1-a10a-1043938fbbe0-catalog-content\") pod \"community-operators-kz8nw\" (UID: \"476deb39-d3e2-47b1-a10a-1043938fbbe0\") " pod="openshift-marketplace/community-operators-kz8nw" Feb 18 12:27:30 crc kubenswrapper[4922]: I0218 12:27:30.019664 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/476deb39-d3e2-47b1-a10a-1043938fbbe0-utilities\") pod \"community-operators-kz8nw\" (UID: \"476deb39-d3e2-47b1-a10a-1043938fbbe0\") " pod="openshift-marketplace/community-operators-kz8nw" Feb 18 12:27:30 crc kubenswrapper[4922]: I0218 12:27:30.020130 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/476deb39-d3e2-47b1-a10a-1043938fbbe0-catalog-content\") pod \"community-operators-kz8nw\" (UID: \"476deb39-d3e2-47b1-a10a-1043938fbbe0\") " pod="openshift-marketplace/community-operators-kz8nw" Feb 18 12:27:30 crc kubenswrapper[4922]: I0218 12:27:30.020222 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/476deb39-d3e2-47b1-a10a-1043938fbbe0-utilities\") pod \"community-operators-kz8nw\" (UID: \"476deb39-d3e2-47b1-a10a-1043938fbbe0\") " pod="openshift-marketplace/community-operators-kz8nw" Feb 18 12:27:30 crc kubenswrapper[4922]: I0218 12:27:30.045066 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z44tg\" (UniqueName: \"kubernetes.io/projected/476deb39-d3e2-47b1-a10a-1043938fbbe0-kube-api-access-z44tg\") pod \"community-operators-kz8nw\" (UID: \"476deb39-d3e2-47b1-a10a-1043938fbbe0\") " pod="openshift-marketplace/community-operators-kz8nw" Feb 18 12:27:30 crc kubenswrapper[4922]: I0218 12:27:30.221141 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kz8nw" Feb 18 12:27:30 crc kubenswrapper[4922]: I0218 12:27:30.784835 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kz8nw"] Feb 18 12:27:31 crc kubenswrapper[4922]: I0218 12:27:31.023746 4922 generic.go:334] "Generic (PLEG): container finished" podID="476deb39-d3e2-47b1-a10a-1043938fbbe0" containerID="fe11e3a24b6b9e456a8b43ffe889d179e3cf3376d32b416411f7b9cf898ab3c5" exitCode=0 Feb 18 12:27:31 crc kubenswrapper[4922]: I0218 12:27:31.023856 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kz8nw" event={"ID":"476deb39-d3e2-47b1-a10a-1043938fbbe0","Type":"ContainerDied","Data":"fe11e3a24b6b9e456a8b43ffe889d179e3cf3376d32b416411f7b9cf898ab3c5"} Feb 18 12:27:31 crc kubenswrapper[4922]: I0218 12:27:31.024063 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kz8nw" event={"ID":"476deb39-d3e2-47b1-a10a-1043938fbbe0","Type":"ContainerStarted","Data":"9cc0ee33444edd82c3aef2b27c7dd32f17623fbbae0694e5eaf22e7c8ce3e73c"} Feb 18 12:27:31 crc kubenswrapper[4922]: I0218 12:27:31.026509 4922 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 12:27:33 crc kubenswrapper[4922]: I0218 12:27:33.044760 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kz8nw" event={"ID":"476deb39-d3e2-47b1-a10a-1043938fbbe0","Type":"ContainerStarted","Data":"93a952cae5ab5bceca4d519c84887375579cf400e7a8973df36383eb70a98dbb"} Feb 18 12:27:35 crc kubenswrapper[4922]: I0218 12:27:35.067731 4922 generic.go:334] "Generic (PLEG): container finished" podID="476deb39-d3e2-47b1-a10a-1043938fbbe0" containerID="93a952cae5ab5bceca4d519c84887375579cf400e7a8973df36383eb70a98dbb" exitCode=0 Feb 18 12:27:35 crc kubenswrapper[4922]: I0218 12:27:35.068074 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kz8nw" event={"ID":"476deb39-d3e2-47b1-a10a-1043938fbbe0","Type":"ContainerDied","Data":"93a952cae5ab5bceca4d519c84887375579cf400e7a8973df36383eb70a98dbb"} Feb 18 12:27:36 crc kubenswrapper[4922]: I0218 12:27:36.077101 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kz8nw" event={"ID":"476deb39-d3e2-47b1-a10a-1043938fbbe0","Type":"ContainerStarted","Data":"2218e06474151611f107b759e15bcdd7c89bff8e2dddc851df6132106966779a"} Feb 18 12:27:36 crc kubenswrapper[4922]: I0218 12:27:36.099520 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kz8nw" podStartSLOduration=2.654062371 podStartE2EDuration="7.099505061s" podCreationTimestamp="2026-02-18 12:27:29 +0000 UTC" firstStartedPulling="2026-02-18 12:27:31.02546449 +0000 UTC m=+3052.753168570" lastFinishedPulling="2026-02-18 12:27:35.47090718 +0000 UTC m=+3057.198611260" observedRunningTime="2026-02-18 12:27:36.09515096 +0000 UTC m=+3057.822855070" watchObservedRunningTime="2026-02-18 12:27:36.099505061 +0000 UTC m=+3057.827209141" Feb 18 12:27:39 crc kubenswrapper[4922]: I0218 12:27:39.807863 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 12:27:39 crc kubenswrapper[4922]: I0218 12:27:39.808491 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 12:27:40 crc kubenswrapper[4922]: I0218 12:27:40.221312 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kz8nw" Feb 18 12:27:40 crc kubenswrapper[4922]: I0218 12:27:40.221378 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kz8nw" Feb 18 12:27:40 crc kubenswrapper[4922]: I0218 12:27:40.275109 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kz8nw" Feb 18 12:27:41 crc kubenswrapper[4922]: I0218 12:27:41.166982 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kz8nw" Feb 18 12:27:44 crc kubenswrapper[4922]: I0218 12:27:44.472119 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kz8nw"] Feb 18 12:27:44 crc kubenswrapper[4922]: I0218 12:27:44.472465 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kz8nw" podUID="476deb39-d3e2-47b1-a10a-1043938fbbe0" containerName="registry-server" containerID="cri-o://2218e06474151611f107b759e15bcdd7c89bff8e2dddc851df6132106966779a" gracePeriod=2 Feb 18 12:27:45 crc kubenswrapper[4922]: I0218 12:27:45.150976 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kz8nw" Feb 18 12:27:45 crc kubenswrapper[4922]: I0218 12:27:45.158743 4922 generic.go:334] "Generic (PLEG): container finished" podID="476deb39-d3e2-47b1-a10a-1043938fbbe0" containerID="2218e06474151611f107b759e15bcdd7c89bff8e2dddc851df6132106966779a" exitCode=0 Feb 18 12:27:45 crc kubenswrapper[4922]: I0218 12:27:45.158791 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kz8nw" event={"ID":"476deb39-d3e2-47b1-a10a-1043938fbbe0","Type":"ContainerDied","Data":"2218e06474151611f107b759e15bcdd7c89bff8e2dddc851df6132106966779a"} Feb 18 12:27:45 crc kubenswrapper[4922]: I0218 12:27:45.158833 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kz8nw" event={"ID":"476deb39-d3e2-47b1-a10a-1043938fbbe0","Type":"ContainerDied","Data":"9cc0ee33444edd82c3aef2b27c7dd32f17623fbbae0694e5eaf22e7c8ce3e73c"} Feb 18 12:27:45 crc kubenswrapper[4922]: I0218 12:27:45.158857 4922 scope.go:117] "RemoveContainer" containerID="2218e06474151611f107b759e15bcdd7c89bff8e2dddc851df6132106966779a" Feb 18 12:27:45 crc kubenswrapper[4922]: I0218 12:27:45.158925 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kz8nw" Feb 18 12:27:45 crc kubenswrapper[4922]: I0218 12:27:45.187606 4922 scope.go:117] "RemoveContainer" containerID="93a952cae5ab5bceca4d519c84887375579cf400e7a8973df36383eb70a98dbb" Feb 18 12:27:45 crc kubenswrapper[4922]: I0218 12:27:45.215879 4922 scope.go:117] "RemoveContainer" containerID="fe11e3a24b6b9e456a8b43ffe889d179e3cf3376d32b416411f7b9cf898ab3c5" Feb 18 12:27:45 crc kubenswrapper[4922]: I0218 12:27:45.273495 4922 scope.go:117] "RemoveContainer" containerID="2218e06474151611f107b759e15bcdd7c89bff8e2dddc851df6132106966779a" Feb 18 12:27:45 crc kubenswrapper[4922]: E0218 12:27:45.275527 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2218e06474151611f107b759e15bcdd7c89bff8e2dddc851df6132106966779a\": container with ID starting with 2218e06474151611f107b759e15bcdd7c89bff8e2dddc851df6132106966779a not found: ID does not exist" containerID="2218e06474151611f107b759e15bcdd7c89bff8e2dddc851df6132106966779a" Feb 18 12:27:45 crc kubenswrapper[4922]: I0218 12:27:45.275579 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2218e06474151611f107b759e15bcdd7c89bff8e2dddc851df6132106966779a"} err="failed to get container status \"2218e06474151611f107b759e15bcdd7c89bff8e2dddc851df6132106966779a\": rpc error: code = NotFound desc = could not find container \"2218e06474151611f107b759e15bcdd7c89bff8e2dddc851df6132106966779a\": container with ID starting with 2218e06474151611f107b759e15bcdd7c89bff8e2dddc851df6132106966779a not found: ID does not exist" Feb 18 12:27:45 crc kubenswrapper[4922]: I0218 12:27:45.275613 4922 scope.go:117] "RemoveContainer" containerID="93a952cae5ab5bceca4d519c84887375579cf400e7a8973df36383eb70a98dbb" Feb 18 12:27:45 crc kubenswrapper[4922]: E0218 12:27:45.276005 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93a952cae5ab5bceca4d519c84887375579cf400e7a8973df36383eb70a98dbb\": container with ID starting with 93a952cae5ab5bceca4d519c84887375579cf400e7a8973df36383eb70a98dbb not found: ID does not exist" containerID="93a952cae5ab5bceca4d519c84887375579cf400e7a8973df36383eb70a98dbb" Feb 18 12:27:45 crc kubenswrapper[4922]: I0218 12:27:45.276035 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93a952cae5ab5bceca4d519c84887375579cf400e7a8973df36383eb70a98dbb"} err="failed to get container status \"93a952cae5ab5bceca4d519c84887375579cf400e7a8973df36383eb70a98dbb\": rpc error: code = NotFound desc = could not find container \"93a952cae5ab5bceca4d519c84887375579cf400e7a8973df36383eb70a98dbb\": container with ID starting with 93a952cae5ab5bceca4d519c84887375579cf400e7a8973df36383eb70a98dbb not found: ID does not exist" Feb 18 12:27:45 crc kubenswrapper[4922]: I0218 12:27:45.276051 4922 scope.go:117] "RemoveContainer" containerID="fe11e3a24b6b9e456a8b43ffe889d179e3cf3376d32b416411f7b9cf898ab3c5" Feb 18 12:27:45 crc kubenswrapper[4922]: E0218 12:27:45.276337 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe11e3a24b6b9e456a8b43ffe889d179e3cf3376d32b416411f7b9cf898ab3c5\": container with ID starting with fe11e3a24b6b9e456a8b43ffe889d179e3cf3376d32b416411f7b9cf898ab3c5 not found: ID does not exist" containerID="fe11e3a24b6b9e456a8b43ffe889d179e3cf3376d32b416411f7b9cf898ab3c5" Feb 18 12:27:45 crc kubenswrapper[4922]: I0218 12:27:45.276481 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe11e3a24b6b9e456a8b43ffe889d179e3cf3376d32b416411f7b9cf898ab3c5"} err="failed to get container status \"fe11e3a24b6b9e456a8b43ffe889d179e3cf3376d32b416411f7b9cf898ab3c5\": rpc error: code = NotFound desc = could not find container \"fe11e3a24b6b9e456a8b43ffe889d179e3cf3376d32b416411f7b9cf898ab3c5\": container with ID starting with fe11e3a24b6b9e456a8b43ffe889d179e3cf3376d32b416411f7b9cf898ab3c5 not found: ID does not exist" Feb 18 12:27:45 crc kubenswrapper[4922]: I0218 12:27:45.340915 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/476deb39-d3e2-47b1-a10a-1043938fbbe0-catalog-content\") pod \"476deb39-d3e2-47b1-a10a-1043938fbbe0\" (UID: \"476deb39-d3e2-47b1-a10a-1043938fbbe0\") " Feb 18 12:27:45 crc kubenswrapper[4922]: I0218 12:27:45.341024 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z44tg\" (UniqueName: \"kubernetes.io/projected/476deb39-d3e2-47b1-a10a-1043938fbbe0-kube-api-access-z44tg\") pod \"476deb39-d3e2-47b1-a10a-1043938fbbe0\" (UID: \"476deb39-d3e2-47b1-a10a-1043938fbbe0\") " Feb 18 12:27:45 crc kubenswrapper[4922]: I0218 12:27:45.341094 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/476deb39-d3e2-47b1-a10a-1043938fbbe0-utilities\") pod \"476deb39-d3e2-47b1-a10a-1043938fbbe0\" (UID: \"476deb39-d3e2-47b1-a10a-1043938fbbe0\") " Feb 18 12:27:45 crc kubenswrapper[4922]: I0218 12:27:45.342249 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/476deb39-d3e2-47b1-a10a-1043938fbbe0-utilities" (OuterVolumeSpecName: "utilities") pod "476deb39-d3e2-47b1-a10a-1043938fbbe0" (UID: "476deb39-d3e2-47b1-a10a-1043938fbbe0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:27:45 crc kubenswrapper[4922]: I0218 12:27:45.347827 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/476deb39-d3e2-47b1-a10a-1043938fbbe0-kube-api-access-z44tg" (OuterVolumeSpecName: "kube-api-access-z44tg") pod "476deb39-d3e2-47b1-a10a-1043938fbbe0" (UID: "476deb39-d3e2-47b1-a10a-1043938fbbe0"). InnerVolumeSpecName "kube-api-access-z44tg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:27:45 crc kubenswrapper[4922]: I0218 12:27:45.389344 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/476deb39-d3e2-47b1-a10a-1043938fbbe0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "476deb39-d3e2-47b1-a10a-1043938fbbe0" (UID: "476deb39-d3e2-47b1-a10a-1043938fbbe0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:27:45 crc kubenswrapper[4922]: I0218 12:27:45.443236 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/476deb39-d3e2-47b1-a10a-1043938fbbe0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 12:27:45 crc kubenswrapper[4922]: I0218 12:27:45.443282 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z44tg\" (UniqueName: \"kubernetes.io/projected/476deb39-d3e2-47b1-a10a-1043938fbbe0-kube-api-access-z44tg\") on node \"crc\" DevicePath \"\"" Feb 18 12:27:45 crc kubenswrapper[4922]: I0218 12:27:45.443296 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/476deb39-d3e2-47b1-a10a-1043938fbbe0-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 12:27:45 crc kubenswrapper[4922]: I0218 12:27:45.499860 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kz8nw"] Feb 18 12:27:45 crc kubenswrapper[4922]: I0218 12:27:45.512869 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kz8nw"] Feb 18 12:27:46 crc kubenswrapper[4922]: I0218 12:27:46.985806 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="476deb39-d3e2-47b1-a10a-1043938fbbe0" path="/var/lib/kubelet/pods/476deb39-d3e2-47b1-a10a-1043938fbbe0/volumes" Feb 18 12:28:09 crc kubenswrapper[4922]: I0218 12:28:09.807978 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 12:28:09 crc kubenswrapper[4922]: I0218 12:28:09.809215 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 12:28:39 crc kubenswrapper[4922]: I0218 12:28:39.807588 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 12:28:39 crc kubenswrapper[4922]: I0218 12:28:39.808208 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 12:28:39 crc kubenswrapper[4922]: I0218 12:28:39.808270 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-znglx" Feb 18 12:28:39 crc kubenswrapper[4922]: I0218 12:28:39.809146 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"16600860fb4ac9263fdc6d81dfcafd7d172469c3e817b2a15138f9381adaba7a"} pod="openshift-machine-config-operator/machine-config-daemon-znglx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 12:28:39 crc kubenswrapper[4922]: I0218 12:28:39.809219 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" containerID="cri-o://16600860fb4ac9263fdc6d81dfcafd7d172469c3e817b2a15138f9381adaba7a" gracePeriod=600 Feb 18 12:28:39 crc kubenswrapper[4922]: E0218 12:28:39.929500 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:28:40 crc kubenswrapper[4922]: I0218 12:28:40.673965 4922 generic.go:334] "Generic (PLEG): container finished" podID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerID="16600860fb4ac9263fdc6d81dfcafd7d172469c3e817b2a15138f9381adaba7a" exitCode=0 Feb 18 12:28:40 crc kubenswrapper[4922]: I0218 12:28:40.674035 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" event={"ID":"fdb7cedc-b2e3-48f0-80e0-e17073b43228","Type":"ContainerDied","Data":"16600860fb4ac9263fdc6d81dfcafd7d172469c3e817b2a15138f9381adaba7a"} Feb 18 12:28:40 crc kubenswrapper[4922]: I0218 12:28:40.674287 4922 scope.go:117] "RemoveContainer" containerID="ef51f81fc06c6367e33199b91ef56039257473dad48437b29bc57a1983bf05c7" Feb 18 12:28:40 crc kubenswrapper[4922]: I0218 12:28:40.675167 4922 scope.go:117] "RemoveContainer" containerID="16600860fb4ac9263fdc6d81dfcafd7d172469c3e817b2a15138f9381adaba7a" Feb 18 12:28:40 crc kubenswrapper[4922]: E0218 12:28:40.675727 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:28:54 crc kubenswrapper[4922]: I0218 12:28:54.973272 4922 scope.go:117] "RemoveContainer" containerID="16600860fb4ac9263fdc6d81dfcafd7d172469c3e817b2a15138f9381adaba7a" Feb 18 12:28:54 crc kubenswrapper[4922]: E0218 12:28:54.974423 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:29:05 crc kubenswrapper[4922]: I0218 12:29:05.973684 4922 scope.go:117] "RemoveContainer" containerID="16600860fb4ac9263fdc6d81dfcafd7d172469c3e817b2a15138f9381adaba7a" Feb 18 12:29:05 crc kubenswrapper[4922]: E0218 12:29:05.974431 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:29:20 crc kubenswrapper[4922]: I0218 12:29:20.973114 4922 scope.go:117] "RemoveContainer" containerID="16600860fb4ac9263fdc6d81dfcafd7d172469c3e817b2a15138f9381adaba7a" Feb 18 12:29:20 crc kubenswrapper[4922]: E0218 12:29:20.974076 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:29:31 crc kubenswrapper[4922]: I0218 12:29:31.973322 4922 scope.go:117] "RemoveContainer" containerID="16600860fb4ac9263fdc6d81dfcafd7d172469c3e817b2a15138f9381adaba7a" Feb 18 12:29:31 crc kubenswrapper[4922]: E0218 12:29:31.974132 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:29:42 crc kubenswrapper[4922]: I0218 12:29:42.973558 4922 scope.go:117] "RemoveContainer" containerID="16600860fb4ac9263fdc6d81dfcafd7d172469c3e817b2a15138f9381adaba7a" Feb 18 12:29:42 crc kubenswrapper[4922]: E0218 12:29:42.974421 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:29:54 crc kubenswrapper[4922]: I0218 12:29:54.973243 4922 scope.go:117] "RemoveContainer" containerID="16600860fb4ac9263fdc6d81dfcafd7d172469c3e817b2a15138f9381adaba7a" Feb 18 12:29:54 crc kubenswrapper[4922]: E0218 12:29:54.974104 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:30:00 crc kubenswrapper[4922]: I0218 12:30:00.151269 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523630-956lw"] Feb 18 12:30:00 crc kubenswrapper[4922]: E0218 12:30:00.152255 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="476deb39-d3e2-47b1-a10a-1043938fbbe0" containerName="extract-content" Feb 18 12:30:00 crc kubenswrapper[4922]: I0218 12:30:00.152269 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="476deb39-d3e2-47b1-a10a-1043938fbbe0" containerName="extract-content" Feb 18 12:30:00 crc kubenswrapper[4922]: E0218 12:30:00.152284 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="476deb39-d3e2-47b1-a10a-1043938fbbe0" containerName="extract-utilities" Feb 18 12:30:00 crc kubenswrapper[4922]: I0218 12:30:00.152290 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="476deb39-d3e2-47b1-a10a-1043938fbbe0" containerName="extract-utilities" Feb 18 12:30:00 crc kubenswrapper[4922]: E0218 12:30:00.152302 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="476deb39-d3e2-47b1-a10a-1043938fbbe0" containerName="registry-server" Feb 18 12:30:00 crc kubenswrapper[4922]: I0218 12:30:00.152308 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="476deb39-d3e2-47b1-a10a-1043938fbbe0" containerName="registry-server" Feb 18 12:30:00 crc kubenswrapper[4922]: I0218 12:30:00.152598 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="476deb39-d3e2-47b1-a10a-1043938fbbe0" containerName="registry-server" Feb 18 12:30:00 crc kubenswrapper[4922]: I0218 12:30:00.153280 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523630-956lw" Feb 18 12:30:00 crc kubenswrapper[4922]: I0218 12:30:00.155807 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 18 12:30:00 crc kubenswrapper[4922]: I0218 12:30:00.156289 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 18 12:30:00 crc kubenswrapper[4922]: I0218 12:30:00.165982 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523630-956lw"] Feb 18 12:30:00 crc kubenswrapper[4922]: I0218 12:30:00.326982 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9t25\" (UniqueName: \"kubernetes.io/projected/e56f5497-9f1a-455d-8d92-36f5dbcafe8b-kube-api-access-g9t25\") pod \"collect-profiles-29523630-956lw\" (UID: \"e56f5497-9f1a-455d-8d92-36f5dbcafe8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523630-956lw" Feb 18 12:30:00 crc kubenswrapper[4922]: I0218 12:30:00.327152 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e56f5497-9f1a-455d-8d92-36f5dbcafe8b-secret-volume\") pod \"collect-profiles-29523630-956lw\" (UID: \"e56f5497-9f1a-455d-8d92-36f5dbcafe8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523630-956lw" Feb 18 12:30:00 crc kubenswrapper[4922]: I0218 12:30:00.327199 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e56f5497-9f1a-455d-8d92-36f5dbcafe8b-config-volume\") pod \"collect-profiles-29523630-956lw\" (UID: \"e56f5497-9f1a-455d-8d92-36f5dbcafe8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523630-956lw" Feb 18 12:30:00 crc kubenswrapper[4922]: I0218 12:30:00.429393 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9t25\" (UniqueName: \"kubernetes.io/projected/e56f5497-9f1a-455d-8d92-36f5dbcafe8b-kube-api-access-g9t25\") pod \"collect-profiles-29523630-956lw\" (UID: \"e56f5497-9f1a-455d-8d92-36f5dbcafe8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523630-956lw" Feb 18 12:30:00 crc kubenswrapper[4922]: I0218 12:30:00.429512 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e56f5497-9f1a-455d-8d92-36f5dbcafe8b-secret-volume\") pod \"collect-profiles-29523630-956lw\" (UID: \"e56f5497-9f1a-455d-8d92-36f5dbcafe8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523630-956lw" Feb 18 12:30:00 crc kubenswrapper[4922]: I0218 12:30:00.429547 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e56f5497-9f1a-455d-8d92-36f5dbcafe8b-config-volume\") pod \"collect-profiles-29523630-956lw\" (UID: \"e56f5497-9f1a-455d-8d92-36f5dbcafe8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523630-956lw" Feb 18 12:30:00 crc kubenswrapper[4922]: I0218 12:30:00.430414 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e56f5497-9f1a-455d-8d92-36f5dbcafe8b-config-volume\") pod \"collect-profiles-29523630-956lw\" (UID: \"e56f5497-9f1a-455d-8d92-36f5dbcafe8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523630-956lw" Feb 18 12:30:00 crc kubenswrapper[4922]: I0218 12:30:00.435017 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e56f5497-9f1a-455d-8d92-36f5dbcafe8b-secret-volume\") pod \"collect-profiles-29523630-956lw\" (UID: \"e56f5497-9f1a-455d-8d92-36f5dbcafe8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523630-956lw" Feb 18 12:30:00 crc kubenswrapper[4922]: I0218 12:30:00.449675 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9t25\" (UniqueName: \"kubernetes.io/projected/e56f5497-9f1a-455d-8d92-36f5dbcafe8b-kube-api-access-g9t25\") pod \"collect-profiles-29523630-956lw\" (UID: \"e56f5497-9f1a-455d-8d92-36f5dbcafe8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523630-956lw" Feb 18 12:30:00 crc kubenswrapper[4922]: I0218 12:30:00.477620 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523630-956lw" Feb 18 12:30:00 crc kubenswrapper[4922]: I0218 12:30:00.919033 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523630-956lw"] Feb 18 12:30:01 crc kubenswrapper[4922]: I0218 12:30:01.391383 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523630-956lw" event={"ID":"e56f5497-9f1a-455d-8d92-36f5dbcafe8b","Type":"ContainerStarted","Data":"97add8b5c3088aee73c0bc262f74aeb973fd2de8d5be81106a8100386b524ddc"} Feb 18 12:30:01 crc kubenswrapper[4922]: I0218 12:30:01.391695 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523630-956lw" event={"ID":"e56f5497-9f1a-455d-8d92-36f5dbcafe8b","Type":"ContainerStarted","Data":"ffffbeddb8706dab52f0e684d84866f3959d4a3ec913158eddbdaed70cb3377e"} Feb 18 12:30:01 crc kubenswrapper[4922]: I0218 12:30:01.411812 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29523630-956lw" podStartSLOduration=1.411794609 podStartE2EDuration="1.411794609s" podCreationTimestamp="2026-02-18 12:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 12:30:01.404375992 +0000 UTC m=+3203.132080072" watchObservedRunningTime="2026-02-18 12:30:01.411794609 +0000 UTC m=+3203.139498689" Feb 18 12:30:02 crc kubenswrapper[4922]: I0218 12:30:02.402585 4922 generic.go:334] "Generic (PLEG): container finished" podID="e56f5497-9f1a-455d-8d92-36f5dbcafe8b" containerID="97add8b5c3088aee73c0bc262f74aeb973fd2de8d5be81106a8100386b524ddc" exitCode=0 Feb 18 12:30:02 crc kubenswrapper[4922]: I0218 12:30:02.402640 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523630-956lw" event={"ID":"e56f5497-9f1a-455d-8d92-36f5dbcafe8b","Type":"ContainerDied","Data":"97add8b5c3088aee73c0bc262f74aeb973fd2de8d5be81106a8100386b524ddc"} Feb 18 12:30:03 crc kubenswrapper[4922]: I0218 12:30:03.839305 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523630-956lw" Feb 18 12:30:04 crc kubenswrapper[4922]: I0218 12:30:04.007961 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e56f5497-9f1a-455d-8d92-36f5dbcafe8b-config-volume\") pod \"e56f5497-9f1a-455d-8d92-36f5dbcafe8b\" (UID: \"e56f5497-9f1a-455d-8d92-36f5dbcafe8b\") " Feb 18 12:30:04 crc kubenswrapper[4922]: I0218 12:30:04.008210 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9t25\" (UniqueName: \"kubernetes.io/projected/e56f5497-9f1a-455d-8d92-36f5dbcafe8b-kube-api-access-g9t25\") pod \"e56f5497-9f1a-455d-8d92-36f5dbcafe8b\" (UID: \"e56f5497-9f1a-455d-8d92-36f5dbcafe8b\") " Feb 18 12:30:04 crc kubenswrapper[4922]: I0218 12:30:04.008248 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e56f5497-9f1a-455d-8d92-36f5dbcafe8b-secret-volume\") pod \"e56f5497-9f1a-455d-8d92-36f5dbcafe8b\" (UID: \"e56f5497-9f1a-455d-8d92-36f5dbcafe8b\") " Feb 18 12:30:04 crc kubenswrapper[4922]: I0218 12:30:04.008655 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e56f5497-9f1a-455d-8d92-36f5dbcafe8b-config-volume" (OuterVolumeSpecName: "config-volume") pod "e56f5497-9f1a-455d-8d92-36f5dbcafe8b" (UID: "e56f5497-9f1a-455d-8d92-36f5dbcafe8b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 12:30:04 crc kubenswrapper[4922]: I0218 12:30:04.013846 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e56f5497-9f1a-455d-8d92-36f5dbcafe8b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e56f5497-9f1a-455d-8d92-36f5dbcafe8b" (UID: "e56f5497-9f1a-455d-8d92-36f5dbcafe8b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:30:04 crc kubenswrapper[4922]: I0218 12:30:04.014062 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e56f5497-9f1a-455d-8d92-36f5dbcafe8b-kube-api-access-g9t25" (OuterVolumeSpecName: "kube-api-access-g9t25") pod "e56f5497-9f1a-455d-8d92-36f5dbcafe8b" (UID: "e56f5497-9f1a-455d-8d92-36f5dbcafe8b"). InnerVolumeSpecName "kube-api-access-g9t25". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:30:04 crc kubenswrapper[4922]: I0218 12:30:04.110839 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9t25\" (UniqueName: \"kubernetes.io/projected/e56f5497-9f1a-455d-8d92-36f5dbcafe8b-kube-api-access-g9t25\") on node \"crc\" DevicePath \"\"" Feb 18 12:30:04 crc kubenswrapper[4922]: I0218 12:30:04.110875 4922 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e56f5497-9f1a-455d-8d92-36f5dbcafe8b-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 18 12:30:04 crc kubenswrapper[4922]: I0218 12:30:04.110884 4922 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e56f5497-9f1a-455d-8d92-36f5dbcafe8b-config-volume\") on node \"crc\" DevicePath \"\"" Feb 18 12:30:04 crc kubenswrapper[4922]: I0218 12:30:04.420182 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523630-956lw" event={"ID":"e56f5497-9f1a-455d-8d92-36f5dbcafe8b","Type":"ContainerDied","Data":"ffffbeddb8706dab52f0e684d84866f3959d4a3ec913158eddbdaed70cb3377e"} Feb 18 12:30:04 crc kubenswrapper[4922]: I0218 12:30:04.420231 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523630-956lw" Feb 18 12:30:04 crc kubenswrapper[4922]: I0218 12:30:04.420243 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffffbeddb8706dab52f0e684d84866f3959d4a3ec913158eddbdaed70cb3377e" Feb 18 12:30:04 crc kubenswrapper[4922]: I0218 12:30:04.484489 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523585-sckv4"] Feb 18 12:30:04 crc kubenswrapper[4922]: I0218 12:30:04.494528 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523585-sckv4"] Feb 18 12:30:04 crc kubenswrapper[4922]: I0218 12:30:04.990917 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74bd299a-42ac-4c5a-93ff-5809da5517b3" path="/var/lib/kubelet/pods/74bd299a-42ac-4c5a-93ff-5809da5517b3/volumes" Feb 18 12:30:05 crc kubenswrapper[4922]: I0218 12:30:05.973895 4922 scope.go:117] "RemoveContainer" containerID="16600860fb4ac9263fdc6d81dfcafd7d172469c3e817b2a15138f9381adaba7a" Feb 18 12:30:05 crc kubenswrapper[4922]: E0218 12:30:05.974543 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:30:13 crc kubenswrapper[4922]: I0218 12:30:13.460564 4922 scope.go:117] "RemoveContainer" containerID="1e57799f76ef61ec42eb4d7506cd5272291d57133dccf113ac6a6ed7f96b16b6" Feb 18 12:30:16 crc kubenswrapper[4922]: I0218 12:30:16.973043 4922 scope.go:117] "RemoveContainer" containerID="16600860fb4ac9263fdc6d81dfcafd7d172469c3e817b2a15138f9381adaba7a" Feb 18 12:30:16 crc kubenswrapper[4922]: E0218 12:30:16.975119 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:30:28 crc kubenswrapper[4922]: I0218 12:30:28.981632 4922 scope.go:117] "RemoveContainer" containerID="16600860fb4ac9263fdc6d81dfcafd7d172469c3e817b2a15138f9381adaba7a" Feb 18 12:30:28 crc kubenswrapper[4922]: E0218 12:30:28.982485 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:30:41 crc kubenswrapper[4922]: I0218 12:30:41.972829 4922 scope.go:117] "RemoveContainer" containerID="16600860fb4ac9263fdc6d81dfcafd7d172469c3e817b2a15138f9381adaba7a" Feb 18 12:30:41 crc kubenswrapper[4922]: E0218 12:30:41.973854 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:30:53 crc kubenswrapper[4922]: I0218 12:30:53.973479 4922 scope.go:117] "RemoveContainer" containerID="16600860fb4ac9263fdc6d81dfcafd7d172469c3e817b2a15138f9381adaba7a" Feb 18 12:30:53 crc kubenswrapper[4922]: E0218 12:30:53.974576 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:31:05 crc kubenswrapper[4922]: I0218 12:31:05.973143 4922 scope.go:117] "RemoveContainer" containerID="16600860fb4ac9263fdc6d81dfcafd7d172469c3e817b2a15138f9381adaba7a" Feb 18 12:31:05 crc kubenswrapper[4922]: E0218 12:31:05.974312 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:31:20 crc kubenswrapper[4922]: I0218 12:31:20.973499 4922 scope.go:117] "RemoveContainer" containerID="16600860fb4ac9263fdc6d81dfcafd7d172469c3e817b2a15138f9381adaba7a" Feb 18 12:31:20 crc kubenswrapper[4922]: E0218 12:31:20.974253 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:31:32 crc kubenswrapper[4922]: I0218 12:31:32.973684 4922 scope.go:117] "RemoveContainer" containerID="16600860fb4ac9263fdc6d81dfcafd7d172469c3e817b2a15138f9381adaba7a" Feb 18 12:31:32 crc kubenswrapper[4922]: E0218 12:31:32.974674 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:31:43 crc kubenswrapper[4922]: I0218 12:31:43.973961 4922 scope.go:117] "RemoveContainer" containerID="16600860fb4ac9263fdc6d81dfcafd7d172469c3e817b2a15138f9381adaba7a" Feb 18 12:31:43 crc kubenswrapper[4922]: E0218 12:31:43.974722 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:31:54 crc kubenswrapper[4922]: I0218 12:31:54.973903 4922 scope.go:117] "RemoveContainer" containerID="16600860fb4ac9263fdc6d81dfcafd7d172469c3e817b2a15138f9381adaba7a" Feb 18 12:31:54 crc kubenswrapper[4922]: E0218 12:31:54.974795 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:32:09 crc kubenswrapper[4922]: I0218 12:32:09.972908 4922 scope.go:117] "RemoveContainer" containerID="16600860fb4ac9263fdc6d81dfcafd7d172469c3e817b2a15138f9381adaba7a" Feb 18 12:32:09 crc kubenswrapper[4922]: E0218 12:32:09.973717 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:32:20 crc kubenswrapper[4922]: I0218 12:32:20.973783 4922 scope.go:117] "RemoveContainer" containerID="16600860fb4ac9263fdc6d81dfcafd7d172469c3e817b2a15138f9381adaba7a" Feb 18 12:32:20 crc kubenswrapper[4922]: E0218 12:32:20.974599 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:32:35 crc kubenswrapper[4922]: I0218 12:32:35.973595 4922 scope.go:117] "RemoveContainer" containerID="16600860fb4ac9263fdc6d81dfcafd7d172469c3e817b2a15138f9381adaba7a" Feb 18 12:32:35 crc kubenswrapper[4922]: E0218 12:32:35.974311 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:32:36 crc kubenswrapper[4922]: I0218 12:32:36.178020 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n6xrc"] Feb 18 12:32:36 crc kubenswrapper[4922]: E0218 12:32:36.178421 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e56f5497-9f1a-455d-8d92-36f5dbcafe8b" containerName="collect-profiles" Feb 18 12:32:36 crc kubenswrapper[4922]: I0218 12:32:36.178438 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="e56f5497-9f1a-455d-8d92-36f5dbcafe8b" containerName="collect-profiles" Feb 18 12:32:36 crc kubenswrapper[4922]: I0218 12:32:36.178667 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="e56f5497-9f1a-455d-8d92-36f5dbcafe8b" containerName="collect-profiles" Feb 18 12:32:36 crc kubenswrapper[4922]: I0218 12:32:36.185822 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n6xrc" Feb 18 12:32:36 crc kubenswrapper[4922]: I0218 12:32:36.258830 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n6xrc"] Feb 18 12:32:36 crc kubenswrapper[4922]: I0218 12:32:36.344849 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8k9h\" (UniqueName: \"kubernetes.io/projected/5e74836e-69fc-4faa-ac09-05926ad4810a-kube-api-access-b8k9h\") pod \"redhat-operators-n6xrc\" (UID: \"5e74836e-69fc-4faa-ac09-05926ad4810a\") " pod="openshift-marketplace/redhat-operators-n6xrc" Feb 18 12:32:36 crc kubenswrapper[4922]: I0218 12:32:36.345178 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e74836e-69fc-4faa-ac09-05926ad4810a-catalog-content\") pod \"redhat-operators-n6xrc\" (UID: \"5e74836e-69fc-4faa-ac09-05926ad4810a\") " pod="openshift-marketplace/redhat-operators-n6xrc" Feb 18 12:32:36 crc kubenswrapper[4922]: I0218 12:32:36.345204 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e74836e-69fc-4faa-ac09-05926ad4810a-utilities\") pod \"redhat-operators-n6xrc\" (UID: \"5e74836e-69fc-4faa-ac09-05926ad4810a\") " pod="openshift-marketplace/redhat-operators-n6xrc" Feb 18 12:32:36 crc kubenswrapper[4922]: I0218 12:32:36.446728 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8k9h\" (UniqueName: \"kubernetes.io/projected/5e74836e-69fc-4faa-ac09-05926ad4810a-kube-api-access-b8k9h\") pod \"redhat-operators-n6xrc\" (UID: \"5e74836e-69fc-4faa-ac09-05926ad4810a\") " pod="openshift-marketplace/redhat-operators-n6xrc" Feb 18 12:32:36 crc kubenswrapper[4922]: I0218 12:32:36.446902 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e74836e-69fc-4faa-ac09-05926ad4810a-catalog-content\") pod \"redhat-operators-n6xrc\" (UID: \"5e74836e-69fc-4faa-ac09-05926ad4810a\") " pod="openshift-marketplace/redhat-operators-n6xrc" Feb 18 12:32:36 crc kubenswrapper[4922]: I0218 12:32:36.446931 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e74836e-69fc-4faa-ac09-05926ad4810a-utilities\") pod \"redhat-operators-n6xrc\" (UID: \"5e74836e-69fc-4faa-ac09-05926ad4810a\") " pod="openshift-marketplace/redhat-operators-n6xrc" Feb 18 12:32:36 crc kubenswrapper[4922]: I0218 12:32:36.447529 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e74836e-69fc-4faa-ac09-05926ad4810a-catalog-content\") pod \"redhat-operators-n6xrc\" (UID: \"5e74836e-69fc-4faa-ac09-05926ad4810a\") " pod="openshift-marketplace/redhat-operators-n6xrc" Feb 18 12:32:36 crc kubenswrapper[4922]: I0218 12:32:36.447590 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e74836e-69fc-4faa-ac09-05926ad4810a-utilities\") pod \"redhat-operators-n6xrc\" (UID: \"5e74836e-69fc-4faa-ac09-05926ad4810a\") " pod="openshift-marketplace/redhat-operators-n6xrc" Feb 18 12:32:36 crc kubenswrapper[4922]: I0218 12:32:36.472396 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8k9h\" (UniqueName: \"kubernetes.io/projected/5e74836e-69fc-4faa-ac09-05926ad4810a-kube-api-access-b8k9h\") pod \"redhat-operators-n6xrc\" (UID: \"5e74836e-69fc-4faa-ac09-05926ad4810a\") " pod="openshift-marketplace/redhat-operators-n6xrc" Feb 18 12:32:36 crc kubenswrapper[4922]: I0218 12:32:36.509841 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n6xrc" Feb 18 12:32:36 crc kubenswrapper[4922]: I0218 12:32:36.998619 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n6xrc"] Feb 18 12:32:37 crc kubenswrapper[4922]: I0218 12:32:37.767682 4922 generic.go:334] "Generic (PLEG): container finished" podID="5e74836e-69fc-4faa-ac09-05926ad4810a" containerID="8ee471cb05a8d2384f66e39b6b4361473f43b7c046668715e859cf3505d7cf06" exitCode=0 Feb 18 12:32:37 crc kubenswrapper[4922]: I0218 12:32:37.767812 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n6xrc" event={"ID":"5e74836e-69fc-4faa-ac09-05926ad4810a","Type":"ContainerDied","Data":"8ee471cb05a8d2384f66e39b6b4361473f43b7c046668715e859cf3505d7cf06"} Feb 18 12:32:37 crc kubenswrapper[4922]: I0218 12:32:37.768060 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n6xrc" event={"ID":"5e74836e-69fc-4faa-ac09-05926ad4810a","Type":"ContainerStarted","Data":"1e8088badc46814acfe2155e1deee69be9d1251ea91af4603dca25b1daef5e16"} Feb 18 12:32:37 crc kubenswrapper[4922]: I0218 12:32:37.770275 4922 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 12:32:48 crc kubenswrapper[4922]: I0218 12:32:48.881716 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n6xrc" event={"ID":"5e74836e-69fc-4faa-ac09-05926ad4810a","Type":"ContainerStarted","Data":"afd804f20029a44d8ec673f1f244f41a3d5964ef86b2b8983cfaeb8895a19e3a"} Feb 18 12:32:50 crc kubenswrapper[4922]: I0218 12:32:50.906739 4922 generic.go:334] "Generic (PLEG): container finished" podID="5e74836e-69fc-4faa-ac09-05926ad4810a" containerID="afd804f20029a44d8ec673f1f244f41a3d5964ef86b2b8983cfaeb8895a19e3a" exitCode=0 Feb 18 12:32:50 crc kubenswrapper[4922]: I0218 12:32:50.906825 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n6xrc" event={"ID":"5e74836e-69fc-4faa-ac09-05926ad4810a","Type":"ContainerDied","Data":"afd804f20029a44d8ec673f1f244f41a3d5964ef86b2b8983cfaeb8895a19e3a"} Feb 18 12:32:50 crc kubenswrapper[4922]: I0218 12:32:50.973276 4922 scope.go:117] "RemoveContainer" containerID="16600860fb4ac9263fdc6d81dfcafd7d172469c3e817b2a15138f9381adaba7a" Feb 18 12:32:50 crc kubenswrapper[4922]: E0218 12:32:50.973636 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:32:51 crc kubenswrapper[4922]: I0218 12:32:51.917807 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n6xrc" event={"ID":"5e74836e-69fc-4faa-ac09-05926ad4810a","Type":"ContainerStarted","Data":"c25c01881e6a39dfb1732a51ced35ab964fcbb9c5bc0168947ec789be35548e9"} Feb 18 12:32:51 crc kubenswrapper[4922]: I0218 12:32:51.943284 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n6xrc" podStartSLOduration=2.203397433 podStartE2EDuration="15.943266309s" podCreationTimestamp="2026-02-18 12:32:36 +0000 UTC" firstStartedPulling="2026-02-18 12:32:37.769929758 +0000 UTC m=+3359.497633858" lastFinishedPulling="2026-02-18 12:32:51.509798664 +0000 UTC m=+3373.237502734" observedRunningTime="2026-02-18 12:32:51.934458266 +0000 UTC m=+3373.662162376" watchObservedRunningTime="2026-02-18 12:32:51.943266309 +0000 UTC m=+3373.670970389" Feb 18 12:32:53 crc kubenswrapper[4922]: I0218 12:32:53.366022 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8hj8v"] Feb 18 12:32:53 crc kubenswrapper[4922]: I0218 12:32:53.369562 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8hj8v" Feb 18 12:32:53 crc kubenswrapper[4922]: I0218 12:32:53.383139 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8hj8v"] Feb 18 12:32:53 crc kubenswrapper[4922]: I0218 12:32:53.477538 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd456290-ba12-4116-9a58-04ed7fcba476-utilities\") pod \"redhat-marketplace-8hj8v\" (UID: \"bd456290-ba12-4116-9a58-04ed7fcba476\") " pod="openshift-marketplace/redhat-marketplace-8hj8v" Feb 18 12:32:53 crc kubenswrapper[4922]: I0218 12:32:53.477698 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd456290-ba12-4116-9a58-04ed7fcba476-catalog-content\") pod \"redhat-marketplace-8hj8v\" (UID: \"bd456290-ba12-4116-9a58-04ed7fcba476\") " pod="openshift-marketplace/redhat-marketplace-8hj8v" Feb 18 12:32:53 crc kubenswrapper[4922]: I0218 12:32:53.477728 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mdcw\" (UniqueName: \"kubernetes.io/projected/bd456290-ba12-4116-9a58-04ed7fcba476-kube-api-access-4mdcw\") pod \"redhat-marketplace-8hj8v\" (UID: \"bd456290-ba12-4116-9a58-04ed7fcba476\") " pod="openshift-marketplace/redhat-marketplace-8hj8v" Feb 18 12:32:53 crc kubenswrapper[4922]: I0218 12:32:53.579521 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd456290-ba12-4116-9a58-04ed7fcba476-catalog-content\") pod \"redhat-marketplace-8hj8v\" (UID: \"bd456290-ba12-4116-9a58-04ed7fcba476\") " pod="openshift-marketplace/redhat-marketplace-8hj8v" Feb 18 12:32:53 crc kubenswrapper[4922]: I0218 12:32:53.579841 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mdcw\" (UniqueName: \"kubernetes.io/projected/bd456290-ba12-4116-9a58-04ed7fcba476-kube-api-access-4mdcw\") pod \"redhat-marketplace-8hj8v\" (UID: \"bd456290-ba12-4116-9a58-04ed7fcba476\") " pod="openshift-marketplace/redhat-marketplace-8hj8v" Feb 18 12:32:53 crc kubenswrapper[4922]: I0218 12:32:53.579962 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd456290-ba12-4116-9a58-04ed7fcba476-utilities\") pod \"redhat-marketplace-8hj8v\" (UID: \"bd456290-ba12-4116-9a58-04ed7fcba476\") " pod="openshift-marketplace/redhat-marketplace-8hj8v" Feb 18 12:32:53 crc kubenswrapper[4922]: I0218 12:32:53.580452 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd456290-ba12-4116-9a58-04ed7fcba476-utilities\") pod \"redhat-marketplace-8hj8v\" (UID: \"bd456290-ba12-4116-9a58-04ed7fcba476\") " pod="openshift-marketplace/redhat-marketplace-8hj8v" Feb 18 12:32:53 crc kubenswrapper[4922]: I0218 12:32:53.580866 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd456290-ba12-4116-9a58-04ed7fcba476-catalog-content\") pod \"redhat-marketplace-8hj8v\" (UID: \"bd456290-ba12-4116-9a58-04ed7fcba476\") " pod="openshift-marketplace/redhat-marketplace-8hj8v" Feb 18 12:32:53 crc kubenswrapper[4922]: I0218 12:32:53.635616 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mdcw\" (UniqueName: \"kubernetes.io/projected/bd456290-ba12-4116-9a58-04ed7fcba476-kube-api-access-4mdcw\") pod \"redhat-marketplace-8hj8v\" (UID: \"bd456290-ba12-4116-9a58-04ed7fcba476\") " pod="openshift-marketplace/redhat-marketplace-8hj8v" Feb 18 12:32:53 crc kubenswrapper[4922]: I0218 12:32:53.708864 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8hj8v" Feb 18 12:32:54 crc kubenswrapper[4922]: I0218 12:32:54.845500 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8hj8v"] Feb 18 12:32:54 crc kubenswrapper[4922]: I0218 12:32:54.949170 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8hj8v" event={"ID":"bd456290-ba12-4116-9a58-04ed7fcba476","Type":"ContainerStarted","Data":"37abbd67b2d6143cc216bd867afee16da3fa841e67518e563a9512ece0b85196"} Feb 18 12:32:55 crc kubenswrapper[4922]: I0218 12:32:55.962555 4922 generic.go:334] "Generic (PLEG): container finished" podID="bd456290-ba12-4116-9a58-04ed7fcba476" containerID="296617ba970beec03022dcbaf12a401d44ce2babd980348b145fc78e2c54a9d6" exitCode=0 Feb 18 12:32:55 crc kubenswrapper[4922]: I0218 12:32:55.962648 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8hj8v" event={"ID":"bd456290-ba12-4116-9a58-04ed7fcba476","Type":"ContainerDied","Data":"296617ba970beec03022dcbaf12a401d44ce2babd980348b145fc78e2c54a9d6"} Feb 18 12:32:56 crc kubenswrapper[4922]: I0218 12:32:56.510434 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n6xrc" Feb 18 12:32:56 crc kubenswrapper[4922]: I0218 12:32:56.510746 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n6xrc" Feb 18 12:32:57 crc kubenswrapper[4922]: I0218 12:32:57.000968 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8hj8v" event={"ID":"bd456290-ba12-4116-9a58-04ed7fcba476","Type":"ContainerStarted","Data":"dc6eae0b7e35dd4caea940146fcc325f1c9769371fb668a92d65ea86bae48778"} Feb 18 12:32:57 crc kubenswrapper[4922]: I0218 12:32:57.556844 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n6xrc" podUID="5e74836e-69fc-4faa-ac09-05926ad4810a" containerName="registry-server" probeResult="failure" output=< Feb 18 12:32:57 crc kubenswrapper[4922]: timeout: failed to connect service ":50051" within 1s Feb 18 12:32:57 crc kubenswrapper[4922]: > Feb 18 12:32:58 crc kubenswrapper[4922]: I0218 12:32:58.011020 4922 generic.go:334] "Generic (PLEG): container finished" podID="bd456290-ba12-4116-9a58-04ed7fcba476" containerID="dc6eae0b7e35dd4caea940146fcc325f1c9769371fb668a92d65ea86bae48778" exitCode=0 Feb 18 12:32:58 crc kubenswrapper[4922]: I0218 12:32:58.011059 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8hj8v" event={"ID":"bd456290-ba12-4116-9a58-04ed7fcba476","Type":"ContainerDied","Data":"dc6eae0b7e35dd4caea940146fcc325f1c9769371fb668a92d65ea86bae48778"} Feb 18 12:32:59 crc kubenswrapper[4922]: I0218 12:32:59.022942 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8hj8v" event={"ID":"bd456290-ba12-4116-9a58-04ed7fcba476","Type":"ContainerStarted","Data":"6cd64925f7df3c941d9123838b532a293586d9f1541dbd6b6b614a998aac6452"} Feb 18 12:33:03 crc kubenswrapper[4922]: I0218 12:33:03.565287 4922 scope.go:117] "RemoveContainer" containerID="16600860fb4ac9263fdc6d81dfcafd7d172469c3e817b2a15138f9381adaba7a" Feb 18 12:33:03 crc kubenswrapper[4922]: E0218 12:33:03.566035 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:33:03 crc kubenswrapper[4922]: I0218 12:33:03.709605 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8hj8v" Feb 18 12:33:03 crc kubenswrapper[4922]: I0218 12:33:03.709651 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8hj8v" Feb 18 12:33:03 crc kubenswrapper[4922]: I0218 12:33:03.767489 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8hj8v" Feb 18 12:33:03 crc kubenswrapper[4922]: I0218 12:33:03.785228 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8hj8v" podStartSLOduration=8.336300466 podStartE2EDuration="10.785209327s" podCreationTimestamp="2026-02-18 12:32:53 +0000 UTC" firstStartedPulling="2026-02-18 12:32:55.964185991 +0000 UTC m=+3377.691890071" lastFinishedPulling="2026-02-18 12:32:58.413094852 +0000 UTC m=+3380.140798932" observedRunningTime="2026-02-18 12:32:59.043999131 +0000 UTC m=+3380.771703221" watchObservedRunningTime="2026-02-18 12:33:03.785209327 +0000 UTC m=+3385.512913407" Feb 18 12:33:04 crc kubenswrapper[4922]: I0218 12:33:04.645378 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8hj8v" Feb 18 12:33:04 crc kubenswrapper[4922]: I0218 12:33:04.699172 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8hj8v"] Feb 18 12:33:06 crc kubenswrapper[4922]: I0218 12:33:06.562628 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n6xrc" Feb 18 12:33:06 crc kubenswrapper[4922]: I0218 12:33:06.611474 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n6xrc" Feb 18 12:33:06 crc kubenswrapper[4922]: I0218 12:33:06.614865 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8hj8v" podUID="bd456290-ba12-4116-9a58-04ed7fcba476" containerName="registry-server" containerID="cri-o://6cd64925f7df3c941d9123838b532a293586d9f1541dbd6b6b614a998aac6452" gracePeriod=2 Feb 18 12:33:07 crc kubenswrapper[4922]: I0218 12:33:07.121982 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8hj8v" Feb 18 12:33:07 crc kubenswrapper[4922]: I0218 12:33:07.224140 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd456290-ba12-4116-9a58-04ed7fcba476-catalog-content\") pod \"bd456290-ba12-4116-9a58-04ed7fcba476\" (UID: \"bd456290-ba12-4116-9a58-04ed7fcba476\") " Feb 18 12:33:07 crc kubenswrapper[4922]: I0218 12:33:07.224302 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd456290-ba12-4116-9a58-04ed7fcba476-utilities\") pod \"bd456290-ba12-4116-9a58-04ed7fcba476\" (UID: \"bd456290-ba12-4116-9a58-04ed7fcba476\") " Feb 18 12:33:07 crc kubenswrapper[4922]: I0218 12:33:07.224476 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mdcw\" (UniqueName: \"kubernetes.io/projected/bd456290-ba12-4116-9a58-04ed7fcba476-kube-api-access-4mdcw\") pod \"bd456290-ba12-4116-9a58-04ed7fcba476\" (UID: \"bd456290-ba12-4116-9a58-04ed7fcba476\") " Feb 18 12:33:07 crc kubenswrapper[4922]: I0218 12:33:07.225395 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd456290-ba12-4116-9a58-04ed7fcba476-utilities" (OuterVolumeSpecName: "utilities") pod "bd456290-ba12-4116-9a58-04ed7fcba476" (UID: "bd456290-ba12-4116-9a58-04ed7fcba476"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:33:07 crc kubenswrapper[4922]: I0218 12:33:07.226199 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd456290-ba12-4116-9a58-04ed7fcba476-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 12:33:07 crc kubenswrapper[4922]: I0218 12:33:07.232200 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd456290-ba12-4116-9a58-04ed7fcba476-kube-api-access-4mdcw" (OuterVolumeSpecName: "kube-api-access-4mdcw") pod "bd456290-ba12-4116-9a58-04ed7fcba476" (UID: "bd456290-ba12-4116-9a58-04ed7fcba476"). InnerVolumeSpecName "kube-api-access-4mdcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:33:07 crc kubenswrapper[4922]: I0218 12:33:07.241835 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n6xrc"] Feb 18 12:33:07 crc kubenswrapper[4922]: I0218 12:33:07.256324 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd456290-ba12-4116-9a58-04ed7fcba476-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bd456290-ba12-4116-9a58-04ed7fcba476" (UID: "bd456290-ba12-4116-9a58-04ed7fcba476"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:33:07 crc kubenswrapper[4922]: I0218 12:33:07.328689 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd456290-ba12-4116-9a58-04ed7fcba476-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 12:33:07 crc kubenswrapper[4922]: I0218 12:33:07.328891 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mdcw\" (UniqueName: \"kubernetes.io/projected/bd456290-ba12-4116-9a58-04ed7fcba476-kube-api-access-4mdcw\") on node \"crc\" DevicePath \"\"" Feb 18 12:33:07 crc kubenswrapper[4922]: I0218 12:33:07.602175 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-48d4t"] Feb 18 12:33:07 crc kubenswrapper[4922]: I0218 12:33:07.602418 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-48d4t" podUID="f1faa074-0925-4c46-b2d7-3d5590f2bfb2" containerName="registry-server" containerID="cri-o://377e6020d0b42c5266d8273b9b34ce499f228a6c68677cbc4c4e34b4940a1f55" gracePeriod=2 Feb 18 12:33:07 crc kubenswrapper[4922]: I0218 12:33:07.637633 4922 generic.go:334] "Generic (PLEG): container finished" podID="bd456290-ba12-4116-9a58-04ed7fcba476" containerID="6cd64925f7df3c941d9123838b532a293586d9f1541dbd6b6b614a998aac6452" exitCode=0 Feb 18 12:33:07 crc kubenswrapper[4922]: I0218 12:33:07.638296 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8hj8v" event={"ID":"bd456290-ba12-4116-9a58-04ed7fcba476","Type":"ContainerDied","Data":"6cd64925f7df3c941d9123838b532a293586d9f1541dbd6b6b614a998aac6452"} Feb 18 12:33:07 crc kubenswrapper[4922]: I0218 12:33:07.638380 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8hj8v" event={"ID":"bd456290-ba12-4116-9a58-04ed7fcba476","Type":"ContainerDied","Data":"37abbd67b2d6143cc216bd867afee16da3fa841e67518e563a9512ece0b85196"} Feb 18 12:33:07 crc kubenswrapper[4922]: I0218 12:33:07.638387 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8hj8v" Feb 18 12:33:07 crc kubenswrapper[4922]: I0218 12:33:07.638404 4922 scope.go:117] "RemoveContainer" containerID="6cd64925f7df3c941d9123838b532a293586d9f1541dbd6b6b614a998aac6452" Feb 18 12:33:07 crc kubenswrapper[4922]: I0218 12:33:07.660851 4922 scope.go:117] "RemoveContainer" containerID="dc6eae0b7e35dd4caea940146fcc325f1c9769371fb668a92d65ea86bae48778" Feb 18 12:33:07 crc kubenswrapper[4922]: I0218 12:33:07.681975 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8hj8v"] Feb 18 12:33:07 crc kubenswrapper[4922]: I0218 12:33:07.692124 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8hj8v"] Feb 18 12:33:07 crc kubenswrapper[4922]: I0218 12:33:07.698547 4922 scope.go:117] "RemoveContainer" containerID="296617ba970beec03022dcbaf12a401d44ce2babd980348b145fc78e2c54a9d6" Feb 18 12:33:07 crc kubenswrapper[4922]: I0218 12:33:07.854587 4922 scope.go:117] "RemoveContainer" containerID="6cd64925f7df3c941d9123838b532a293586d9f1541dbd6b6b614a998aac6452" Feb 18 12:33:07 crc kubenswrapper[4922]: E0218 12:33:07.855161 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cd64925f7df3c941d9123838b532a293586d9f1541dbd6b6b614a998aac6452\": container with ID starting with 6cd64925f7df3c941d9123838b532a293586d9f1541dbd6b6b614a998aac6452 not found: ID does not exist" containerID="6cd64925f7df3c941d9123838b532a293586d9f1541dbd6b6b614a998aac6452" Feb 18 12:33:07 crc kubenswrapper[4922]: I0218 12:33:07.855207 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cd64925f7df3c941d9123838b532a293586d9f1541dbd6b6b614a998aac6452"} err="failed to get container status \"6cd64925f7df3c941d9123838b532a293586d9f1541dbd6b6b614a998aac6452\": rpc error: code = NotFound desc = could not find container \"6cd64925f7df3c941d9123838b532a293586d9f1541dbd6b6b614a998aac6452\": container with ID starting with 6cd64925f7df3c941d9123838b532a293586d9f1541dbd6b6b614a998aac6452 not found: ID does not exist" Feb 18 12:33:07 crc kubenswrapper[4922]: I0218 12:33:07.855231 4922 scope.go:117] "RemoveContainer" containerID="dc6eae0b7e35dd4caea940146fcc325f1c9769371fb668a92d65ea86bae48778" Feb 18 12:33:07 crc kubenswrapper[4922]: E0218 12:33:07.855993 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc6eae0b7e35dd4caea940146fcc325f1c9769371fb668a92d65ea86bae48778\": container with ID starting with dc6eae0b7e35dd4caea940146fcc325f1c9769371fb668a92d65ea86bae48778 not found: ID does not exist" containerID="dc6eae0b7e35dd4caea940146fcc325f1c9769371fb668a92d65ea86bae48778" Feb 18 12:33:07 crc kubenswrapper[4922]: I0218 12:33:07.856030 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc6eae0b7e35dd4caea940146fcc325f1c9769371fb668a92d65ea86bae48778"} err="failed to get container status \"dc6eae0b7e35dd4caea940146fcc325f1c9769371fb668a92d65ea86bae48778\": rpc error: code = NotFound desc = could not find container \"dc6eae0b7e35dd4caea940146fcc325f1c9769371fb668a92d65ea86bae48778\": container with ID starting with dc6eae0b7e35dd4caea940146fcc325f1c9769371fb668a92d65ea86bae48778 not found: ID does not exist" Feb 18 12:33:07 crc kubenswrapper[4922]: I0218 12:33:07.856053 4922 scope.go:117] "RemoveContainer" containerID="296617ba970beec03022dcbaf12a401d44ce2babd980348b145fc78e2c54a9d6" Feb 18 12:33:07 crc kubenswrapper[4922]: E0218 12:33:07.856527 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"296617ba970beec03022dcbaf12a401d44ce2babd980348b145fc78e2c54a9d6\": container with ID starting with 296617ba970beec03022dcbaf12a401d44ce2babd980348b145fc78e2c54a9d6 not found: ID does not exist" containerID="296617ba970beec03022dcbaf12a401d44ce2babd980348b145fc78e2c54a9d6" Feb 18 12:33:07 crc kubenswrapper[4922]: I0218 12:33:07.856568 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"296617ba970beec03022dcbaf12a401d44ce2babd980348b145fc78e2c54a9d6"} err="failed to get container status \"296617ba970beec03022dcbaf12a401d44ce2babd980348b145fc78e2c54a9d6\": rpc error: code = NotFound desc = could not find container \"296617ba970beec03022dcbaf12a401d44ce2babd980348b145fc78e2c54a9d6\": container with ID starting with 296617ba970beec03022dcbaf12a401d44ce2babd980348b145fc78e2c54a9d6 not found: ID does not exist" Feb 18 12:33:08 crc kubenswrapper[4922]: I0218 12:33:08.164572 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-48d4t" Feb 18 12:33:08 crc kubenswrapper[4922]: I0218 12:33:08.247268 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1faa074-0925-4c46-b2d7-3d5590f2bfb2-utilities\") pod \"f1faa074-0925-4c46-b2d7-3d5590f2bfb2\" (UID: \"f1faa074-0925-4c46-b2d7-3d5590f2bfb2\") " Feb 18 12:33:08 crc kubenswrapper[4922]: I0218 12:33:08.247334 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1faa074-0925-4c46-b2d7-3d5590f2bfb2-catalog-content\") pod \"f1faa074-0925-4c46-b2d7-3d5590f2bfb2\" (UID: \"f1faa074-0925-4c46-b2d7-3d5590f2bfb2\") " Feb 18 12:33:08 crc kubenswrapper[4922]: I0218 12:33:08.247476 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2pw6\" (UniqueName: \"kubernetes.io/projected/f1faa074-0925-4c46-b2d7-3d5590f2bfb2-kube-api-access-g2pw6\") pod \"f1faa074-0925-4c46-b2d7-3d5590f2bfb2\" (UID: \"f1faa074-0925-4c46-b2d7-3d5590f2bfb2\") " Feb 18 12:33:08 crc kubenswrapper[4922]: I0218 12:33:08.254671 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1faa074-0925-4c46-b2d7-3d5590f2bfb2-utilities" (OuterVolumeSpecName: "utilities") pod "f1faa074-0925-4c46-b2d7-3d5590f2bfb2" (UID: "f1faa074-0925-4c46-b2d7-3d5590f2bfb2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:33:08 crc kubenswrapper[4922]: I0218 12:33:08.259767 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1faa074-0925-4c46-b2d7-3d5590f2bfb2-kube-api-access-g2pw6" (OuterVolumeSpecName: "kube-api-access-g2pw6") pod "f1faa074-0925-4c46-b2d7-3d5590f2bfb2" (UID: "f1faa074-0925-4c46-b2d7-3d5590f2bfb2"). InnerVolumeSpecName "kube-api-access-g2pw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:33:08 crc kubenswrapper[4922]: I0218 12:33:08.349239 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1faa074-0925-4c46-b2d7-3d5590f2bfb2-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 12:33:08 crc kubenswrapper[4922]: I0218 12:33:08.349275 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2pw6\" (UniqueName: \"kubernetes.io/projected/f1faa074-0925-4c46-b2d7-3d5590f2bfb2-kube-api-access-g2pw6\") on node \"crc\" DevicePath \"\"" Feb 18 12:33:08 crc kubenswrapper[4922]: I0218 12:33:08.517317 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1faa074-0925-4c46-b2d7-3d5590f2bfb2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f1faa074-0925-4c46-b2d7-3d5590f2bfb2" (UID: "f1faa074-0925-4c46-b2d7-3d5590f2bfb2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:33:08 crc kubenswrapper[4922]: I0218 12:33:08.553226 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1faa074-0925-4c46-b2d7-3d5590f2bfb2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 12:33:08 crc kubenswrapper[4922]: I0218 12:33:08.653670 4922 generic.go:334] "Generic (PLEG): container finished" podID="f1faa074-0925-4c46-b2d7-3d5590f2bfb2" containerID="377e6020d0b42c5266d8273b9b34ce499f228a6c68677cbc4c4e34b4940a1f55" exitCode=0 Feb 18 12:33:08 crc kubenswrapper[4922]: I0218 12:33:08.653718 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-48d4t" event={"ID":"f1faa074-0925-4c46-b2d7-3d5590f2bfb2","Type":"ContainerDied","Data":"377e6020d0b42c5266d8273b9b34ce499f228a6c68677cbc4c4e34b4940a1f55"} Feb 18 12:33:08 crc kubenswrapper[4922]: I0218 12:33:08.653748 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-48d4t" event={"ID":"f1faa074-0925-4c46-b2d7-3d5590f2bfb2","Type":"ContainerDied","Data":"b63935ff693150fab51446b824532dfaf0826366b92ab601cfc7bb7ab4edc1ce"} Feb 18 12:33:08 crc kubenswrapper[4922]: I0218 12:33:08.653771 4922 scope.go:117] "RemoveContainer" containerID="377e6020d0b42c5266d8273b9b34ce499f228a6c68677cbc4c4e34b4940a1f55" Feb 18 12:33:08 crc kubenswrapper[4922]: I0218 12:33:08.653891 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-48d4t" Feb 18 12:33:08 crc kubenswrapper[4922]: I0218 12:33:08.701684 4922 scope.go:117] "RemoveContainer" containerID="23f296a4f83051f275c5ad75133b9d989ffbb5e5773e3cf9682aa82e8d0b8ec6" Feb 18 12:33:08 crc kubenswrapper[4922]: I0218 12:33:08.702599 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-48d4t"] Feb 18 12:33:08 crc kubenswrapper[4922]: I0218 12:33:08.714704 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-48d4t"] Feb 18 12:33:08 crc kubenswrapper[4922]: I0218 12:33:08.734941 4922 scope.go:117] "RemoveContainer" containerID="9e3a83e756a30ee92f9449f7ddc8339c21a1abdeda0792499d4a21920ed6c1c4" Feb 18 12:33:08 crc kubenswrapper[4922]: I0218 12:33:08.805130 4922 scope.go:117] "RemoveContainer" containerID="377e6020d0b42c5266d8273b9b34ce499f228a6c68677cbc4c4e34b4940a1f55" Feb 18 12:33:08 crc kubenswrapper[4922]: E0218 12:33:08.805560 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"377e6020d0b42c5266d8273b9b34ce499f228a6c68677cbc4c4e34b4940a1f55\": container with ID starting with 377e6020d0b42c5266d8273b9b34ce499f228a6c68677cbc4c4e34b4940a1f55 not found: ID does not exist" containerID="377e6020d0b42c5266d8273b9b34ce499f228a6c68677cbc4c4e34b4940a1f55" Feb 18 12:33:08 crc kubenswrapper[4922]: I0218 12:33:08.805592 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"377e6020d0b42c5266d8273b9b34ce499f228a6c68677cbc4c4e34b4940a1f55"} err="failed to get container status \"377e6020d0b42c5266d8273b9b34ce499f228a6c68677cbc4c4e34b4940a1f55\": rpc error: code = NotFound desc = could not find container \"377e6020d0b42c5266d8273b9b34ce499f228a6c68677cbc4c4e34b4940a1f55\": container with ID starting with 377e6020d0b42c5266d8273b9b34ce499f228a6c68677cbc4c4e34b4940a1f55 not found: ID does not exist" Feb 18 12:33:08 crc kubenswrapper[4922]: I0218 12:33:08.805612 4922 scope.go:117] "RemoveContainer" containerID="23f296a4f83051f275c5ad75133b9d989ffbb5e5773e3cf9682aa82e8d0b8ec6" Feb 18 12:33:08 crc kubenswrapper[4922]: E0218 12:33:08.806009 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23f296a4f83051f275c5ad75133b9d989ffbb5e5773e3cf9682aa82e8d0b8ec6\": container with ID starting with 23f296a4f83051f275c5ad75133b9d989ffbb5e5773e3cf9682aa82e8d0b8ec6 not found: ID does not exist" containerID="23f296a4f83051f275c5ad75133b9d989ffbb5e5773e3cf9682aa82e8d0b8ec6" Feb 18 12:33:08 crc kubenswrapper[4922]: I0218 12:33:08.806032 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23f296a4f83051f275c5ad75133b9d989ffbb5e5773e3cf9682aa82e8d0b8ec6"} err="failed to get container status \"23f296a4f83051f275c5ad75133b9d989ffbb5e5773e3cf9682aa82e8d0b8ec6\": rpc error: code = NotFound desc = could not find container \"23f296a4f83051f275c5ad75133b9d989ffbb5e5773e3cf9682aa82e8d0b8ec6\": container with ID starting with 23f296a4f83051f275c5ad75133b9d989ffbb5e5773e3cf9682aa82e8d0b8ec6 not found: ID does not exist" Feb 18 12:33:08 crc kubenswrapper[4922]: I0218 12:33:08.806045 4922 scope.go:117] "RemoveContainer" containerID="9e3a83e756a30ee92f9449f7ddc8339c21a1abdeda0792499d4a21920ed6c1c4" Feb 18 12:33:08 crc kubenswrapper[4922]: E0218 12:33:08.806557 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e3a83e756a30ee92f9449f7ddc8339c21a1abdeda0792499d4a21920ed6c1c4\": container with ID starting with 9e3a83e756a30ee92f9449f7ddc8339c21a1abdeda0792499d4a21920ed6c1c4 not found: ID does not exist" containerID="9e3a83e756a30ee92f9449f7ddc8339c21a1abdeda0792499d4a21920ed6c1c4" Feb 18 12:33:08 crc kubenswrapper[4922]: I0218 12:33:08.806608 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e3a83e756a30ee92f9449f7ddc8339c21a1abdeda0792499d4a21920ed6c1c4"} err="failed to get container status \"9e3a83e756a30ee92f9449f7ddc8339c21a1abdeda0792499d4a21920ed6c1c4\": rpc error: code = NotFound desc = could not find container \"9e3a83e756a30ee92f9449f7ddc8339c21a1abdeda0792499d4a21920ed6c1c4\": container with ID starting with 9e3a83e756a30ee92f9449f7ddc8339c21a1abdeda0792499d4a21920ed6c1c4 not found: ID does not exist" Feb 18 12:33:09 crc kubenswrapper[4922]: I0218 12:33:09.009076 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd456290-ba12-4116-9a58-04ed7fcba476" path="/var/lib/kubelet/pods/bd456290-ba12-4116-9a58-04ed7fcba476/volumes" Feb 18 12:33:09 crc kubenswrapper[4922]: I0218 12:33:09.018618 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1faa074-0925-4c46-b2d7-3d5590f2bfb2" path="/var/lib/kubelet/pods/f1faa074-0925-4c46-b2d7-3d5590f2bfb2/volumes" Feb 18 12:33:14 crc kubenswrapper[4922]: I0218 12:33:14.973053 4922 scope.go:117] "RemoveContainer" containerID="16600860fb4ac9263fdc6d81dfcafd7d172469c3e817b2a15138f9381adaba7a" Feb 18 12:33:14 crc kubenswrapper[4922]: E0218 12:33:14.973790 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:33:25 crc kubenswrapper[4922]: I0218 12:33:25.973269 4922 scope.go:117] "RemoveContainer" containerID="16600860fb4ac9263fdc6d81dfcafd7d172469c3e817b2a15138f9381adaba7a" Feb 18 12:33:25 crc kubenswrapper[4922]: E0218 12:33:25.974120 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:33:39 crc kubenswrapper[4922]: I0218 12:33:39.972743 4922 scope.go:117] "RemoveContainer" containerID="16600860fb4ac9263fdc6d81dfcafd7d172469c3e817b2a15138f9381adaba7a" Feb 18 12:33:40 crc kubenswrapper[4922]: I0218 12:33:40.930978 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" event={"ID":"fdb7cedc-b2e3-48f0-80e0-e17073b43228","Type":"ContainerStarted","Data":"617209a8a29981601b193fadb47bd31c1969eee3e663e9025cd5b23991442654"} Feb 18 12:34:06 crc kubenswrapper[4922]: I0218 12:34:06.645713 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vqpdr"] Feb 18 12:34:06 crc kubenswrapper[4922]: E0218 12:34:06.646743 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1faa074-0925-4c46-b2d7-3d5590f2bfb2" containerName="registry-server" Feb 18 12:34:06 crc kubenswrapper[4922]: I0218 12:34:06.646762 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1faa074-0925-4c46-b2d7-3d5590f2bfb2" containerName="registry-server" Feb 18 12:34:06 crc kubenswrapper[4922]: E0218 12:34:06.646775 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1faa074-0925-4c46-b2d7-3d5590f2bfb2" containerName="extract-content" Feb 18 12:34:06 crc kubenswrapper[4922]: I0218 12:34:06.646783 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1faa074-0925-4c46-b2d7-3d5590f2bfb2" containerName="extract-content" Feb 18 12:34:06 crc kubenswrapper[4922]: E0218 12:34:06.646801 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd456290-ba12-4116-9a58-04ed7fcba476" containerName="extract-content" Feb 18 12:34:06 crc kubenswrapper[4922]: I0218 12:34:06.646811 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd456290-ba12-4116-9a58-04ed7fcba476" containerName="extract-content" Feb 18 12:34:06 crc kubenswrapper[4922]: E0218 12:34:06.646830 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1faa074-0925-4c46-b2d7-3d5590f2bfb2" containerName="extract-utilities" Feb 18 12:34:06 crc kubenswrapper[4922]: I0218 12:34:06.646838 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1faa074-0925-4c46-b2d7-3d5590f2bfb2" containerName="extract-utilities" Feb 18 12:34:06 crc kubenswrapper[4922]: E0218 12:34:06.646863 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd456290-ba12-4116-9a58-04ed7fcba476" containerName="registry-server" Feb 18 12:34:06 crc kubenswrapper[4922]: I0218 12:34:06.646871 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd456290-ba12-4116-9a58-04ed7fcba476" containerName="registry-server" Feb 18 12:34:06 crc kubenswrapper[4922]: E0218 12:34:06.646886 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd456290-ba12-4116-9a58-04ed7fcba476" containerName="extract-utilities" Feb 18 12:34:06 crc kubenswrapper[4922]: I0218 12:34:06.646893 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd456290-ba12-4116-9a58-04ed7fcba476" containerName="extract-utilities" Feb 18 12:34:06 crc kubenswrapper[4922]: I0218 12:34:06.647141 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd456290-ba12-4116-9a58-04ed7fcba476" containerName="registry-server" Feb 18 12:34:06 crc kubenswrapper[4922]: I0218 12:34:06.647156 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1faa074-0925-4c46-b2d7-3d5590f2bfb2" containerName="registry-server" Feb 18 12:34:06 crc kubenswrapper[4922]: I0218 12:34:06.649950 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vqpdr" Feb 18 12:34:06 crc kubenswrapper[4922]: I0218 12:34:06.652900 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmq7z\" (UniqueName: \"kubernetes.io/projected/d1f3187c-1e35-4bda-a0d3-2969d44c1ad5-kube-api-access-vmq7z\") pod \"certified-operators-vqpdr\" (UID: \"d1f3187c-1e35-4bda-a0d3-2969d44c1ad5\") " pod="openshift-marketplace/certified-operators-vqpdr" Feb 18 12:34:06 crc kubenswrapper[4922]: I0218 12:34:06.653195 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1f3187c-1e35-4bda-a0d3-2969d44c1ad5-catalog-content\") pod \"certified-operators-vqpdr\" (UID: \"d1f3187c-1e35-4bda-a0d3-2969d44c1ad5\") " pod="openshift-marketplace/certified-operators-vqpdr" Feb 18 12:34:06 crc kubenswrapper[4922]: I0218 12:34:06.653276 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1f3187c-1e35-4bda-a0d3-2969d44c1ad5-utilities\") pod \"certified-operators-vqpdr\" (UID: \"d1f3187c-1e35-4bda-a0d3-2969d44c1ad5\") " pod="openshift-marketplace/certified-operators-vqpdr" Feb 18 12:34:06 crc kubenswrapper[4922]: I0218 12:34:06.660079 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vqpdr"] Feb 18 12:34:06 crc kubenswrapper[4922]: I0218 12:34:06.754801 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1f3187c-1e35-4bda-a0d3-2969d44c1ad5-utilities\") pod \"certified-operators-vqpdr\" (UID: \"d1f3187c-1e35-4bda-a0d3-2969d44c1ad5\") " pod="openshift-marketplace/certified-operators-vqpdr" Feb 18 12:34:06 crc kubenswrapper[4922]: I0218 12:34:06.755117 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmq7z\" (UniqueName: \"kubernetes.io/projected/d1f3187c-1e35-4bda-a0d3-2969d44c1ad5-kube-api-access-vmq7z\") pod \"certified-operators-vqpdr\" (UID: \"d1f3187c-1e35-4bda-a0d3-2969d44c1ad5\") " pod="openshift-marketplace/certified-operators-vqpdr" Feb 18 12:34:06 crc kubenswrapper[4922]: I0218 12:34:06.755232 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1f3187c-1e35-4bda-a0d3-2969d44c1ad5-catalog-content\") pod \"certified-operators-vqpdr\" (UID: \"d1f3187c-1e35-4bda-a0d3-2969d44c1ad5\") " pod="openshift-marketplace/certified-operators-vqpdr" Feb 18 12:34:06 crc kubenswrapper[4922]: I0218 12:34:06.755605 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1f3187c-1e35-4bda-a0d3-2969d44c1ad5-utilities\") pod \"certified-operators-vqpdr\" (UID: \"d1f3187c-1e35-4bda-a0d3-2969d44c1ad5\") " pod="openshift-marketplace/certified-operators-vqpdr" Feb 18 12:34:06 crc kubenswrapper[4922]: I0218 12:34:06.755686 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1f3187c-1e35-4bda-a0d3-2969d44c1ad5-catalog-content\") pod \"certified-operators-vqpdr\" (UID: \"d1f3187c-1e35-4bda-a0d3-2969d44c1ad5\") " pod="openshift-marketplace/certified-operators-vqpdr" Feb 18 12:34:06 crc kubenswrapper[4922]: I0218 12:34:06.779953 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmq7z\" (UniqueName: \"kubernetes.io/projected/d1f3187c-1e35-4bda-a0d3-2969d44c1ad5-kube-api-access-vmq7z\") pod \"certified-operators-vqpdr\" (UID: \"d1f3187c-1e35-4bda-a0d3-2969d44c1ad5\") " pod="openshift-marketplace/certified-operators-vqpdr" Feb 18 12:34:06 crc kubenswrapper[4922]: I0218 12:34:06.977267 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vqpdr" Feb 18 12:34:07 crc kubenswrapper[4922]: I0218 12:34:07.469805 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vqpdr"] Feb 18 12:34:07 crc kubenswrapper[4922]: I0218 12:34:07.645587 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqpdr" event={"ID":"d1f3187c-1e35-4bda-a0d3-2969d44c1ad5","Type":"ContainerStarted","Data":"e9c90e41c6e32a490492f7cdbaf233137eb96ac3ecd876f57a74114cd5f4c029"} Feb 18 12:34:08 crc kubenswrapper[4922]: I0218 12:34:08.654873 4922 generic.go:334] "Generic (PLEG): container finished" podID="d1f3187c-1e35-4bda-a0d3-2969d44c1ad5" containerID="d3ee751d5747cbf6f52f6c827ede651bb584829aee903c09210e24a6a7b80f9e" exitCode=0 Feb 18 12:34:08 crc kubenswrapper[4922]: I0218 12:34:08.655087 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqpdr" event={"ID":"d1f3187c-1e35-4bda-a0d3-2969d44c1ad5","Type":"ContainerDied","Data":"d3ee751d5747cbf6f52f6c827ede651bb584829aee903c09210e24a6a7b80f9e"} Feb 18 12:34:10 crc kubenswrapper[4922]: I0218 12:34:10.674149 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqpdr" event={"ID":"d1f3187c-1e35-4bda-a0d3-2969d44c1ad5","Type":"ContainerStarted","Data":"4b6426978a8c701dfd748b188e2342eeb4b3fe08cb2fc2f0798d400869864f5f"} Feb 18 12:34:11 crc kubenswrapper[4922]: I0218 12:34:11.683968 4922 generic.go:334] "Generic (PLEG): container finished" podID="d1f3187c-1e35-4bda-a0d3-2969d44c1ad5" containerID="4b6426978a8c701dfd748b188e2342eeb4b3fe08cb2fc2f0798d400869864f5f" exitCode=0 Feb 18 12:34:11 crc kubenswrapper[4922]: I0218 12:34:11.684088 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqpdr" event={"ID":"d1f3187c-1e35-4bda-a0d3-2969d44c1ad5","Type":"ContainerDied","Data":"4b6426978a8c701dfd748b188e2342eeb4b3fe08cb2fc2f0798d400869864f5f"} Feb 18 12:34:12 crc kubenswrapper[4922]: I0218 12:34:12.702079 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqpdr" event={"ID":"d1f3187c-1e35-4bda-a0d3-2969d44c1ad5","Type":"ContainerStarted","Data":"f1c7e176a50ab4ad1ed3a22837e437925e2c95b8a9a2c11b7bebb33d69dfda4e"} Feb 18 12:34:12 crc kubenswrapper[4922]: I0218 12:34:12.721960 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vqpdr" podStartSLOduration=3.296533479 podStartE2EDuration="6.721938648s" podCreationTimestamp="2026-02-18 12:34:06 +0000 UTC" firstStartedPulling="2026-02-18 12:34:08.656591271 +0000 UTC m=+3450.384295351" lastFinishedPulling="2026-02-18 12:34:12.08199642 +0000 UTC m=+3453.809700520" observedRunningTime="2026-02-18 12:34:12.718931722 +0000 UTC m=+3454.446635802" watchObservedRunningTime="2026-02-18 12:34:12.721938648 +0000 UTC m=+3454.449642728" Feb 18 12:34:16 crc kubenswrapper[4922]: I0218 12:34:16.985488 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vqpdr" Feb 18 12:34:16 crc kubenswrapper[4922]: I0218 12:34:16.986085 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vqpdr" Feb 18 12:34:17 crc kubenswrapper[4922]: I0218 12:34:17.028285 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vqpdr" Feb 18 12:34:17 crc kubenswrapper[4922]: I0218 12:34:17.789275 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vqpdr" Feb 18 12:34:17 crc kubenswrapper[4922]: I0218 12:34:17.838026 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vqpdr"] Feb 18 12:34:19 crc kubenswrapper[4922]: I0218 12:34:19.765710 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vqpdr" podUID="d1f3187c-1e35-4bda-a0d3-2969d44c1ad5" containerName="registry-server" containerID="cri-o://f1c7e176a50ab4ad1ed3a22837e437925e2c95b8a9a2c11b7bebb33d69dfda4e" gracePeriod=2 Feb 18 12:34:20 crc kubenswrapper[4922]: I0218 12:34:20.312378 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vqpdr" Feb 18 12:34:20 crc kubenswrapper[4922]: I0218 12:34:20.419217 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmq7z\" (UniqueName: \"kubernetes.io/projected/d1f3187c-1e35-4bda-a0d3-2969d44c1ad5-kube-api-access-vmq7z\") pod \"d1f3187c-1e35-4bda-a0d3-2969d44c1ad5\" (UID: \"d1f3187c-1e35-4bda-a0d3-2969d44c1ad5\") " Feb 18 12:34:20 crc kubenswrapper[4922]: I0218 12:34:20.419270 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1f3187c-1e35-4bda-a0d3-2969d44c1ad5-catalog-content\") pod \"d1f3187c-1e35-4bda-a0d3-2969d44c1ad5\" (UID: \"d1f3187c-1e35-4bda-a0d3-2969d44c1ad5\") " Feb 18 12:34:20 crc kubenswrapper[4922]: I0218 12:34:20.419547 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1f3187c-1e35-4bda-a0d3-2969d44c1ad5-utilities\") pod \"d1f3187c-1e35-4bda-a0d3-2969d44c1ad5\" (UID: \"d1f3187c-1e35-4bda-a0d3-2969d44c1ad5\") " Feb 18 12:34:20 crc kubenswrapper[4922]: I0218 12:34:20.420480 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1f3187c-1e35-4bda-a0d3-2969d44c1ad5-utilities" (OuterVolumeSpecName: "utilities") pod "d1f3187c-1e35-4bda-a0d3-2969d44c1ad5" (UID: "d1f3187c-1e35-4bda-a0d3-2969d44c1ad5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:34:20 crc kubenswrapper[4922]: I0218 12:34:20.426483 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1f3187c-1e35-4bda-a0d3-2969d44c1ad5-kube-api-access-vmq7z" (OuterVolumeSpecName: "kube-api-access-vmq7z") pod "d1f3187c-1e35-4bda-a0d3-2969d44c1ad5" (UID: "d1f3187c-1e35-4bda-a0d3-2969d44c1ad5"). InnerVolumeSpecName "kube-api-access-vmq7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:34:20 crc kubenswrapper[4922]: I0218 12:34:20.521749 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1f3187c-1e35-4bda-a0d3-2969d44c1ad5-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 12:34:20 crc kubenswrapper[4922]: I0218 12:34:20.521955 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmq7z\" (UniqueName: \"kubernetes.io/projected/d1f3187c-1e35-4bda-a0d3-2969d44c1ad5-kube-api-access-vmq7z\") on node \"crc\" DevicePath \"\"" Feb 18 12:34:20 crc kubenswrapper[4922]: I0218 12:34:20.775208 4922 generic.go:334] "Generic (PLEG): container finished" podID="d1f3187c-1e35-4bda-a0d3-2969d44c1ad5" containerID="f1c7e176a50ab4ad1ed3a22837e437925e2c95b8a9a2c11b7bebb33d69dfda4e" exitCode=0 Feb 18 12:34:20 crc kubenswrapper[4922]: I0218 12:34:20.775532 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqpdr" event={"ID":"d1f3187c-1e35-4bda-a0d3-2969d44c1ad5","Type":"ContainerDied","Data":"f1c7e176a50ab4ad1ed3a22837e437925e2c95b8a9a2c11b7bebb33d69dfda4e"} Feb 18 12:34:20 crc kubenswrapper[4922]: I0218 12:34:20.775580 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vqpdr" Feb 18 12:34:20 crc kubenswrapper[4922]: I0218 12:34:20.775674 4922 scope.go:117] "RemoveContainer" containerID="f1c7e176a50ab4ad1ed3a22837e437925e2c95b8a9a2c11b7bebb33d69dfda4e" Feb 18 12:34:20 crc kubenswrapper[4922]: I0218 12:34:20.775653 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqpdr" event={"ID":"d1f3187c-1e35-4bda-a0d3-2969d44c1ad5","Type":"ContainerDied","Data":"e9c90e41c6e32a490492f7cdbaf233137eb96ac3ecd876f57a74114cd5f4c029"} Feb 18 12:34:20 crc kubenswrapper[4922]: I0218 12:34:20.795884 4922 scope.go:117] "RemoveContainer" containerID="4b6426978a8c701dfd748b188e2342eeb4b3fe08cb2fc2f0798d400869864f5f" Feb 18 12:34:20 crc kubenswrapper[4922]: I0218 12:34:20.818554 4922 scope.go:117] "RemoveContainer" containerID="d3ee751d5747cbf6f52f6c827ede651bb584829aee903c09210e24a6a7b80f9e" Feb 18 12:34:20 crc kubenswrapper[4922]: I0218 12:34:20.869133 4922 scope.go:117] "RemoveContainer" containerID="f1c7e176a50ab4ad1ed3a22837e437925e2c95b8a9a2c11b7bebb33d69dfda4e" Feb 18 12:34:20 crc kubenswrapper[4922]: E0218 12:34:20.869500 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1c7e176a50ab4ad1ed3a22837e437925e2c95b8a9a2c11b7bebb33d69dfda4e\": container with ID starting with f1c7e176a50ab4ad1ed3a22837e437925e2c95b8a9a2c11b7bebb33d69dfda4e not found: ID does not exist" containerID="f1c7e176a50ab4ad1ed3a22837e437925e2c95b8a9a2c11b7bebb33d69dfda4e" Feb 18 12:34:20 crc kubenswrapper[4922]: I0218 12:34:20.869544 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1c7e176a50ab4ad1ed3a22837e437925e2c95b8a9a2c11b7bebb33d69dfda4e"} err="failed to get container status \"f1c7e176a50ab4ad1ed3a22837e437925e2c95b8a9a2c11b7bebb33d69dfda4e\": rpc error: code = NotFound desc = could not find container \"f1c7e176a50ab4ad1ed3a22837e437925e2c95b8a9a2c11b7bebb33d69dfda4e\": container with ID starting with f1c7e176a50ab4ad1ed3a22837e437925e2c95b8a9a2c11b7bebb33d69dfda4e not found: ID does not exist" Feb 18 12:34:20 crc kubenswrapper[4922]: I0218 12:34:20.869573 4922 scope.go:117] "RemoveContainer" containerID="4b6426978a8c701dfd748b188e2342eeb4b3fe08cb2fc2f0798d400869864f5f" Feb 18 12:34:20 crc kubenswrapper[4922]: E0218 12:34:20.869915 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b6426978a8c701dfd748b188e2342eeb4b3fe08cb2fc2f0798d400869864f5f\": container with ID starting with 4b6426978a8c701dfd748b188e2342eeb4b3fe08cb2fc2f0798d400869864f5f not found: ID does not exist" containerID="4b6426978a8c701dfd748b188e2342eeb4b3fe08cb2fc2f0798d400869864f5f" Feb 18 12:34:20 crc kubenswrapper[4922]: I0218 12:34:20.869956 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b6426978a8c701dfd748b188e2342eeb4b3fe08cb2fc2f0798d400869864f5f"} err="failed to get container status \"4b6426978a8c701dfd748b188e2342eeb4b3fe08cb2fc2f0798d400869864f5f\": rpc error: code = NotFound desc = could not find container \"4b6426978a8c701dfd748b188e2342eeb4b3fe08cb2fc2f0798d400869864f5f\": container with ID starting with 4b6426978a8c701dfd748b188e2342eeb4b3fe08cb2fc2f0798d400869864f5f not found: ID does not exist" Feb 18 12:34:20 crc kubenswrapper[4922]: I0218 12:34:20.869990 4922 scope.go:117] "RemoveContainer" containerID="d3ee751d5747cbf6f52f6c827ede651bb584829aee903c09210e24a6a7b80f9e" Feb 18 12:34:20 crc kubenswrapper[4922]: E0218 12:34:20.870271 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3ee751d5747cbf6f52f6c827ede651bb584829aee903c09210e24a6a7b80f9e\": container with ID starting with d3ee751d5747cbf6f52f6c827ede651bb584829aee903c09210e24a6a7b80f9e not found: ID does not exist" containerID="d3ee751d5747cbf6f52f6c827ede651bb584829aee903c09210e24a6a7b80f9e" Feb 18 12:34:20 crc kubenswrapper[4922]: I0218 12:34:20.870303 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3ee751d5747cbf6f52f6c827ede651bb584829aee903c09210e24a6a7b80f9e"} err="failed to get container status \"d3ee751d5747cbf6f52f6c827ede651bb584829aee903c09210e24a6a7b80f9e\": rpc error: code = NotFound desc = could not find container \"d3ee751d5747cbf6f52f6c827ede651bb584829aee903c09210e24a6a7b80f9e\": container with ID starting with d3ee751d5747cbf6f52f6c827ede651bb584829aee903c09210e24a6a7b80f9e not found: ID does not exist" Feb 18 12:34:20 crc kubenswrapper[4922]: I0218 12:34:20.892510 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1f3187c-1e35-4bda-a0d3-2969d44c1ad5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d1f3187c-1e35-4bda-a0d3-2969d44c1ad5" (UID: "d1f3187c-1e35-4bda-a0d3-2969d44c1ad5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:34:20 crc kubenswrapper[4922]: I0218 12:34:20.928229 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1f3187c-1e35-4bda-a0d3-2969d44c1ad5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 12:34:21 crc kubenswrapper[4922]: I0218 12:34:21.099387 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vqpdr"] Feb 18 12:34:21 crc kubenswrapper[4922]: I0218 12:34:21.109395 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vqpdr"] Feb 18 12:34:22 crc kubenswrapper[4922]: I0218 12:34:22.984785 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1f3187c-1e35-4bda-a0d3-2969d44c1ad5" path="/var/lib/kubelet/pods/d1f3187c-1e35-4bda-a0d3-2969d44c1ad5/volumes" Feb 18 12:36:09 crc kubenswrapper[4922]: I0218 12:36:09.807697 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 12:36:09 crc kubenswrapper[4922]: I0218 12:36:09.808217 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 12:36:39 crc kubenswrapper[4922]: I0218 12:36:39.807849 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 12:36:39 crc kubenswrapper[4922]: I0218 12:36:39.808424 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 12:37:09 crc kubenswrapper[4922]: I0218 12:37:09.807931 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 12:37:09 crc kubenswrapper[4922]: I0218 12:37:09.809216 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 12:37:09 crc kubenswrapper[4922]: I0218 12:37:09.809303 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-znglx" Feb 18 12:37:09 crc kubenswrapper[4922]: I0218 12:37:09.810074 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"617209a8a29981601b193fadb47bd31c1969eee3e663e9025cd5b23991442654"} pod="openshift-machine-config-operator/machine-config-daemon-znglx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 12:37:09 crc kubenswrapper[4922]: I0218 12:37:09.810143 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" containerID="cri-o://617209a8a29981601b193fadb47bd31c1969eee3e663e9025cd5b23991442654" gracePeriod=600 Feb 18 12:37:10 crc kubenswrapper[4922]: I0218 12:37:10.387215 4922 generic.go:334] "Generic (PLEG): container finished" podID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerID="617209a8a29981601b193fadb47bd31c1969eee3e663e9025cd5b23991442654" exitCode=0 Feb 18 12:37:10 crc kubenswrapper[4922]: I0218 12:37:10.387353 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" event={"ID":"fdb7cedc-b2e3-48f0-80e0-e17073b43228","Type":"ContainerDied","Data":"617209a8a29981601b193fadb47bd31c1969eee3e663e9025cd5b23991442654"} Feb 18 12:37:10 crc kubenswrapper[4922]: I0218 12:37:10.387670 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" event={"ID":"fdb7cedc-b2e3-48f0-80e0-e17073b43228","Type":"ContainerStarted","Data":"da2f31195c0d03b655b84a936da99ef08c103ba9b2f411b408e8d74bdaa66a62"} Feb 18 12:37:10 crc kubenswrapper[4922]: I0218 12:37:10.387688 4922 scope.go:117] "RemoveContainer" containerID="16600860fb4ac9263fdc6d81dfcafd7d172469c3e817b2a15138f9381adaba7a" Feb 18 12:38:04 crc kubenswrapper[4922]: I0218 12:38:04.771792 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rj9ln"] Feb 18 12:38:04 crc kubenswrapper[4922]: E0218 12:38:04.772922 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1f3187c-1e35-4bda-a0d3-2969d44c1ad5" containerName="extract-content" Feb 18 12:38:04 crc kubenswrapper[4922]: I0218 12:38:04.772939 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1f3187c-1e35-4bda-a0d3-2969d44c1ad5" containerName="extract-content" Feb 18 12:38:04 crc kubenswrapper[4922]: E0218 12:38:04.772958 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1f3187c-1e35-4bda-a0d3-2969d44c1ad5" containerName="registry-server" Feb 18 12:38:04 crc kubenswrapper[4922]: I0218 12:38:04.772966 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1f3187c-1e35-4bda-a0d3-2969d44c1ad5" containerName="registry-server" Feb 18 12:38:04 crc kubenswrapper[4922]: E0218 12:38:04.772992 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1f3187c-1e35-4bda-a0d3-2969d44c1ad5" containerName="extract-utilities" Feb 18 12:38:04 crc kubenswrapper[4922]: I0218 12:38:04.773002 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1f3187c-1e35-4bda-a0d3-2969d44c1ad5" containerName="extract-utilities" Feb 18 12:38:04 crc kubenswrapper[4922]: I0218 12:38:04.773273 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1f3187c-1e35-4bda-a0d3-2969d44c1ad5" containerName="registry-server" Feb 18 12:38:04 crc kubenswrapper[4922]: I0218 12:38:04.775603 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rj9ln" Feb 18 12:38:04 crc kubenswrapper[4922]: I0218 12:38:04.786743 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rj9ln"] Feb 18 12:38:04 crc kubenswrapper[4922]: I0218 12:38:04.895198 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppqsr\" (UniqueName: \"kubernetes.io/projected/9802c743-deb5-4b6c-9484-2336cb49265c-kube-api-access-ppqsr\") pod \"community-operators-rj9ln\" (UID: \"9802c743-deb5-4b6c-9484-2336cb49265c\") " pod="openshift-marketplace/community-operators-rj9ln" Feb 18 12:38:04 crc kubenswrapper[4922]: I0218 12:38:04.895926 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9802c743-deb5-4b6c-9484-2336cb49265c-utilities\") pod \"community-operators-rj9ln\" (UID: \"9802c743-deb5-4b6c-9484-2336cb49265c\") " pod="openshift-marketplace/community-operators-rj9ln" Feb 18 12:38:04 crc kubenswrapper[4922]: I0218 12:38:04.896452 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9802c743-deb5-4b6c-9484-2336cb49265c-catalog-content\") pod \"community-operators-rj9ln\" (UID: \"9802c743-deb5-4b6c-9484-2336cb49265c\") " pod="openshift-marketplace/community-operators-rj9ln" Feb 18 12:38:04 crc kubenswrapper[4922]: I0218 12:38:04.999071 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9802c743-deb5-4b6c-9484-2336cb49265c-catalog-content\") pod \"community-operators-rj9ln\" (UID: \"9802c743-deb5-4b6c-9484-2336cb49265c\") " pod="openshift-marketplace/community-operators-rj9ln" Feb 18 12:38:04 crc kubenswrapper[4922]: I0218 12:38:04.999337 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppqsr\" (UniqueName: \"kubernetes.io/projected/9802c743-deb5-4b6c-9484-2336cb49265c-kube-api-access-ppqsr\") pod \"community-operators-rj9ln\" (UID: \"9802c743-deb5-4b6c-9484-2336cb49265c\") " pod="openshift-marketplace/community-operators-rj9ln" Feb 18 12:38:04 crc kubenswrapper[4922]: I0218 12:38:04.999385 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9802c743-deb5-4b6c-9484-2336cb49265c-utilities\") pod \"community-operators-rj9ln\" (UID: \"9802c743-deb5-4b6c-9484-2336cb49265c\") " pod="openshift-marketplace/community-operators-rj9ln" Feb 18 12:38:04 crc kubenswrapper[4922]: I0218 12:38:04.999626 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9802c743-deb5-4b6c-9484-2336cb49265c-catalog-content\") pod \"community-operators-rj9ln\" (UID: \"9802c743-deb5-4b6c-9484-2336cb49265c\") " pod="openshift-marketplace/community-operators-rj9ln" Feb 18 12:38:04 crc kubenswrapper[4922]: I0218 12:38:04.999965 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9802c743-deb5-4b6c-9484-2336cb49265c-utilities\") pod \"community-operators-rj9ln\" (UID: \"9802c743-deb5-4b6c-9484-2336cb49265c\") " pod="openshift-marketplace/community-operators-rj9ln" Feb 18 12:38:05 crc kubenswrapper[4922]: I0218 12:38:05.030399 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppqsr\" (UniqueName: \"kubernetes.io/projected/9802c743-deb5-4b6c-9484-2336cb49265c-kube-api-access-ppqsr\") pod \"community-operators-rj9ln\" (UID: \"9802c743-deb5-4b6c-9484-2336cb49265c\") " pod="openshift-marketplace/community-operators-rj9ln" Feb 18 12:38:05 crc kubenswrapper[4922]: I0218 12:38:05.103393 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rj9ln" Feb 18 12:38:05 crc kubenswrapper[4922]: I0218 12:38:05.615702 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rj9ln"] Feb 18 12:38:06 crc kubenswrapper[4922]: I0218 12:38:06.111935 4922 generic.go:334] "Generic (PLEG): container finished" podID="9802c743-deb5-4b6c-9484-2336cb49265c" containerID="0fdda781022745ae094779b7a89ad71538c6121b5be91abadd095ef46c0b037f" exitCode=0 Feb 18 12:38:06 crc kubenswrapper[4922]: I0218 12:38:06.111980 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rj9ln" event={"ID":"9802c743-deb5-4b6c-9484-2336cb49265c","Type":"ContainerDied","Data":"0fdda781022745ae094779b7a89ad71538c6121b5be91abadd095ef46c0b037f"} Feb 18 12:38:06 crc kubenswrapper[4922]: I0218 12:38:06.112026 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rj9ln" event={"ID":"9802c743-deb5-4b6c-9484-2336cb49265c","Type":"ContainerStarted","Data":"3eb4a03c2248714dc926c7d21c85f1bf14169f529c1694e23cb56a243ebfe742"} Feb 18 12:38:06 crc kubenswrapper[4922]: I0218 12:38:06.114935 4922 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 12:38:08 crc kubenswrapper[4922]: I0218 12:38:08.133839 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rj9ln" event={"ID":"9802c743-deb5-4b6c-9484-2336cb49265c","Type":"ContainerStarted","Data":"801d1c22b45cb2e89fe95fe3ea38c34d7591eb2afb03c53f75e7344cc3e55fcf"} Feb 18 12:38:13 crc kubenswrapper[4922]: I0218 12:38:13.176048 4922 generic.go:334] "Generic (PLEG): container finished" podID="9802c743-deb5-4b6c-9484-2336cb49265c" containerID="801d1c22b45cb2e89fe95fe3ea38c34d7591eb2afb03c53f75e7344cc3e55fcf" exitCode=0 Feb 18 12:38:13 crc kubenswrapper[4922]: I0218 12:38:13.176117 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rj9ln" event={"ID":"9802c743-deb5-4b6c-9484-2336cb49265c","Type":"ContainerDied","Data":"801d1c22b45cb2e89fe95fe3ea38c34d7591eb2afb03c53f75e7344cc3e55fcf"} Feb 18 12:38:15 crc kubenswrapper[4922]: I0218 12:38:15.200437 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rj9ln" event={"ID":"9802c743-deb5-4b6c-9484-2336cb49265c","Type":"ContainerStarted","Data":"8c12b89aeab1b8e1855ba7f24cf663759741c9a00c975fe91969456b730c9126"} Feb 18 12:38:15 crc kubenswrapper[4922]: I0218 12:38:15.226143 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rj9ln" podStartSLOduration=3.265438528 podStartE2EDuration="11.226120474s" podCreationTimestamp="2026-02-18 12:38:04 +0000 UTC" firstStartedPulling="2026-02-18 12:38:06.114045678 +0000 UTC m=+3687.841749758" lastFinishedPulling="2026-02-18 12:38:14.074727604 +0000 UTC m=+3695.802431704" observedRunningTime="2026-02-18 12:38:15.21887643 +0000 UTC m=+3696.946580510" watchObservedRunningTime="2026-02-18 12:38:15.226120474 +0000 UTC m=+3696.953824554" Feb 18 12:38:25 crc kubenswrapper[4922]: I0218 12:38:25.103871 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rj9ln" Feb 18 12:38:25 crc kubenswrapper[4922]: I0218 12:38:25.104656 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rj9ln" Feb 18 12:38:25 crc kubenswrapper[4922]: I0218 12:38:25.151270 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rj9ln" Feb 18 12:38:25 crc kubenswrapper[4922]: I0218 12:38:25.352912 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rj9ln" Feb 18 12:38:25 crc kubenswrapper[4922]: I0218 12:38:25.407314 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rj9ln"] Feb 18 12:38:27 crc kubenswrapper[4922]: I0218 12:38:27.307916 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rj9ln" podUID="9802c743-deb5-4b6c-9484-2336cb49265c" containerName="registry-server" containerID="cri-o://8c12b89aeab1b8e1855ba7f24cf663759741c9a00c975fe91969456b730c9126" gracePeriod=2 Feb 18 12:38:27 crc kubenswrapper[4922]: I0218 12:38:27.867877 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rj9ln" Feb 18 12:38:28 crc kubenswrapper[4922]: I0218 12:38:28.058764 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9802c743-deb5-4b6c-9484-2336cb49265c-utilities\") pod \"9802c743-deb5-4b6c-9484-2336cb49265c\" (UID: \"9802c743-deb5-4b6c-9484-2336cb49265c\") " Feb 18 12:38:28 crc kubenswrapper[4922]: I0218 12:38:28.058838 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppqsr\" (UniqueName: \"kubernetes.io/projected/9802c743-deb5-4b6c-9484-2336cb49265c-kube-api-access-ppqsr\") pod \"9802c743-deb5-4b6c-9484-2336cb49265c\" (UID: \"9802c743-deb5-4b6c-9484-2336cb49265c\") " Feb 18 12:38:28 crc kubenswrapper[4922]: I0218 12:38:28.058960 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9802c743-deb5-4b6c-9484-2336cb49265c-catalog-content\") pod \"9802c743-deb5-4b6c-9484-2336cb49265c\" (UID: \"9802c743-deb5-4b6c-9484-2336cb49265c\") " Feb 18 12:38:28 crc kubenswrapper[4922]: I0218 12:38:28.059733 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9802c743-deb5-4b6c-9484-2336cb49265c-utilities" (OuterVolumeSpecName: "utilities") pod "9802c743-deb5-4b6c-9484-2336cb49265c" (UID: "9802c743-deb5-4b6c-9484-2336cb49265c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:38:28 crc kubenswrapper[4922]: I0218 12:38:28.067282 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9802c743-deb5-4b6c-9484-2336cb49265c-kube-api-access-ppqsr" (OuterVolumeSpecName: "kube-api-access-ppqsr") pod "9802c743-deb5-4b6c-9484-2336cb49265c" (UID: "9802c743-deb5-4b6c-9484-2336cb49265c"). InnerVolumeSpecName "kube-api-access-ppqsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:38:28 crc kubenswrapper[4922]: I0218 12:38:28.121523 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9802c743-deb5-4b6c-9484-2336cb49265c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9802c743-deb5-4b6c-9484-2336cb49265c" (UID: "9802c743-deb5-4b6c-9484-2336cb49265c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:38:28 crc kubenswrapper[4922]: I0218 12:38:28.162058 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9802c743-deb5-4b6c-9484-2336cb49265c-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 12:38:28 crc kubenswrapper[4922]: I0218 12:38:28.162098 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppqsr\" (UniqueName: \"kubernetes.io/projected/9802c743-deb5-4b6c-9484-2336cb49265c-kube-api-access-ppqsr\") on node \"crc\" DevicePath \"\"" Feb 18 12:38:28 crc kubenswrapper[4922]: I0218 12:38:28.162111 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9802c743-deb5-4b6c-9484-2336cb49265c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 12:38:28 crc kubenswrapper[4922]: I0218 12:38:28.320116 4922 generic.go:334] "Generic (PLEG): container finished" podID="9802c743-deb5-4b6c-9484-2336cb49265c" containerID="8c12b89aeab1b8e1855ba7f24cf663759741c9a00c975fe91969456b730c9126" exitCode=0 Feb 18 12:38:28 crc kubenswrapper[4922]: I0218 12:38:28.320175 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rj9ln" Feb 18 12:38:28 crc kubenswrapper[4922]: I0218 12:38:28.320173 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rj9ln" event={"ID":"9802c743-deb5-4b6c-9484-2336cb49265c","Type":"ContainerDied","Data":"8c12b89aeab1b8e1855ba7f24cf663759741c9a00c975fe91969456b730c9126"} Feb 18 12:38:28 crc kubenswrapper[4922]: I0218 12:38:28.320315 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rj9ln" event={"ID":"9802c743-deb5-4b6c-9484-2336cb49265c","Type":"ContainerDied","Data":"3eb4a03c2248714dc926c7d21c85f1bf14169f529c1694e23cb56a243ebfe742"} Feb 18 12:38:28 crc kubenswrapper[4922]: I0218 12:38:28.320341 4922 scope.go:117] "RemoveContainer" containerID="8c12b89aeab1b8e1855ba7f24cf663759741c9a00c975fe91969456b730c9126" Feb 18 12:38:28 crc kubenswrapper[4922]: I0218 12:38:28.343854 4922 scope.go:117] "RemoveContainer" containerID="801d1c22b45cb2e89fe95fe3ea38c34d7591eb2afb03c53f75e7344cc3e55fcf" Feb 18 12:38:28 crc kubenswrapper[4922]: I0218 12:38:28.364753 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rj9ln"] Feb 18 12:38:28 crc kubenswrapper[4922]: I0218 12:38:28.379043 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rj9ln"] Feb 18 12:38:28 crc kubenswrapper[4922]: I0218 12:38:28.380496 4922 scope.go:117] "RemoveContainer" containerID="0fdda781022745ae094779b7a89ad71538c6121b5be91abadd095ef46c0b037f" Feb 18 12:38:28 crc kubenswrapper[4922]: I0218 12:38:28.420055 4922 scope.go:117] "RemoveContainer" containerID="8c12b89aeab1b8e1855ba7f24cf663759741c9a00c975fe91969456b730c9126" Feb 18 12:38:28 crc kubenswrapper[4922]: E0218 12:38:28.420523 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c12b89aeab1b8e1855ba7f24cf663759741c9a00c975fe91969456b730c9126\": container with ID starting with 8c12b89aeab1b8e1855ba7f24cf663759741c9a00c975fe91969456b730c9126 not found: ID does not exist" containerID="8c12b89aeab1b8e1855ba7f24cf663759741c9a00c975fe91969456b730c9126" Feb 18 12:38:28 crc kubenswrapper[4922]: I0218 12:38:28.420564 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c12b89aeab1b8e1855ba7f24cf663759741c9a00c975fe91969456b730c9126"} err="failed to get container status \"8c12b89aeab1b8e1855ba7f24cf663759741c9a00c975fe91969456b730c9126\": rpc error: code = NotFound desc = could not find container \"8c12b89aeab1b8e1855ba7f24cf663759741c9a00c975fe91969456b730c9126\": container with ID starting with 8c12b89aeab1b8e1855ba7f24cf663759741c9a00c975fe91969456b730c9126 not found: ID does not exist" Feb 18 12:38:28 crc kubenswrapper[4922]: I0218 12:38:28.420591 4922 scope.go:117] "RemoveContainer" containerID="801d1c22b45cb2e89fe95fe3ea38c34d7591eb2afb03c53f75e7344cc3e55fcf" Feb 18 12:38:28 crc kubenswrapper[4922]: E0218 12:38:28.420871 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"801d1c22b45cb2e89fe95fe3ea38c34d7591eb2afb03c53f75e7344cc3e55fcf\": container with ID starting with 801d1c22b45cb2e89fe95fe3ea38c34d7591eb2afb03c53f75e7344cc3e55fcf not found: ID does not exist" containerID="801d1c22b45cb2e89fe95fe3ea38c34d7591eb2afb03c53f75e7344cc3e55fcf" Feb 18 12:38:28 crc kubenswrapper[4922]: I0218 12:38:28.420942 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"801d1c22b45cb2e89fe95fe3ea38c34d7591eb2afb03c53f75e7344cc3e55fcf"} err="failed to get container status \"801d1c22b45cb2e89fe95fe3ea38c34d7591eb2afb03c53f75e7344cc3e55fcf\": rpc error: code = NotFound desc = could not find container \"801d1c22b45cb2e89fe95fe3ea38c34d7591eb2afb03c53f75e7344cc3e55fcf\": container with ID starting with 801d1c22b45cb2e89fe95fe3ea38c34d7591eb2afb03c53f75e7344cc3e55fcf not found: ID does not exist" Feb 18 12:38:28 crc kubenswrapper[4922]: I0218 12:38:28.420984 4922 scope.go:117] "RemoveContainer" containerID="0fdda781022745ae094779b7a89ad71538c6121b5be91abadd095ef46c0b037f" Feb 18 12:38:28 crc kubenswrapper[4922]: E0218 12:38:28.421404 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fdda781022745ae094779b7a89ad71538c6121b5be91abadd095ef46c0b037f\": container with ID starting with 0fdda781022745ae094779b7a89ad71538c6121b5be91abadd095ef46c0b037f not found: ID does not exist" containerID="0fdda781022745ae094779b7a89ad71538c6121b5be91abadd095ef46c0b037f" Feb 18 12:38:28 crc kubenswrapper[4922]: I0218 12:38:28.421437 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fdda781022745ae094779b7a89ad71538c6121b5be91abadd095ef46c0b037f"} err="failed to get container status \"0fdda781022745ae094779b7a89ad71538c6121b5be91abadd095ef46c0b037f\": rpc error: code = NotFound desc = could not find container \"0fdda781022745ae094779b7a89ad71538c6121b5be91abadd095ef46c0b037f\": container with ID starting with 0fdda781022745ae094779b7a89ad71538c6121b5be91abadd095ef46c0b037f not found: ID does not exist" Feb 18 12:38:28 crc kubenswrapper[4922]: I0218 12:38:28.987275 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9802c743-deb5-4b6c-9484-2336cb49265c" path="/var/lib/kubelet/pods/9802c743-deb5-4b6c-9484-2336cb49265c/volumes" Feb 18 12:39:39 crc kubenswrapper[4922]: I0218 12:39:39.807584 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 12:39:39 crc kubenswrapper[4922]: I0218 12:39:39.808188 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 12:40:09 crc kubenswrapper[4922]: I0218 12:40:09.808123 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 12:40:09 crc kubenswrapper[4922]: I0218 12:40:09.808951 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 12:40:39 crc kubenswrapper[4922]: I0218 12:40:39.806941 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 12:40:39 crc kubenswrapper[4922]: I0218 12:40:39.808228 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 12:40:39 crc kubenswrapper[4922]: I0218 12:40:39.808610 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-znglx" Feb 18 12:40:39 crc kubenswrapper[4922]: I0218 12:40:39.809675 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"da2f31195c0d03b655b84a936da99ef08c103ba9b2f411b408e8d74bdaa66a62"} pod="openshift-machine-config-operator/machine-config-daemon-znglx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 12:40:39 crc kubenswrapper[4922]: I0218 12:40:39.809756 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" containerID="cri-o://da2f31195c0d03b655b84a936da99ef08c103ba9b2f411b408e8d74bdaa66a62" gracePeriod=600 Feb 18 12:40:39 crc kubenswrapper[4922]: E0218 12:40:39.941726 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:40:40 crc kubenswrapper[4922]: I0218 12:40:40.515076 4922 generic.go:334] "Generic (PLEG): container finished" podID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerID="da2f31195c0d03b655b84a936da99ef08c103ba9b2f411b408e8d74bdaa66a62" exitCode=0 Feb 18 12:40:40 crc kubenswrapper[4922]: I0218 12:40:40.515153 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" event={"ID":"fdb7cedc-b2e3-48f0-80e0-e17073b43228","Type":"ContainerDied","Data":"da2f31195c0d03b655b84a936da99ef08c103ba9b2f411b408e8d74bdaa66a62"} Feb 18 12:40:40 crc kubenswrapper[4922]: I0218 12:40:40.515503 4922 scope.go:117] "RemoveContainer" containerID="617209a8a29981601b193fadb47bd31c1969eee3e663e9025cd5b23991442654" Feb 18 12:40:40 crc kubenswrapper[4922]: I0218 12:40:40.516200 4922 scope.go:117] "RemoveContainer" containerID="da2f31195c0d03b655b84a936da99ef08c103ba9b2f411b408e8d74bdaa66a62" Feb 18 12:40:40 crc kubenswrapper[4922]: E0218 12:40:40.516507 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:40:54 crc kubenswrapper[4922]: I0218 12:40:54.973676 4922 scope.go:117] "RemoveContainer" containerID="da2f31195c0d03b655b84a936da99ef08c103ba9b2f411b408e8d74bdaa66a62" Feb 18 12:40:54 crc kubenswrapper[4922]: E0218 12:40:54.974544 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:41:05 crc kubenswrapper[4922]: I0218 12:41:05.973255 4922 scope.go:117] "RemoveContainer" containerID="da2f31195c0d03b655b84a936da99ef08c103ba9b2f411b408e8d74bdaa66a62" Feb 18 12:41:05 crc kubenswrapper[4922]: E0218 12:41:05.974090 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:41:18 crc kubenswrapper[4922]: I0218 12:41:18.979634 4922 scope.go:117] "RemoveContainer" containerID="da2f31195c0d03b655b84a936da99ef08c103ba9b2f411b408e8d74bdaa66a62" Feb 18 12:41:18 crc kubenswrapper[4922]: E0218 12:41:18.980575 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:41:32 crc kubenswrapper[4922]: I0218 12:41:32.973001 4922 scope.go:117] "RemoveContainer" containerID="da2f31195c0d03b655b84a936da99ef08c103ba9b2f411b408e8d74bdaa66a62" Feb 18 12:41:32 crc kubenswrapper[4922]: E0218 12:41:32.973778 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:41:46 crc kubenswrapper[4922]: I0218 12:41:46.976306 4922 scope.go:117] "RemoveContainer" containerID="da2f31195c0d03b655b84a936da99ef08c103ba9b2f411b408e8d74bdaa66a62" Feb 18 12:41:46 crc kubenswrapper[4922]: E0218 12:41:46.977326 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:42:00 crc kubenswrapper[4922]: I0218 12:42:00.973270 4922 scope.go:117] "RemoveContainer" containerID="da2f31195c0d03b655b84a936da99ef08c103ba9b2f411b408e8d74bdaa66a62" Feb 18 12:42:00 crc kubenswrapper[4922]: E0218 12:42:00.974055 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:42:15 crc kubenswrapper[4922]: I0218 12:42:15.974196 4922 scope.go:117] "RemoveContainer" containerID="da2f31195c0d03b655b84a936da99ef08c103ba9b2f411b408e8d74bdaa66a62" Feb 18 12:42:15 crc kubenswrapper[4922]: E0218 12:42:15.975066 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:42:28 crc kubenswrapper[4922]: I0218 12:42:28.981143 4922 scope.go:117] "RemoveContainer" containerID="da2f31195c0d03b655b84a936da99ef08c103ba9b2f411b408e8d74bdaa66a62" Feb 18 12:42:28 crc kubenswrapper[4922]: E0218 12:42:28.982067 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:42:42 crc kubenswrapper[4922]: I0218 12:42:42.251915 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-clkkl"] Feb 18 12:42:42 crc kubenswrapper[4922]: E0218 12:42:42.253032 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9802c743-deb5-4b6c-9484-2336cb49265c" containerName="extract-content" Feb 18 12:42:42 crc kubenswrapper[4922]: I0218 12:42:42.253051 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="9802c743-deb5-4b6c-9484-2336cb49265c" containerName="extract-content" Feb 18 12:42:42 crc kubenswrapper[4922]: E0218 12:42:42.253090 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9802c743-deb5-4b6c-9484-2336cb49265c" containerName="registry-server" Feb 18 12:42:42 crc kubenswrapper[4922]: I0218 12:42:42.253098 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="9802c743-deb5-4b6c-9484-2336cb49265c" containerName="registry-server" Feb 18 12:42:42 crc kubenswrapper[4922]: E0218 12:42:42.253110 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9802c743-deb5-4b6c-9484-2336cb49265c" containerName="extract-utilities" Feb 18 12:42:42 crc kubenswrapper[4922]: I0218 12:42:42.253119 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="9802c743-deb5-4b6c-9484-2336cb49265c" containerName="extract-utilities" Feb 18 12:42:42 crc kubenswrapper[4922]: I0218 12:42:42.253334 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="9802c743-deb5-4b6c-9484-2336cb49265c" containerName="registry-server" Feb 18 12:42:42 crc kubenswrapper[4922]: I0218 12:42:42.255115 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-clkkl" Feb 18 12:42:42 crc kubenswrapper[4922]: I0218 12:42:42.264056 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-clkkl"] Feb 18 12:42:42 crc kubenswrapper[4922]: I0218 12:42:42.326180 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63579133-a220-43c4-a314-288bc8f38929-catalog-content\") pod \"redhat-operators-clkkl\" (UID: \"63579133-a220-43c4-a314-288bc8f38929\") " pod="openshift-marketplace/redhat-operators-clkkl" Feb 18 12:42:42 crc kubenswrapper[4922]: I0218 12:42:42.326571 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xsfp\" (UniqueName: \"kubernetes.io/projected/63579133-a220-43c4-a314-288bc8f38929-kube-api-access-5xsfp\") pod \"redhat-operators-clkkl\" (UID: \"63579133-a220-43c4-a314-288bc8f38929\") " pod="openshift-marketplace/redhat-operators-clkkl" Feb 18 12:42:42 crc kubenswrapper[4922]: I0218 12:42:42.326792 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63579133-a220-43c4-a314-288bc8f38929-utilities\") pod \"redhat-operators-clkkl\" (UID: \"63579133-a220-43c4-a314-288bc8f38929\") " pod="openshift-marketplace/redhat-operators-clkkl" Feb 18 12:42:42 crc kubenswrapper[4922]: I0218 12:42:42.429570 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xsfp\" (UniqueName: \"kubernetes.io/projected/63579133-a220-43c4-a314-288bc8f38929-kube-api-access-5xsfp\") pod \"redhat-operators-clkkl\" (UID: \"63579133-a220-43c4-a314-288bc8f38929\") " pod="openshift-marketplace/redhat-operators-clkkl" Feb 18 12:42:42 crc kubenswrapper[4922]: I0218 12:42:42.429700 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63579133-a220-43c4-a314-288bc8f38929-utilities\") pod \"redhat-operators-clkkl\" (UID: \"63579133-a220-43c4-a314-288bc8f38929\") " pod="openshift-marketplace/redhat-operators-clkkl" Feb 18 12:42:42 crc kubenswrapper[4922]: I0218 12:42:42.429877 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63579133-a220-43c4-a314-288bc8f38929-catalog-content\") pod \"redhat-operators-clkkl\" (UID: \"63579133-a220-43c4-a314-288bc8f38929\") " pod="openshift-marketplace/redhat-operators-clkkl" Feb 18 12:42:42 crc kubenswrapper[4922]: I0218 12:42:42.430351 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63579133-a220-43c4-a314-288bc8f38929-utilities\") pod \"redhat-operators-clkkl\" (UID: \"63579133-a220-43c4-a314-288bc8f38929\") " pod="openshift-marketplace/redhat-operators-clkkl" Feb 18 12:42:42 crc kubenswrapper[4922]: I0218 12:42:42.430393 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63579133-a220-43c4-a314-288bc8f38929-catalog-content\") pod \"redhat-operators-clkkl\" (UID: \"63579133-a220-43c4-a314-288bc8f38929\") " pod="openshift-marketplace/redhat-operators-clkkl" Feb 18 12:42:42 crc kubenswrapper[4922]: I0218 12:42:42.452429 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xsfp\" (UniqueName: \"kubernetes.io/projected/63579133-a220-43c4-a314-288bc8f38929-kube-api-access-5xsfp\") pod \"redhat-operators-clkkl\" (UID: \"63579133-a220-43c4-a314-288bc8f38929\") " pod="openshift-marketplace/redhat-operators-clkkl" Feb 18 12:42:42 crc kubenswrapper[4922]: I0218 12:42:42.580561 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-clkkl" Feb 18 12:42:42 crc kubenswrapper[4922]: I0218 12:42:42.973550 4922 scope.go:117] "RemoveContainer" containerID="da2f31195c0d03b655b84a936da99ef08c103ba9b2f411b408e8d74bdaa66a62" Feb 18 12:42:42 crc kubenswrapper[4922]: E0218 12:42:42.974203 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:42:43 crc kubenswrapper[4922]: I0218 12:42:43.083055 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-clkkl"] Feb 18 12:42:43 crc kubenswrapper[4922]: I0218 12:42:43.569007 4922 generic.go:334] "Generic (PLEG): container finished" podID="63579133-a220-43c4-a314-288bc8f38929" containerID="e24ebb2ed1b8046bc9f6b6143fa1ec40cc72dffd5ac252d890717327c4135663" exitCode=0 Feb 18 12:42:43 crc kubenswrapper[4922]: I0218 12:42:43.569053 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-clkkl" event={"ID":"63579133-a220-43c4-a314-288bc8f38929","Type":"ContainerDied","Data":"e24ebb2ed1b8046bc9f6b6143fa1ec40cc72dffd5ac252d890717327c4135663"} Feb 18 12:42:43 crc kubenswrapper[4922]: I0218 12:42:43.569077 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-clkkl" event={"ID":"63579133-a220-43c4-a314-288bc8f38929","Type":"ContainerStarted","Data":"dd225ac6793d793b66eff5b5c2ffd76a38a665620014f94d05e48321aa82ad41"} Feb 18 12:42:45 crc kubenswrapper[4922]: I0218 12:42:45.590476 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-clkkl" event={"ID":"63579133-a220-43c4-a314-288bc8f38929","Type":"ContainerStarted","Data":"196ebfcf38d00312eef2d80b81e8ac6fe059cf735f5738aecf10ce0fc9c83221"} Feb 18 12:42:53 crc kubenswrapper[4922]: I0218 12:42:53.682482 4922 generic.go:334] "Generic (PLEG): container finished" podID="63579133-a220-43c4-a314-288bc8f38929" containerID="196ebfcf38d00312eef2d80b81e8ac6fe059cf735f5738aecf10ce0fc9c83221" exitCode=0 Feb 18 12:42:53 crc kubenswrapper[4922]: I0218 12:42:53.682562 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-clkkl" event={"ID":"63579133-a220-43c4-a314-288bc8f38929","Type":"ContainerDied","Data":"196ebfcf38d00312eef2d80b81e8ac6fe059cf735f5738aecf10ce0fc9c83221"} Feb 18 12:42:54 crc kubenswrapper[4922]: I0218 12:42:54.693837 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-clkkl" event={"ID":"63579133-a220-43c4-a314-288bc8f38929","Type":"ContainerStarted","Data":"a9ca6240108d56cc66c67300c6c6370746b70144120b1ea6d67266a50c71dd99"} Feb 18 12:42:54 crc kubenswrapper[4922]: I0218 12:42:54.713049 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-clkkl" podStartSLOduration=2.202911295 podStartE2EDuration="12.71302573s" podCreationTimestamp="2026-02-18 12:42:42 +0000 UTC" firstStartedPulling="2026-02-18 12:42:43.57176106 +0000 UTC m=+3965.299465140" lastFinishedPulling="2026-02-18 12:42:54.081875495 +0000 UTC m=+3975.809579575" observedRunningTime="2026-02-18 12:42:54.711355078 +0000 UTC m=+3976.439059168" watchObservedRunningTime="2026-02-18 12:42:54.71302573 +0000 UTC m=+3976.440729810" Feb 18 12:42:56 crc kubenswrapper[4922]: I0218 12:42:56.974514 4922 scope.go:117] "RemoveContainer" containerID="da2f31195c0d03b655b84a936da99ef08c103ba9b2f411b408e8d74bdaa66a62" Feb 18 12:42:56 crc kubenswrapper[4922]: E0218 12:42:56.975221 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:43:02 crc kubenswrapper[4922]: I0218 12:43:02.581493 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-clkkl" Feb 18 12:43:02 crc kubenswrapper[4922]: I0218 12:43:02.582544 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-clkkl" Feb 18 12:43:02 crc kubenswrapper[4922]: I0218 12:43:02.629881 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-clkkl" Feb 18 12:43:02 crc kubenswrapper[4922]: I0218 12:43:02.809491 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-clkkl" Feb 18 12:43:02 crc kubenswrapper[4922]: I0218 12:43:02.863878 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-clkkl"] Feb 18 12:43:04 crc kubenswrapper[4922]: I0218 12:43:04.779574 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-clkkl" podUID="63579133-a220-43c4-a314-288bc8f38929" containerName="registry-server" containerID="cri-o://a9ca6240108d56cc66c67300c6c6370746b70144120b1ea6d67266a50c71dd99" gracePeriod=2 Feb 18 12:43:05 crc kubenswrapper[4922]: I0218 12:43:05.276729 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-clkkl" Feb 18 12:43:05 crc kubenswrapper[4922]: I0218 12:43:05.369669 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63579133-a220-43c4-a314-288bc8f38929-utilities\") pod \"63579133-a220-43c4-a314-288bc8f38929\" (UID: \"63579133-a220-43c4-a314-288bc8f38929\") " Feb 18 12:43:05 crc kubenswrapper[4922]: I0218 12:43:05.369819 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xsfp\" (UniqueName: \"kubernetes.io/projected/63579133-a220-43c4-a314-288bc8f38929-kube-api-access-5xsfp\") pod \"63579133-a220-43c4-a314-288bc8f38929\" (UID: \"63579133-a220-43c4-a314-288bc8f38929\") " Feb 18 12:43:05 crc kubenswrapper[4922]: I0218 12:43:05.369853 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63579133-a220-43c4-a314-288bc8f38929-catalog-content\") pod \"63579133-a220-43c4-a314-288bc8f38929\" (UID: \"63579133-a220-43c4-a314-288bc8f38929\") " Feb 18 12:43:05 crc kubenswrapper[4922]: I0218 12:43:05.370699 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63579133-a220-43c4-a314-288bc8f38929-utilities" (OuterVolumeSpecName: "utilities") pod "63579133-a220-43c4-a314-288bc8f38929" (UID: "63579133-a220-43c4-a314-288bc8f38929"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:43:05 crc kubenswrapper[4922]: I0218 12:43:05.375642 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63579133-a220-43c4-a314-288bc8f38929-kube-api-access-5xsfp" (OuterVolumeSpecName: "kube-api-access-5xsfp") pod "63579133-a220-43c4-a314-288bc8f38929" (UID: "63579133-a220-43c4-a314-288bc8f38929"). InnerVolumeSpecName "kube-api-access-5xsfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:43:05 crc kubenswrapper[4922]: I0218 12:43:05.471610 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xsfp\" (UniqueName: \"kubernetes.io/projected/63579133-a220-43c4-a314-288bc8f38929-kube-api-access-5xsfp\") on node \"crc\" DevicePath \"\"" Feb 18 12:43:05 crc kubenswrapper[4922]: I0218 12:43:05.471651 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63579133-a220-43c4-a314-288bc8f38929-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 12:43:05 crc kubenswrapper[4922]: I0218 12:43:05.502994 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63579133-a220-43c4-a314-288bc8f38929-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "63579133-a220-43c4-a314-288bc8f38929" (UID: "63579133-a220-43c4-a314-288bc8f38929"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:43:05 crc kubenswrapper[4922]: I0218 12:43:05.573470 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63579133-a220-43c4-a314-288bc8f38929-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 12:43:05 crc kubenswrapper[4922]: I0218 12:43:05.792068 4922 generic.go:334] "Generic (PLEG): container finished" podID="63579133-a220-43c4-a314-288bc8f38929" containerID="a9ca6240108d56cc66c67300c6c6370746b70144120b1ea6d67266a50c71dd99" exitCode=0 Feb 18 12:43:05 crc kubenswrapper[4922]: I0218 12:43:05.792109 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-clkkl" Feb 18 12:43:05 crc kubenswrapper[4922]: I0218 12:43:05.792124 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-clkkl" event={"ID":"63579133-a220-43c4-a314-288bc8f38929","Type":"ContainerDied","Data":"a9ca6240108d56cc66c67300c6c6370746b70144120b1ea6d67266a50c71dd99"} Feb 18 12:43:05 crc kubenswrapper[4922]: I0218 12:43:05.792164 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-clkkl" event={"ID":"63579133-a220-43c4-a314-288bc8f38929","Type":"ContainerDied","Data":"dd225ac6793d793b66eff5b5c2ffd76a38a665620014f94d05e48321aa82ad41"} Feb 18 12:43:05 crc kubenswrapper[4922]: I0218 12:43:05.792180 4922 scope.go:117] "RemoveContainer" containerID="a9ca6240108d56cc66c67300c6c6370746b70144120b1ea6d67266a50c71dd99" Feb 18 12:43:05 crc kubenswrapper[4922]: I0218 12:43:05.815410 4922 scope.go:117] "RemoveContainer" containerID="196ebfcf38d00312eef2d80b81e8ac6fe059cf735f5738aecf10ce0fc9c83221" Feb 18 12:43:05 crc kubenswrapper[4922]: I0218 12:43:05.826080 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-clkkl"] Feb 18 12:43:05 crc kubenswrapper[4922]: I0218 12:43:05.836383 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-clkkl"] Feb 18 12:43:06 crc kubenswrapper[4922]: I0218 12:43:06.074302 4922 scope.go:117] "RemoveContainer" containerID="e24ebb2ed1b8046bc9f6b6143fa1ec40cc72dffd5ac252d890717327c4135663" Feb 18 12:43:06 crc kubenswrapper[4922]: I0218 12:43:06.321151 4922 scope.go:117] "RemoveContainer" containerID="a9ca6240108d56cc66c67300c6c6370746b70144120b1ea6d67266a50c71dd99" Feb 18 12:43:06 crc kubenswrapper[4922]: E0218 12:43:06.321786 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9ca6240108d56cc66c67300c6c6370746b70144120b1ea6d67266a50c71dd99\": container with ID starting with a9ca6240108d56cc66c67300c6c6370746b70144120b1ea6d67266a50c71dd99 not found: ID does not exist" containerID="a9ca6240108d56cc66c67300c6c6370746b70144120b1ea6d67266a50c71dd99" Feb 18 12:43:06 crc kubenswrapper[4922]: I0218 12:43:06.321832 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9ca6240108d56cc66c67300c6c6370746b70144120b1ea6d67266a50c71dd99"} err="failed to get container status \"a9ca6240108d56cc66c67300c6c6370746b70144120b1ea6d67266a50c71dd99\": rpc error: code = NotFound desc = could not find container \"a9ca6240108d56cc66c67300c6c6370746b70144120b1ea6d67266a50c71dd99\": container with ID starting with a9ca6240108d56cc66c67300c6c6370746b70144120b1ea6d67266a50c71dd99 not found: ID does not exist" Feb 18 12:43:06 crc kubenswrapper[4922]: I0218 12:43:06.321898 4922 scope.go:117] "RemoveContainer" containerID="196ebfcf38d00312eef2d80b81e8ac6fe059cf735f5738aecf10ce0fc9c83221" Feb 18 12:43:06 crc kubenswrapper[4922]: E0218 12:43:06.322228 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"196ebfcf38d00312eef2d80b81e8ac6fe059cf735f5738aecf10ce0fc9c83221\": container with ID starting with 196ebfcf38d00312eef2d80b81e8ac6fe059cf735f5738aecf10ce0fc9c83221 not found: ID does not exist" containerID="196ebfcf38d00312eef2d80b81e8ac6fe059cf735f5738aecf10ce0fc9c83221" Feb 18 12:43:06 crc kubenswrapper[4922]: I0218 12:43:06.322260 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"196ebfcf38d00312eef2d80b81e8ac6fe059cf735f5738aecf10ce0fc9c83221"} err="failed to get container status \"196ebfcf38d00312eef2d80b81e8ac6fe059cf735f5738aecf10ce0fc9c83221\": rpc error: code = NotFound desc = could not find container \"196ebfcf38d00312eef2d80b81e8ac6fe059cf735f5738aecf10ce0fc9c83221\": container with ID starting with 196ebfcf38d00312eef2d80b81e8ac6fe059cf735f5738aecf10ce0fc9c83221 not found: ID does not exist" Feb 18 12:43:06 crc kubenswrapper[4922]: I0218 12:43:06.322283 4922 scope.go:117] "RemoveContainer" containerID="e24ebb2ed1b8046bc9f6b6143fa1ec40cc72dffd5ac252d890717327c4135663" Feb 18 12:43:06 crc kubenswrapper[4922]: E0218 12:43:06.322569 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e24ebb2ed1b8046bc9f6b6143fa1ec40cc72dffd5ac252d890717327c4135663\": container with ID starting with e24ebb2ed1b8046bc9f6b6143fa1ec40cc72dffd5ac252d890717327c4135663 not found: ID does not exist" containerID="e24ebb2ed1b8046bc9f6b6143fa1ec40cc72dffd5ac252d890717327c4135663" Feb 18 12:43:06 crc kubenswrapper[4922]: I0218 12:43:06.322600 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e24ebb2ed1b8046bc9f6b6143fa1ec40cc72dffd5ac252d890717327c4135663"} err="failed to get container status \"e24ebb2ed1b8046bc9f6b6143fa1ec40cc72dffd5ac252d890717327c4135663\": rpc error: code = NotFound desc = could not find container \"e24ebb2ed1b8046bc9f6b6143fa1ec40cc72dffd5ac252d890717327c4135663\": container with ID starting with e24ebb2ed1b8046bc9f6b6143fa1ec40cc72dffd5ac252d890717327c4135663 not found: ID does not exist" Feb 18 12:43:06 crc kubenswrapper[4922]: I0218 12:43:06.984423 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63579133-a220-43c4-a314-288bc8f38929" path="/var/lib/kubelet/pods/63579133-a220-43c4-a314-288bc8f38929/volumes" Feb 18 12:43:09 crc kubenswrapper[4922]: I0218 12:43:09.973845 4922 scope.go:117] "RemoveContainer" containerID="da2f31195c0d03b655b84a936da99ef08c103ba9b2f411b408e8d74bdaa66a62" Feb 18 12:43:09 crc kubenswrapper[4922]: E0218 12:43:09.975752 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:43:24 crc kubenswrapper[4922]: I0218 12:43:24.974032 4922 scope.go:117] "RemoveContainer" containerID="da2f31195c0d03b655b84a936da99ef08c103ba9b2f411b408e8d74bdaa66a62" Feb 18 12:43:24 crc kubenswrapper[4922]: E0218 12:43:24.974999 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:43:35 crc kubenswrapper[4922]: I0218 12:43:35.974341 4922 scope.go:117] "RemoveContainer" containerID="da2f31195c0d03b655b84a936da99ef08c103ba9b2f411b408e8d74bdaa66a62" Feb 18 12:43:35 crc kubenswrapper[4922]: E0218 12:43:35.975270 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:43:46 crc kubenswrapper[4922]: I0218 12:43:46.974725 4922 scope.go:117] "RemoveContainer" containerID="da2f31195c0d03b655b84a936da99ef08c103ba9b2f411b408e8d74bdaa66a62" Feb 18 12:43:46 crc kubenswrapper[4922]: E0218 12:43:46.975616 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:44:00 crc kubenswrapper[4922]: I0218 12:44:00.972900 4922 scope.go:117] "RemoveContainer" containerID="da2f31195c0d03b655b84a936da99ef08c103ba9b2f411b408e8d74bdaa66a62" Feb 18 12:44:00 crc kubenswrapper[4922]: E0218 12:44:00.973817 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:44:12 crc kubenswrapper[4922]: I0218 12:44:12.973681 4922 scope.go:117] "RemoveContainer" containerID="da2f31195c0d03b655b84a936da99ef08c103ba9b2f411b408e8d74bdaa66a62" Feb 18 12:44:12 crc kubenswrapper[4922]: E0218 12:44:12.975235 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:44:24 crc kubenswrapper[4922]: I0218 12:44:24.973654 4922 scope.go:117] "RemoveContainer" containerID="da2f31195c0d03b655b84a936da99ef08c103ba9b2f411b408e8d74bdaa66a62" Feb 18 12:44:24 crc kubenswrapper[4922]: E0218 12:44:24.974618 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:44:36 crc kubenswrapper[4922]: I0218 12:44:36.972952 4922 scope.go:117] "RemoveContainer" containerID="da2f31195c0d03b655b84a936da99ef08c103ba9b2f411b408e8d74bdaa66a62" Feb 18 12:44:36 crc kubenswrapper[4922]: E0218 12:44:36.973801 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:44:49 crc kubenswrapper[4922]: I0218 12:44:49.973026 4922 scope.go:117] "RemoveContainer" containerID="da2f31195c0d03b655b84a936da99ef08c103ba9b2f411b408e8d74bdaa66a62" Feb 18 12:44:49 crc kubenswrapper[4922]: E0218 12:44:49.973885 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:45:00 crc kubenswrapper[4922]: I0218 12:45:00.181419 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523645-wb6x6"] Feb 18 12:45:00 crc kubenswrapper[4922]: E0218 12:45:00.183015 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63579133-a220-43c4-a314-288bc8f38929" containerName="registry-server" Feb 18 12:45:00 crc kubenswrapper[4922]: I0218 12:45:00.183036 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="63579133-a220-43c4-a314-288bc8f38929" containerName="registry-server" Feb 18 12:45:00 crc kubenswrapper[4922]: E0218 12:45:00.183084 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63579133-a220-43c4-a314-288bc8f38929" containerName="extract-content" Feb 18 12:45:00 crc kubenswrapper[4922]: I0218 12:45:00.183091 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="63579133-a220-43c4-a314-288bc8f38929" containerName="extract-content" Feb 18 12:45:00 crc kubenswrapper[4922]: E0218 12:45:00.183114 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63579133-a220-43c4-a314-288bc8f38929" containerName="extract-utilities" Feb 18 12:45:00 crc kubenswrapper[4922]: I0218 12:45:00.183123 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="63579133-a220-43c4-a314-288bc8f38929" containerName="extract-utilities" Feb 18 12:45:00 crc kubenswrapper[4922]: I0218 12:45:00.183453 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="63579133-a220-43c4-a314-288bc8f38929" containerName="registry-server" Feb 18 12:45:00 crc kubenswrapper[4922]: I0218 12:45:00.184538 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523645-wb6x6" Feb 18 12:45:00 crc kubenswrapper[4922]: I0218 12:45:00.187434 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 18 12:45:00 crc kubenswrapper[4922]: I0218 12:45:00.187712 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 18 12:45:00 crc kubenswrapper[4922]: I0218 12:45:00.193235 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523645-wb6x6"] Feb 18 12:45:00 crc kubenswrapper[4922]: I0218 12:45:00.314774 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/69f484e4-9c27-40bd-86e6-774e5b7d6b34-secret-volume\") pod \"collect-profiles-29523645-wb6x6\" (UID: \"69f484e4-9c27-40bd-86e6-774e5b7d6b34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523645-wb6x6" Feb 18 12:45:00 crc kubenswrapper[4922]: I0218 12:45:00.315139 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/69f484e4-9c27-40bd-86e6-774e5b7d6b34-config-volume\") pod \"collect-profiles-29523645-wb6x6\" (UID: \"69f484e4-9c27-40bd-86e6-774e5b7d6b34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523645-wb6x6" Feb 18 12:45:00 crc kubenswrapper[4922]: I0218 12:45:00.315249 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf7sg\" (UniqueName: \"kubernetes.io/projected/69f484e4-9c27-40bd-86e6-774e5b7d6b34-kube-api-access-gf7sg\") pod \"collect-profiles-29523645-wb6x6\" (UID: \"69f484e4-9c27-40bd-86e6-774e5b7d6b34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523645-wb6x6" Feb 18 12:45:00 crc kubenswrapper[4922]: I0218 12:45:00.416826 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gf7sg\" (UniqueName: \"kubernetes.io/projected/69f484e4-9c27-40bd-86e6-774e5b7d6b34-kube-api-access-gf7sg\") pod \"collect-profiles-29523645-wb6x6\" (UID: \"69f484e4-9c27-40bd-86e6-774e5b7d6b34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523645-wb6x6" Feb 18 12:45:00 crc kubenswrapper[4922]: I0218 12:45:00.417000 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/69f484e4-9c27-40bd-86e6-774e5b7d6b34-secret-volume\") pod \"collect-profiles-29523645-wb6x6\" (UID: \"69f484e4-9c27-40bd-86e6-774e5b7d6b34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523645-wb6x6" Feb 18 12:45:00 crc kubenswrapper[4922]: I0218 12:45:00.417060 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/69f484e4-9c27-40bd-86e6-774e5b7d6b34-config-volume\") pod \"collect-profiles-29523645-wb6x6\" (UID: \"69f484e4-9c27-40bd-86e6-774e5b7d6b34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523645-wb6x6" Feb 18 12:45:00 crc kubenswrapper[4922]: I0218 12:45:00.418624 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/69f484e4-9c27-40bd-86e6-774e5b7d6b34-config-volume\") pod \"collect-profiles-29523645-wb6x6\" (UID: \"69f484e4-9c27-40bd-86e6-774e5b7d6b34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523645-wb6x6" Feb 18 12:45:00 crc kubenswrapper[4922]: I0218 12:45:00.424635 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/69f484e4-9c27-40bd-86e6-774e5b7d6b34-secret-volume\") pod \"collect-profiles-29523645-wb6x6\" (UID: \"69f484e4-9c27-40bd-86e6-774e5b7d6b34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523645-wb6x6" Feb 18 12:45:00 crc kubenswrapper[4922]: I0218 12:45:00.436178 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf7sg\" (UniqueName: \"kubernetes.io/projected/69f484e4-9c27-40bd-86e6-774e5b7d6b34-kube-api-access-gf7sg\") pod \"collect-profiles-29523645-wb6x6\" (UID: \"69f484e4-9c27-40bd-86e6-774e5b7d6b34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523645-wb6x6" Feb 18 12:45:00 crc kubenswrapper[4922]: I0218 12:45:00.514484 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523645-wb6x6" Feb 18 12:45:01 crc kubenswrapper[4922]: I0218 12:45:01.031663 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523645-wb6x6"] Feb 18 12:45:01 crc kubenswrapper[4922]: I0218 12:45:01.833683 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523645-wb6x6" event={"ID":"69f484e4-9c27-40bd-86e6-774e5b7d6b34","Type":"ContainerStarted","Data":"caa6534508d40a94ab11fe87cec89638773cf6fb1ccd43b4e9b6095ba98cd870"} Feb 18 12:45:01 crc kubenswrapper[4922]: I0218 12:45:01.834291 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523645-wb6x6" event={"ID":"69f484e4-9c27-40bd-86e6-774e5b7d6b34","Type":"ContainerStarted","Data":"3535708cc41f15d330adc08d8f9e5ae99755591c5c4890fa3645cc353e975da1"} Feb 18 12:45:01 crc kubenswrapper[4922]: I0218 12:45:01.852004 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29523645-wb6x6" podStartSLOduration=1.851987152 podStartE2EDuration="1.851987152s" podCreationTimestamp="2026-02-18 12:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 12:45:01.848912924 +0000 UTC m=+4103.576617004" watchObservedRunningTime="2026-02-18 12:45:01.851987152 +0000 UTC m=+4103.579691232" Feb 18 12:45:02 crc kubenswrapper[4922]: I0218 12:45:02.846786 4922 generic.go:334] "Generic (PLEG): container finished" podID="69f484e4-9c27-40bd-86e6-774e5b7d6b34" containerID="caa6534508d40a94ab11fe87cec89638773cf6fb1ccd43b4e9b6095ba98cd870" exitCode=0 Feb 18 12:45:02 crc kubenswrapper[4922]: I0218 12:45:02.846908 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523645-wb6x6" event={"ID":"69f484e4-9c27-40bd-86e6-774e5b7d6b34","Type":"ContainerDied","Data":"caa6534508d40a94ab11fe87cec89638773cf6fb1ccd43b4e9b6095ba98cd870"} Feb 18 12:45:02 crc kubenswrapper[4922]: I0218 12:45:02.973693 4922 scope.go:117] "RemoveContainer" containerID="da2f31195c0d03b655b84a936da99ef08c103ba9b2f411b408e8d74bdaa66a62" Feb 18 12:45:02 crc kubenswrapper[4922]: E0218 12:45:02.974630 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:45:04 crc kubenswrapper[4922]: I0218 12:45:04.269865 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523645-wb6x6" Feb 18 12:45:04 crc kubenswrapper[4922]: I0218 12:45:04.400035 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf7sg\" (UniqueName: \"kubernetes.io/projected/69f484e4-9c27-40bd-86e6-774e5b7d6b34-kube-api-access-gf7sg\") pod \"69f484e4-9c27-40bd-86e6-774e5b7d6b34\" (UID: \"69f484e4-9c27-40bd-86e6-774e5b7d6b34\") " Feb 18 12:45:04 crc kubenswrapper[4922]: I0218 12:45:04.400097 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/69f484e4-9c27-40bd-86e6-774e5b7d6b34-config-volume\") pod \"69f484e4-9c27-40bd-86e6-774e5b7d6b34\" (UID: \"69f484e4-9c27-40bd-86e6-774e5b7d6b34\") " Feb 18 12:45:04 crc kubenswrapper[4922]: I0218 12:45:04.400250 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/69f484e4-9c27-40bd-86e6-774e5b7d6b34-secret-volume\") pod \"69f484e4-9c27-40bd-86e6-774e5b7d6b34\" (UID: \"69f484e4-9c27-40bd-86e6-774e5b7d6b34\") " Feb 18 12:45:04 crc kubenswrapper[4922]: I0218 12:45:04.400814 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69f484e4-9c27-40bd-86e6-774e5b7d6b34-config-volume" (OuterVolumeSpecName: "config-volume") pod "69f484e4-9c27-40bd-86e6-774e5b7d6b34" (UID: "69f484e4-9c27-40bd-86e6-774e5b7d6b34"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 12:45:04 crc kubenswrapper[4922]: I0218 12:45:04.406165 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69f484e4-9c27-40bd-86e6-774e5b7d6b34-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "69f484e4-9c27-40bd-86e6-774e5b7d6b34" (UID: "69f484e4-9c27-40bd-86e6-774e5b7d6b34"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:45:04 crc kubenswrapper[4922]: I0218 12:45:04.406289 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69f484e4-9c27-40bd-86e6-774e5b7d6b34-kube-api-access-gf7sg" (OuterVolumeSpecName: "kube-api-access-gf7sg") pod "69f484e4-9c27-40bd-86e6-774e5b7d6b34" (UID: "69f484e4-9c27-40bd-86e6-774e5b7d6b34"). InnerVolumeSpecName "kube-api-access-gf7sg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:45:04 crc kubenswrapper[4922]: I0218 12:45:04.502425 4922 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/69f484e4-9c27-40bd-86e6-774e5b7d6b34-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 18 12:45:04 crc kubenswrapper[4922]: I0218 12:45:04.502460 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf7sg\" (UniqueName: \"kubernetes.io/projected/69f484e4-9c27-40bd-86e6-774e5b7d6b34-kube-api-access-gf7sg\") on node \"crc\" DevicePath \"\"" Feb 18 12:45:04 crc kubenswrapper[4922]: I0218 12:45:04.502471 4922 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/69f484e4-9c27-40bd-86e6-774e5b7d6b34-config-volume\") on node \"crc\" DevicePath \"\"" Feb 18 12:45:04 crc kubenswrapper[4922]: I0218 12:45:04.868352 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523645-wb6x6" event={"ID":"69f484e4-9c27-40bd-86e6-774e5b7d6b34","Type":"ContainerDied","Data":"3535708cc41f15d330adc08d8f9e5ae99755591c5c4890fa3645cc353e975da1"} Feb 18 12:45:04 crc kubenswrapper[4922]: I0218 12:45:04.868429 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3535708cc41f15d330adc08d8f9e5ae99755591c5c4890fa3645cc353e975da1" Feb 18 12:45:04 crc kubenswrapper[4922]: I0218 12:45:04.868659 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523645-wb6x6" Feb 18 12:45:04 crc kubenswrapper[4922]: I0218 12:45:04.949099 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523600-9hg97"] Feb 18 12:45:04 crc kubenswrapper[4922]: I0218 12:45:04.957168 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523600-9hg97"] Feb 18 12:45:04 crc kubenswrapper[4922]: I0218 12:45:04.986123 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee2dabc9-c094-41a8-8efd-7b113f5c634c" path="/var/lib/kubelet/pods/ee2dabc9-c094-41a8-8efd-7b113f5c634c/volumes" Feb 18 12:45:13 crc kubenswrapper[4922]: I0218 12:45:13.833791 4922 scope.go:117] "RemoveContainer" containerID="8bef9aa4b92aba91322be4b15768a495bfe0d2b031bccfdb47f0999ccd8a7508" Feb 18 12:45:17 crc kubenswrapper[4922]: I0218 12:45:17.973939 4922 scope.go:117] "RemoveContainer" containerID="da2f31195c0d03b655b84a936da99ef08c103ba9b2f411b408e8d74bdaa66a62" Feb 18 12:45:17 crc kubenswrapper[4922]: E0218 12:45:17.975955 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:45:32 crc kubenswrapper[4922]: I0218 12:45:32.973100 4922 scope.go:117] "RemoveContainer" containerID="da2f31195c0d03b655b84a936da99ef08c103ba9b2f411b408e8d74bdaa66a62" Feb 18 12:45:32 crc kubenswrapper[4922]: E0218 12:45:32.974066 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:45:38 crc kubenswrapper[4922]: I0218 12:45:38.288080 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gsthg"] Feb 18 12:45:38 crc kubenswrapper[4922]: E0218 12:45:38.289011 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69f484e4-9c27-40bd-86e6-774e5b7d6b34" containerName="collect-profiles" Feb 18 12:45:38 crc kubenswrapper[4922]: I0218 12:45:38.289024 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="69f484e4-9c27-40bd-86e6-774e5b7d6b34" containerName="collect-profiles" Feb 18 12:45:38 crc kubenswrapper[4922]: I0218 12:45:38.289286 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="69f484e4-9c27-40bd-86e6-774e5b7d6b34" containerName="collect-profiles" Feb 18 12:45:38 crc kubenswrapper[4922]: I0218 12:45:38.290593 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gsthg" Feb 18 12:45:38 crc kubenswrapper[4922]: I0218 12:45:38.298429 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gsthg"] Feb 18 12:45:38 crc kubenswrapper[4922]: I0218 12:45:38.388680 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c2d2657-497c-4512-97ff-be630635c1df-catalog-content\") pod \"redhat-marketplace-gsthg\" (UID: \"0c2d2657-497c-4512-97ff-be630635c1df\") " pod="openshift-marketplace/redhat-marketplace-gsthg" Feb 18 12:45:38 crc kubenswrapper[4922]: I0218 12:45:38.388758 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c2d2657-497c-4512-97ff-be630635c1df-utilities\") pod \"redhat-marketplace-gsthg\" (UID: \"0c2d2657-497c-4512-97ff-be630635c1df\") " pod="openshift-marketplace/redhat-marketplace-gsthg" Feb 18 12:45:38 crc kubenswrapper[4922]: I0218 12:45:38.388904 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh589\" (UniqueName: \"kubernetes.io/projected/0c2d2657-497c-4512-97ff-be630635c1df-kube-api-access-nh589\") pod \"redhat-marketplace-gsthg\" (UID: \"0c2d2657-497c-4512-97ff-be630635c1df\") " pod="openshift-marketplace/redhat-marketplace-gsthg" Feb 18 12:45:38 crc kubenswrapper[4922]: I0218 12:45:38.490168 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh589\" (UniqueName: \"kubernetes.io/projected/0c2d2657-497c-4512-97ff-be630635c1df-kube-api-access-nh589\") pod \"redhat-marketplace-gsthg\" (UID: \"0c2d2657-497c-4512-97ff-be630635c1df\") " pod="openshift-marketplace/redhat-marketplace-gsthg" Feb 18 12:45:38 crc kubenswrapper[4922]: I0218 12:45:38.490307 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c2d2657-497c-4512-97ff-be630635c1df-catalog-content\") pod \"redhat-marketplace-gsthg\" (UID: \"0c2d2657-497c-4512-97ff-be630635c1df\") " pod="openshift-marketplace/redhat-marketplace-gsthg" Feb 18 12:45:38 crc kubenswrapper[4922]: I0218 12:45:38.490349 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c2d2657-497c-4512-97ff-be630635c1df-utilities\") pod \"redhat-marketplace-gsthg\" (UID: \"0c2d2657-497c-4512-97ff-be630635c1df\") " pod="openshift-marketplace/redhat-marketplace-gsthg" Feb 18 12:45:38 crc kubenswrapper[4922]: I0218 12:45:38.490873 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c2d2657-497c-4512-97ff-be630635c1df-catalog-content\") pod \"redhat-marketplace-gsthg\" (UID: \"0c2d2657-497c-4512-97ff-be630635c1df\") " pod="openshift-marketplace/redhat-marketplace-gsthg" Feb 18 12:45:38 crc kubenswrapper[4922]: I0218 12:45:38.490922 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c2d2657-497c-4512-97ff-be630635c1df-utilities\") pod \"redhat-marketplace-gsthg\" (UID: \"0c2d2657-497c-4512-97ff-be630635c1df\") " pod="openshift-marketplace/redhat-marketplace-gsthg" Feb 18 12:45:38 crc kubenswrapper[4922]: I0218 12:45:38.862962 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh589\" (UniqueName: \"kubernetes.io/projected/0c2d2657-497c-4512-97ff-be630635c1df-kube-api-access-nh589\") pod \"redhat-marketplace-gsthg\" (UID: \"0c2d2657-497c-4512-97ff-be630635c1df\") " pod="openshift-marketplace/redhat-marketplace-gsthg" Feb 18 12:45:38 crc kubenswrapper[4922]: I0218 12:45:38.918250 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gsthg" Feb 18 12:45:39 crc kubenswrapper[4922]: I0218 12:45:39.574780 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gsthg"] Feb 18 12:45:40 crc kubenswrapper[4922]: I0218 12:45:40.176130 4922 generic.go:334] "Generic (PLEG): container finished" podID="0c2d2657-497c-4512-97ff-be630635c1df" containerID="d7f8d5c5fec09a56bccab7f38514244874b62e4137de81a91c7d5b473889adec" exitCode=0 Feb 18 12:45:40 crc kubenswrapper[4922]: I0218 12:45:40.176181 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gsthg" event={"ID":"0c2d2657-497c-4512-97ff-be630635c1df","Type":"ContainerDied","Data":"d7f8d5c5fec09a56bccab7f38514244874b62e4137de81a91c7d5b473889adec"} Feb 18 12:45:40 crc kubenswrapper[4922]: I0218 12:45:40.176601 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gsthg" event={"ID":"0c2d2657-497c-4512-97ff-be630635c1df","Type":"ContainerStarted","Data":"bcd76dcd25dca3e3cfdb75b80d337acdf50960fd0a2d8689d631660c1968d8d1"} Feb 18 12:45:40 crc kubenswrapper[4922]: I0218 12:45:40.178784 4922 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 12:45:41 crc kubenswrapper[4922]: I0218 12:45:41.477272 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tnmx6"] Feb 18 12:45:41 crc kubenswrapper[4922]: I0218 12:45:41.479654 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tnmx6" Feb 18 12:45:41 crc kubenswrapper[4922]: I0218 12:45:41.487895 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tnmx6"] Feb 18 12:45:41 crc kubenswrapper[4922]: I0218 12:45:41.550847 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c710512-6ce5-40e7-9085-70e8516bb4c2-utilities\") pod \"certified-operators-tnmx6\" (UID: \"8c710512-6ce5-40e7-9085-70e8516bb4c2\") " pod="openshift-marketplace/certified-operators-tnmx6" Feb 18 12:45:41 crc kubenswrapper[4922]: I0218 12:45:41.550957 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-js5v8\" (UniqueName: \"kubernetes.io/projected/8c710512-6ce5-40e7-9085-70e8516bb4c2-kube-api-access-js5v8\") pod \"certified-operators-tnmx6\" (UID: \"8c710512-6ce5-40e7-9085-70e8516bb4c2\") " pod="openshift-marketplace/certified-operators-tnmx6" Feb 18 12:45:41 crc kubenswrapper[4922]: I0218 12:45:41.550983 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c710512-6ce5-40e7-9085-70e8516bb4c2-catalog-content\") pod \"certified-operators-tnmx6\" (UID: \"8c710512-6ce5-40e7-9085-70e8516bb4c2\") " pod="openshift-marketplace/certified-operators-tnmx6" Feb 18 12:45:41 crc kubenswrapper[4922]: I0218 12:45:41.653184 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c710512-6ce5-40e7-9085-70e8516bb4c2-utilities\") pod \"certified-operators-tnmx6\" (UID: \"8c710512-6ce5-40e7-9085-70e8516bb4c2\") " pod="openshift-marketplace/certified-operators-tnmx6" Feb 18 12:45:41 crc kubenswrapper[4922]: I0218 12:45:41.653279 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-js5v8\" (UniqueName: \"kubernetes.io/projected/8c710512-6ce5-40e7-9085-70e8516bb4c2-kube-api-access-js5v8\") pod \"certified-operators-tnmx6\" (UID: \"8c710512-6ce5-40e7-9085-70e8516bb4c2\") " pod="openshift-marketplace/certified-operators-tnmx6" Feb 18 12:45:41 crc kubenswrapper[4922]: I0218 12:45:41.653312 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c710512-6ce5-40e7-9085-70e8516bb4c2-catalog-content\") pod \"certified-operators-tnmx6\" (UID: \"8c710512-6ce5-40e7-9085-70e8516bb4c2\") " pod="openshift-marketplace/certified-operators-tnmx6" Feb 18 12:45:41 crc kubenswrapper[4922]: I0218 12:45:41.653713 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c710512-6ce5-40e7-9085-70e8516bb4c2-utilities\") pod \"certified-operators-tnmx6\" (UID: \"8c710512-6ce5-40e7-9085-70e8516bb4c2\") " pod="openshift-marketplace/certified-operators-tnmx6" Feb 18 12:45:41 crc kubenswrapper[4922]: I0218 12:45:41.653979 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c710512-6ce5-40e7-9085-70e8516bb4c2-catalog-content\") pod \"certified-operators-tnmx6\" (UID: \"8c710512-6ce5-40e7-9085-70e8516bb4c2\") " pod="openshift-marketplace/certified-operators-tnmx6" Feb 18 12:45:41 crc kubenswrapper[4922]: I0218 12:45:41.671118 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-js5v8\" (UniqueName: \"kubernetes.io/projected/8c710512-6ce5-40e7-9085-70e8516bb4c2-kube-api-access-js5v8\") pod \"certified-operators-tnmx6\" (UID: \"8c710512-6ce5-40e7-9085-70e8516bb4c2\") " pod="openshift-marketplace/certified-operators-tnmx6" Feb 18 12:45:41 crc kubenswrapper[4922]: I0218 12:45:41.801940 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tnmx6" Feb 18 12:45:42 crc kubenswrapper[4922]: I0218 12:45:42.366780 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tnmx6"] Feb 18 12:45:42 crc kubenswrapper[4922]: W0218 12:45:42.367183 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c710512_6ce5_40e7_9085_70e8516bb4c2.slice/crio-749290a5cf2aaed7b05cc10724ddebd9c7dc542bda1f66814d677749f6e69a95 WatchSource:0}: Error finding container 749290a5cf2aaed7b05cc10724ddebd9c7dc542bda1f66814d677749f6e69a95: Status 404 returned error can't find the container with id 749290a5cf2aaed7b05cc10724ddebd9c7dc542bda1f66814d677749f6e69a95 Feb 18 12:45:43 crc kubenswrapper[4922]: I0218 12:45:43.205242 4922 generic.go:334] "Generic (PLEG): container finished" podID="8c710512-6ce5-40e7-9085-70e8516bb4c2" containerID="af8772c92370f86d738fd284221d23aa9beb9592dab1e6643aaeed0fa10bef0f" exitCode=0 Feb 18 12:45:43 crc kubenswrapper[4922]: I0218 12:45:43.205317 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tnmx6" event={"ID":"8c710512-6ce5-40e7-9085-70e8516bb4c2","Type":"ContainerDied","Data":"af8772c92370f86d738fd284221d23aa9beb9592dab1e6643aaeed0fa10bef0f"} Feb 18 12:45:43 crc kubenswrapper[4922]: I0218 12:45:43.205618 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tnmx6" event={"ID":"8c710512-6ce5-40e7-9085-70e8516bb4c2","Type":"ContainerStarted","Data":"749290a5cf2aaed7b05cc10724ddebd9c7dc542bda1f66814d677749f6e69a95"} Feb 18 12:45:43 crc kubenswrapper[4922]: I0218 12:45:43.973731 4922 scope.go:117] "RemoveContainer" containerID="da2f31195c0d03b655b84a936da99ef08c103ba9b2f411b408e8d74bdaa66a62" Feb 18 12:45:44 crc kubenswrapper[4922]: I0218 12:45:44.218402 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tnmx6" event={"ID":"8c710512-6ce5-40e7-9085-70e8516bb4c2","Type":"ContainerStarted","Data":"a9eb7565239981ac85c892177039fdb2c36b50119f7e6d6a757e15e00ae601b9"} Feb 18 12:45:44 crc kubenswrapper[4922]: I0218 12:45:44.222075 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" event={"ID":"fdb7cedc-b2e3-48f0-80e0-e17073b43228","Type":"ContainerStarted","Data":"1d138caa9f48794399f333864aee478905e60e95ed598a5f227130509c079fc0"} Feb 18 12:45:46 crc kubenswrapper[4922]: I0218 12:45:46.239869 4922 generic.go:334] "Generic (PLEG): container finished" podID="8c710512-6ce5-40e7-9085-70e8516bb4c2" containerID="a9eb7565239981ac85c892177039fdb2c36b50119f7e6d6a757e15e00ae601b9" exitCode=0 Feb 18 12:45:46 crc kubenswrapper[4922]: I0218 12:45:46.239961 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tnmx6" event={"ID":"8c710512-6ce5-40e7-9085-70e8516bb4c2","Type":"ContainerDied","Data":"a9eb7565239981ac85c892177039fdb2c36b50119f7e6d6a757e15e00ae601b9"} Feb 18 12:45:49 crc kubenswrapper[4922]: I0218 12:45:49.272884 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tnmx6" event={"ID":"8c710512-6ce5-40e7-9085-70e8516bb4c2","Type":"ContainerStarted","Data":"3c1e6e3c93bfe3283d4e4bfd6869e425e9c364ada040ff5a5aaf099ba396f03c"} Feb 18 12:45:49 crc kubenswrapper[4922]: I0218 12:45:49.276463 4922 generic.go:334] "Generic (PLEG): container finished" podID="0c2d2657-497c-4512-97ff-be630635c1df" containerID="e5a1a8b18d8a432f0c72383a517720388154e5174281804915b9ff3434226f96" exitCode=0 Feb 18 12:45:49 crc kubenswrapper[4922]: I0218 12:45:49.276508 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gsthg" event={"ID":"0c2d2657-497c-4512-97ff-be630635c1df","Type":"ContainerDied","Data":"e5a1a8b18d8a432f0c72383a517720388154e5174281804915b9ff3434226f96"} Feb 18 12:45:49 crc kubenswrapper[4922]: I0218 12:45:49.296103 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tnmx6" podStartSLOduration=3.357383961 podStartE2EDuration="8.296085445s" podCreationTimestamp="2026-02-18 12:45:41 +0000 UTC" firstStartedPulling="2026-02-18 12:45:43.207204529 +0000 UTC m=+4144.934908619" lastFinishedPulling="2026-02-18 12:45:48.145906023 +0000 UTC m=+4149.873610103" observedRunningTime="2026-02-18 12:45:49.294592177 +0000 UTC m=+4151.022296257" watchObservedRunningTime="2026-02-18 12:45:49.296085445 +0000 UTC m=+4151.023789525" Feb 18 12:45:50 crc kubenswrapper[4922]: I0218 12:45:50.286486 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gsthg" event={"ID":"0c2d2657-497c-4512-97ff-be630635c1df","Type":"ContainerStarted","Data":"af2124d50d4cd1091518ef7487ec2bd809da05b2c838beb01b80578e23f35293"} Feb 18 12:45:50 crc kubenswrapper[4922]: I0218 12:45:50.306769 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gsthg" podStartSLOduration=2.704796949 podStartE2EDuration="12.306748343s" podCreationTimestamp="2026-02-18 12:45:38 +0000 UTC" firstStartedPulling="2026-02-18 12:45:40.1785515 +0000 UTC m=+4141.906255580" lastFinishedPulling="2026-02-18 12:45:49.780502894 +0000 UTC m=+4151.508206974" observedRunningTime="2026-02-18 12:45:50.303187493 +0000 UTC m=+4152.030891593" watchObservedRunningTime="2026-02-18 12:45:50.306748343 +0000 UTC m=+4152.034452423" Feb 18 12:45:51 crc kubenswrapper[4922]: I0218 12:45:51.802558 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tnmx6" Feb 18 12:45:51 crc kubenswrapper[4922]: I0218 12:45:51.803141 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tnmx6" Feb 18 12:45:51 crc kubenswrapper[4922]: I0218 12:45:51.851953 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tnmx6" Feb 18 12:45:58 crc kubenswrapper[4922]: I0218 12:45:58.919278 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gsthg" Feb 18 12:45:58 crc kubenswrapper[4922]: I0218 12:45:58.919800 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gsthg" Feb 18 12:45:58 crc kubenswrapper[4922]: I0218 12:45:58.966427 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gsthg" Feb 18 12:45:59 crc kubenswrapper[4922]: I0218 12:45:59.407581 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gsthg" Feb 18 12:45:59 crc kubenswrapper[4922]: I0218 12:45:59.471197 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gsthg"] Feb 18 12:45:59 crc kubenswrapper[4922]: I0218 12:45:59.513671 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6bqhb"] Feb 18 12:45:59 crc kubenswrapper[4922]: I0218 12:45:59.513899 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6bqhb" podUID="f50f60ee-09dd-45e2-aab0-384f2ff99b7d" containerName="registry-server" containerID="cri-o://3c30617879326ec9fb7f35dddf404540d84efb8a598a6b6c0ad31087cd8073f1" gracePeriod=2 Feb 18 12:46:00 crc kubenswrapper[4922]: I0218 12:46:00.376350 4922 generic.go:334] "Generic (PLEG): container finished" podID="f50f60ee-09dd-45e2-aab0-384f2ff99b7d" containerID="3c30617879326ec9fb7f35dddf404540d84efb8a598a6b6c0ad31087cd8073f1" exitCode=0 Feb 18 12:46:00 crc kubenswrapper[4922]: I0218 12:46:00.376403 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6bqhb" event={"ID":"f50f60ee-09dd-45e2-aab0-384f2ff99b7d","Type":"ContainerDied","Data":"3c30617879326ec9fb7f35dddf404540d84efb8a598a6b6c0ad31087cd8073f1"} Feb 18 12:46:00 crc kubenswrapper[4922]: I0218 12:46:00.682432 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6bqhb" Feb 18 12:46:00 crc kubenswrapper[4922]: I0218 12:46:00.722935 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q98tn\" (UniqueName: \"kubernetes.io/projected/f50f60ee-09dd-45e2-aab0-384f2ff99b7d-kube-api-access-q98tn\") pod \"f50f60ee-09dd-45e2-aab0-384f2ff99b7d\" (UID: \"f50f60ee-09dd-45e2-aab0-384f2ff99b7d\") " Feb 18 12:46:00 crc kubenswrapper[4922]: I0218 12:46:00.723062 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f50f60ee-09dd-45e2-aab0-384f2ff99b7d-utilities\") pod \"f50f60ee-09dd-45e2-aab0-384f2ff99b7d\" (UID: \"f50f60ee-09dd-45e2-aab0-384f2ff99b7d\") " Feb 18 12:46:00 crc kubenswrapper[4922]: I0218 12:46:00.723220 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f50f60ee-09dd-45e2-aab0-384f2ff99b7d-catalog-content\") pod \"f50f60ee-09dd-45e2-aab0-384f2ff99b7d\" (UID: \"f50f60ee-09dd-45e2-aab0-384f2ff99b7d\") " Feb 18 12:46:00 crc kubenswrapper[4922]: I0218 12:46:00.723829 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f50f60ee-09dd-45e2-aab0-384f2ff99b7d-utilities" (OuterVolumeSpecName: "utilities") pod "f50f60ee-09dd-45e2-aab0-384f2ff99b7d" (UID: "f50f60ee-09dd-45e2-aab0-384f2ff99b7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:46:00 crc kubenswrapper[4922]: I0218 12:46:00.724278 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f50f60ee-09dd-45e2-aab0-384f2ff99b7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 12:46:00 crc kubenswrapper[4922]: I0218 12:46:00.729695 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f50f60ee-09dd-45e2-aab0-384f2ff99b7d-kube-api-access-q98tn" (OuterVolumeSpecName: "kube-api-access-q98tn") pod "f50f60ee-09dd-45e2-aab0-384f2ff99b7d" (UID: "f50f60ee-09dd-45e2-aab0-384f2ff99b7d"). InnerVolumeSpecName "kube-api-access-q98tn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:46:00 crc kubenswrapper[4922]: I0218 12:46:00.750185 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f50f60ee-09dd-45e2-aab0-384f2ff99b7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f50f60ee-09dd-45e2-aab0-384f2ff99b7d" (UID: "f50f60ee-09dd-45e2-aab0-384f2ff99b7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:46:00 crc kubenswrapper[4922]: I0218 12:46:00.826191 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f50f60ee-09dd-45e2-aab0-384f2ff99b7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 12:46:00 crc kubenswrapper[4922]: I0218 12:46:00.826233 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q98tn\" (UniqueName: \"kubernetes.io/projected/f50f60ee-09dd-45e2-aab0-384f2ff99b7d-kube-api-access-q98tn\") on node \"crc\" DevicePath \"\"" Feb 18 12:46:01 crc kubenswrapper[4922]: I0218 12:46:01.393779 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6bqhb" Feb 18 12:46:01 crc kubenswrapper[4922]: I0218 12:46:01.393765 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6bqhb" event={"ID":"f50f60ee-09dd-45e2-aab0-384f2ff99b7d","Type":"ContainerDied","Data":"81c45efcdac9362802de58dd27bf893197de86f7ebd3df5f164f632b261368cc"} Feb 18 12:46:01 crc kubenswrapper[4922]: I0218 12:46:01.393927 4922 scope.go:117] "RemoveContainer" containerID="3c30617879326ec9fb7f35dddf404540d84efb8a598a6b6c0ad31087cd8073f1" Feb 18 12:46:01 crc kubenswrapper[4922]: I0218 12:46:01.418541 4922 scope.go:117] "RemoveContainer" containerID="dbaac2d048b1bf28c0a544d56094e4190ceeb213bd8b9b6ab6c58e5f9ca98a21" Feb 18 12:46:01 crc kubenswrapper[4922]: I0218 12:46:01.420065 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6bqhb"] Feb 18 12:46:01 crc kubenswrapper[4922]: I0218 12:46:01.429975 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6bqhb"] Feb 18 12:46:01 crc kubenswrapper[4922]: I0218 12:46:01.445332 4922 scope.go:117] "RemoveContainer" containerID="66cf4f94781e4ece125829fc4a1a5acf7beefaa52399d35c3cf834cf5448be6c" Feb 18 12:46:01 crc kubenswrapper[4922]: I0218 12:46:01.851028 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tnmx6" Feb 18 12:46:02 crc kubenswrapper[4922]: I0218 12:46:02.982987 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f50f60ee-09dd-45e2-aab0-384f2ff99b7d" path="/var/lib/kubelet/pods/f50f60ee-09dd-45e2-aab0-384f2ff99b7d/volumes" Feb 18 12:46:04 crc kubenswrapper[4922]: I0218 12:46:04.208856 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tnmx6"] Feb 18 12:46:04 crc kubenswrapper[4922]: I0218 12:46:04.209349 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tnmx6" podUID="8c710512-6ce5-40e7-9085-70e8516bb4c2" containerName="registry-server" containerID="cri-o://3c1e6e3c93bfe3283d4e4bfd6869e425e9c364ada040ff5a5aaf099ba396f03c" gracePeriod=2 Feb 18 12:46:04 crc kubenswrapper[4922]: I0218 12:46:04.424319 4922 generic.go:334] "Generic (PLEG): container finished" podID="8c710512-6ce5-40e7-9085-70e8516bb4c2" containerID="3c1e6e3c93bfe3283d4e4bfd6869e425e9c364ada040ff5a5aaf099ba396f03c" exitCode=0 Feb 18 12:46:04 crc kubenswrapper[4922]: I0218 12:46:04.424376 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tnmx6" event={"ID":"8c710512-6ce5-40e7-9085-70e8516bb4c2","Type":"ContainerDied","Data":"3c1e6e3c93bfe3283d4e4bfd6869e425e9c364ada040ff5a5aaf099ba396f03c"} Feb 18 12:46:04 crc kubenswrapper[4922]: I0218 12:46:04.724014 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tnmx6" Feb 18 12:46:04 crc kubenswrapper[4922]: I0218 12:46:04.806992 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c710512-6ce5-40e7-9085-70e8516bb4c2-utilities\") pod \"8c710512-6ce5-40e7-9085-70e8516bb4c2\" (UID: \"8c710512-6ce5-40e7-9085-70e8516bb4c2\") " Feb 18 12:46:04 crc kubenswrapper[4922]: I0218 12:46:04.807061 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c710512-6ce5-40e7-9085-70e8516bb4c2-catalog-content\") pod \"8c710512-6ce5-40e7-9085-70e8516bb4c2\" (UID: \"8c710512-6ce5-40e7-9085-70e8516bb4c2\") " Feb 18 12:46:04 crc kubenswrapper[4922]: I0218 12:46:04.807183 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-js5v8\" (UniqueName: \"kubernetes.io/projected/8c710512-6ce5-40e7-9085-70e8516bb4c2-kube-api-access-js5v8\") pod \"8c710512-6ce5-40e7-9085-70e8516bb4c2\" (UID: \"8c710512-6ce5-40e7-9085-70e8516bb4c2\") " Feb 18 12:46:04 crc kubenswrapper[4922]: I0218 12:46:04.807526 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c710512-6ce5-40e7-9085-70e8516bb4c2-utilities" (OuterVolumeSpecName: "utilities") pod "8c710512-6ce5-40e7-9085-70e8516bb4c2" (UID: "8c710512-6ce5-40e7-9085-70e8516bb4c2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:46:04 crc kubenswrapper[4922]: I0218 12:46:04.807859 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c710512-6ce5-40e7-9085-70e8516bb4c2-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 12:46:04 crc kubenswrapper[4922]: I0218 12:46:04.814602 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c710512-6ce5-40e7-9085-70e8516bb4c2-kube-api-access-js5v8" (OuterVolumeSpecName: "kube-api-access-js5v8") pod "8c710512-6ce5-40e7-9085-70e8516bb4c2" (UID: "8c710512-6ce5-40e7-9085-70e8516bb4c2"). InnerVolumeSpecName "kube-api-access-js5v8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:46:04 crc kubenswrapper[4922]: I0218 12:46:04.863398 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c710512-6ce5-40e7-9085-70e8516bb4c2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8c710512-6ce5-40e7-9085-70e8516bb4c2" (UID: "8c710512-6ce5-40e7-9085-70e8516bb4c2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:46:04 crc kubenswrapper[4922]: I0218 12:46:04.909045 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c710512-6ce5-40e7-9085-70e8516bb4c2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 12:46:04 crc kubenswrapper[4922]: I0218 12:46:04.909074 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-js5v8\" (UniqueName: \"kubernetes.io/projected/8c710512-6ce5-40e7-9085-70e8516bb4c2-kube-api-access-js5v8\") on node \"crc\" DevicePath \"\"" Feb 18 12:46:05 crc kubenswrapper[4922]: I0218 12:46:05.436204 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tnmx6" event={"ID":"8c710512-6ce5-40e7-9085-70e8516bb4c2","Type":"ContainerDied","Data":"749290a5cf2aaed7b05cc10724ddebd9c7dc542bda1f66814d677749f6e69a95"} Feb 18 12:46:05 crc kubenswrapper[4922]: I0218 12:46:05.436560 4922 scope.go:117] "RemoveContainer" containerID="3c1e6e3c93bfe3283d4e4bfd6869e425e9c364ada040ff5a5aaf099ba396f03c" Feb 18 12:46:05 crc kubenswrapper[4922]: I0218 12:46:05.436284 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tnmx6" Feb 18 12:46:05 crc kubenswrapper[4922]: I0218 12:46:05.456798 4922 scope.go:117] "RemoveContainer" containerID="a9eb7565239981ac85c892177039fdb2c36b50119f7e6d6a757e15e00ae601b9" Feb 18 12:46:05 crc kubenswrapper[4922]: I0218 12:46:05.456799 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tnmx6"] Feb 18 12:46:05 crc kubenswrapper[4922]: I0218 12:46:05.465228 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tnmx6"] Feb 18 12:46:05 crc kubenswrapper[4922]: I0218 12:46:05.476578 4922 scope.go:117] "RemoveContainer" containerID="af8772c92370f86d738fd284221d23aa9beb9592dab1e6643aaeed0fa10bef0f" Feb 18 12:46:06 crc kubenswrapper[4922]: I0218 12:46:06.990566 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c710512-6ce5-40e7-9085-70e8516bb4c2" path="/var/lib/kubelet/pods/8c710512-6ce5-40e7-9085-70e8516bb4c2/volumes" Feb 18 12:48:09 crc kubenswrapper[4922]: I0218 12:48:09.807624 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 12:48:09 crc kubenswrapper[4922]: I0218 12:48:09.808764 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 12:48:39 crc kubenswrapper[4922]: I0218 12:48:39.807082 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 12:48:39 crc kubenswrapper[4922]: I0218 12:48:39.807680 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 12:48:40 crc kubenswrapper[4922]: I0218 12:48:40.520578 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bx9gn"] Feb 18 12:48:40 crc kubenswrapper[4922]: E0218 12:48:40.521436 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c710512-6ce5-40e7-9085-70e8516bb4c2" containerName="extract-utilities" Feb 18 12:48:40 crc kubenswrapper[4922]: I0218 12:48:40.521464 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c710512-6ce5-40e7-9085-70e8516bb4c2" containerName="extract-utilities" Feb 18 12:48:40 crc kubenswrapper[4922]: E0218 12:48:40.521483 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f50f60ee-09dd-45e2-aab0-384f2ff99b7d" containerName="registry-server" Feb 18 12:48:40 crc kubenswrapper[4922]: I0218 12:48:40.521491 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="f50f60ee-09dd-45e2-aab0-384f2ff99b7d" containerName="registry-server" Feb 18 12:48:40 crc kubenswrapper[4922]: E0218 12:48:40.521503 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c710512-6ce5-40e7-9085-70e8516bb4c2" containerName="extract-content" Feb 18 12:48:40 crc kubenswrapper[4922]: I0218 12:48:40.521509 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c710512-6ce5-40e7-9085-70e8516bb4c2" containerName="extract-content" Feb 18 12:48:40 crc kubenswrapper[4922]: E0218 12:48:40.521547 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f50f60ee-09dd-45e2-aab0-384f2ff99b7d" containerName="extract-utilities" Feb 18 12:48:40 crc kubenswrapper[4922]: I0218 12:48:40.521555 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="f50f60ee-09dd-45e2-aab0-384f2ff99b7d" containerName="extract-utilities" Feb 18 12:48:40 crc kubenswrapper[4922]: E0218 12:48:40.521564 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f50f60ee-09dd-45e2-aab0-384f2ff99b7d" containerName="extract-content" Feb 18 12:48:40 crc kubenswrapper[4922]: I0218 12:48:40.521571 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="f50f60ee-09dd-45e2-aab0-384f2ff99b7d" containerName="extract-content" Feb 18 12:48:40 crc kubenswrapper[4922]: E0218 12:48:40.521584 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c710512-6ce5-40e7-9085-70e8516bb4c2" containerName="registry-server" Feb 18 12:48:40 crc kubenswrapper[4922]: I0218 12:48:40.521592 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c710512-6ce5-40e7-9085-70e8516bb4c2" containerName="registry-server" Feb 18 12:48:40 crc kubenswrapper[4922]: I0218 12:48:40.521828 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="f50f60ee-09dd-45e2-aab0-384f2ff99b7d" containerName="registry-server" Feb 18 12:48:40 crc kubenswrapper[4922]: I0218 12:48:40.521866 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c710512-6ce5-40e7-9085-70e8516bb4c2" containerName="registry-server" Feb 18 12:48:40 crc kubenswrapper[4922]: I0218 12:48:40.523655 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bx9gn" Feb 18 12:48:40 crc kubenswrapper[4922]: I0218 12:48:40.532766 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bx9gn"] Feb 18 12:48:40 crc kubenswrapper[4922]: I0218 12:48:40.648838 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2259fa1-5d48-4af4-95a8-248656995677-catalog-content\") pod \"community-operators-bx9gn\" (UID: \"d2259fa1-5d48-4af4-95a8-248656995677\") " pod="openshift-marketplace/community-operators-bx9gn" Feb 18 12:48:40 crc kubenswrapper[4922]: I0218 12:48:40.648939 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcql8\" (UniqueName: \"kubernetes.io/projected/d2259fa1-5d48-4af4-95a8-248656995677-kube-api-access-kcql8\") pod \"community-operators-bx9gn\" (UID: \"d2259fa1-5d48-4af4-95a8-248656995677\") " pod="openshift-marketplace/community-operators-bx9gn" Feb 18 12:48:40 crc kubenswrapper[4922]: I0218 12:48:40.648958 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2259fa1-5d48-4af4-95a8-248656995677-utilities\") pod \"community-operators-bx9gn\" (UID: \"d2259fa1-5d48-4af4-95a8-248656995677\") " pod="openshift-marketplace/community-operators-bx9gn" Feb 18 12:48:40 crc kubenswrapper[4922]: I0218 12:48:40.751462 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcql8\" (UniqueName: \"kubernetes.io/projected/d2259fa1-5d48-4af4-95a8-248656995677-kube-api-access-kcql8\") pod \"community-operators-bx9gn\" (UID: \"d2259fa1-5d48-4af4-95a8-248656995677\") " pod="openshift-marketplace/community-operators-bx9gn" Feb 18 12:48:40 crc kubenswrapper[4922]: I0218 12:48:40.751507 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2259fa1-5d48-4af4-95a8-248656995677-utilities\") pod \"community-operators-bx9gn\" (UID: \"d2259fa1-5d48-4af4-95a8-248656995677\") " pod="openshift-marketplace/community-operators-bx9gn" Feb 18 12:48:40 crc kubenswrapper[4922]: I0218 12:48:40.751650 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2259fa1-5d48-4af4-95a8-248656995677-catalog-content\") pod \"community-operators-bx9gn\" (UID: \"d2259fa1-5d48-4af4-95a8-248656995677\") " pod="openshift-marketplace/community-operators-bx9gn" Feb 18 12:48:40 crc kubenswrapper[4922]: I0218 12:48:40.752175 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2259fa1-5d48-4af4-95a8-248656995677-catalog-content\") pod \"community-operators-bx9gn\" (UID: \"d2259fa1-5d48-4af4-95a8-248656995677\") " pod="openshift-marketplace/community-operators-bx9gn" Feb 18 12:48:40 crc kubenswrapper[4922]: I0218 12:48:40.752393 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2259fa1-5d48-4af4-95a8-248656995677-utilities\") pod \"community-operators-bx9gn\" (UID: \"d2259fa1-5d48-4af4-95a8-248656995677\") " pod="openshift-marketplace/community-operators-bx9gn" Feb 18 12:48:40 crc kubenswrapper[4922]: I0218 12:48:40.773549 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcql8\" (UniqueName: \"kubernetes.io/projected/d2259fa1-5d48-4af4-95a8-248656995677-kube-api-access-kcql8\") pod \"community-operators-bx9gn\" (UID: \"d2259fa1-5d48-4af4-95a8-248656995677\") " pod="openshift-marketplace/community-operators-bx9gn" Feb 18 12:48:40 crc kubenswrapper[4922]: I0218 12:48:40.851034 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bx9gn" Feb 18 12:48:41 crc kubenswrapper[4922]: I0218 12:48:41.388192 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bx9gn"] Feb 18 12:48:41 crc kubenswrapper[4922]: I0218 12:48:41.778913 4922 generic.go:334] "Generic (PLEG): container finished" podID="d2259fa1-5d48-4af4-95a8-248656995677" containerID="44b26d0885cb416ce09fd2b10a847716b4e1607e6001dc3d68e30e390801aaab" exitCode=0 Feb 18 12:48:41 crc kubenswrapper[4922]: I0218 12:48:41.778970 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bx9gn" event={"ID":"d2259fa1-5d48-4af4-95a8-248656995677","Type":"ContainerDied","Data":"44b26d0885cb416ce09fd2b10a847716b4e1607e6001dc3d68e30e390801aaab"} Feb 18 12:48:41 crc kubenswrapper[4922]: I0218 12:48:41.779001 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bx9gn" event={"ID":"d2259fa1-5d48-4af4-95a8-248656995677","Type":"ContainerStarted","Data":"060c2cae1327bf0e518d54b6060282df30e58a89ab4033848f08f53f8eb73e97"} Feb 18 12:48:43 crc kubenswrapper[4922]: I0218 12:48:43.798470 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bx9gn" event={"ID":"d2259fa1-5d48-4af4-95a8-248656995677","Type":"ContainerStarted","Data":"a916979ab662e1964d2095283461a74d2e3cde5e3b4cbc64f91e5fa317792e73"} Feb 18 12:48:44 crc kubenswrapper[4922]: I0218 12:48:44.810687 4922 generic.go:334] "Generic (PLEG): container finished" podID="d2259fa1-5d48-4af4-95a8-248656995677" containerID="a916979ab662e1964d2095283461a74d2e3cde5e3b4cbc64f91e5fa317792e73" exitCode=0 Feb 18 12:48:44 crc kubenswrapper[4922]: I0218 12:48:44.810794 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bx9gn" event={"ID":"d2259fa1-5d48-4af4-95a8-248656995677","Type":"ContainerDied","Data":"a916979ab662e1964d2095283461a74d2e3cde5e3b4cbc64f91e5fa317792e73"} Feb 18 12:48:45 crc kubenswrapper[4922]: I0218 12:48:45.822229 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bx9gn" event={"ID":"d2259fa1-5d48-4af4-95a8-248656995677","Type":"ContainerStarted","Data":"19bfb4fdec197ffadd7986d5ef7545670ca979803a6cf0429378c31163bc6062"} Feb 18 12:48:45 crc kubenswrapper[4922]: I0218 12:48:45.838670 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bx9gn" podStartSLOduration=2.405849348 podStartE2EDuration="5.838651744s" podCreationTimestamp="2026-02-18 12:48:40 +0000 UTC" firstStartedPulling="2026-02-18 12:48:41.780805327 +0000 UTC m=+4323.508509407" lastFinishedPulling="2026-02-18 12:48:45.213607723 +0000 UTC m=+4326.941311803" observedRunningTime="2026-02-18 12:48:45.83731857 +0000 UTC m=+4327.565022650" watchObservedRunningTime="2026-02-18 12:48:45.838651744 +0000 UTC m=+4327.566355824" Feb 18 12:48:50 crc kubenswrapper[4922]: I0218 12:48:50.851233 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bx9gn" Feb 18 12:48:50 crc kubenswrapper[4922]: I0218 12:48:50.851669 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bx9gn" Feb 18 12:48:50 crc kubenswrapper[4922]: I0218 12:48:50.901831 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bx9gn" Feb 18 12:48:50 crc kubenswrapper[4922]: I0218 12:48:50.952719 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bx9gn" Feb 18 12:48:51 crc kubenswrapper[4922]: I0218 12:48:51.145083 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bx9gn"] Feb 18 12:48:52 crc kubenswrapper[4922]: I0218 12:48:52.880447 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bx9gn" podUID="d2259fa1-5d48-4af4-95a8-248656995677" containerName="registry-server" containerID="cri-o://19bfb4fdec197ffadd7986d5ef7545670ca979803a6cf0429378c31163bc6062" gracePeriod=2 Feb 18 12:48:53 crc kubenswrapper[4922]: I0218 12:48:53.766843 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bx9gn" Feb 18 12:48:53 crc kubenswrapper[4922]: I0218 12:48:53.891517 4922 generic.go:334] "Generic (PLEG): container finished" podID="d2259fa1-5d48-4af4-95a8-248656995677" containerID="19bfb4fdec197ffadd7986d5ef7545670ca979803a6cf0429378c31163bc6062" exitCode=0 Feb 18 12:48:53 crc kubenswrapper[4922]: I0218 12:48:53.891573 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bx9gn" Feb 18 12:48:53 crc kubenswrapper[4922]: I0218 12:48:53.891576 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bx9gn" event={"ID":"d2259fa1-5d48-4af4-95a8-248656995677","Type":"ContainerDied","Data":"19bfb4fdec197ffadd7986d5ef7545670ca979803a6cf0429378c31163bc6062"} Feb 18 12:48:53 crc kubenswrapper[4922]: I0218 12:48:53.891710 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bx9gn" event={"ID":"d2259fa1-5d48-4af4-95a8-248656995677","Type":"ContainerDied","Data":"060c2cae1327bf0e518d54b6060282df30e58a89ab4033848f08f53f8eb73e97"} Feb 18 12:48:53 crc kubenswrapper[4922]: I0218 12:48:53.891732 4922 scope.go:117] "RemoveContainer" containerID="19bfb4fdec197ffadd7986d5ef7545670ca979803a6cf0429378c31163bc6062" Feb 18 12:48:53 crc kubenswrapper[4922]: I0218 12:48:53.913097 4922 scope.go:117] "RemoveContainer" containerID="a916979ab662e1964d2095283461a74d2e3cde5e3b4cbc64f91e5fa317792e73" Feb 18 12:48:53 crc kubenswrapper[4922]: I0218 12:48:53.930088 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2259fa1-5d48-4af4-95a8-248656995677-catalog-content\") pod \"d2259fa1-5d48-4af4-95a8-248656995677\" (UID: \"d2259fa1-5d48-4af4-95a8-248656995677\") " Feb 18 12:48:53 crc kubenswrapper[4922]: I0218 12:48:53.930294 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcql8\" (UniqueName: \"kubernetes.io/projected/d2259fa1-5d48-4af4-95a8-248656995677-kube-api-access-kcql8\") pod \"d2259fa1-5d48-4af4-95a8-248656995677\" (UID: \"d2259fa1-5d48-4af4-95a8-248656995677\") " Feb 18 12:48:53 crc kubenswrapper[4922]: I0218 12:48:53.930532 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2259fa1-5d48-4af4-95a8-248656995677-utilities\") pod \"d2259fa1-5d48-4af4-95a8-248656995677\" (UID: \"d2259fa1-5d48-4af4-95a8-248656995677\") " Feb 18 12:48:53 crc kubenswrapper[4922]: I0218 12:48:53.931378 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2259fa1-5d48-4af4-95a8-248656995677-utilities" (OuterVolumeSpecName: "utilities") pod "d2259fa1-5d48-4af4-95a8-248656995677" (UID: "d2259fa1-5d48-4af4-95a8-248656995677"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:48:53 crc kubenswrapper[4922]: I0218 12:48:53.931809 4922 scope.go:117] "RemoveContainer" containerID="44b26d0885cb416ce09fd2b10a847716b4e1607e6001dc3d68e30e390801aaab" Feb 18 12:48:53 crc kubenswrapper[4922]: I0218 12:48:53.937199 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2259fa1-5d48-4af4-95a8-248656995677-kube-api-access-kcql8" (OuterVolumeSpecName: "kube-api-access-kcql8") pod "d2259fa1-5d48-4af4-95a8-248656995677" (UID: "d2259fa1-5d48-4af4-95a8-248656995677"). InnerVolumeSpecName "kube-api-access-kcql8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:48:54 crc kubenswrapper[4922]: I0218 12:48:54.028123 4922 scope.go:117] "RemoveContainer" containerID="19bfb4fdec197ffadd7986d5ef7545670ca979803a6cf0429378c31163bc6062" Feb 18 12:48:54 crc kubenswrapper[4922]: E0218 12:48:54.028682 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19bfb4fdec197ffadd7986d5ef7545670ca979803a6cf0429378c31163bc6062\": container with ID starting with 19bfb4fdec197ffadd7986d5ef7545670ca979803a6cf0429378c31163bc6062 not found: ID does not exist" containerID="19bfb4fdec197ffadd7986d5ef7545670ca979803a6cf0429378c31163bc6062" Feb 18 12:48:54 crc kubenswrapper[4922]: I0218 12:48:54.028729 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19bfb4fdec197ffadd7986d5ef7545670ca979803a6cf0429378c31163bc6062"} err="failed to get container status \"19bfb4fdec197ffadd7986d5ef7545670ca979803a6cf0429378c31163bc6062\": rpc error: code = NotFound desc = could not find container \"19bfb4fdec197ffadd7986d5ef7545670ca979803a6cf0429378c31163bc6062\": container with ID starting with 19bfb4fdec197ffadd7986d5ef7545670ca979803a6cf0429378c31163bc6062 not found: ID does not exist" Feb 18 12:48:54 crc kubenswrapper[4922]: I0218 12:48:54.028754 4922 scope.go:117] "RemoveContainer" containerID="a916979ab662e1964d2095283461a74d2e3cde5e3b4cbc64f91e5fa317792e73" Feb 18 12:48:54 crc kubenswrapper[4922]: E0218 12:48:54.029169 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a916979ab662e1964d2095283461a74d2e3cde5e3b4cbc64f91e5fa317792e73\": container with ID starting with a916979ab662e1964d2095283461a74d2e3cde5e3b4cbc64f91e5fa317792e73 not found: ID does not exist" containerID="a916979ab662e1964d2095283461a74d2e3cde5e3b4cbc64f91e5fa317792e73" Feb 18 12:48:54 crc kubenswrapper[4922]: I0218 12:48:54.029302 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a916979ab662e1964d2095283461a74d2e3cde5e3b4cbc64f91e5fa317792e73"} err="failed to get container status \"a916979ab662e1964d2095283461a74d2e3cde5e3b4cbc64f91e5fa317792e73\": rpc error: code = NotFound desc = could not find container \"a916979ab662e1964d2095283461a74d2e3cde5e3b4cbc64f91e5fa317792e73\": container with ID starting with a916979ab662e1964d2095283461a74d2e3cde5e3b4cbc64f91e5fa317792e73 not found: ID does not exist" Feb 18 12:48:54 crc kubenswrapper[4922]: I0218 12:48:54.029431 4922 scope.go:117] "RemoveContainer" containerID="44b26d0885cb416ce09fd2b10a847716b4e1607e6001dc3d68e30e390801aaab" Feb 18 12:48:54 crc kubenswrapper[4922]: E0218 12:48:54.029993 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44b26d0885cb416ce09fd2b10a847716b4e1607e6001dc3d68e30e390801aaab\": container with ID starting with 44b26d0885cb416ce09fd2b10a847716b4e1607e6001dc3d68e30e390801aaab not found: ID does not exist" containerID="44b26d0885cb416ce09fd2b10a847716b4e1607e6001dc3d68e30e390801aaab" Feb 18 12:48:54 crc kubenswrapper[4922]: I0218 12:48:54.030091 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44b26d0885cb416ce09fd2b10a847716b4e1607e6001dc3d68e30e390801aaab"} err="failed to get container status \"44b26d0885cb416ce09fd2b10a847716b4e1607e6001dc3d68e30e390801aaab\": rpc error: code = NotFound desc = could not find container \"44b26d0885cb416ce09fd2b10a847716b4e1607e6001dc3d68e30e390801aaab\": container with ID starting with 44b26d0885cb416ce09fd2b10a847716b4e1607e6001dc3d68e30e390801aaab not found: ID does not exist" Feb 18 12:48:54 crc kubenswrapper[4922]: I0218 12:48:54.032377 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcql8\" (UniqueName: \"kubernetes.io/projected/d2259fa1-5d48-4af4-95a8-248656995677-kube-api-access-kcql8\") on node \"crc\" DevicePath \"\"" Feb 18 12:48:54 crc kubenswrapper[4922]: I0218 12:48:54.032401 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2259fa1-5d48-4af4-95a8-248656995677-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 12:48:54 crc kubenswrapper[4922]: I0218 12:48:54.264464 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2259fa1-5d48-4af4-95a8-248656995677-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d2259fa1-5d48-4af4-95a8-248656995677" (UID: "d2259fa1-5d48-4af4-95a8-248656995677"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:48:54 crc kubenswrapper[4922]: I0218 12:48:54.339836 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2259fa1-5d48-4af4-95a8-248656995677-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 12:48:54 crc kubenswrapper[4922]: I0218 12:48:54.528471 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bx9gn"] Feb 18 12:48:54 crc kubenswrapper[4922]: I0218 12:48:54.540640 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bx9gn"] Feb 18 12:48:54 crc kubenswrapper[4922]: I0218 12:48:54.983501 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2259fa1-5d48-4af4-95a8-248656995677" path="/var/lib/kubelet/pods/d2259fa1-5d48-4af4-95a8-248656995677/volumes" Feb 18 12:49:09 crc kubenswrapper[4922]: I0218 12:49:09.807890 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 12:49:09 crc kubenswrapper[4922]: I0218 12:49:09.808495 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 12:49:09 crc kubenswrapper[4922]: I0218 12:49:09.808549 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-znglx" Feb 18 12:49:09 crc kubenswrapper[4922]: I0218 12:49:09.809788 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1d138caa9f48794399f333864aee478905e60e95ed598a5f227130509c079fc0"} pod="openshift-machine-config-operator/machine-config-daemon-znglx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 12:49:09 crc kubenswrapper[4922]: I0218 12:49:09.809903 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" containerID="cri-o://1d138caa9f48794399f333864aee478905e60e95ed598a5f227130509c079fc0" gracePeriod=600 Feb 18 12:49:10 crc kubenswrapper[4922]: I0218 12:49:10.039440 4922 generic.go:334] "Generic (PLEG): container finished" podID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerID="1d138caa9f48794399f333864aee478905e60e95ed598a5f227130509c079fc0" exitCode=0 Feb 18 12:49:10 crc kubenswrapper[4922]: I0218 12:49:10.039580 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" event={"ID":"fdb7cedc-b2e3-48f0-80e0-e17073b43228","Type":"ContainerDied","Data":"1d138caa9f48794399f333864aee478905e60e95ed598a5f227130509c079fc0"} Feb 18 12:49:10 crc kubenswrapper[4922]: I0218 12:49:10.040157 4922 scope.go:117] "RemoveContainer" containerID="da2f31195c0d03b655b84a936da99ef08c103ba9b2f411b408e8d74bdaa66a62" Feb 18 12:49:11 crc kubenswrapper[4922]: I0218 12:49:11.051410 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" event={"ID":"fdb7cedc-b2e3-48f0-80e0-e17073b43228","Type":"ContainerStarted","Data":"5cb544d05c9692e1a613f94aebafc80d5e6b64a4e004e82767f3ecc7c8a1d324"} Feb 18 12:51:39 crc kubenswrapper[4922]: I0218 12:51:39.807771 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 12:51:39 crc kubenswrapper[4922]: I0218 12:51:39.808265 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 12:52:09 crc kubenswrapper[4922]: I0218 12:52:09.807032 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 12:52:09 crc kubenswrapper[4922]: I0218 12:52:09.808757 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 12:52:39 crc kubenswrapper[4922]: I0218 12:52:39.807984 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 12:52:39 crc kubenswrapper[4922]: I0218 12:52:39.808724 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 12:52:39 crc kubenswrapper[4922]: I0218 12:52:39.808790 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-znglx" Feb 18 12:52:39 crc kubenswrapper[4922]: I0218 12:52:39.809875 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5cb544d05c9692e1a613f94aebafc80d5e6b64a4e004e82767f3ecc7c8a1d324"} pod="openshift-machine-config-operator/machine-config-daemon-znglx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 12:52:39 crc kubenswrapper[4922]: I0218 12:52:39.809944 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" containerID="cri-o://5cb544d05c9692e1a613f94aebafc80d5e6b64a4e004e82767f3ecc7c8a1d324" gracePeriod=600 Feb 18 12:52:39 crc kubenswrapper[4922]: E0218 12:52:39.942934 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:52:40 crc kubenswrapper[4922]: I0218 12:52:40.932324 4922 generic.go:334] "Generic (PLEG): container finished" podID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerID="5cb544d05c9692e1a613f94aebafc80d5e6b64a4e004e82767f3ecc7c8a1d324" exitCode=0 Feb 18 12:52:40 crc kubenswrapper[4922]: I0218 12:52:40.932539 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" event={"ID":"fdb7cedc-b2e3-48f0-80e0-e17073b43228","Type":"ContainerDied","Data":"5cb544d05c9692e1a613f94aebafc80d5e6b64a4e004e82767f3ecc7c8a1d324"} Feb 18 12:52:40 crc kubenswrapper[4922]: I0218 12:52:40.932939 4922 scope.go:117] "RemoveContainer" containerID="1d138caa9f48794399f333864aee478905e60e95ed598a5f227130509c079fc0" Feb 18 12:52:40 crc kubenswrapper[4922]: I0218 12:52:40.934176 4922 scope.go:117] "RemoveContainer" containerID="5cb544d05c9692e1a613f94aebafc80d5e6b64a4e004e82767f3ecc7c8a1d324" Feb 18 12:52:40 crc kubenswrapper[4922]: E0218 12:52:40.934764 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:52:54 crc kubenswrapper[4922]: I0218 12:52:54.973376 4922 scope.go:117] "RemoveContainer" containerID="5cb544d05c9692e1a613f94aebafc80d5e6b64a4e004e82767f3ecc7c8a1d324" Feb 18 12:52:54 crc kubenswrapper[4922]: E0218 12:52:54.974202 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:53:07 crc kubenswrapper[4922]: I0218 12:53:07.973119 4922 scope.go:117] "RemoveContainer" containerID="5cb544d05c9692e1a613f94aebafc80d5e6b64a4e004e82767f3ecc7c8a1d324" Feb 18 12:53:07 crc kubenswrapper[4922]: E0218 12:53:07.974120 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:53:21 crc kubenswrapper[4922]: I0218 12:53:21.979287 4922 scope.go:117] "RemoveContainer" containerID="5cb544d05c9692e1a613f94aebafc80d5e6b64a4e004e82767f3ecc7c8a1d324" Feb 18 12:53:21 crc kubenswrapper[4922]: E0218 12:53:21.979977 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:53:35 crc kubenswrapper[4922]: I0218 12:53:35.972826 4922 scope.go:117] "RemoveContainer" containerID="5cb544d05c9692e1a613f94aebafc80d5e6b64a4e004e82767f3ecc7c8a1d324" Feb 18 12:53:35 crc kubenswrapper[4922]: E0218 12:53:35.973753 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:53:45 crc kubenswrapper[4922]: I0218 12:53:45.725017 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hqcnv"] Feb 18 12:53:45 crc kubenswrapper[4922]: E0218 12:53:45.777170 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2259fa1-5d48-4af4-95a8-248656995677" containerName="extract-utilities" Feb 18 12:53:45 crc kubenswrapper[4922]: I0218 12:53:45.777210 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2259fa1-5d48-4af4-95a8-248656995677" containerName="extract-utilities" Feb 18 12:53:45 crc kubenswrapper[4922]: E0218 12:53:45.777241 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2259fa1-5d48-4af4-95a8-248656995677" containerName="extract-content" Feb 18 12:53:45 crc kubenswrapper[4922]: I0218 12:53:45.777254 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2259fa1-5d48-4af4-95a8-248656995677" containerName="extract-content" Feb 18 12:53:45 crc kubenswrapper[4922]: E0218 12:53:45.777265 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2259fa1-5d48-4af4-95a8-248656995677" containerName="registry-server" Feb 18 12:53:45 crc kubenswrapper[4922]: I0218 12:53:45.777273 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2259fa1-5d48-4af4-95a8-248656995677" containerName="registry-server" Feb 18 12:53:45 crc kubenswrapper[4922]: I0218 12:53:45.789434 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2259fa1-5d48-4af4-95a8-248656995677" containerName="registry-server" Feb 18 12:53:45 crc kubenswrapper[4922]: I0218 12:53:45.821741 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hqcnv"] Feb 18 12:53:45 crc kubenswrapper[4922]: I0218 12:53:45.821879 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hqcnv" Feb 18 12:53:45 crc kubenswrapper[4922]: I0218 12:53:45.960648 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/698ffa93-9036-42f0-9b8f-b40d8bd89799-catalog-content\") pod \"redhat-operators-hqcnv\" (UID: \"698ffa93-9036-42f0-9b8f-b40d8bd89799\") " pod="openshift-marketplace/redhat-operators-hqcnv" Feb 18 12:53:45 crc kubenswrapper[4922]: I0218 12:53:45.960945 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/698ffa93-9036-42f0-9b8f-b40d8bd89799-utilities\") pod \"redhat-operators-hqcnv\" (UID: \"698ffa93-9036-42f0-9b8f-b40d8bd89799\") " pod="openshift-marketplace/redhat-operators-hqcnv" Feb 18 12:53:45 crc kubenswrapper[4922]: I0218 12:53:45.961485 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dndhx\" (UniqueName: \"kubernetes.io/projected/698ffa93-9036-42f0-9b8f-b40d8bd89799-kube-api-access-dndhx\") pod \"redhat-operators-hqcnv\" (UID: \"698ffa93-9036-42f0-9b8f-b40d8bd89799\") " pod="openshift-marketplace/redhat-operators-hqcnv" Feb 18 12:53:46 crc kubenswrapper[4922]: I0218 12:53:46.063021 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dndhx\" (UniqueName: \"kubernetes.io/projected/698ffa93-9036-42f0-9b8f-b40d8bd89799-kube-api-access-dndhx\") pod \"redhat-operators-hqcnv\" (UID: \"698ffa93-9036-42f0-9b8f-b40d8bd89799\") " pod="openshift-marketplace/redhat-operators-hqcnv" Feb 18 12:53:46 crc kubenswrapper[4922]: I0218 12:53:46.063074 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/698ffa93-9036-42f0-9b8f-b40d8bd89799-catalog-content\") pod \"redhat-operators-hqcnv\" (UID: \"698ffa93-9036-42f0-9b8f-b40d8bd89799\") " pod="openshift-marketplace/redhat-operators-hqcnv" Feb 18 12:53:46 crc kubenswrapper[4922]: I0218 12:53:46.063143 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/698ffa93-9036-42f0-9b8f-b40d8bd89799-utilities\") pod \"redhat-operators-hqcnv\" (UID: \"698ffa93-9036-42f0-9b8f-b40d8bd89799\") " pod="openshift-marketplace/redhat-operators-hqcnv" Feb 18 12:53:46 crc kubenswrapper[4922]: I0218 12:53:46.063738 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/698ffa93-9036-42f0-9b8f-b40d8bd89799-utilities\") pod \"redhat-operators-hqcnv\" (UID: \"698ffa93-9036-42f0-9b8f-b40d8bd89799\") " pod="openshift-marketplace/redhat-operators-hqcnv" Feb 18 12:53:46 crc kubenswrapper[4922]: I0218 12:53:46.064259 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/698ffa93-9036-42f0-9b8f-b40d8bd89799-catalog-content\") pod \"redhat-operators-hqcnv\" (UID: \"698ffa93-9036-42f0-9b8f-b40d8bd89799\") " pod="openshift-marketplace/redhat-operators-hqcnv" Feb 18 12:53:46 crc kubenswrapper[4922]: I0218 12:53:46.082538 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dndhx\" (UniqueName: \"kubernetes.io/projected/698ffa93-9036-42f0-9b8f-b40d8bd89799-kube-api-access-dndhx\") pod \"redhat-operators-hqcnv\" (UID: \"698ffa93-9036-42f0-9b8f-b40d8bd89799\") " pod="openshift-marketplace/redhat-operators-hqcnv" Feb 18 12:53:46 crc kubenswrapper[4922]: I0218 12:53:46.147164 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hqcnv" Feb 18 12:53:46 crc kubenswrapper[4922]: I0218 12:53:46.624532 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hqcnv"] Feb 18 12:53:47 crc kubenswrapper[4922]: I0218 12:53:47.561811 4922 generic.go:334] "Generic (PLEG): container finished" podID="698ffa93-9036-42f0-9b8f-b40d8bd89799" containerID="1ffb8a877d9b78b85d9fa2d690dde256e6eeb627bd355b242bda0e6e5edf9391" exitCode=0 Feb 18 12:53:47 crc kubenswrapper[4922]: I0218 12:53:47.561892 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hqcnv" event={"ID":"698ffa93-9036-42f0-9b8f-b40d8bd89799","Type":"ContainerDied","Data":"1ffb8a877d9b78b85d9fa2d690dde256e6eeb627bd355b242bda0e6e5edf9391"} Feb 18 12:53:47 crc kubenswrapper[4922]: I0218 12:53:47.562139 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hqcnv" event={"ID":"698ffa93-9036-42f0-9b8f-b40d8bd89799","Type":"ContainerStarted","Data":"30376a75b17d5eebb1bdba8398ea7875600a6f47c29f91d225604e0257fff402"} Feb 18 12:53:47 crc kubenswrapper[4922]: I0218 12:53:47.566225 4922 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 12:53:48 crc kubenswrapper[4922]: I0218 12:53:48.980830 4922 scope.go:117] "RemoveContainer" containerID="5cb544d05c9692e1a613f94aebafc80d5e6b64a4e004e82767f3ecc7c8a1d324" Feb 18 12:53:48 crc kubenswrapper[4922]: E0218 12:53:48.981405 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:53:49 crc kubenswrapper[4922]: I0218 12:53:49.582671 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hqcnv" event={"ID":"698ffa93-9036-42f0-9b8f-b40d8bd89799","Type":"ContainerStarted","Data":"e75c9b2ab15b8a47c339719f561374dd3ec0669fefe78913caecbb8e3f4e763b"} Feb 18 12:53:53 crc kubenswrapper[4922]: I0218 12:53:53.618752 4922 generic.go:334] "Generic (PLEG): container finished" podID="698ffa93-9036-42f0-9b8f-b40d8bd89799" containerID="e75c9b2ab15b8a47c339719f561374dd3ec0669fefe78913caecbb8e3f4e763b" exitCode=0 Feb 18 12:53:53 crc kubenswrapper[4922]: I0218 12:53:53.618877 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hqcnv" event={"ID":"698ffa93-9036-42f0-9b8f-b40d8bd89799","Type":"ContainerDied","Data":"e75c9b2ab15b8a47c339719f561374dd3ec0669fefe78913caecbb8e3f4e763b"} Feb 18 12:53:54 crc kubenswrapper[4922]: I0218 12:53:54.629844 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hqcnv" event={"ID":"698ffa93-9036-42f0-9b8f-b40d8bd89799","Type":"ContainerStarted","Data":"4e2f3e2ccd8d9fcdf0bcd4fedb1e5412f66508d41fddbbf46882aace021eefc8"} Feb 18 12:53:54 crc kubenswrapper[4922]: I0218 12:53:54.650911 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hqcnv" podStartSLOduration=3.189458545 podStartE2EDuration="9.650887212s" podCreationTimestamp="2026-02-18 12:53:45 +0000 UTC" firstStartedPulling="2026-02-18 12:53:47.565926009 +0000 UTC m=+4629.293630089" lastFinishedPulling="2026-02-18 12:53:54.027354676 +0000 UTC m=+4635.755058756" observedRunningTime="2026-02-18 12:53:54.649533858 +0000 UTC m=+4636.377237938" watchObservedRunningTime="2026-02-18 12:53:54.650887212 +0000 UTC m=+4636.378591292" Feb 18 12:53:56 crc kubenswrapper[4922]: I0218 12:53:56.147877 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hqcnv" Feb 18 12:53:56 crc kubenswrapper[4922]: I0218 12:53:56.148190 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hqcnv" Feb 18 12:53:57 crc kubenswrapper[4922]: I0218 12:53:57.193834 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hqcnv" podUID="698ffa93-9036-42f0-9b8f-b40d8bd89799" containerName="registry-server" probeResult="failure" output=< Feb 18 12:53:57 crc kubenswrapper[4922]: timeout: failed to connect service ":50051" within 1s Feb 18 12:53:57 crc kubenswrapper[4922]: > Feb 18 12:54:02 crc kubenswrapper[4922]: I0218 12:54:02.972915 4922 scope.go:117] "RemoveContainer" containerID="5cb544d05c9692e1a613f94aebafc80d5e6b64a4e004e82767f3ecc7c8a1d324" Feb 18 12:54:02 crc kubenswrapper[4922]: E0218 12:54:02.973940 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:54:07 crc kubenswrapper[4922]: I0218 12:54:07.195151 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hqcnv" podUID="698ffa93-9036-42f0-9b8f-b40d8bd89799" containerName="registry-server" probeResult="failure" output=< Feb 18 12:54:07 crc kubenswrapper[4922]: timeout: failed to connect service ":50051" within 1s Feb 18 12:54:07 crc kubenswrapper[4922]: > Feb 18 12:54:14 crc kubenswrapper[4922]: I0218 12:54:14.972680 4922 scope.go:117] "RemoveContainer" containerID="5cb544d05c9692e1a613f94aebafc80d5e6b64a4e004e82767f3ecc7c8a1d324" Feb 18 12:54:14 crc kubenswrapper[4922]: E0218 12:54:14.973533 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:54:16 crc kubenswrapper[4922]: I0218 12:54:16.198526 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hqcnv" Feb 18 12:54:16 crc kubenswrapper[4922]: I0218 12:54:16.259050 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hqcnv" Feb 18 12:54:16 crc kubenswrapper[4922]: I0218 12:54:16.925613 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hqcnv"] Feb 18 12:54:17 crc kubenswrapper[4922]: I0218 12:54:17.833568 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hqcnv" podUID="698ffa93-9036-42f0-9b8f-b40d8bd89799" containerName="registry-server" containerID="cri-o://4e2f3e2ccd8d9fcdf0bcd4fedb1e5412f66508d41fddbbf46882aace021eefc8" gracePeriod=2 Feb 18 12:54:18 crc kubenswrapper[4922]: I0218 12:54:18.318557 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hqcnv" Feb 18 12:54:18 crc kubenswrapper[4922]: I0218 12:54:18.453543 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/698ffa93-9036-42f0-9b8f-b40d8bd89799-catalog-content\") pod \"698ffa93-9036-42f0-9b8f-b40d8bd89799\" (UID: \"698ffa93-9036-42f0-9b8f-b40d8bd89799\") " Feb 18 12:54:18 crc kubenswrapper[4922]: I0218 12:54:18.453848 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dndhx\" (UniqueName: \"kubernetes.io/projected/698ffa93-9036-42f0-9b8f-b40d8bd89799-kube-api-access-dndhx\") pod \"698ffa93-9036-42f0-9b8f-b40d8bd89799\" (UID: \"698ffa93-9036-42f0-9b8f-b40d8bd89799\") " Feb 18 12:54:18 crc kubenswrapper[4922]: I0218 12:54:18.453886 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/698ffa93-9036-42f0-9b8f-b40d8bd89799-utilities\") pod \"698ffa93-9036-42f0-9b8f-b40d8bd89799\" (UID: \"698ffa93-9036-42f0-9b8f-b40d8bd89799\") " Feb 18 12:54:18 crc kubenswrapper[4922]: I0218 12:54:18.454837 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/698ffa93-9036-42f0-9b8f-b40d8bd89799-utilities" (OuterVolumeSpecName: "utilities") pod "698ffa93-9036-42f0-9b8f-b40d8bd89799" (UID: "698ffa93-9036-42f0-9b8f-b40d8bd89799"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:54:18 crc kubenswrapper[4922]: I0218 12:54:18.461739 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/698ffa93-9036-42f0-9b8f-b40d8bd89799-kube-api-access-dndhx" (OuterVolumeSpecName: "kube-api-access-dndhx") pod "698ffa93-9036-42f0-9b8f-b40d8bd89799" (UID: "698ffa93-9036-42f0-9b8f-b40d8bd89799"). InnerVolumeSpecName "kube-api-access-dndhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:54:18 crc kubenswrapper[4922]: I0218 12:54:18.557019 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dndhx\" (UniqueName: \"kubernetes.io/projected/698ffa93-9036-42f0-9b8f-b40d8bd89799-kube-api-access-dndhx\") on node \"crc\" DevicePath \"\"" Feb 18 12:54:18 crc kubenswrapper[4922]: I0218 12:54:18.557065 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/698ffa93-9036-42f0-9b8f-b40d8bd89799-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 12:54:18 crc kubenswrapper[4922]: I0218 12:54:18.599242 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/698ffa93-9036-42f0-9b8f-b40d8bd89799-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "698ffa93-9036-42f0-9b8f-b40d8bd89799" (UID: "698ffa93-9036-42f0-9b8f-b40d8bd89799"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:54:18 crc kubenswrapper[4922]: I0218 12:54:18.660044 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/698ffa93-9036-42f0-9b8f-b40d8bd89799-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 12:54:18 crc kubenswrapper[4922]: I0218 12:54:18.845946 4922 generic.go:334] "Generic (PLEG): container finished" podID="698ffa93-9036-42f0-9b8f-b40d8bd89799" containerID="4e2f3e2ccd8d9fcdf0bcd4fedb1e5412f66508d41fddbbf46882aace021eefc8" exitCode=0 Feb 18 12:54:18 crc kubenswrapper[4922]: I0218 12:54:18.846015 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hqcnv" event={"ID":"698ffa93-9036-42f0-9b8f-b40d8bd89799","Type":"ContainerDied","Data":"4e2f3e2ccd8d9fcdf0bcd4fedb1e5412f66508d41fddbbf46882aace021eefc8"} Feb 18 12:54:18 crc kubenswrapper[4922]: I0218 12:54:18.846097 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hqcnv" Feb 18 12:54:18 crc kubenswrapper[4922]: I0218 12:54:18.846129 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hqcnv" event={"ID":"698ffa93-9036-42f0-9b8f-b40d8bd89799","Type":"ContainerDied","Data":"30376a75b17d5eebb1bdba8398ea7875600a6f47c29f91d225604e0257fff402"} Feb 18 12:54:18 crc kubenswrapper[4922]: I0218 12:54:18.846156 4922 scope.go:117] "RemoveContainer" containerID="4e2f3e2ccd8d9fcdf0bcd4fedb1e5412f66508d41fddbbf46882aace021eefc8" Feb 18 12:54:18 crc kubenswrapper[4922]: I0218 12:54:18.883512 4922 scope.go:117] "RemoveContainer" containerID="e75c9b2ab15b8a47c339719f561374dd3ec0669fefe78913caecbb8e3f4e763b" Feb 18 12:54:18 crc kubenswrapper[4922]: I0218 12:54:18.894743 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hqcnv"] Feb 18 12:54:18 crc kubenswrapper[4922]: I0218 12:54:18.908083 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hqcnv"] Feb 18 12:54:18 crc kubenswrapper[4922]: I0218 12:54:18.914132 4922 scope.go:117] "RemoveContainer" containerID="1ffb8a877d9b78b85d9fa2d690dde256e6eeb627bd355b242bda0e6e5edf9391" Feb 18 12:54:18 crc kubenswrapper[4922]: I0218 12:54:18.953583 4922 scope.go:117] "RemoveContainer" containerID="4e2f3e2ccd8d9fcdf0bcd4fedb1e5412f66508d41fddbbf46882aace021eefc8" Feb 18 12:54:18 crc kubenswrapper[4922]: E0218 12:54:18.954088 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e2f3e2ccd8d9fcdf0bcd4fedb1e5412f66508d41fddbbf46882aace021eefc8\": container with ID starting with 4e2f3e2ccd8d9fcdf0bcd4fedb1e5412f66508d41fddbbf46882aace021eefc8 not found: ID does not exist" containerID="4e2f3e2ccd8d9fcdf0bcd4fedb1e5412f66508d41fddbbf46882aace021eefc8" Feb 18 12:54:18 crc kubenswrapper[4922]: I0218 12:54:18.954119 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e2f3e2ccd8d9fcdf0bcd4fedb1e5412f66508d41fddbbf46882aace021eefc8"} err="failed to get container status \"4e2f3e2ccd8d9fcdf0bcd4fedb1e5412f66508d41fddbbf46882aace021eefc8\": rpc error: code = NotFound desc = could not find container \"4e2f3e2ccd8d9fcdf0bcd4fedb1e5412f66508d41fddbbf46882aace021eefc8\": container with ID starting with 4e2f3e2ccd8d9fcdf0bcd4fedb1e5412f66508d41fddbbf46882aace021eefc8 not found: ID does not exist" Feb 18 12:54:18 crc kubenswrapper[4922]: I0218 12:54:18.954147 4922 scope.go:117] "RemoveContainer" containerID="e75c9b2ab15b8a47c339719f561374dd3ec0669fefe78913caecbb8e3f4e763b" Feb 18 12:54:18 crc kubenswrapper[4922]: E0218 12:54:18.954551 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e75c9b2ab15b8a47c339719f561374dd3ec0669fefe78913caecbb8e3f4e763b\": container with ID starting with e75c9b2ab15b8a47c339719f561374dd3ec0669fefe78913caecbb8e3f4e763b not found: ID does not exist" containerID="e75c9b2ab15b8a47c339719f561374dd3ec0669fefe78913caecbb8e3f4e763b" Feb 18 12:54:18 crc kubenswrapper[4922]: I0218 12:54:18.954605 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e75c9b2ab15b8a47c339719f561374dd3ec0669fefe78913caecbb8e3f4e763b"} err="failed to get container status \"e75c9b2ab15b8a47c339719f561374dd3ec0669fefe78913caecbb8e3f4e763b\": rpc error: code = NotFound desc = could not find container \"e75c9b2ab15b8a47c339719f561374dd3ec0669fefe78913caecbb8e3f4e763b\": container with ID starting with e75c9b2ab15b8a47c339719f561374dd3ec0669fefe78913caecbb8e3f4e763b not found: ID does not exist" Feb 18 12:54:18 crc kubenswrapper[4922]: I0218 12:54:18.954638 4922 scope.go:117] "RemoveContainer" containerID="1ffb8a877d9b78b85d9fa2d690dde256e6eeb627bd355b242bda0e6e5edf9391" Feb 18 12:54:18 crc kubenswrapper[4922]: E0218 12:54:18.955005 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ffb8a877d9b78b85d9fa2d690dde256e6eeb627bd355b242bda0e6e5edf9391\": container with ID starting with 1ffb8a877d9b78b85d9fa2d690dde256e6eeb627bd355b242bda0e6e5edf9391 not found: ID does not exist" containerID="1ffb8a877d9b78b85d9fa2d690dde256e6eeb627bd355b242bda0e6e5edf9391" Feb 18 12:54:18 crc kubenswrapper[4922]: I0218 12:54:18.955062 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ffb8a877d9b78b85d9fa2d690dde256e6eeb627bd355b242bda0e6e5edf9391"} err="failed to get container status \"1ffb8a877d9b78b85d9fa2d690dde256e6eeb627bd355b242bda0e6e5edf9391\": rpc error: code = NotFound desc = could not find container \"1ffb8a877d9b78b85d9fa2d690dde256e6eeb627bd355b242bda0e6e5edf9391\": container with ID starting with 1ffb8a877d9b78b85d9fa2d690dde256e6eeb627bd355b242bda0e6e5edf9391 not found: ID does not exist" Feb 18 12:54:18 crc kubenswrapper[4922]: I0218 12:54:18.989777 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="698ffa93-9036-42f0-9b8f-b40d8bd89799" path="/var/lib/kubelet/pods/698ffa93-9036-42f0-9b8f-b40d8bd89799/volumes" Feb 18 12:54:25 crc kubenswrapper[4922]: I0218 12:54:25.973595 4922 scope.go:117] "RemoveContainer" containerID="5cb544d05c9692e1a613f94aebafc80d5e6b64a4e004e82767f3ecc7c8a1d324" Feb 18 12:54:25 crc kubenswrapper[4922]: E0218 12:54:25.974213 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:54:37 crc kubenswrapper[4922]: I0218 12:54:37.973812 4922 scope.go:117] "RemoveContainer" containerID="5cb544d05c9692e1a613f94aebafc80d5e6b64a4e004e82767f3ecc7c8a1d324" Feb 18 12:54:37 crc kubenswrapper[4922]: E0218 12:54:37.974658 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:54:51 crc kubenswrapper[4922]: I0218 12:54:51.973642 4922 scope.go:117] "RemoveContainer" containerID="5cb544d05c9692e1a613f94aebafc80d5e6b64a4e004e82767f3ecc7c8a1d324" Feb 18 12:54:51 crc kubenswrapper[4922]: E0218 12:54:51.974561 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:55:06 crc kubenswrapper[4922]: I0218 12:55:06.974518 4922 scope.go:117] "RemoveContainer" containerID="5cb544d05c9692e1a613f94aebafc80d5e6b64a4e004e82767f3ecc7c8a1d324" Feb 18 12:55:06 crc kubenswrapper[4922]: E0218 12:55:06.975479 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:55:20 crc kubenswrapper[4922]: I0218 12:55:20.973319 4922 scope.go:117] "RemoveContainer" containerID="5cb544d05c9692e1a613f94aebafc80d5e6b64a4e004e82767f3ecc7c8a1d324" Feb 18 12:55:20 crc kubenswrapper[4922]: E0218 12:55:20.974229 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:55:33 crc kubenswrapper[4922]: I0218 12:55:33.973142 4922 scope.go:117] "RemoveContainer" containerID="5cb544d05c9692e1a613f94aebafc80d5e6b64a4e004e82767f3ecc7c8a1d324" Feb 18 12:55:33 crc kubenswrapper[4922]: E0218 12:55:33.974088 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:55:48 crc kubenswrapper[4922]: I0218 12:55:48.982048 4922 scope.go:117] "RemoveContainer" containerID="5cb544d05c9692e1a613f94aebafc80d5e6b64a4e004e82767f3ecc7c8a1d324" Feb 18 12:55:48 crc kubenswrapper[4922]: E0218 12:55:48.984587 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:56:03 crc kubenswrapper[4922]: I0218 12:56:03.974664 4922 scope.go:117] "RemoveContainer" containerID="5cb544d05c9692e1a613f94aebafc80d5e6b64a4e004e82767f3ecc7c8a1d324" Feb 18 12:56:03 crc kubenswrapper[4922]: E0218 12:56:03.975983 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:56:17 crc kubenswrapper[4922]: I0218 12:56:17.973345 4922 scope.go:117] "RemoveContainer" containerID="5cb544d05c9692e1a613f94aebafc80d5e6b64a4e004e82767f3ecc7c8a1d324" Feb 18 12:56:17 crc kubenswrapper[4922]: E0218 12:56:17.974276 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:56:30 crc kubenswrapper[4922]: I0218 12:56:30.973497 4922 scope.go:117] "RemoveContainer" containerID="5cb544d05c9692e1a613f94aebafc80d5e6b64a4e004e82767f3ecc7c8a1d324" Feb 18 12:56:30 crc kubenswrapper[4922]: E0218 12:56:30.974699 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:56:33 crc kubenswrapper[4922]: I0218 12:56:33.300809 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qrs5h"] Feb 18 12:56:33 crc kubenswrapper[4922]: E0218 12:56:33.301678 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="698ffa93-9036-42f0-9b8f-b40d8bd89799" containerName="extract-utilities" Feb 18 12:56:33 crc kubenswrapper[4922]: I0218 12:56:33.301694 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="698ffa93-9036-42f0-9b8f-b40d8bd89799" containerName="extract-utilities" Feb 18 12:56:33 crc kubenswrapper[4922]: E0218 12:56:33.301711 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="698ffa93-9036-42f0-9b8f-b40d8bd89799" containerName="extract-content" Feb 18 12:56:33 crc kubenswrapper[4922]: I0218 12:56:33.301718 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="698ffa93-9036-42f0-9b8f-b40d8bd89799" containerName="extract-content" Feb 18 12:56:33 crc kubenswrapper[4922]: E0218 12:56:33.301736 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="698ffa93-9036-42f0-9b8f-b40d8bd89799" containerName="registry-server" Feb 18 12:56:33 crc kubenswrapper[4922]: I0218 12:56:33.301745 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="698ffa93-9036-42f0-9b8f-b40d8bd89799" containerName="registry-server" Feb 18 12:56:33 crc kubenswrapper[4922]: I0218 12:56:33.301993 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="698ffa93-9036-42f0-9b8f-b40d8bd89799" containerName="registry-server" Feb 18 12:56:33 crc kubenswrapper[4922]: I0218 12:56:33.303722 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qrs5h" Feb 18 12:56:33 crc kubenswrapper[4922]: I0218 12:56:33.322110 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qrs5h"] Feb 18 12:56:33 crc kubenswrapper[4922]: I0218 12:56:33.454032 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94c125c3-32cd-4f33-b450-40c8cc7eacae-utilities\") pod \"certified-operators-qrs5h\" (UID: \"94c125c3-32cd-4f33-b450-40c8cc7eacae\") " pod="openshift-marketplace/certified-operators-qrs5h" Feb 18 12:56:33 crc kubenswrapper[4922]: I0218 12:56:33.454182 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94c125c3-32cd-4f33-b450-40c8cc7eacae-catalog-content\") pod \"certified-operators-qrs5h\" (UID: \"94c125c3-32cd-4f33-b450-40c8cc7eacae\") " pod="openshift-marketplace/certified-operators-qrs5h" Feb 18 12:56:33 crc kubenswrapper[4922]: I0218 12:56:33.454311 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ms4t\" (UniqueName: \"kubernetes.io/projected/94c125c3-32cd-4f33-b450-40c8cc7eacae-kube-api-access-9ms4t\") pod \"certified-operators-qrs5h\" (UID: \"94c125c3-32cd-4f33-b450-40c8cc7eacae\") " pod="openshift-marketplace/certified-operators-qrs5h" Feb 18 12:56:33 crc kubenswrapper[4922]: I0218 12:56:33.556637 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ms4t\" (UniqueName: \"kubernetes.io/projected/94c125c3-32cd-4f33-b450-40c8cc7eacae-kube-api-access-9ms4t\") pod \"certified-operators-qrs5h\" (UID: \"94c125c3-32cd-4f33-b450-40c8cc7eacae\") " pod="openshift-marketplace/certified-operators-qrs5h" Feb 18 12:56:33 crc kubenswrapper[4922]: I0218 12:56:33.557558 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94c125c3-32cd-4f33-b450-40c8cc7eacae-utilities\") pod \"certified-operators-qrs5h\" (UID: \"94c125c3-32cd-4f33-b450-40c8cc7eacae\") " pod="openshift-marketplace/certified-operators-qrs5h" Feb 18 12:56:33 crc kubenswrapper[4922]: I0218 12:56:33.557822 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94c125c3-32cd-4f33-b450-40c8cc7eacae-catalog-content\") pod \"certified-operators-qrs5h\" (UID: \"94c125c3-32cd-4f33-b450-40c8cc7eacae\") " pod="openshift-marketplace/certified-operators-qrs5h" Feb 18 12:56:33 crc kubenswrapper[4922]: I0218 12:56:33.558532 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94c125c3-32cd-4f33-b450-40c8cc7eacae-catalog-content\") pod \"certified-operators-qrs5h\" (UID: \"94c125c3-32cd-4f33-b450-40c8cc7eacae\") " pod="openshift-marketplace/certified-operators-qrs5h" Feb 18 12:56:33 crc kubenswrapper[4922]: I0218 12:56:33.558650 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94c125c3-32cd-4f33-b450-40c8cc7eacae-utilities\") pod \"certified-operators-qrs5h\" (UID: \"94c125c3-32cd-4f33-b450-40c8cc7eacae\") " pod="openshift-marketplace/certified-operators-qrs5h" Feb 18 12:56:33 crc kubenswrapper[4922]: I0218 12:56:33.578933 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ms4t\" (UniqueName: \"kubernetes.io/projected/94c125c3-32cd-4f33-b450-40c8cc7eacae-kube-api-access-9ms4t\") pod \"certified-operators-qrs5h\" (UID: \"94c125c3-32cd-4f33-b450-40c8cc7eacae\") " pod="openshift-marketplace/certified-operators-qrs5h" Feb 18 12:56:33 crc kubenswrapper[4922]: I0218 12:56:33.630985 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qrs5h" Feb 18 12:56:34 crc kubenswrapper[4922]: I0218 12:56:34.242604 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qrs5h"] Feb 18 12:56:35 crc kubenswrapper[4922]: I0218 12:56:35.158257 4922 generic.go:334] "Generic (PLEG): container finished" podID="94c125c3-32cd-4f33-b450-40c8cc7eacae" containerID="08a0375b93f0bbc164f903238995d36229517460a9b5e99b1dcfb85d4f3d5632" exitCode=0 Feb 18 12:56:35 crc kubenswrapper[4922]: I0218 12:56:35.158463 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qrs5h" event={"ID":"94c125c3-32cd-4f33-b450-40c8cc7eacae","Type":"ContainerDied","Data":"08a0375b93f0bbc164f903238995d36229517460a9b5e99b1dcfb85d4f3d5632"} Feb 18 12:56:35 crc kubenswrapper[4922]: I0218 12:56:35.158633 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qrs5h" event={"ID":"94c125c3-32cd-4f33-b450-40c8cc7eacae","Type":"ContainerStarted","Data":"7c62826b0f0334c8b0aa3d37b0faacd0446320f8a887d70e52e623149d73d432"} Feb 18 12:56:36 crc kubenswrapper[4922]: I0218 12:56:36.172220 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qrs5h" event={"ID":"94c125c3-32cd-4f33-b450-40c8cc7eacae","Type":"ContainerStarted","Data":"b7f8c4fb90fcb2523ccad8afad30e2ec491363e6f807578a7eba0ee12f9fbee9"} Feb 18 12:56:38 crc kubenswrapper[4922]: I0218 12:56:38.194937 4922 generic.go:334] "Generic (PLEG): container finished" podID="94c125c3-32cd-4f33-b450-40c8cc7eacae" containerID="b7f8c4fb90fcb2523ccad8afad30e2ec491363e6f807578a7eba0ee12f9fbee9" exitCode=0 Feb 18 12:56:38 crc kubenswrapper[4922]: I0218 12:56:38.194976 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qrs5h" event={"ID":"94c125c3-32cd-4f33-b450-40c8cc7eacae","Type":"ContainerDied","Data":"b7f8c4fb90fcb2523ccad8afad30e2ec491363e6f807578a7eba0ee12f9fbee9"} Feb 18 12:56:40 crc kubenswrapper[4922]: I0218 12:56:40.218754 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qrs5h" event={"ID":"94c125c3-32cd-4f33-b450-40c8cc7eacae","Type":"ContainerStarted","Data":"66d77ef75776ebe5a22cef4805ed16aa4fb6c96fc71e508d6f998ab7ef682a5b"} Feb 18 12:56:40 crc kubenswrapper[4922]: I0218 12:56:40.237083 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qrs5h" podStartSLOduration=2.796039842 podStartE2EDuration="7.237064262s" podCreationTimestamp="2026-02-18 12:56:33 +0000 UTC" firstStartedPulling="2026-02-18 12:56:35.161472215 +0000 UTC m=+4796.889176295" lastFinishedPulling="2026-02-18 12:56:39.602496635 +0000 UTC m=+4801.330200715" observedRunningTime="2026-02-18 12:56:40.235212265 +0000 UTC m=+4801.962916355" watchObservedRunningTime="2026-02-18 12:56:40.237064262 +0000 UTC m=+4801.964768362" Feb 18 12:56:43 crc kubenswrapper[4922]: I0218 12:56:43.631869 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qrs5h" Feb 18 12:56:43 crc kubenswrapper[4922]: I0218 12:56:43.632389 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qrs5h" Feb 18 12:56:43 crc kubenswrapper[4922]: I0218 12:56:43.680885 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qrs5h" Feb 18 12:56:44 crc kubenswrapper[4922]: I0218 12:56:44.298723 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qrs5h" Feb 18 12:56:44 crc kubenswrapper[4922]: I0218 12:56:44.351612 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qrs5h"] Feb 18 12:56:45 crc kubenswrapper[4922]: I0218 12:56:45.974124 4922 scope.go:117] "RemoveContainer" containerID="5cb544d05c9692e1a613f94aebafc80d5e6b64a4e004e82767f3ecc7c8a1d324" Feb 18 12:56:45 crc kubenswrapper[4922]: E0218 12:56:45.974795 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:56:46 crc kubenswrapper[4922]: I0218 12:56:46.265702 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qrs5h" podUID="94c125c3-32cd-4f33-b450-40c8cc7eacae" containerName="registry-server" containerID="cri-o://66d77ef75776ebe5a22cef4805ed16aa4fb6c96fc71e508d6f998ab7ef682a5b" gracePeriod=2 Feb 18 12:56:46 crc kubenswrapper[4922]: I0218 12:56:46.777520 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qrs5h" Feb 18 12:56:46 crc kubenswrapper[4922]: I0218 12:56:46.955937 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ms4t\" (UniqueName: \"kubernetes.io/projected/94c125c3-32cd-4f33-b450-40c8cc7eacae-kube-api-access-9ms4t\") pod \"94c125c3-32cd-4f33-b450-40c8cc7eacae\" (UID: \"94c125c3-32cd-4f33-b450-40c8cc7eacae\") " Feb 18 12:56:46 crc kubenswrapper[4922]: I0218 12:56:46.956248 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94c125c3-32cd-4f33-b450-40c8cc7eacae-utilities\") pod \"94c125c3-32cd-4f33-b450-40c8cc7eacae\" (UID: \"94c125c3-32cd-4f33-b450-40c8cc7eacae\") " Feb 18 12:56:46 crc kubenswrapper[4922]: I0218 12:56:46.956346 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94c125c3-32cd-4f33-b450-40c8cc7eacae-catalog-content\") pod \"94c125c3-32cd-4f33-b450-40c8cc7eacae\" (UID: \"94c125c3-32cd-4f33-b450-40c8cc7eacae\") " Feb 18 12:56:46 crc kubenswrapper[4922]: I0218 12:56:46.962546 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94c125c3-32cd-4f33-b450-40c8cc7eacae-kube-api-access-9ms4t" (OuterVolumeSpecName: "kube-api-access-9ms4t") pod "94c125c3-32cd-4f33-b450-40c8cc7eacae" (UID: "94c125c3-32cd-4f33-b450-40c8cc7eacae"). InnerVolumeSpecName "kube-api-access-9ms4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:56:46 crc kubenswrapper[4922]: I0218 12:56:46.964511 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94c125c3-32cd-4f33-b450-40c8cc7eacae-utilities" (OuterVolumeSpecName: "utilities") pod "94c125c3-32cd-4f33-b450-40c8cc7eacae" (UID: "94c125c3-32cd-4f33-b450-40c8cc7eacae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:56:47 crc kubenswrapper[4922]: I0218 12:56:47.030869 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94c125c3-32cd-4f33-b450-40c8cc7eacae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "94c125c3-32cd-4f33-b450-40c8cc7eacae" (UID: "94c125c3-32cd-4f33-b450-40c8cc7eacae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:56:47 crc kubenswrapper[4922]: I0218 12:56:47.059306 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ms4t\" (UniqueName: \"kubernetes.io/projected/94c125c3-32cd-4f33-b450-40c8cc7eacae-kube-api-access-9ms4t\") on node \"crc\" DevicePath \"\"" Feb 18 12:56:47 crc kubenswrapper[4922]: I0218 12:56:47.059350 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94c125c3-32cd-4f33-b450-40c8cc7eacae-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 12:56:47 crc kubenswrapper[4922]: I0218 12:56:47.059380 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94c125c3-32cd-4f33-b450-40c8cc7eacae-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 12:56:47 crc kubenswrapper[4922]: I0218 12:56:47.275790 4922 generic.go:334] "Generic (PLEG): container finished" podID="94c125c3-32cd-4f33-b450-40c8cc7eacae" containerID="66d77ef75776ebe5a22cef4805ed16aa4fb6c96fc71e508d6f998ab7ef682a5b" exitCode=0 Feb 18 12:56:47 crc kubenswrapper[4922]: I0218 12:56:47.275888 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qrs5h" Feb 18 12:56:47 crc kubenswrapper[4922]: I0218 12:56:47.275885 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qrs5h" event={"ID":"94c125c3-32cd-4f33-b450-40c8cc7eacae","Type":"ContainerDied","Data":"66d77ef75776ebe5a22cef4805ed16aa4fb6c96fc71e508d6f998ab7ef682a5b"} Feb 18 12:56:47 crc kubenswrapper[4922]: I0218 12:56:47.276380 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qrs5h" event={"ID":"94c125c3-32cd-4f33-b450-40c8cc7eacae","Type":"ContainerDied","Data":"7c62826b0f0334c8b0aa3d37b0faacd0446320f8a887d70e52e623149d73d432"} Feb 18 12:56:47 crc kubenswrapper[4922]: I0218 12:56:47.276401 4922 scope.go:117] "RemoveContainer" containerID="66d77ef75776ebe5a22cef4805ed16aa4fb6c96fc71e508d6f998ab7ef682a5b" Feb 18 12:56:47 crc kubenswrapper[4922]: I0218 12:56:47.310355 4922 scope.go:117] "RemoveContainer" containerID="b7f8c4fb90fcb2523ccad8afad30e2ec491363e6f807578a7eba0ee12f9fbee9" Feb 18 12:56:47 crc kubenswrapper[4922]: I0218 12:56:47.317633 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qrs5h"] Feb 18 12:56:47 crc kubenswrapper[4922]: I0218 12:56:47.324785 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qrs5h"] Feb 18 12:56:47 crc kubenswrapper[4922]: I0218 12:56:47.347686 4922 scope.go:117] "RemoveContainer" containerID="08a0375b93f0bbc164f903238995d36229517460a9b5e99b1dcfb85d4f3d5632" Feb 18 12:56:47 crc kubenswrapper[4922]: I0218 12:56:47.381296 4922 scope.go:117] "RemoveContainer" containerID="66d77ef75776ebe5a22cef4805ed16aa4fb6c96fc71e508d6f998ab7ef682a5b" Feb 18 12:56:47 crc kubenswrapper[4922]: E0218 12:56:47.381779 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66d77ef75776ebe5a22cef4805ed16aa4fb6c96fc71e508d6f998ab7ef682a5b\": container with ID starting with 66d77ef75776ebe5a22cef4805ed16aa4fb6c96fc71e508d6f998ab7ef682a5b not found: ID does not exist" containerID="66d77ef75776ebe5a22cef4805ed16aa4fb6c96fc71e508d6f998ab7ef682a5b" Feb 18 12:56:47 crc kubenswrapper[4922]: I0218 12:56:47.381816 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66d77ef75776ebe5a22cef4805ed16aa4fb6c96fc71e508d6f998ab7ef682a5b"} err="failed to get container status \"66d77ef75776ebe5a22cef4805ed16aa4fb6c96fc71e508d6f998ab7ef682a5b\": rpc error: code = NotFound desc = could not find container \"66d77ef75776ebe5a22cef4805ed16aa4fb6c96fc71e508d6f998ab7ef682a5b\": container with ID starting with 66d77ef75776ebe5a22cef4805ed16aa4fb6c96fc71e508d6f998ab7ef682a5b not found: ID does not exist" Feb 18 12:56:47 crc kubenswrapper[4922]: I0218 12:56:47.381845 4922 scope.go:117] "RemoveContainer" containerID="b7f8c4fb90fcb2523ccad8afad30e2ec491363e6f807578a7eba0ee12f9fbee9" Feb 18 12:56:47 crc kubenswrapper[4922]: E0218 12:56:47.382123 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7f8c4fb90fcb2523ccad8afad30e2ec491363e6f807578a7eba0ee12f9fbee9\": container with ID starting with b7f8c4fb90fcb2523ccad8afad30e2ec491363e6f807578a7eba0ee12f9fbee9 not found: ID does not exist" containerID="b7f8c4fb90fcb2523ccad8afad30e2ec491363e6f807578a7eba0ee12f9fbee9" Feb 18 12:56:47 crc kubenswrapper[4922]: I0218 12:56:47.382147 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7f8c4fb90fcb2523ccad8afad30e2ec491363e6f807578a7eba0ee12f9fbee9"} err="failed to get container status \"b7f8c4fb90fcb2523ccad8afad30e2ec491363e6f807578a7eba0ee12f9fbee9\": rpc error: code = NotFound desc = could not find container \"b7f8c4fb90fcb2523ccad8afad30e2ec491363e6f807578a7eba0ee12f9fbee9\": container with ID starting with b7f8c4fb90fcb2523ccad8afad30e2ec491363e6f807578a7eba0ee12f9fbee9 not found: ID does not exist" Feb 18 12:56:47 crc kubenswrapper[4922]: I0218 12:56:47.382166 4922 scope.go:117] "RemoveContainer" containerID="08a0375b93f0bbc164f903238995d36229517460a9b5e99b1dcfb85d4f3d5632" Feb 18 12:56:47 crc kubenswrapper[4922]: E0218 12:56:47.382491 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08a0375b93f0bbc164f903238995d36229517460a9b5e99b1dcfb85d4f3d5632\": container with ID starting with 08a0375b93f0bbc164f903238995d36229517460a9b5e99b1dcfb85d4f3d5632 not found: ID does not exist" containerID="08a0375b93f0bbc164f903238995d36229517460a9b5e99b1dcfb85d4f3d5632" Feb 18 12:56:47 crc kubenswrapper[4922]: I0218 12:56:47.382517 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08a0375b93f0bbc164f903238995d36229517460a9b5e99b1dcfb85d4f3d5632"} err="failed to get container status \"08a0375b93f0bbc164f903238995d36229517460a9b5e99b1dcfb85d4f3d5632\": rpc error: code = NotFound desc = could not find container \"08a0375b93f0bbc164f903238995d36229517460a9b5e99b1dcfb85d4f3d5632\": container with ID starting with 08a0375b93f0bbc164f903238995d36229517460a9b5e99b1dcfb85d4f3d5632 not found: ID does not exist" Feb 18 12:56:48 crc kubenswrapper[4922]: I0218 12:56:48.985877 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94c125c3-32cd-4f33-b450-40c8cc7eacae" path="/var/lib/kubelet/pods/94c125c3-32cd-4f33-b450-40c8cc7eacae/volumes" Feb 18 12:56:56 crc kubenswrapper[4922]: I0218 12:56:56.974729 4922 scope.go:117] "RemoveContainer" containerID="5cb544d05c9692e1a613f94aebafc80d5e6b64a4e004e82767f3ecc7c8a1d324" Feb 18 12:56:56 crc kubenswrapper[4922]: E0218 12:56:56.975721 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:57:10 crc kubenswrapper[4922]: I0218 12:57:10.974294 4922 scope.go:117] "RemoveContainer" containerID="5cb544d05c9692e1a613f94aebafc80d5e6b64a4e004e82767f3ecc7c8a1d324" Feb 18 12:57:10 crc kubenswrapper[4922]: E0218 12:57:10.974982 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:57:22 crc kubenswrapper[4922]: I0218 12:57:22.975077 4922 scope.go:117] "RemoveContainer" containerID="5cb544d05c9692e1a613f94aebafc80d5e6b64a4e004e82767f3ecc7c8a1d324" Feb 18 12:57:22 crc kubenswrapper[4922]: E0218 12:57:22.976092 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:57:34 crc kubenswrapper[4922]: I0218 12:57:34.973184 4922 scope.go:117] "RemoveContainer" containerID="5cb544d05c9692e1a613f94aebafc80d5e6b64a4e004e82767f3ecc7c8a1d324" Feb 18 12:57:34 crc kubenswrapper[4922]: E0218 12:57:34.973978 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 12:57:45 crc kubenswrapper[4922]: I0218 12:57:45.973348 4922 scope.go:117] "RemoveContainer" containerID="5cb544d05c9692e1a613f94aebafc80d5e6b64a4e004e82767f3ecc7c8a1d324" Feb 18 12:57:46 crc kubenswrapper[4922]: I0218 12:57:46.811512 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" event={"ID":"fdb7cedc-b2e3-48f0-80e0-e17073b43228","Type":"ContainerStarted","Data":"a3d670811b88086243e0421af6023cabe2c743133e3bc783d1903c4468a2a91f"} Feb 18 12:57:51 crc kubenswrapper[4922]: I0218 12:57:51.815766 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hvxm5"] Feb 18 12:57:51 crc kubenswrapper[4922]: E0218 12:57:51.816688 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94c125c3-32cd-4f33-b450-40c8cc7eacae" containerName="extract-content" Feb 18 12:57:51 crc kubenswrapper[4922]: I0218 12:57:51.816704 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="94c125c3-32cd-4f33-b450-40c8cc7eacae" containerName="extract-content" Feb 18 12:57:51 crc kubenswrapper[4922]: E0218 12:57:51.816779 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94c125c3-32cd-4f33-b450-40c8cc7eacae" containerName="extract-utilities" Feb 18 12:57:51 crc kubenswrapper[4922]: I0218 12:57:51.816788 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="94c125c3-32cd-4f33-b450-40c8cc7eacae" containerName="extract-utilities" Feb 18 12:57:51 crc kubenswrapper[4922]: E0218 12:57:51.816801 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94c125c3-32cd-4f33-b450-40c8cc7eacae" containerName="registry-server" Feb 18 12:57:51 crc kubenswrapper[4922]: I0218 12:57:51.816807 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="94c125c3-32cd-4f33-b450-40c8cc7eacae" containerName="registry-server" Feb 18 12:57:51 crc kubenswrapper[4922]: I0218 12:57:51.817003 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="94c125c3-32cd-4f33-b450-40c8cc7eacae" containerName="registry-server" Feb 18 12:57:51 crc kubenswrapper[4922]: I0218 12:57:51.818572 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hvxm5" Feb 18 12:57:51 crc kubenswrapper[4922]: I0218 12:57:51.833207 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hvxm5"] Feb 18 12:57:51 crc kubenswrapper[4922]: I0218 12:57:51.917762 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c3f144e-fabf-4034-975e-46f1494ee4bf-catalog-content\") pod \"redhat-marketplace-hvxm5\" (UID: \"3c3f144e-fabf-4034-975e-46f1494ee4bf\") " pod="openshift-marketplace/redhat-marketplace-hvxm5" Feb 18 12:57:51 crc kubenswrapper[4922]: I0218 12:57:51.917904 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c3f144e-fabf-4034-975e-46f1494ee4bf-utilities\") pod \"redhat-marketplace-hvxm5\" (UID: \"3c3f144e-fabf-4034-975e-46f1494ee4bf\") " pod="openshift-marketplace/redhat-marketplace-hvxm5" Feb 18 12:57:51 crc kubenswrapper[4922]: I0218 12:57:51.917977 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f557n\" (UniqueName: \"kubernetes.io/projected/3c3f144e-fabf-4034-975e-46f1494ee4bf-kube-api-access-f557n\") pod \"redhat-marketplace-hvxm5\" (UID: \"3c3f144e-fabf-4034-975e-46f1494ee4bf\") " pod="openshift-marketplace/redhat-marketplace-hvxm5" Feb 18 12:57:52 crc kubenswrapper[4922]: I0218 12:57:52.019606 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c3f144e-fabf-4034-975e-46f1494ee4bf-utilities\") pod \"redhat-marketplace-hvxm5\" (UID: \"3c3f144e-fabf-4034-975e-46f1494ee4bf\") " pod="openshift-marketplace/redhat-marketplace-hvxm5" Feb 18 12:57:52 crc kubenswrapper[4922]: I0218 12:57:52.019682 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f557n\" (UniqueName: \"kubernetes.io/projected/3c3f144e-fabf-4034-975e-46f1494ee4bf-kube-api-access-f557n\") pod \"redhat-marketplace-hvxm5\" (UID: \"3c3f144e-fabf-4034-975e-46f1494ee4bf\") " pod="openshift-marketplace/redhat-marketplace-hvxm5" Feb 18 12:57:52 crc kubenswrapper[4922]: I0218 12:57:52.019821 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c3f144e-fabf-4034-975e-46f1494ee4bf-catalog-content\") pod \"redhat-marketplace-hvxm5\" (UID: \"3c3f144e-fabf-4034-975e-46f1494ee4bf\") " pod="openshift-marketplace/redhat-marketplace-hvxm5" Feb 18 12:57:52 crc kubenswrapper[4922]: I0218 12:57:52.020338 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c3f144e-fabf-4034-975e-46f1494ee4bf-catalog-content\") pod \"redhat-marketplace-hvxm5\" (UID: \"3c3f144e-fabf-4034-975e-46f1494ee4bf\") " pod="openshift-marketplace/redhat-marketplace-hvxm5" Feb 18 12:57:52 crc kubenswrapper[4922]: I0218 12:57:52.020351 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c3f144e-fabf-4034-975e-46f1494ee4bf-utilities\") pod \"redhat-marketplace-hvxm5\" (UID: \"3c3f144e-fabf-4034-975e-46f1494ee4bf\") " pod="openshift-marketplace/redhat-marketplace-hvxm5" Feb 18 12:57:52 crc kubenswrapper[4922]: I0218 12:57:52.039509 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f557n\" (UniqueName: \"kubernetes.io/projected/3c3f144e-fabf-4034-975e-46f1494ee4bf-kube-api-access-f557n\") pod \"redhat-marketplace-hvxm5\" (UID: \"3c3f144e-fabf-4034-975e-46f1494ee4bf\") " pod="openshift-marketplace/redhat-marketplace-hvxm5" Feb 18 12:57:52 crc kubenswrapper[4922]: I0218 12:57:52.147791 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hvxm5" Feb 18 12:57:52 crc kubenswrapper[4922]: I0218 12:57:52.594243 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hvxm5"] Feb 18 12:57:52 crc kubenswrapper[4922]: I0218 12:57:52.874623 4922 generic.go:334] "Generic (PLEG): container finished" podID="3c3f144e-fabf-4034-975e-46f1494ee4bf" containerID="882d0494ca3cd1b991318e5181f3fda34cfb362f6738b6b9c57fca6cd37790d3" exitCode=0 Feb 18 12:57:52 crc kubenswrapper[4922]: I0218 12:57:52.874693 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hvxm5" event={"ID":"3c3f144e-fabf-4034-975e-46f1494ee4bf","Type":"ContainerDied","Data":"882d0494ca3cd1b991318e5181f3fda34cfb362f6738b6b9c57fca6cd37790d3"} Feb 18 12:57:52 crc kubenswrapper[4922]: I0218 12:57:52.874992 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hvxm5" event={"ID":"3c3f144e-fabf-4034-975e-46f1494ee4bf","Type":"ContainerStarted","Data":"0d0fed85c22369371f11a697ae35c11710d53cc735f858229a22ee2477d97913"} Feb 18 12:57:53 crc kubenswrapper[4922]: I0218 12:57:53.885097 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hvxm5" event={"ID":"3c3f144e-fabf-4034-975e-46f1494ee4bf","Type":"ContainerStarted","Data":"9865bb8d3add17f03ed0cb12b7bff7d34d2081926faa532b4686b36cd82edfb7"} Feb 18 12:57:54 crc kubenswrapper[4922]: I0218 12:57:54.896109 4922 generic.go:334] "Generic (PLEG): container finished" podID="3c3f144e-fabf-4034-975e-46f1494ee4bf" containerID="9865bb8d3add17f03ed0cb12b7bff7d34d2081926faa532b4686b36cd82edfb7" exitCode=0 Feb 18 12:57:54 crc kubenswrapper[4922]: I0218 12:57:54.896174 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hvxm5" event={"ID":"3c3f144e-fabf-4034-975e-46f1494ee4bf","Type":"ContainerDied","Data":"9865bb8d3add17f03ed0cb12b7bff7d34d2081926faa532b4686b36cd82edfb7"} Feb 18 12:57:55 crc kubenswrapper[4922]: I0218 12:57:55.907277 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hvxm5" event={"ID":"3c3f144e-fabf-4034-975e-46f1494ee4bf","Type":"ContainerStarted","Data":"737a38605f39c3e1061fe189a1db07f3846ad31c74ef2249f1a43f7e2f414368"} Feb 18 12:57:55 crc kubenswrapper[4922]: I0218 12:57:55.935761 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hvxm5" podStartSLOduration=2.551131387 podStartE2EDuration="4.93574119s" podCreationTimestamp="2026-02-18 12:57:51 +0000 UTC" firstStartedPulling="2026-02-18 12:57:52.875959142 +0000 UTC m=+4874.603663222" lastFinishedPulling="2026-02-18 12:57:55.260568945 +0000 UTC m=+4876.988273025" observedRunningTime="2026-02-18 12:57:55.926621129 +0000 UTC m=+4877.654325209" watchObservedRunningTime="2026-02-18 12:57:55.93574119 +0000 UTC m=+4877.663445270" Feb 18 12:58:02 crc kubenswrapper[4922]: I0218 12:58:02.149007 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hvxm5" Feb 18 12:58:02 crc kubenswrapper[4922]: I0218 12:58:02.150115 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hvxm5" Feb 18 12:58:02 crc kubenswrapper[4922]: I0218 12:58:02.199181 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hvxm5" Feb 18 12:58:03 crc kubenswrapper[4922]: I0218 12:58:03.036155 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hvxm5" Feb 18 12:58:04 crc kubenswrapper[4922]: I0218 12:58:04.206112 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hvxm5"] Feb 18 12:58:05 crc kubenswrapper[4922]: I0218 12:58:05.006149 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hvxm5" podUID="3c3f144e-fabf-4034-975e-46f1494ee4bf" containerName="registry-server" containerID="cri-o://737a38605f39c3e1061fe189a1db07f3846ad31c74ef2249f1a43f7e2f414368" gracePeriod=2 Feb 18 12:58:05 crc kubenswrapper[4922]: I0218 12:58:05.502912 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hvxm5" Feb 18 12:58:05 crc kubenswrapper[4922]: I0218 12:58:05.596135 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c3f144e-fabf-4034-975e-46f1494ee4bf-utilities\") pod \"3c3f144e-fabf-4034-975e-46f1494ee4bf\" (UID: \"3c3f144e-fabf-4034-975e-46f1494ee4bf\") " Feb 18 12:58:05 crc kubenswrapper[4922]: I0218 12:58:05.596253 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f557n\" (UniqueName: \"kubernetes.io/projected/3c3f144e-fabf-4034-975e-46f1494ee4bf-kube-api-access-f557n\") pod \"3c3f144e-fabf-4034-975e-46f1494ee4bf\" (UID: \"3c3f144e-fabf-4034-975e-46f1494ee4bf\") " Feb 18 12:58:05 crc kubenswrapper[4922]: I0218 12:58:05.596379 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c3f144e-fabf-4034-975e-46f1494ee4bf-catalog-content\") pod \"3c3f144e-fabf-4034-975e-46f1494ee4bf\" (UID: \"3c3f144e-fabf-4034-975e-46f1494ee4bf\") " Feb 18 12:58:05 crc kubenswrapper[4922]: I0218 12:58:05.597516 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c3f144e-fabf-4034-975e-46f1494ee4bf-utilities" (OuterVolumeSpecName: "utilities") pod "3c3f144e-fabf-4034-975e-46f1494ee4bf" (UID: "3c3f144e-fabf-4034-975e-46f1494ee4bf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:58:05 crc kubenswrapper[4922]: I0218 12:58:05.605491 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c3f144e-fabf-4034-975e-46f1494ee4bf-kube-api-access-f557n" (OuterVolumeSpecName: "kube-api-access-f557n") pod "3c3f144e-fabf-4034-975e-46f1494ee4bf" (UID: "3c3f144e-fabf-4034-975e-46f1494ee4bf"). InnerVolumeSpecName "kube-api-access-f557n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:58:05 crc kubenswrapper[4922]: I0218 12:58:05.621263 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c3f144e-fabf-4034-975e-46f1494ee4bf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3c3f144e-fabf-4034-975e-46f1494ee4bf" (UID: "3c3f144e-fabf-4034-975e-46f1494ee4bf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:58:05 crc kubenswrapper[4922]: I0218 12:58:05.699064 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c3f144e-fabf-4034-975e-46f1494ee4bf-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 12:58:05 crc kubenswrapper[4922]: I0218 12:58:05.699106 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f557n\" (UniqueName: \"kubernetes.io/projected/3c3f144e-fabf-4034-975e-46f1494ee4bf-kube-api-access-f557n\") on node \"crc\" DevicePath \"\"" Feb 18 12:58:05 crc kubenswrapper[4922]: I0218 12:58:05.699114 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c3f144e-fabf-4034-975e-46f1494ee4bf-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 12:58:06 crc kubenswrapper[4922]: I0218 12:58:06.019986 4922 generic.go:334] "Generic (PLEG): container finished" podID="3c3f144e-fabf-4034-975e-46f1494ee4bf" containerID="737a38605f39c3e1061fe189a1db07f3846ad31c74ef2249f1a43f7e2f414368" exitCode=0 Feb 18 12:58:06 crc kubenswrapper[4922]: I0218 12:58:06.020056 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hvxm5" event={"ID":"3c3f144e-fabf-4034-975e-46f1494ee4bf","Type":"ContainerDied","Data":"737a38605f39c3e1061fe189a1db07f3846ad31c74ef2249f1a43f7e2f414368"} Feb 18 12:58:06 crc kubenswrapper[4922]: I0218 12:58:06.020085 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hvxm5" Feb 18 12:58:06 crc kubenswrapper[4922]: I0218 12:58:06.020102 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hvxm5" event={"ID":"3c3f144e-fabf-4034-975e-46f1494ee4bf","Type":"ContainerDied","Data":"0d0fed85c22369371f11a697ae35c11710d53cc735f858229a22ee2477d97913"} Feb 18 12:58:06 crc kubenswrapper[4922]: I0218 12:58:06.020121 4922 scope.go:117] "RemoveContainer" containerID="737a38605f39c3e1061fe189a1db07f3846ad31c74ef2249f1a43f7e2f414368" Feb 18 12:58:06 crc kubenswrapper[4922]: I0218 12:58:06.061421 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hvxm5"] Feb 18 12:58:06 crc kubenswrapper[4922]: I0218 12:58:06.062487 4922 scope.go:117] "RemoveContainer" containerID="9865bb8d3add17f03ed0cb12b7bff7d34d2081926faa532b4686b36cd82edfb7" Feb 18 12:58:06 crc kubenswrapper[4922]: I0218 12:58:06.072906 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hvxm5"] Feb 18 12:58:06 crc kubenswrapper[4922]: I0218 12:58:06.372625 4922 scope.go:117] "RemoveContainer" containerID="882d0494ca3cd1b991318e5181f3fda34cfb362f6738b6b9c57fca6cd37790d3" Feb 18 12:58:06 crc kubenswrapper[4922]: I0218 12:58:06.429353 4922 scope.go:117] "RemoveContainer" containerID="737a38605f39c3e1061fe189a1db07f3846ad31c74ef2249f1a43f7e2f414368" Feb 18 12:58:06 crc kubenswrapper[4922]: E0218 12:58:06.429841 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"737a38605f39c3e1061fe189a1db07f3846ad31c74ef2249f1a43f7e2f414368\": container with ID starting with 737a38605f39c3e1061fe189a1db07f3846ad31c74ef2249f1a43f7e2f414368 not found: ID does not exist" containerID="737a38605f39c3e1061fe189a1db07f3846ad31c74ef2249f1a43f7e2f414368" Feb 18 12:58:06 crc kubenswrapper[4922]: I0218 12:58:06.429959 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"737a38605f39c3e1061fe189a1db07f3846ad31c74ef2249f1a43f7e2f414368"} err="failed to get container status \"737a38605f39c3e1061fe189a1db07f3846ad31c74ef2249f1a43f7e2f414368\": rpc error: code = NotFound desc = could not find container \"737a38605f39c3e1061fe189a1db07f3846ad31c74ef2249f1a43f7e2f414368\": container with ID starting with 737a38605f39c3e1061fe189a1db07f3846ad31c74ef2249f1a43f7e2f414368 not found: ID does not exist" Feb 18 12:58:06 crc kubenswrapper[4922]: I0218 12:58:06.430071 4922 scope.go:117] "RemoveContainer" containerID="9865bb8d3add17f03ed0cb12b7bff7d34d2081926faa532b4686b36cd82edfb7" Feb 18 12:58:06 crc kubenswrapper[4922]: E0218 12:58:06.430833 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9865bb8d3add17f03ed0cb12b7bff7d34d2081926faa532b4686b36cd82edfb7\": container with ID starting with 9865bb8d3add17f03ed0cb12b7bff7d34d2081926faa532b4686b36cd82edfb7 not found: ID does not exist" containerID="9865bb8d3add17f03ed0cb12b7bff7d34d2081926faa532b4686b36cd82edfb7" Feb 18 12:58:06 crc kubenswrapper[4922]: I0218 12:58:06.430924 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9865bb8d3add17f03ed0cb12b7bff7d34d2081926faa532b4686b36cd82edfb7"} err="failed to get container status \"9865bb8d3add17f03ed0cb12b7bff7d34d2081926faa532b4686b36cd82edfb7\": rpc error: code = NotFound desc = could not find container \"9865bb8d3add17f03ed0cb12b7bff7d34d2081926faa532b4686b36cd82edfb7\": container with ID starting with 9865bb8d3add17f03ed0cb12b7bff7d34d2081926faa532b4686b36cd82edfb7 not found: ID does not exist" Feb 18 12:58:06 crc kubenswrapper[4922]: I0218 12:58:06.430999 4922 scope.go:117] "RemoveContainer" containerID="882d0494ca3cd1b991318e5181f3fda34cfb362f6738b6b9c57fca6cd37790d3" Feb 18 12:58:06 crc kubenswrapper[4922]: E0218 12:58:06.431402 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"882d0494ca3cd1b991318e5181f3fda34cfb362f6738b6b9c57fca6cd37790d3\": container with ID starting with 882d0494ca3cd1b991318e5181f3fda34cfb362f6738b6b9c57fca6cd37790d3 not found: ID does not exist" containerID="882d0494ca3cd1b991318e5181f3fda34cfb362f6738b6b9c57fca6cd37790d3" Feb 18 12:58:06 crc kubenswrapper[4922]: I0218 12:58:06.431435 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"882d0494ca3cd1b991318e5181f3fda34cfb362f6738b6b9c57fca6cd37790d3"} err="failed to get container status \"882d0494ca3cd1b991318e5181f3fda34cfb362f6738b6b9c57fca6cd37790d3\": rpc error: code = NotFound desc = could not find container \"882d0494ca3cd1b991318e5181f3fda34cfb362f6738b6b9c57fca6cd37790d3\": container with ID starting with 882d0494ca3cd1b991318e5181f3fda34cfb362f6738b6b9c57fca6cd37790d3 not found: ID does not exist" Feb 18 12:58:06 crc kubenswrapper[4922]: I0218 12:58:06.990571 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c3f144e-fabf-4034-975e-46f1494ee4bf" path="/var/lib/kubelet/pods/3c3f144e-fabf-4034-975e-46f1494ee4bf/volumes" Feb 18 12:58:19 crc kubenswrapper[4922]: I0218 12:58:19.137215 4922 generic.go:334] "Generic (PLEG): container finished" podID="4525818f-9e1d-48a0-8ec1-1a22a0841dd4" containerID="e3b2b8b928d4d252bf46e4bb853a742c089adf478f27339f292b3bd6347dcdc0" exitCode=1 Feb 18 12:58:19 crc kubenswrapper[4922]: I0218 12:58:19.137324 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"4525818f-9e1d-48a0-8ec1-1a22a0841dd4","Type":"ContainerDied","Data":"e3b2b8b928d4d252bf46e4bb853a742c089adf478f27339f292b3bd6347dcdc0"} Feb 18 12:58:20 crc kubenswrapper[4922]: I0218 12:58:20.485656 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 18 12:58:20 crc kubenswrapper[4922]: I0218 12:58:20.583744 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4525818f-9e1d-48a0-8ec1-1a22a0841dd4-test-operator-ephemeral-temporary\") pod \"4525818f-9e1d-48a0-8ec1-1a22a0841dd4\" (UID: \"4525818f-9e1d-48a0-8ec1-1a22a0841dd4\") " Feb 18 12:58:20 crc kubenswrapper[4922]: I0218 12:58:20.583830 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4525818f-9e1d-48a0-8ec1-1a22a0841dd4-test-operator-ephemeral-workdir\") pod \"4525818f-9e1d-48a0-8ec1-1a22a0841dd4\" (UID: \"4525818f-9e1d-48a0-8ec1-1a22a0841dd4\") " Feb 18 12:58:20 crc kubenswrapper[4922]: I0218 12:58:20.583862 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4525818f-9e1d-48a0-8ec1-1a22a0841dd4-openstack-config\") pod \"4525818f-9e1d-48a0-8ec1-1a22a0841dd4\" (UID: \"4525818f-9e1d-48a0-8ec1-1a22a0841dd4\") " Feb 18 12:58:20 crc kubenswrapper[4922]: I0218 12:58:20.583898 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4525818f-9e1d-48a0-8ec1-1a22a0841dd4-ssh-key\") pod \"4525818f-9e1d-48a0-8ec1-1a22a0841dd4\" (UID: \"4525818f-9e1d-48a0-8ec1-1a22a0841dd4\") " Feb 18 12:58:20 crc kubenswrapper[4922]: I0218 12:58:20.583978 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4525818f-9e1d-48a0-8ec1-1a22a0841dd4-config-data\") pod \"4525818f-9e1d-48a0-8ec1-1a22a0841dd4\" (UID: \"4525818f-9e1d-48a0-8ec1-1a22a0841dd4\") " Feb 18 12:58:20 crc kubenswrapper[4922]: I0218 12:58:20.584051 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4525818f-9e1d-48a0-8ec1-1a22a0841dd4-openstack-config-secret\") pod \"4525818f-9e1d-48a0-8ec1-1a22a0841dd4\" (UID: \"4525818f-9e1d-48a0-8ec1-1a22a0841dd4\") " Feb 18 12:58:20 crc kubenswrapper[4922]: I0218 12:58:20.584424 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4525818f-9e1d-48a0-8ec1-1a22a0841dd4-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "4525818f-9e1d-48a0-8ec1-1a22a0841dd4" (UID: "4525818f-9e1d-48a0-8ec1-1a22a0841dd4"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:58:20 crc kubenswrapper[4922]: I0218 12:58:20.584693 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4525818f-9e1d-48a0-8ec1-1a22a0841dd4-config-data" (OuterVolumeSpecName: "config-data") pod "4525818f-9e1d-48a0-8ec1-1a22a0841dd4" (UID: "4525818f-9e1d-48a0-8ec1-1a22a0841dd4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 12:58:20 crc kubenswrapper[4922]: I0218 12:58:20.584781 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4525818f-9e1d-48a0-8ec1-1a22a0841dd4-ca-certs\") pod \"4525818f-9e1d-48a0-8ec1-1a22a0841dd4\" (UID: \"4525818f-9e1d-48a0-8ec1-1a22a0841dd4\") " Feb 18 12:58:20 crc kubenswrapper[4922]: I0218 12:58:20.584982 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqctm\" (UniqueName: \"kubernetes.io/projected/4525818f-9e1d-48a0-8ec1-1a22a0841dd4-kube-api-access-vqctm\") pod \"4525818f-9e1d-48a0-8ec1-1a22a0841dd4\" (UID: \"4525818f-9e1d-48a0-8ec1-1a22a0841dd4\") " Feb 18 12:58:20 crc kubenswrapper[4922]: I0218 12:58:20.585048 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"4525818f-9e1d-48a0-8ec1-1a22a0841dd4\" (UID: \"4525818f-9e1d-48a0-8ec1-1a22a0841dd4\") " Feb 18 12:58:20 crc kubenswrapper[4922]: I0218 12:58:20.585749 4922 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4525818f-9e1d-48a0-8ec1-1a22a0841dd4-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Feb 18 12:58:20 crc kubenswrapper[4922]: I0218 12:58:20.585768 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4525818f-9e1d-48a0-8ec1-1a22a0841dd4-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 12:58:20 crc kubenswrapper[4922]: I0218 12:58:20.589711 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "test-operator-logs") pod "4525818f-9e1d-48a0-8ec1-1a22a0841dd4" (UID: "4525818f-9e1d-48a0-8ec1-1a22a0841dd4"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 18 12:58:20 crc kubenswrapper[4922]: I0218 12:58:20.590188 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4525818f-9e1d-48a0-8ec1-1a22a0841dd4-kube-api-access-vqctm" (OuterVolumeSpecName: "kube-api-access-vqctm") pod "4525818f-9e1d-48a0-8ec1-1a22a0841dd4" (UID: "4525818f-9e1d-48a0-8ec1-1a22a0841dd4"). InnerVolumeSpecName "kube-api-access-vqctm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:58:20 crc kubenswrapper[4922]: I0218 12:58:20.611320 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4525818f-9e1d-48a0-8ec1-1a22a0841dd4-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "4525818f-9e1d-48a0-8ec1-1a22a0841dd4" (UID: "4525818f-9e1d-48a0-8ec1-1a22a0841dd4"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:58:20 crc kubenswrapper[4922]: I0218 12:58:20.612332 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4525818f-9e1d-48a0-8ec1-1a22a0841dd4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4525818f-9e1d-48a0-8ec1-1a22a0841dd4" (UID: "4525818f-9e1d-48a0-8ec1-1a22a0841dd4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:58:20 crc kubenswrapper[4922]: I0218 12:58:20.613979 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4525818f-9e1d-48a0-8ec1-1a22a0841dd4-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "4525818f-9e1d-48a0-8ec1-1a22a0841dd4" (UID: "4525818f-9e1d-48a0-8ec1-1a22a0841dd4"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:58:20 crc kubenswrapper[4922]: I0218 12:58:20.641344 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4525818f-9e1d-48a0-8ec1-1a22a0841dd4-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "4525818f-9e1d-48a0-8ec1-1a22a0841dd4" (UID: "4525818f-9e1d-48a0-8ec1-1a22a0841dd4"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 12:58:20 crc kubenswrapper[4922]: I0218 12:58:20.662425 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4525818f-9e1d-48a0-8ec1-1a22a0841dd4-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "4525818f-9e1d-48a0-8ec1-1a22a0841dd4" (UID: "4525818f-9e1d-48a0-8ec1-1a22a0841dd4"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:58:20 crc kubenswrapper[4922]: I0218 12:58:20.687078 4922 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4525818f-9e1d-48a0-8ec1-1a22a0841dd4-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 18 12:58:20 crc kubenswrapper[4922]: I0218 12:58:20.687110 4922 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4525818f-9e1d-48a0-8ec1-1a22a0841dd4-ca-certs\") on node \"crc\" DevicePath \"\"" Feb 18 12:58:20 crc kubenswrapper[4922]: I0218 12:58:20.687120 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqctm\" (UniqueName: \"kubernetes.io/projected/4525818f-9e1d-48a0-8ec1-1a22a0841dd4-kube-api-access-vqctm\") on node \"crc\" DevicePath \"\"" Feb 18 12:58:20 crc kubenswrapper[4922]: I0218 12:58:20.687151 4922 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Feb 18 12:58:20 crc kubenswrapper[4922]: I0218 12:58:20.687162 4922 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4525818f-9e1d-48a0-8ec1-1a22a0841dd4-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Feb 18 12:58:20 crc kubenswrapper[4922]: I0218 12:58:20.687172 4922 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4525818f-9e1d-48a0-8ec1-1a22a0841dd4-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 18 12:58:20 crc kubenswrapper[4922]: I0218 12:58:20.687180 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4525818f-9e1d-48a0-8ec1-1a22a0841dd4-ssh-key\") on node \"crc\" DevicePath \"\"" Feb 18 12:58:20 crc kubenswrapper[4922]: I0218 12:58:20.708550 4922 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Feb 18 12:58:20 crc kubenswrapper[4922]: I0218 12:58:20.789230 4922 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Feb 18 12:58:21 crc kubenswrapper[4922]: I0218 12:58:21.157209 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"4525818f-9e1d-48a0-8ec1-1a22a0841dd4","Type":"ContainerDied","Data":"492694f74401bc119697f1caa4fa178df1922c217659e262bc75d36660dd58d8"} Feb 18 12:58:21 crc kubenswrapper[4922]: I0218 12:58:21.157262 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="492694f74401bc119697f1caa4fa178df1922c217659e262bc75d36660dd58d8" Feb 18 12:58:21 crc kubenswrapper[4922]: I0218 12:58:21.157278 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 18 12:58:28 crc kubenswrapper[4922]: I0218 12:58:28.224905 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 18 12:58:28 crc kubenswrapper[4922]: E0218 12:58:28.226590 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c3f144e-fabf-4034-975e-46f1494ee4bf" containerName="extract-utilities" Feb 18 12:58:28 crc kubenswrapper[4922]: I0218 12:58:28.226611 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c3f144e-fabf-4034-975e-46f1494ee4bf" containerName="extract-utilities" Feb 18 12:58:28 crc kubenswrapper[4922]: E0218 12:58:28.226631 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c3f144e-fabf-4034-975e-46f1494ee4bf" containerName="extract-content" Feb 18 12:58:28 crc kubenswrapper[4922]: I0218 12:58:28.226639 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c3f144e-fabf-4034-975e-46f1494ee4bf" containerName="extract-content" Feb 18 12:58:28 crc kubenswrapper[4922]: E0218 12:58:28.226655 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4525818f-9e1d-48a0-8ec1-1a22a0841dd4" containerName="tempest-tests-tempest-tests-runner" Feb 18 12:58:28 crc kubenswrapper[4922]: I0218 12:58:28.226662 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="4525818f-9e1d-48a0-8ec1-1a22a0841dd4" containerName="tempest-tests-tempest-tests-runner" Feb 18 12:58:28 crc kubenswrapper[4922]: E0218 12:58:28.226688 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c3f144e-fabf-4034-975e-46f1494ee4bf" containerName="registry-server" Feb 18 12:58:28 crc kubenswrapper[4922]: I0218 12:58:28.226694 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c3f144e-fabf-4034-975e-46f1494ee4bf" containerName="registry-server" Feb 18 12:58:28 crc kubenswrapper[4922]: I0218 12:58:28.226964 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="4525818f-9e1d-48a0-8ec1-1a22a0841dd4" containerName="tempest-tests-tempest-tests-runner" Feb 18 12:58:28 crc kubenswrapper[4922]: I0218 12:58:28.227016 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c3f144e-fabf-4034-975e-46f1494ee4bf" containerName="registry-server" Feb 18 12:58:28 crc kubenswrapper[4922]: I0218 12:58:28.228049 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 18 12:58:28 crc kubenswrapper[4922]: I0218 12:58:28.231073 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-thfnr" Feb 18 12:58:28 crc kubenswrapper[4922]: I0218 12:58:28.244397 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 18 12:58:28 crc kubenswrapper[4922]: I0218 12:58:28.340728 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stkds\" (UniqueName: \"kubernetes.io/projected/fc2833dd-ab51-414c-9ce3-ed8078989ea5-kube-api-access-stkds\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"fc2833dd-ab51-414c-9ce3-ed8078989ea5\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 18 12:58:28 crc kubenswrapper[4922]: I0218 12:58:28.341042 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"fc2833dd-ab51-414c-9ce3-ed8078989ea5\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 18 12:58:28 crc kubenswrapper[4922]: I0218 12:58:28.443401 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stkds\" (UniqueName: \"kubernetes.io/projected/fc2833dd-ab51-414c-9ce3-ed8078989ea5-kube-api-access-stkds\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"fc2833dd-ab51-414c-9ce3-ed8078989ea5\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 18 12:58:28 crc kubenswrapper[4922]: I0218 12:58:28.443557 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"fc2833dd-ab51-414c-9ce3-ed8078989ea5\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 18 12:58:28 crc kubenswrapper[4922]: I0218 12:58:28.444028 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"fc2833dd-ab51-414c-9ce3-ed8078989ea5\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 18 12:58:28 crc kubenswrapper[4922]: I0218 12:58:28.465259 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stkds\" (UniqueName: \"kubernetes.io/projected/fc2833dd-ab51-414c-9ce3-ed8078989ea5-kube-api-access-stkds\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"fc2833dd-ab51-414c-9ce3-ed8078989ea5\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 18 12:58:28 crc kubenswrapper[4922]: I0218 12:58:28.471042 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"fc2833dd-ab51-414c-9ce3-ed8078989ea5\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 18 12:58:28 crc kubenswrapper[4922]: I0218 12:58:28.551993 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 18 12:58:29 crc kubenswrapper[4922]: I0218 12:58:29.033052 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 18 12:58:29 crc kubenswrapper[4922]: I0218 12:58:29.220803 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"fc2833dd-ab51-414c-9ce3-ed8078989ea5","Type":"ContainerStarted","Data":"5d8033e919b26ce15b779e896791d985ec6fd7ba74dfed5e364ccbafba85687d"} Feb 18 12:58:30 crc kubenswrapper[4922]: I0218 12:58:30.229836 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"fc2833dd-ab51-414c-9ce3-ed8078989ea5","Type":"ContainerStarted","Data":"58ee746ddfe99d427062a01a14a31e40a16f287ebdc8eab91fc41fab8d27f975"} Feb 18 12:58:30 crc kubenswrapper[4922]: I0218 12:58:30.244774 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.333675906 podStartE2EDuration="2.244756278s" podCreationTimestamp="2026-02-18 12:58:28 +0000 UTC" firstStartedPulling="2026-02-18 12:58:29.024440872 +0000 UTC m=+4910.752144952" lastFinishedPulling="2026-02-18 12:58:29.935521244 +0000 UTC m=+4911.663225324" observedRunningTime="2026-02-18 12:58:30.240886539 +0000 UTC m=+4911.968590619" watchObservedRunningTime="2026-02-18 12:58:30.244756278 +0000 UTC m=+4911.972460358" Feb 18 12:58:53 crc kubenswrapper[4922]: I0218 12:58:53.591783 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nxktd"] Feb 18 12:58:53 crc kubenswrapper[4922]: I0218 12:58:53.595535 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nxktd" Feb 18 12:58:53 crc kubenswrapper[4922]: I0218 12:58:53.603265 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nxktd"] Feb 18 12:58:53 crc kubenswrapper[4922]: I0218 12:58:53.727057 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttfnt\" (UniqueName: \"kubernetes.io/projected/37694696-8e79-4f78-be48-8d4bbdeef478-kube-api-access-ttfnt\") pod \"community-operators-nxktd\" (UID: \"37694696-8e79-4f78-be48-8d4bbdeef478\") " pod="openshift-marketplace/community-operators-nxktd" Feb 18 12:58:53 crc kubenswrapper[4922]: I0218 12:58:53.727185 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37694696-8e79-4f78-be48-8d4bbdeef478-utilities\") pod \"community-operators-nxktd\" (UID: \"37694696-8e79-4f78-be48-8d4bbdeef478\") " pod="openshift-marketplace/community-operators-nxktd" Feb 18 12:58:53 crc kubenswrapper[4922]: I0218 12:58:53.727211 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37694696-8e79-4f78-be48-8d4bbdeef478-catalog-content\") pod \"community-operators-nxktd\" (UID: \"37694696-8e79-4f78-be48-8d4bbdeef478\") " pod="openshift-marketplace/community-operators-nxktd" Feb 18 12:58:53 crc kubenswrapper[4922]: I0218 12:58:53.829415 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttfnt\" (UniqueName: \"kubernetes.io/projected/37694696-8e79-4f78-be48-8d4bbdeef478-kube-api-access-ttfnt\") pod \"community-operators-nxktd\" (UID: \"37694696-8e79-4f78-be48-8d4bbdeef478\") " pod="openshift-marketplace/community-operators-nxktd" Feb 18 12:58:53 crc kubenswrapper[4922]: I0218 12:58:53.829580 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37694696-8e79-4f78-be48-8d4bbdeef478-utilities\") pod \"community-operators-nxktd\" (UID: \"37694696-8e79-4f78-be48-8d4bbdeef478\") " pod="openshift-marketplace/community-operators-nxktd" Feb 18 12:58:53 crc kubenswrapper[4922]: I0218 12:58:53.829619 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37694696-8e79-4f78-be48-8d4bbdeef478-catalog-content\") pod \"community-operators-nxktd\" (UID: \"37694696-8e79-4f78-be48-8d4bbdeef478\") " pod="openshift-marketplace/community-operators-nxktd" Feb 18 12:58:53 crc kubenswrapper[4922]: I0218 12:58:53.830169 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37694696-8e79-4f78-be48-8d4bbdeef478-catalog-content\") pod \"community-operators-nxktd\" (UID: \"37694696-8e79-4f78-be48-8d4bbdeef478\") " pod="openshift-marketplace/community-operators-nxktd" Feb 18 12:58:53 crc kubenswrapper[4922]: I0218 12:58:53.830479 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37694696-8e79-4f78-be48-8d4bbdeef478-utilities\") pod \"community-operators-nxktd\" (UID: \"37694696-8e79-4f78-be48-8d4bbdeef478\") " pod="openshift-marketplace/community-operators-nxktd" Feb 18 12:58:53 crc kubenswrapper[4922]: I0218 12:58:53.849749 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttfnt\" (UniqueName: \"kubernetes.io/projected/37694696-8e79-4f78-be48-8d4bbdeef478-kube-api-access-ttfnt\") pod \"community-operators-nxktd\" (UID: \"37694696-8e79-4f78-be48-8d4bbdeef478\") " pod="openshift-marketplace/community-operators-nxktd" Feb 18 12:58:53 crc kubenswrapper[4922]: I0218 12:58:53.924643 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nxktd" Feb 18 12:58:54 crc kubenswrapper[4922]: W0218 12:58:54.426446 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37694696_8e79_4f78_be48_8d4bbdeef478.slice/crio-60d43046c25b17aef620e372e4fff97acd4e4d4aef2c1b4922e6e429cb743ec5 WatchSource:0}: Error finding container 60d43046c25b17aef620e372e4fff97acd4e4d4aef2c1b4922e6e429cb743ec5: Status 404 returned error can't find the container with id 60d43046c25b17aef620e372e4fff97acd4e4d4aef2c1b4922e6e429cb743ec5 Feb 18 12:58:54 crc kubenswrapper[4922]: I0218 12:58:54.427903 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nxktd"] Feb 18 12:58:54 crc kubenswrapper[4922]: I0218 12:58:54.438081 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nxktd" event={"ID":"37694696-8e79-4f78-be48-8d4bbdeef478","Type":"ContainerStarted","Data":"60d43046c25b17aef620e372e4fff97acd4e4d4aef2c1b4922e6e429cb743ec5"} Feb 18 12:58:55 crc kubenswrapper[4922]: I0218 12:58:55.447961 4922 generic.go:334] "Generic (PLEG): container finished" podID="37694696-8e79-4f78-be48-8d4bbdeef478" containerID="0c8448c56970a5721edade6dfb6a144edc5f360fa40fa69b6e85fe752f611c9d" exitCode=0 Feb 18 12:58:55 crc kubenswrapper[4922]: I0218 12:58:55.448285 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nxktd" event={"ID":"37694696-8e79-4f78-be48-8d4bbdeef478","Type":"ContainerDied","Data":"0c8448c56970a5721edade6dfb6a144edc5f360fa40fa69b6e85fe752f611c9d"} Feb 18 12:58:55 crc kubenswrapper[4922]: I0218 12:58:55.451008 4922 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 12:58:56 crc kubenswrapper[4922]: I0218 12:58:56.458860 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nxktd" event={"ID":"37694696-8e79-4f78-be48-8d4bbdeef478","Type":"ContainerStarted","Data":"8efcea36a093ebc966af2378eeaab555a57033dc35aa8325ecff1d49f8467349"} Feb 18 12:58:57 crc kubenswrapper[4922]: I0218 12:58:57.469727 4922 generic.go:334] "Generic (PLEG): container finished" podID="37694696-8e79-4f78-be48-8d4bbdeef478" containerID="8efcea36a093ebc966af2378eeaab555a57033dc35aa8325ecff1d49f8467349" exitCode=0 Feb 18 12:58:57 crc kubenswrapper[4922]: I0218 12:58:57.469818 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nxktd" event={"ID":"37694696-8e79-4f78-be48-8d4bbdeef478","Type":"ContainerDied","Data":"8efcea36a093ebc966af2378eeaab555a57033dc35aa8325ecff1d49f8467349"} Feb 18 12:58:58 crc kubenswrapper[4922]: I0218 12:58:58.481643 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nxktd" event={"ID":"37694696-8e79-4f78-be48-8d4bbdeef478","Type":"ContainerStarted","Data":"c5daea675634727bd83c74c0073578c4223e10a3aaa230ae869aeb8c315362d9"} Feb 18 12:58:58 crc kubenswrapper[4922]: I0218 12:58:58.501819 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nxktd" podStartSLOduration=3.098737372 podStartE2EDuration="5.501800002s" podCreationTimestamp="2026-02-18 12:58:53 +0000 UTC" firstStartedPulling="2026-02-18 12:58:55.450797196 +0000 UTC m=+4937.178501276" lastFinishedPulling="2026-02-18 12:58:57.853859826 +0000 UTC m=+4939.581563906" observedRunningTime="2026-02-18 12:58:58.495959114 +0000 UTC m=+4940.223663194" watchObservedRunningTime="2026-02-18 12:58:58.501800002 +0000 UTC m=+4940.229504082" Feb 18 12:59:03 crc kubenswrapper[4922]: I0218 12:59:03.925340 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nxktd" Feb 18 12:59:03 crc kubenswrapper[4922]: I0218 12:59:03.925930 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nxktd" Feb 18 12:59:03 crc kubenswrapper[4922]: I0218 12:59:03.971347 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nxktd" Feb 18 12:59:04 crc kubenswrapper[4922]: I0218 12:59:04.575165 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nxktd" Feb 18 12:59:04 crc kubenswrapper[4922]: I0218 12:59:04.621621 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nxktd"] Feb 18 12:59:06 crc kubenswrapper[4922]: I0218 12:59:06.546641 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nxktd" podUID="37694696-8e79-4f78-be48-8d4bbdeef478" containerName="registry-server" containerID="cri-o://c5daea675634727bd83c74c0073578c4223e10a3aaa230ae869aeb8c315362d9" gracePeriod=2 Feb 18 12:59:07 crc kubenswrapper[4922]: I0218 12:59:07.001239 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bf65p/must-gather-pnxz8"] Feb 18 12:59:07 crc kubenswrapper[4922]: I0218 12:59:07.003534 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-bf65p/must-gather-pnxz8"] Feb 18 12:59:07 crc kubenswrapper[4922]: I0218 12:59:07.003622 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bf65p/must-gather-pnxz8" Feb 18 12:59:07 crc kubenswrapper[4922]: I0218 12:59:07.015805 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-bf65p"/"kube-root-ca.crt" Feb 18 12:59:07 crc kubenswrapper[4922]: I0218 12:59:07.016051 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-bf65p"/"openshift-service-ca.crt" Feb 18 12:59:07 crc kubenswrapper[4922]: I0218 12:59:07.018332 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-bf65p"/"default-dockercfg-xvq9t" Feb 18 12:59:07 crc kubenswrapper[4922]: I0218 12:59:07.119208 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nxktd" Feb 18 12:59:07 crc kubenswrapper[4922]: I0218 12:59:07.124808 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e0dbd879-002b-41f9-aa0f-dd9a7aa8fa44-must-gather-output\") pod \"must-gather-pnxz8\" (UID: \"e0dbd879-002b-41f9-aa0f-dd9a7aa8fa44\") " pod="openshift-must-gather-bf65p/must-gather-pnxz8" Feb 18 12:59:07 crc kubenswrapper[4922]: I0218 12:59:07.124877 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkczn\" (UniqueName: \"kubernetes.io/projected/e0dbd879-002b-41f9-aa0f-dd9a7aa8fa44-kube-api-access-hkczn\") pod \"must-gather-pnxz8\" (UID: \"e0dbd879-002b-41f9-aa0f-dd9a7aa8fa44\") " pod="openshift-must-gather-bf65p/must-gather-pnxz8" Feb 18 12:59:07 crc kubenswrapper[4922]: I0218 12:59:07.226700 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttfnt\" (UniqueName: \"kubernetes.io/projected/37694696-8e79-4f78-be48-8d4bbdeef478-kube-api-access-ttfnt\") pod \"37694696-8e79-4f78-be48-8d4bbdeef478\" (UID: \"37694696-8e79-4f78-be48-8d4bbdeef478\") " Feb 18 12:59:07 crc kubenswrapper[4922]: I0218 12:59:07.226961 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37694696-8e79-4f78-be48-8d4bbdeef478-catalog-content\") pod \"37694696-8e79-4f78-be48-8d4bbdeef478\" (UID: \"37694696-8e79-4f78-be48-8d4bbdeef478\") " Feb 18 12:59:07 crc kubenswrapper[4922]: I0218 12:59:07.227023 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37694696-8e79-4f78-be48-8d4bbdeef478-utilities\") pod \"37694696-8e79-4f78-be48-8d4bbdeef478\" (UID: \"37694696-8e79-4f78-be48-8d4bbdeef478\") " Feb 18 12:59:07 crc kubenswrapper[4922]: I0218 12:59:07.227458 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e0dbd879-002b-41f9-aa0f-dd9a7aa8fa44-must-gather-output\") pod \"must-gather-pnxz8\" (UID: \"e0dbd879-002b-41f9-aa0f-dd9a7aa8fa44\") " pod="openshift-must-gather-bf65p/must-gather-pnxz8" Feb 18 12:59:07 crc kubenswrapper[4922]: I0218 12:59:07.227518 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkczn\" (UniqueName: \"kubernetes.io/projected/e0dbd879-002b-41f9-aa0f-dd9a7aa8fa44-kube-api-access-hkczn\") pod \"must-gather-pnxz8\" (UID: \"e0dbd879-002b-41f9-aa0f-dd9a7aa8fa44\") " pod="openshift-must-gather-bf65p/must-gather-pnxz8" Feb 18 12:59:07 crc kubenswrapper[4922]: I0218 12:59:07.228045 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37694696-8e79-4f78-be48-8d4bbdeef478-utilities" (OuterVolumeSpecName: "utilities") pod "37694696-8e79-4f78-be48-8d4bbdeef478" (UID: "37694696-8e79-4f78-be48-8d4bbdeef478"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:59:07 crc kubenswrapper[4922]: I0218 12:59:07.228353 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e0dbd879-002b-41f9-aa0f-dd9a7aa8fa44-must-gather-output\") pod \"must-gather-pnxz8\" (UID: \"e0dbd879-002b-41f9-aa0f-dd9a7aa8fa44\") " pod="openshift-must-gather-bf65p/must-gather-pnxz8" Feb 18 12:59:07 crc kubenswrapper[4922]: I0218 12:59:07.234516 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37694696-8e79-4f78-be48-8d4bbdeef478-kube-api-access-ttfnt" (OuterVolumeSpecName: "kube-api-access-ttfnt") pod "37694696-8e79-4f78-be48-8d4bbdeef478" (UID: "37694696-8e79-4f78-be48-8d4bbdeef478"). InnerVolumeSpecName "kube-api-access-ttfnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:59:07 crc kubenswrapper[4922]: I0218 12:59:07.244186 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkczn\" (UniqueName: \"kubernetes.io/projected/e0dbd879-002b-41f9-aa0f-dd9a7aa8fa44-kube-api-access-hkczn\") pod \"must-gather-pnxz8\" (UID: \"e0dbd879-002b-41f9-aa0f-dd9a7aa8fa44\") " pod="openshift-must-gather-bf65p/must-gather-pnxz8" Feb 18 12:59:07 crc kubenswrapper[4922]: I0218 12:59:07.284946 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37694696-8e79-4f78-be48-8d4bbdeef478-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "37694696-8e79-4f78-be48-8d4bbdeef478" (UID: "37694696-8e79-4f78-be48-8d4bbdeef478"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:59:07 crc kubenswrapper[4922]: I0218 12:59:07.329972 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37694696-8e79-4f78-be48-8d4bbdeef478-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 12:59:07 crc kubenswrapper[4922]: I0218 12:59:07.330005 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttfnt\" (UniqueName: \"kubernetes.io/projected/37694696-8e79-4f78-be48-8d4bbdeef478-kube-api-access-ttfnt\") on node \"crc\" DevicePath \"\"" Feb 18 12:59:07 crc kubenswrapper[4922]: I0218 12:59:07.330016 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37694696-8e79-4f78-be48-8d4bbdeef478-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 12:59:07 crc kubenswrapper[4922]: I0218 12:59:07.414866 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bf65p/must-gather-pnxz8" Feb 18 12:59:07 crc kubenswrapper[4922]: I0218 12:59:07.558333 4922 generic.go:334] "Generic (PLEG): container finished" podID="37694696-8e79-4f78-be48-8d4bbdeef478" containerID="c5daea675634727bd83c74c0073578c4223e10a3aaa230ae869aeb8c315362d9" exitCode=0 Feb 18 12:59:07 crc kubenswrapper[4922]: I0218 12:59:07.558408 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nxktd" event={"ID":"37694696-8e79-4f78-be48-8d4bbdeef478","Type":"ContainerDied","Data":"c5daea675634727bd83c74c0073578c4223e10a3aaa230ae869aeb8c315362d9"} Feb 18 12:59:07 crc kubenswrapper[4922]: I0218 12:59:07.558438 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nxktd" event={"ID":"37694696-8e79-4f78-be48-8d4bbdeef478","Type":"ContainerDied","Data":"60d43046c25b17aef620e372e4fff97acd4e4d4aef2c1b4922e6e429cb743ec5"} Feb 18 12:59:07 crc kubenswrapper[4922]: I0218 12:59:07.558472 4922 scope.go:117] "RemoveContainer" containerID="c5daea675634727bd83c74c0073578c4223e10a3aaa230ae869aeb8c315362d9" Feb 18 12:59:07 crc kubenswrapper[4922]: I0218 12:59:07.558648 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nxktd" Feb 18 12:59:07 crc kubenswrapper[4922]: I0218 12:59:07.597966 4922 scope.go:117] "RemoveContainer" containerID="8efcea36a093ebc966af2378eeaab555a57033dc35aa8325ecff1d49f8467349" Feb 18 12:59:07 crc kubenswrapper[4922]: I0218 12:59:07.609751 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nxktd"] Feb 18 12:59:07 crc kubenswrapper[4922]: I0218 12:59:07.618006 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nxktd"] Feb 18 12:59:07 crc kubenswrapper[4922]: I0218 12:59:07.632014 4922 scope.go:117] "RemoveContainer" containerID="0c8448c56970a5721edade6dfb6a144edc5f360fa40fa69b6e85fe752f611c9d" Feb 18 12:59:07 crc kubenswrapper[4922]: I0218 12:59:07.667781 4922 scope.go:117] "RemoveContainer" containerID="c5daea675634727bd83c74c0073578c4223e10a3aaa230ae869aeb8c315362d9" Feb 18 12:59:07 crc kubenswrapper[4922]: E0218 12:59:07.668539 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5daea675634727bd83c74c0073578c4223e10a3aaa230ae869aeb8c315362d9\": container with ID starting with c5daea675634727bd83c74c0073578c4223e10a3aaa230ae869aeb8c315362d9 not found: ID does not exist" containerID="c5daea675634727bd83c74c0073578c4223e10a3aaa230ae869aeb8c315362d9" Feb 18 12:59:07 crc kubenswrapper[4922]: I0218 12:59:07.668567 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5daea675634727bd83c74c0073578c4223e10a3aaa230ae869aeb8c315362d9"} err="failed to get container status \"c5daea675634727bd83c74c0073578c4223e10a3aaa230ae869aeb8c315362d9\": rpc error: code = NotFound desc = could not find container \"c5daea675634727bd83c74c0073578c4223e10a3aaa230ae869aeb8c315362d9\": container with ID starting with c5daea675634727bd83c74c0073578c4223e10a3aaa230ae869aeb8c315362d9 not found: ID does not exist" Feb 18 12:59:07 crc kubenswrapper[4922]: I0218 12:59:07.668589 4922 scope.go:117] "RemoveContainer" containerID="8efcea36a093ebc966af2378eeaab555a57033dc35aa8325ecff1d49f8467349" Feb 18 12:59:07 crc kubenswrapper[4922]: E0218 12:59:07.669110 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8efcea36a093ebc966af2378eeaab555a57033dc35aa8325ecff1d49f8467349\": container with ID starting with 8efcea36a093ebc966af2378eeaab555a57033dc35aa8325ecff1d49f8467349 not found: ID does not exist" containerID="8efcea36a093ebc966af2378eeaab555a57033dc35aa8325ecff1d49f8467349" Feb 18 12:59:07 crc kubenswrapper[4922]: I0218 12:59:07.669134 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8efcea36a093ebc966af2378eeaab555a57033dc35aa8325ecff1d49f8467349"} err="failed to get container status \"8efcea36a093ebc966af2378eeaab555a57033dc35aa8325ecff1d49f8467349\": rpc error: code = NotFound desc = could not find container \"8efcea36a093ebc966af2378eeaab555a57033dc35aa8325ecff1d49f8467349\": container with ID starting with 8efcea36a093ebc966af2378eeaab555a57033dc35aa8325ecff1d49f8467349 not found: ID does not exist" Feb 18 12:59:07 crc kubenswrapper[4922]: I0218 12:59:07.669206 4922 scope.go:117] "RemoveContainer" containerID="0c8448c56970a5721edade6dfb6a144edc5f360fa40fa69b6e85fe752f611c9d" Feb 18 12:59:07 crc kubenswrapper[4922]: E0218 12:59:07.669602 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c8448c56970a5721edade6dfb6a144edc5f360fa40fa69b6e85fe752f611c9d\": container with ID starting with 0c8448c56970a5721edade6dfb6a144edc5f360fa40fa69b6e85fe752f611c9d not found: ID does not exist" containerID="0c8448c56970a5721edade6dfb6a144edc5f360fa40fa69b6e85fe752f611c9d" Feb 18 12:59:07 crc kubenswrapper[4922]: I0218 12:59:07.669620 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c8448c56970a5721edade6dfb6a144edc5f360fa40fa69b6e85fe752f611c9d"} err="failed to get container status \"0c8448c56970a5721edade6dfb6a144edc5f360fa40fa69b6e85fe752f611c9d\": rpc error: code = NotFound desc = could not find container \"0c8448c56970a5721edade6dfb6a144edc5f360fa40fa69b6e85fe752f611c9d\": container with ID starting with 0c8448c56970a5721edade6dfb6a144edc5f360fa40fa69b6e85fe752f611c9d not found: ID does not exist" Feb 18 12:59:07 crc kubenswrapper[4922]: I0218 12:59:07.874761 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-bf65p/must-gather-pnxz8"] Feb 18 12:59:08 crc kubenswrapper[4922]: I0218 12:59:08.567676 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bf65p/must-gather-pnxz8" event={"ID":"e0dbd879-002b-41f9-aa0f-dd9a7aa8fa44","Type":"ContainerStarted","Data":"44ad0ca2b1f4d58d5e9c1bdc531a19d9bcb3626c9f2ad8ca2e2348f7eea332c7"} Feb 18 12:59:08 crc kubenswrapper[4922]: I0218 12:59:08.985453 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37694696-8e79-4f78-be48-8d4bbdeef478" path="/var/lib/kubelet/pods/37694696-8e79-4f78-be48-8d4bbdeef478/volumes" Feb 18 12:59:16 crc kubenswrapper[4922]: I0218 12:59:16.639950 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bf65p/must-gather-pnxz8" event={"ID":"e0dbd879-002b-41f9-aa0f-dd9a7aa8fa44","Type":"ContainerStarted","Data":"87e25aa3435f2488f0e8ad59cf6c52d09f26e56dbbb0c16afa85618c9286b2b0"} Feb 18 12:59:16 crc kubenswrapper[4922]: I0218 12:59:16.640532 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bf65p/must-gather-pnxz8" event={"ID":"e0dbd879-002b-41f9-aa0f-dd9a7aa8fa44","Type":"ContainerStarted","Data":"1a1f695d758905465b01fcc900845ef77ea0ac9586c1010946a5b06f4411a66c"} Feb 18 12:59:16 crc kubenswrapper[4922]: I0218 12:59:16.656910 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-bf65p/must-gather-pnxz8" podStartSLOduration=2.999636208 podStartE2EDuration="10.656887539s" podCreationTimestamp="2026-02-18 12:59:06 +0000 UTC" firstStartedPulling="2026-02-18 12:59:08.173455247 +0000 UTC m=+4949.901159327" lastFinishedPulling="2026-02-18 12:59:15.830706578 +0000 UTC m=+4957.558410658" observedRunningTime="2026-02-18 12:59:16.653884173 +0000 UTC m=+4958.381588263" watchObservedRunningTime="2026-02-18 12:59:16.656887539 +0000 UTC m=+4958.384591619" Feb 18 12:59:19 crc kubenswrapper[4922]: I0218 12:59:19.781963 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bf65p/crc-debug-zdn7g"] Feb 18 12:59:19 crc kubenswrapper[4922]: E0218 12:59:19.783316 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37694696-8e79-4f78-be48-8d4bbdeef478" containerName="registry-server" Feb 18 12:59:19 crc kubenswrapper[4922]: I0218 12:59:19.783332 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="37694696-8e79-4f78-be48-8d4bbdeef478" containerName="registry-server" Feb 18 12:59:19 crc kubenswrapper[4922]: E0218 12:59:19.783353 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37694696-8e79-4f78-be48-8d4bbdeef478" containerName="extract-utilities" Feb 18 12:59:19 crc kubenswrapper[4922]: I0218 12:59:19.783375 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="37694696-8e79-4f78-be48-8d4bbdeef478" containerName="extract-utilities" Feb 18 12:59:19 crc kubenswrapper[4922]: E0218 12:59:19.783391 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37694696-8e79-4f78-be48-8d4bbdeef478" containerName="extract-content" Feb 18 12:59:19 crc kubenswrapper[4922]: I0218 12:59:19.783397 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="37694696-8e79-4f78-be48-8d4bbdeef478" containerName="extract-content" Feb 18 12:59:19 crc kubenswrapper[4922]: I0218 12:59:19.783583 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="37694696-8e79-4f78-be48-8d4bbdeef478" containerName="registry-server" Feb 18 12:59:19 crc kubenswrapper[4922]: I0218 12:59:19.784253 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bf65p/crc-debug-zdn7g" Feb 18 12:59:19 crc kubenswrapper[4922]: I0218 12:59:19.879850 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7ee5a976-94bf-472d-af8f-09df6835587b-host\") pod \"crc-debug-zdn7g\" (UID: \"7ee5a976-94bf-472d-af8f-09df6835587b\") " pod="openshift-must-gather-bf65p/crc-debug-zdn7g" Feb 18 12:59:19 crc kubenswrapper[4922]: I0218 12:59:19.879906 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zcrn\" (UniqueName: \"kubernetes.io/projected/7ee5a976-94bf-472d-af8f-09df6835587b-kube-api-access-4zcrn\") pod \"crc-debug-zdn7g\" (UID: \"7ee5a976-94bf-472d-af8f-09df6835587b\") " pod="openshift-must-gather-bf65p/crc-debug-zdn7g" Feb 18 12:59:19 crc kubenswrapper[4922]: I0218 12:59:19.982602 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7ee5a976-94bf-472d-af8f-09df6835587b-host\") pod \"crc-debug-zdn7g\" (UID: \"7ee5a976-94bf-472d-af8f-09df6835587b\") " pod="openshift-must-gather-bf65p/crc-debug-zdn7g" Feb 18 12:59:19 crc kubenswrapper[4922]: I0218 12:59:19.982696 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zcrn\" (UniqueName: \"kubernetes.io/projected/7ee5a976-94bf-472d-af8f-09df6835587b-kube-api-access-4zcrn\") pod \"crc-debug-zdn7g\" (UID: \"7ee5a976-94bf-472d-af8f-09df6835587b\") " pod="openshift-must-gather-bf65p/crc-debug-zdn7g" Feb 18 12:59:19 crc kubenswrapper[4922]: I0218 12:59:19.982816 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7ee5a976-94bf-472d-af8f-09df6835587b-host\") pod \"crc-debug-zdn7g\" (UID: \"7ee5a976-94bf-472d-af8f-09df6835587b\") " pod="openshift-must-gather-bf65p/crc-debug-zdn7g" Feb 18 12:59:20 crc kubenswrapper[4922]: I0218 12:59:20.006285 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zcrn\" (UniqueName: \"kubernetes.io/projected/7ee5a976-94bf-472d-af8f-09df6835587b-kube-api-access-4zcrn\") pod \"crc-debug-zdn7g\" (UID: \"7ee5a976-94bf-472d-af8f-09df6835587b\") " pod="openshift-must-gather-bf65p/crc-debug-zdn7g" Feb 18 12:59:20 crc kubenswrapper[4922]: I0218 12:59:20.101735 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bf65p/crc-debug-zdn7g" Feb 18 12:59:20 crc kubenswrapper[4922]: W0218 12:59:20.145619 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ee5a976_94bf_472d_af8f_09df6835587b.slice/crio-5aa8a6bf58c8c23a03dba7b3217875b34ffb49c2470c51d0c70e37165b156f73 WatchSource:0}: Error finding container 5aa8a6bf58c8c23a03dba7b3217875b34ffb49c2470c51d0c70e37165b156f73: Status 404 returned error can't find the container with id 5aa8a6bf58c8c23a03dba7b3217875b34ffb49c2470c51d0c70e37165b156f73 Feb 18 12:59:20 crc kubenswrapper[4922]: I0218 12:59:20.684493 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bf65p/crc-debug-zdn7g" event={"ID":"7ee5a976-94bf-472d-af8f-09df6835587b","Type":"ContainerStarted","Data":"5aa8a6bf58c8c23a03dba7b3217875b34ffb49c2470c51d0c70e37165b156f73"} Feb 18 12:59:31 crc kubenswrapper[4922]: I0218 12:59:31.799545 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bf65p/crc-debug-zdn7g" event={"ID":"7ee5a976-94bf-472d-af8f-09df6835587b","Type":"ContainerStarted","Data":"807a994ec59a7105a4222c75e8c367b51fc34fa12867e26594ab79a75f9ffd06"} Feb 18 12:59:31 crc kubenswrapper[4922]: I0218 12:59:31.823630 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-bf65p/crc-debug-zdn7g" podStartSLOduration=2.104918267 podStartE2EDuration="12.823610368s" podCreationTimestamp="2026-02-18 12:59:19 +0000 UTC" firstStartedPulling="2026-02-18 12:59:20.148395085 +0000 UTC m=+4961.876099165" lastFinishedPulling="2026-02-18 12:59:30.867087176 +0000 UTC m=+4972.594791266" observedRunningTime="2026-02-18 12:59:31.81459248 +0000 UTC m=+4973.542296560" watchObservedRunningTime="2026-02-18 12:59:31.823610368 +0000 UTC m=+4973.551314448" Feb 18 13:00:00 crc kubenswrapper[4922]: I0218 13:00:00.183439 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523660-khccs"] Feb 18 13:00:00 crc kubenswrapper[4922]: I0218 13:00:00.185757 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523660-khccs" Feb 18 13:00:00 crc kubenswrapper[4922]: I0218 13:00:00.189720 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 18 13:00:00 crc kubenswrapper[4922]: I0218 13:00:00.191307 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 18 13:00:00 crc kubenswrapper[4922]: I0218 13:00:00.210831 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523660-khccs"] Feb 18 13:00:00 crc kubenswrapper[4922]: I0218 13:00:00.322946 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9dfb0ade-7d72-46ba-b438-04dd2465b963-config-volume\") pod \"collect-profiles-29523660-khccs\" (UID: \"9dfb0ade-7d72-46ba-b438-04dd2465b963\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523660-khccs" Feb 18 13:00:00 crc kubenswrapper[4922]: I0218 13:00:00.323260 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9dfb0ade-7d72-46ba-b438-04dd2465b963-secret-volume\") pod \"collect-profiles-29523660-khccs\" (UID: \"9dfb0ade-7d72-46ba-b438-04dd2465b963\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523660-khccs" Feb 18 13:00:00 crc kubenswrapper[4922]: I0218 13:00:00.323572 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8926d\" (UniqueName: \"kubernetes.io/projected/9dfb0ade-7d72-46ba-b438-04dd2465b963-kube-api-access-8926d\") pod \"collect-profiles-29523660-khccs\" (UID: \"9dfb0ade-7d72-46ba-b438-04dd2465b963\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523660-khccs" Feb 18 13:00:00 crc kubenswrapper[4922]: I0218 13:00:00.426269 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8926d\" (UniqueName: \"kubernetes.io/projected/9dfb0ade-7d72-46ba-b438-04dd2465b963-kube-api-access-8926d\") pod \"collect-profiles-29523660-khccs\" (UID: \"9dfb0ade-7d72-46ba-b438-04dd2465b963\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523660-khccs" Feb 18 13:00:00 crc kubenswrapper[4922]: I0218 13:00:00.426751 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9dfb0ade-7d72-46ba-b438-04dd2465b963-config-volume\") pod \"collect-profiles-29523660-khccs\" (UID: \"9dfb0ade-7d72-46ba-b438-04dd2465b963\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523660-khccs" Feb 18 13:00:00 crc kubenswrapper[4922]: I0218 13:00:00.426927 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9dfb0ade-7d72-46ba-b438-04dd2465b963-secret-volume\") pod \"collect-profiles-29523660-khccs\" (UID: \"9dfb0ade-7d72-46ba-b438-04dd2465b963\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523660-khccs" Feb 18 13:00:00 crc kubenswrapper[4922]: I0218 13:00:00.427772 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9dfb0ade-7d72-46ba-b438-04dd2465b963-config-volume\") pod \"collect-profiles-29523660-khccs\" (UID: \"9dfb0ade-7d72-46ba-b438-04dd2465b963\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523660-khccs" Feb 18 13:00:00 crc kubenswrapper[4922]: I0218 13:00:00.434320 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9dfb0ade-7d72-46ba-b438-04dd2465b963-secret-volume\") pod \"collect-profiles-29523660-khccs\" (UID: \"9dfb0ade-7d72-46ba-b438-04dd2465b963\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523660-khccs" Feb 18 13:00:00 crc kubenswrapper[4922]: I0218 13:00:00.469411 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8926d\" (UniqueName: \"kubernetes.io/projected/9dfb0ade-7d72-46ba-b438-04dd2465b963-kube-api-access-8926d\") pod \"collect-profiles-29523660-khccs\" (UID: \"9dfb0ade-7d72-46ba-b438-04dd2465b963\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523660-khccs" Feb 18 13:00:00 crc kubenswrapper[4922]: I0218 13:00:00.512617 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523660-khccs" Feb 18 13:00:01 crc kubenswrapper[4922]: I0218 13:00:01.061103 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523660-khccs"] Feb 18 13:00:01 crc kubenswrapper[4922]: W0218 13:00:01.064613 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9dfb0ade_7d72_46ba_b438_04dd2465b963.slice/crio-030383aab789df0d99eca0e575230a460d70ccefa58d04b4d79bd4d9df818265 WatchSource:0}: Error finding container 030383aab789df0d99eca0e575230a460d70ccefa58d04b4d79bd4d9df818265: Status 404 returned error can't find the container with id 030383aab789df0d99eca0e575230a460d70ccefa58d04b4d79bd4d9df818265 Feb 18 13:00:01 crc kubenswrapper[4922]: I0218 13:00:01.151373 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523660-khccs" event={"ID":"9dfb0ade-7d72-46ba-b438-04dd2465b963","Type":"ContainerStarted","Data":"030383aab789df0d99eca0e575230a460d70ccefa58d04b4d79bd4d9df818265"} Feb 18 13:00:02 crc kubenswrapper[4922]: I0218 13:00:02.162635 4922 generic.go:334] "Generic (PLEG): container finished" podID="9dfb0ade-7d72-46ba-b438-04dd2465b963" containerID="4a6a3619ea4131a004e539864c38f6aff500cbd29e608f03198bcf5f40b0a6e5" exitCode=0 Feb 18 13:00:02 crc kubenswrapper[4922]: I0218 13:00:02.162943 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523660-khccs" event={"ID":"9dfb0ade-7d72-46ba-b438-04dd2465b963","Type":"ContainerDied","Data":"4a6a3619ea4131a004e539864c38f6aff500cbd29e608f03198bcf5f40b0a6e5"} Feb 18 13:00:04 crc kubenswrapper[4922]: I0218 13:00:04.109937 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523660-khccs" Feb 18 13:00:04 crc kubenswrapper[4922]: I0218 13:00:04.194868 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523660-khccs" event={"ID":"9dfb0ade-7d72-46ba-b438-04dd2465b963","Type":"ContainerDied","Data":"030383aab789df0d99eca0e575230a460d70ccefa58d04b4d79bd4d9df818265"} Feb 18 13:00:04 crc kubenswrapper[4922]: I0218 13:00:04.195200 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="030383aab789df0d99eca0e575230a460d70ccefa58d04b4d79bd4d9df818265" Feb 18 13:00:04 crc kubenswrapper[4922]: I0218 13:00:04.195079 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523660-khccs" Feb 18 13:00:04 crc kubenswrapper[4922]: I0218 13:00:04.274423 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9dfb0ade-7d72-46ba-b438-04dd2465b963-secret-volume\") pod \"9dfb0ade-7d72-46ba-b438-04dd2465b963\" (UID: \"9dfb0ade-7d72-46ba-b438-04dd2465b963\") " Feb 18 13:00:04 crc kubenswrapper[4922]: I0218 13:00:04.274483 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8926d\" (UniqueName: \"kubernetes.io/projected/9dfb0ade-7d72-46ba-b438-04dd2465b963-kube-api-access-8926d\") pod \"9dfb0ade-7d72-46ba-b438-04dd2465b963\" (UID: \"9dfb0ade-7d72-46ba-b438-04dd2465b963\") " Feb 18 13:00:04 crc kubenswrapper[4922]: I0218 13:00:04.274622 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9dfb0ade-7d72-46ba-b438-04dd2465b963-config-volume\") pod \"9dfb0ade-7d72-46ba-b438-04dd2465b963\" (UID: \"9dfb0ade-7d72-46ba-b438-04dd2465b963\") " Feb 18 13:00:04 crc kubenswrapper[4922]: I0218 13:00:04.277356 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9dfb0ade-7d72-46ba-b438-04dd2465b963-config-volume" (OuterVolumeSpecName: "config-volume") pod "9dfb0ade-7d72-46ba-b438-04dd2465b963" (UID: "9dfb0ade-7d72-46ba-b438-04dd2465b963"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:00:04 crc kubenswrapper[4922]: I0218 13:00:04.293483 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dfb0ade-7d72-46ba-b438-04dd2465b963-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9dfb0ade-7d72-46ba-b438-04dd2465b963" (UID: "9dfb0ade-7d72-46ba-b438-04dd2465b963"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:00:04 crc kubenswrapper[4922]: I0218 13:00:04.295233 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dfb0ade-7d72-46ba-b438-04dd2465b963-kube-api-access-8926d" (OuterVolumeSpecName: "kube-api-access-8926d") pod "9dfb0ade-7d72-46ba-b438-04dd2465b963" (UID: "9dfb0ade-7d72-46ba-b438-04dd2465b963"). InnerVolumeSpecName "kube-api-access-8926d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:00:04 crc kubenswrapper[4922]: I0218 13:00:04.378012 4922 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9dfb0ade-7d72-46ba-b438-04dd2465b963-config-volume\") on node \"crc\" DevicePath \"\"" Feb 18 13:00:04 crc kubenswrapper[4922]: I0218 13:00:04.378068 4922 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9dfb0ade-7d72-46ba-b438-04dd2465b963-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 18 13:00:04 crc kubenswrapper[4922]: I0218 13:00:04.378114 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8926d\" (UniqueName: \"kubernetes.io/projected/9dfb0ade-7d72-46ba-b438-04dd2465b963-kube-api-access-8926d\") on node \"crc\" DevicePath \"\"" Feb 18 13:00:05 crc kubenswrapper[4922]: I0218 13:00:05.193046 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523615-grpz4"] Feb 18 13:00:05 crc kubenswrapper[4922]: I0218 13:00:05.212033 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523615-grpz4"] Feb 18 13:00:06 crc kubenswrapper[4922]: I0218 13:00:06.992000 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fefb3a87-d203-4ac1-b63d-61c582015132" path="/var/lib/kubelet/pods/fefb3a87-d203-4ac1-b63d-61c582015132/volumes" Feb 18 13:00:09 crc kubenswrapper[4922]: I0218 13:00:09.807534 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 13:00:09 crc kubenswrapper[4922]: I0218 13:00:09.807891 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 13:00:15 crc kubenswrapper[4922]: I0218 13:00:15.751789 4922 scope.go:117] "RemoveContainer" containerID="5b401b8ee4f7943af0a7b7807634c73c9cc5371f7bb8ea18f378db7de3390a99" Feb 18 13:00:27 crc kubenswrapper[4922]: I0218 13:00:27.391199 4922 generic.go:334] "Generic (PLEG): container finished" podID="7ee5a976-94bf-472d-af8f-09df6835587b" containerID="807a994ec59a7105a4222c75e8c367b51fc34fa12867e26594ab79a75f9ffd06" exitCode=0 Feb 18 13:00:27 crc kubenswrapper[4922]: I0218 13:00:27.391314 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bf65p/crc-debug-zdn7g" event={"ID":"7ee5a976-94bf-472d-af8f-09df6835587b","Type":"ContainerDied","Data":"807a994ec59a7105a4222c75e8c367b51fc34fa12867e26594ab79a75f9ffd06"} Feb 18 13:00:28 crc kubenswrapper[4922]: I0218 13:00:28.500694 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bf65p/crc-debug-zdn7g" Feb 18 13:00:28 crc kubenswrapper[4922]: I0218 13:00:28.541132 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-bf65p/crc-debug-zdn7g"] Feb 18 13:00:28 crc kubenswrapper[4922]: I0218 13:00:28.551065 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-bf65p/crc-debug-zdn7g"] Feb 18 13:00:28 crc kubenswrapper[4922]: I0218 13:00:28.640016 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7ee5a976-94bf-472d-af8f-09df6835587b-host\") pod \"7ee5a976-94bf-472d-af8f-09df6835587b\" (UID: \"7ee5a976-94bf-472d-af8f-09df6835587b\") " Feb 18 13:00:28 crc kubenswrapper[4922]: I0218 13:00:28.640174 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zcrn\" (UniqueName: \"kubernetes.io/projected/7ee5a976-94bf-472d-af8f-09df6835587b-kube-api-access-4zcrn\") pod \"7ee5a976-94bf-472d-af8f-09df6835587b\" (UID: \"7ee5a976-94bf-472d-af8f-09df6835587b\") " Feb 18 13:00:28 crc kubenswrapper[4922]: I0218 13:00:28.640351 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ee5a976-94bf-472d-af8f-09df6835587b-host" (OuterVolumeSpecName: "host") pod "7ee5a976-94bf-472d-af8f-09df6835587b" (UID: "7ee5a976-94bf-472d-af8f-09df6835587b"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 13:00:28 crc kubenswrapper[4922]: I0218 13:00:28.640698 4922 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7ee5a976-94bf-472d-af8f-09df6835587b-host\") on node \"crc\" DevicePath \"\"" Feb 18 13:00:28 crc kubenswrapper[4922]: I0218 13:00:28.655229 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ee5a976-94bf-472d-af8f-09df6835587b-kube-api-access-4zcrn" (OuterVolumeSpecName: "kube-api-access-4zcrn") pod "7ee5a976-94bf-472d-af8f-09df6835587b" (UID: "7ee5a976-94bf-472d-af8f-09df6835587b"). InnerVolumeSpecName "kube-api-access-4zcrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:00:28 crc kubenswrapper[4922]: I0218 13:00:28.742600 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zcrn\" (UniqueName: \"kubernetes.io/projected/7ee5a976-94bf-472d-af8f-09df6835587b-kube-api-access-4zcrn\") on node \"crc\" DevicePath \"\"" Feb 18 13:00:28 crc kubenswrapper[4922]: I0218 13:00:28.996328 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ee5a976-94bf-472d-af8f-09df6835587b" path="/var/lib/kubelet/pods/7ee5a976-94bf-472d-af8f-09df6835587b/volumes" Feb 18 13:00:29 crc kubenswrapper[4922]: I0218 13:00:29.408830 4922 scope.go:117] "RemoveContainer" containerID="807a994ec59a7105a4222c75e8c367b51fc34fa12867e26594ab79a75f9ffd06" Feb 18 13:00:29 crc kubenswrapper[4922]: I0218 13:00:29.408879 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bf65p/crc-debug-zdn7g" Feb 18 13:00:29 crc kubenswrapper[4922]: I0218 13:00:29.727521 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bf65p/crc-debug-t2cmf"] Feb 18 13:00:29 crc kubenswrapper[4922]: E0218 13:00:29.727974 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dfb0ade-7d72-46ba-b438-04dd2465b963" containerName="collect-profiles" Feb 18 13:00:29 crc kubenswrapper[4922]: I0218 13:00:29.727990 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dfb0ade-7d72-46ba-b438-04dd2465b963" containerName="collect-profiles" Feb 18 13:00:29 crc kubenswrapper[4922]: E0218 13:00:29.728034 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ee5a976-94bf-472d-af8f-09df6835587b" containerName="container-00" Feb 18 13:00:29 crc kubenswrapper[4922]: I0218 13:00:29.728042 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ee5a976-94bf-472d-af8f-09df6835587b" containerName="container-00" Feb 18 13:00:29 crc kubenswrapper[4922]: I0218 13:00:29.728259 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dfb0ade-7d72-46ba-b438-04dd2465b963" containerName="collect-profiles" Feb 18 13:00:29 crc kubenswrapper[4922]: I0218 13:00:29.728282 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ee5a976-94bf-472d-af8f-09df6835587b" containerName="container-00" Feb 18 13:00:29 crc kubenswrapper[4922]: I0218 13:00:29.730094 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bf65p/crc-debug-t2cmf" Feb 18 13:00:29 crc kubenswrapper[4922]: I0218 13:00:29.864427 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/77a43352-36b4-4006-b6ca-489c923eaf63-host\") pod \"crc-debug-t2cmf\" (UID: \"77a43352-36b4-4006-b6ca-489c923eaf63\") " pod="openshift-must-gather-bf65p/crc-debug-t2cmf" Feb 18 13:00:29 crc kubenswrapper[4922]: I0218 13:00:29.864652 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zkpx\" (UniqueName: \"kubernetes.io/projected/77a43352-36b4-4006-b6ca-489c923eaf63-kube-api-access-2zkpx\") pod \"crc-debug-t2cmf\" (UID: \"77a43352-36b4-4006-b6ca-489c923eaf63\") " pod="openshift-must-gather-bf65p/crc-debug-t2cmf" Feb 18 13:00:29 crc kubenswrapper[4922]: I0218 13:00:29.966560 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/77a43352-36b4-4006-b6ca-489c923eaf63-host\") pod \"crc-debug-t2cmf\" (UID: \"77a43352-36b4-4006-b6ca-489c923eaf63\") " pod="openshift-must-gather-bf65p/crc-debug-t2cmf" Feb 18 13:00:29 crc kubenswrapper[4922]: I0218 13:00:29.966705 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zkpx\" (UniqueName: \"kubernetes.io/projected/77a43352-36b4-4006-b6ca-489c923eaf63-kube-api-access-2zkpx\") pod \"crc-debug-t2cmf\" (UID: \"77a43352-36b4-4006-b6ca-489c923eaf63\") " pod="openshift-must-gather-bf65p/crc-debug-t2cmf" Feb 18 13:00:29 crc kubenswrapper[4922]: I0218 13:00:29.967177 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/77a43352-36b4-4006-b6ca-489c923eaf63-host\") pod \"crc-debug-t2cmf\" (UID: \"77a43352-36b4-4006-b6ca-489c923eaf63\") " pod="openshift-must-gather-bf65p/crc-debug-t2cmf" Feb 18 13:00:29 crc kubenswrapper[4922]: I0218 13:00:29.989137 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zkpx\" (UniqueName: \"kubernetes.io/projected/77a43352-36b4-4006-b6ca-489c923eaf63-kube-api-access-2zkpx\") pod \"crc-debug-t2cmf\" (UID: \"77a43352-36b4-4006-b6ca-489c923eaf63\") " pod="openshift-must-gather-bf65p/crc-debug-t2cmf" Feb 18 13:00:30 crc kubenswrapper[4922]: I0218 13:00:30.047009 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bf65p/crc-debug-t2cmf" Feb 18 13:00:30 crc kubenswrapper[4922]: I0218 13:00:30.417689 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bf65p/crc-debug-t2cmf" event={"ID":"77a43352-36b4-4006-b6ca-489c923eaf63","Type":"ContainerStarted","Data":"dabdb87ca65d6f8f0ea66c048ef4f9fa15f0c9dea5a0b549634e5eeaf236ae25"} Feb 18 13:00:30 crc kubenswrapper[4922]: I0218 13:00:30.418200 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bf65p/crc-debug-t2cmf" event={"ID":"77a43352-36b4-4006-b6ca-489c923eaf63","Type":"ContainerStarted","Data":"29f3c28826d06281ee090cf35680da696091c0a166a7523f0f211375d3bfa443"} Feb 18 13:00:30 crc kubenswrapper[4922]: I0218 13:00:30.437250 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-bf65p/crc-debug-t2cmf" podStartSLOduration=1.4372315979999999 podStartE2EDuration="1.437231598s" podCreationTimestamp="2026-02-18 13:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:00:30.430863856 +0000 UTC m=+5032.158567936" watchObservedRunningTime="2026-02-18 13:00:30.437231598 +0000 UTC m=+5032.164935678" Feb 18 13:00:31 crc kubenswrapper[4922]: I0218 13:00:31.429798 4922 generic.go:334] "Generic (PLEG): container finished" podID="77a43352-36b4-4006-b6ca-489c923eaf63" containerID="dabdb87ca65d6f8f0ea66c048ef4f9fa15f0c9dea5a0b549634e5eeaf236ae25" exitCode=0 Feb 18 13:00:31 crc kubenswrapper[4922]: I0218 13:00:31.430100 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bf65p/crc-debug-t2cmf" event={"ID":"77a43352-36b4-4006-b6ca-489c923eaf63","Type":"ContainerDied","Data":"dabdb87ca65d6f8f0ea66c048ef4f9fa15f0c9dea5a0b549634e5eeaf236ae25"} Feb 18 13:00:32 crc kubenswrapper[4922]: I0218 13:00:32.711883 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bf65p/crc-debug-t2cmf" Feb 18 13:00:32 crc kubenswrapper[4922]: I0218 13:00:32.817954 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zkpx\" (UniqueName: \"kubernetes.io/projected/77a43352-36b4-4006-b6ca-489c923eaf63-kube-api-access-2zkpx\") pod \"77a43352-36b4-4006-b6ca-489c923eaf63\" (UID: \"77a43352-36b4-4006-b6ca-489c923eaf63\") " Feb 18 13:00:32 crc kubenswrapper[4922]: I0218 13:00:32.818335 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/77a43352-36b4-4006-b6ca-489c923eaf63-host\") pod \"77a43352-36b4-4006-b6ca-489c923eaf63\" (UID: \"77a43352-36b4-4006-b6ca-489c923eaf63\") " Feb 18 13:00:32 crc kubenswrapper[4922]: I0218 13:00:32.818410 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/77a43352-36b4-4006-b6ca-489c923eaf63-host" (OuterVolumeSpecName: "host") pod "77a43352-36b4-4006-b6ca-489c923eaf63" (UID: "77a43352-36b4-4006-b6ca-489c923eaf63"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 13:00:32 crc kubenswrapper[4922]: I0218 13:00:32.818860 4922 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/77a43352-36b4-4006-b6ca-489c923eaf63-host\") on node \"crc\" DevicePath \"\"" Feb 18 13:00:32 crc kubenswrapper[4922]: I0218 13:00:32.824150 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77a43352-36b4-4006-b6ca-489c923eaf63-kube-api-access-2zkpx" (OuterVolumeSpecName: "kube-api-access-2zkpx") pod "77a43352-36b4-4006-b6ca-489c923eaf63" (UID: "77a43352-36b4-4006-b6ca-489c923eaf63"). InnerVolumeSpecName "kube-api-access-2zkpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:00:32 crc kubenswrapper[4922]: I0218 13:00:32.879026 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-bf65p/crc-debug-t2cmf"] Feb 18 13:00:32 crc kubenswrapper[4922]: I0218 13:00:32.888928 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-bf65p/crc-debug-t2cmf"] Feb 18 13:00:32 crc kubenswrapper[4922]: I0218 13:00:32.920953 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zkpx\" (UniqueName: \"kubernetes.io/projected/77a43352-36b4-4006-b6ca-489c923eaf63-kube-api-access-2zkpx\") on node \"crc\" DevicePath \"\"" Feb 18 13:00:32 crc kubenswrapper[4922]: I0218 13:00:32.983539 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77a43352-36b4-4006-b6ca-489c923eaf63" path="/var/lib/kubelet/pods/77a43352-36b4-4006-b6ca-489c923eaf63/volumes" Feb 18 13:00:33 crc kubenswrapper[4922]: I0218 13:00:33.450525 4922 scope.go:117] "RemoveContainer" containerID="dabdb87ca65d6f8f0ea66c048ef4f9fa15f0c9dea5a0b549634e5eeaf236ae25" Feb 18 13:00:33 crc kubenswrapper[4922]: I0218 13:00:33.450662 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bf65p/crc-debug-t2cmf" Feb 18 13:00:34 crc kubenswrapper[4922]: I0218 13:00:34.039051 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bf65p/crc-debug-swr2w"] Feb 18 13:00:34 crc kubenswrapper[4922]: E0218 13:00:34.039953 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77a43352-36b4-4006-b6ca-489c923eaf63" containerName="container-00" Feb 18 13:00:34 crc kubenswrapper[4922]: I0218 13:00:34.039971 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="77a43352-36b4-4006-b6ca-489c923eaf63" containerName="container-00" Feb 18 13:00:34 crc kubenswrapper[4922]: I0218 13:00:34.040196 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="77a43352-36b4-4006-b6ca-489c923eaf63" containerName="container-00" Feb 18 13:00:34 crc kubenswrapper[4922]: I0218 13:00:34.040901 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bf65p/crc-debug-swr2w" Feb 18 13:00:34 crc kubenswrapper[4922]: I0218 13:00:34.143917 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th4g8\" (UniqueName: \"kubernetes.io/projected/186cacb8-5e9c-4617-80ae-a6e968fa421b-kube-api-access-th4g8\") pod \"crc-debug-swr2w\" (UID: \"186cacb8-5e9c-4617-80ae-a6e968fa421b\") " pod="openshift-must-gather-bf65p/crc-debug-swr2w" Feb 18 13:00:34 crc kubenswrapper[4922]: I0218 13:00:34.143980 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/186cacb8-5e9c-4617-80ae-a6e968fa421b-host\") pod \"crc-debug-swr2w\" (UID: \"186cacb8-5e9c-4617-80ae-a6e968fa421b\") " pod="openshift-must-gather-bf65p/crc-debug-swr2w" Feb 18 13:00:34 crc kubenswrapper[4922]: I0218 13:00:34.245849 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-th4g8\" (UniqueName: \"kubernetes.io/projected/186cacb8-5e9c-4617-80ae-a6e968fa421b-kube-api-access-th4g8\") pod \"crc-debug-swr2w\" (UID: \"186cacb8-5e9c-4617-80ae-a6e968fa421b\") " pod="openshift-must-gather-bf65p/crc-debug-swr2w" Feb 18 13:00:34 crc kubenswrapper[4922]: I0218 13:00:34.245907 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/186cacb8-5e9c-4617-80ae-a6e968fa421b-host\") pod \"crc-debug-swr2w\" (UID: \"186cacb8-5e9c-4617-80ae-a6e968fa421b\") " pod="openshift-must-gather-bf65p/crc-debug-swr2w" Feb 18 13:00:34 crc kubenswrapper[4922]: I0218 13:00:34.246110 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/186cacb8-5e9c-4617-80ae-a6e968fa421b-host\") pod \"crc-debug-swr2w\" (UID: \"186cacb8-5e9c-4617-80ae-a6e968fa421b\") " pod="openshift-must-gather-bf65p/crc-debug-swr2w" Feb 18 13:00:34 crc kubenswrapper[4922]: I0218 13:00:34.262918 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-th4g8\" (UniqueName: \"kubernetes.io/projected/186cacb8-5e9c-4617-80ae-a6e968fa421b-kube-api-access-th4g8\") pod \"crc-debug-swr2w\" (UID: \"186cacb8-5e9c-4617-80ae-a6e968fa421b\") " pod="openshift-must-gather-bf65p/crc-debug-swr2w" Feb 18 13:00:34 crc kubenswrapper[4922]: I0218 13:00:34.358968 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bf65p/crc-debug-swr2w" Feb 18 13:00:34 crc kubenswrapper[4922]: I0218 13:00:34.460554 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bf65p/crc-debug-swr2w" event={"ID":"186cacb8-5e9c-4617-80ae-a6e968fa421b","Type":"ContainerStarted","Data":"326ba37b0375cb7b50735b154f30002943935d23cb145c6d19e89d30c20c8fa0"} Feb 18 13:00:35 crc kubenswrapper[4922]: I0218 13:00:35.477280 4922 generic.go:334] "Generic (PLEG): container finished" podID="186cacb8-5e9c-4617-80ae-a6e968fa421b" containerID="6115389c653a52211bb6d60a2ee801c77bf458f01c30b57fb3cbccdfadf9cac5" exitCode=0 Feb 18 13:00:35 crc kubenswrapper[4922]: I0218 13:00:35.477803 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bf65p/crc-debug-swr2w" event={"ID":"186cacb8-5e9c-4617-80ae-a6e968fa421b","Type":"ContainerDied","Data":"6115389c653a52211bb6d60a2ee801c77bf458f01c30b57fb3cbccdfadf9cac5"} Feb 18 13:00:35 crc kubenswrapper[4922]: I0218 13:00:35.518573 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-bf65p/crc-debug-swr2w"] Feb 18 13:00:35 crc kubenswrapper[4922]: I0218 13:00:35.527927 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-bf65p/crc-debug-swr2w"] Feb 18 13:00:36 crc kubenswrapper[4922]: I0218 13:00:36.612038 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bf65p/crc-debug-swr2w" Feb 18 13:00:36 crc kubenswrapper[4922]: I0218 13:00:36.699693 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/186cacb8-5e9c-4617-80ae-a6e968fa421b-host\") pod \"186cacb8-5e9c-4617-80ae-a6e968fa421b\" (UID: \"186cacb8-5e9c-4617-80ae-a6e968fa421b\") " Feb 18 13:00:36 crc kubenswrapper[4922]: I0218 13:00:36.699907 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/186cacb8-5e9c-4617-80ae-a6e968fa421b-host" (OuterVolumeSpecName: "host") pod "186cacb8-5e9c-4617-80ae-a6e968fa421b" (UID: "186cacb8-5e9c-4617-80ae-a6e968fa421b"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 13:00:36 crc kubenswrapper[4922]: I0218 13:00:36.700019 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-th4g8\" (UniqueName: \"kubernetes.io/projected/186cacb8-5e9c-4617-80ae-a6e968fa421b-kube-api-access-th4g8\") pod \"186cacb8-5e9c-4617-80ae-a6e968fa421b\" (UID: \"186cacb8-5e9c-4617-80ae-a6e968fa421b\") " Feb 18 13:00:36 crc kubenswrapper[4922]: I0218 13:00:36.700595 4922 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/186cacb8-5e9c-4617-80ae-a6e968fa421b-host\") on node \"crc\" DevicePath \"\"" Feb 18 13:00:36 crc kubenswrapper[4922]: I0218 13:00:36.709721 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/186cacb8-5e9c-4617-80ae-a6e968fa421b-kube-api-access-th4g8" (OuterVolumeSpecName: "kube-api-access-th4g8") pod "186cacb8-5e9c-4617-80ae-a6e968fa421b" (UID: "186cacb8-5e9c-4617-80ae-a6e968fa421b"). InnerVolumeSpecName "kube-api-access-th4g8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:00:36 crc kubenswrapper[4922]: I0218 13:00:36.803118 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-th4g8\" (UniqueName: \"kubernetes.io/projected/186cacb8-5e9c-4617-80ae-a6e968fa421b-kube-api-access-th4g8\") on node \"crc\" DevicePath \"\"" Feb 18 13:00:36 crc kubenswrapper[4922]: I0218 13:00:36.984510 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="186cacb8-5e9c-4617-80ae-a6e968fa421b" path="/var/lib/kubelet/pods/186cacb8-5e9c-4617-80ae-a6e968fa421b/volumes" Feb 18 13:00:37 crc kubenswrapper[4922]: I0218 13:00:37.504778 4922 scope.go:117] "RemoveContainer" containerID="6115389c653a52211bb6d60a2ee801c77bf458f01c30b57fb3cbccdfadf9cac5" Feb 18 13:00:37 crc kubenswrapper[4922]: I0218 13:00:37.504930 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bf65p/crc-debug-swr2w" Feb 18 13:00:39 crc kubenswrapper[4922]: I0218 13:00:39.808023 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 13:00:39 crc kubenswrapper[4922]: I0218 13:00:39.808395 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 13:01:00 crc kubenswrapper[4922]: I0218 13:01:00.162731 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29523661-8n69m"] Feb 18 13:01:00 crc kubenswrapper[4922]: E0218 13:01:00.163741 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="186cacb8-5e9c-4617-80ae-a6e968fa421b" containerName="container-00" Feb 18 13:01:00 crc kubenswrapper[4922]: I0218 13:01:00.163764 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="186cacb8-5e9c-4617-80ae-a6e968fa421b" containerName="container-00" Feb 18 13:01:00 crc kubenswrapper[4922]: I0218 13:01:00.164042 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="186cacb8-5e9c-4617-80ae-a6e968fa421b" containerName="container-00" Feb 18 13:01:00 crc kubenswrapper[4922]: I0218 13:01:00.164805 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29523661-8n69m" Feb 18 13:01:00 crc kubenswrapper[4922]: I0218 13:01:00.178167 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29523661-8n69m"] Feb 18 13:01:00 crc kubenswrapper[4922]: I0218 13:01:00.267676 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d70ede78-e133-44f7-8df1-fd86bfc44d38-config-data\") pod \"keystone-cron-29523661-8n69m\" (UID: \"d70ede78-e133-44f7-8df1-fd86bfc44d38\") " pod="openstack/keystone-cron-29523661-8n69m" Feb 18 13:01:00 crc kubenswrapper[4922]: I0218 13:01:00.267730 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d70ede78-e133-44f7-8df1-fd86bfc44d38-fernet-keys\") pod \"keystone-cron-29523661-8n69m\" (UID: \"d70ede78-e133-44f7-8df1-fd86bfc44d38\") " pod="openstack/keystone-cron-29523661-8n69m" Feb 18 13:01:00 crc kubenswrapper[4922]: I0218 13:01:00.267768 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d70ede78-e133-44f7-8df1-fd86bfc44d38-combined-ca-bundle\") pod \"keystone-cron-29523661-8n69m\" (UID: \"d70ede78-e133-44f7-8df1-fd86bfc44d38\") " pod="openstack/keystone-cron-29523661-8n69m" Feb 18 13:01:00 crc kubenswrapper[4922]: I0218 13:01:00.267976 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjblx\" (UniqueName: \"kubernetes.io/projected/d70ede78-e133-44f7-8df1-fd86bfc44d38-kube-api-access-sjblx\") pod \"keystone-cron-29523661-8n69m\" (UID: \"d70ede78-e133-44f7-8df1-fd86bfc44d38\") " pod="openstack/keystone-cron-29523661-8n69m" Feb 18 13:01:00 crc kubenswrapper[4922]: I0218 13:01:00.369978 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d70ede78-e133-44f7-8df1-fd86bfc44d38-config-data\") pod \"keystone-cron-29523661-8n69m\" (UID: \"d70ede78-e133-44f7-8df1-fd86bfc44d38\") " pod="openstack/keystone-cron-29523661-8n69m" Feb 18 13:01:00 crc kubenswrapper[4922]: I0218 13:01:00.370034 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d70ede78-e133-44f7-8df1-fd86bfc44d38-fernet-keys\") pod \"keystone-cron-29523661-8n69m\" (UID: \"d70ede78-e133-44f7-8df1-fd86bfc44d38\") " pod="openstack/keystone-cron-29523661-8n69m" Feb 18 13:01:00 crc kubenswrapper[4922]: I0218 13:01:00.370074 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d70ede78-e133-44f7-8df1-fd86bfc44d38-combined-ca-bundle\") pod \"keystone-cron-29523661-8n69m\" (UID: \"d70ede78-e133-44f7-8df1-fd86bfc44d38\") " pod="openstack/keystone-cron-29523661-8n69m" Feb 18 13:01:00 crc kubenswrapper[4922]: I0218 13:01:00.370131 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjblx\" (UniqueName: \"kubernetes.io/projected/d70ede78-e133-44f7-8df1-fd86bfc44d38-kube-api-access-sjblx\") pod \"keystone-cron-29523661-8n69m\" (UID: \"d70ede78-e133-44f7-8df1-fd86bfc44d38\") " pod="openstack/keystone-cron-29523661-8n69m" Feb 18 13:01:00 crc kubenswrapper[4922]: I0218 13:01:00.378845 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d70ede78-e133-44f7-8df1-fd86bfc44d38-combined-ca-bundle\") pod \"keystone-cron-29523661-8n69m\" (UID: \"d70ede78-e133-44f7-8df1-fd86bfc44d38\") " pod="openstack/keystone-cron-29523661-8n69m" Feb 18 13:01:00 crc kubenswrapper[4922]: I0218 13:01:00.378876 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d70ede78-e133-44f7-8df1-fd86bfc44d38-config-data\") pod \"keystone-cron-29523661-8n69m\" (UID: \"d70ede78-e133-44f7-8df1-fd86bfc44d38\") " pod="openstack/keystone-cron-29523661-8n69m" Feb 18 13:01:00 crc kubenswrapper[4922]: I0218 13:01:00.385275 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d70ede78-e133-44f7-8df1-fd86bfc44d38-fernet-keys\") pod \"keystone-cron-29523661-8n69m\" (UID: \"d70ede78-e133-44f7-8df1-fd86bfc44d38\") " pod="openstack/keystone-cron-29523661-8n69m" Feb 18 13:01:00 crc kubenswrapper[4922]: I0218 13:01:00.392637 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjblx\" (UniqueName: \"kubernetes.io/projected/d70ede78-e133-44f7-8df1-fd86bfc44d38-kube-api-access-sjblx\") pod \"keystone-cron-29523661-8n69m\" (UID: \"d70ede78-e133-44f7-8df1-fd86bfc44d38\") " pod="openstack/keystone-cron-29523661-8n69m" Feb 18 13:01:00 crc kubenswrapper[4922]: I0218 13:01:00.501482 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29523661-8n69m" Feb 18 13:01:01 crc kubenswrapper[4922]: I0218 13:01:01.028593 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29523661-8n69m"] Feb 18 13:01:01 crc kubenswrapper[4922]: I0218 13:01:01.304173 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-794d859fd8-fbbnx_d8d3eec1-763e-4874-b2af-19401e383fed/barbican-api/0.log" Feb 18 13:01:01 crc kubenswrapper[4922]: I0218 13:01:01.496886 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-794d859fd8-fbbnx_d8d3eec1-763e-4874-b2af-19401e383fed/barbican-api-log/0.log" Feb 18 13:01:01 crc kubenswrapper[4922]: I0218 13:01:01.547501 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-78995b5fcd-pmbbf_2664c9b6-f62a-4453-8771-8c273f5f9ec1/barbican-keystone-listener/0.log" Feb 18 13:01:01 crc kubenswrapper[4922]: I0218 13:01:01.704377 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-78995b5fcd-pmbbf_2664c9b6-f62a-4453-8771-8c273f5f9ec1/barbican-keystone-listener-log/0.log" Feb 18 13:01:01 crc kubenswrapper[4922]: I0218 13:01:01.746896 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29523661-8n69m" event={"ID":"d70ede78-e133-44f7-8df1-fd86bfc44d38","Type":"ContainerStarted","Data":"d0f6fc0562afe44aa18fd1e63e1011ad503634a6e6172e5eb778e96a2afe56dd"} Feb 18 13:01:01 crc kubenswrapper[4922]: I0218 13:01:01.746949 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29523661-8n69m" event={"ID":"d70ede78-e133-44f7-8df1-fd86bfc44d38","Type":"ContainerStarted","Data":"e7b035461fd11d5f92f5e193670bb3215ebfae191541dbd8e9b48c5dd56d3796"} Feb 18 13:01:01 crc kubenswrapper[4922]: I0218 13:01:01.773055 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29523661-8n69m" podStartSLOduration=1.773032319 podStartE2EDuration="1.773032319s" podCreationTimestamp="2026-02-18 13:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:01:01.761394564 +0000 UTC m=+5063.489098654" watchObservedRunningTime="2026-02-18 13:01:01.773032319 +0000 UTC m=+5063.500736409" Feb 18 13:01:01 crc kubenswrapper[4922]: I0218 13:01:01.782912 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-676bd4cb85-2ggtc_93be7893-0b89-4762-870d-f5878ecddb3b/barbican-worker/0.log" Feb 18 13:01:01 crc kubenswrapper[4922]: I0218 13:01:01.846837 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-676bd4cb85-2ggtc_93be7893-0b89-4762-870d-f5878ecddb3b/barbican-worker-log/0.log" Feb 18 13:01:02 crc kubenswrapper[4922]: I0218 13:01:02.040762 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-5rdwv_2685dd3b-59b6-4879-b59a-215b187b1344/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 13:01:02 crc kubenswrapper[4922]: I0218 13:01:02.163166 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_bfc3cdcf-4513-4e18-8d43-c435fd877ae7/ceilometer-central-agent/0.log" Feb 18 13:01:02 crc kubenswrapper[4922]: I0218 13:01:02.233379 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_bfc3cdcf-4513-4e18-8d43-c435fd877ae7/ceilometer-notification-agent/0.log" Feb 18 13:01:02 crc kubenswrapper[4922]: I0218 13:01:02.259695 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_bfc3cdcf-4513-4e18-8d43-c435fd877ae7/proxy-httpd/0.log" Feb 18 13:01:02 crc kubenswrapper[4922]: I0218 13:01:02.324920 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_bfc3cdcf-4513-4e18-8d43-c435fd877ae7/sg-core/0.log" Feb 18 13:01:02 crc kubenswrapper[4922]: I0218 13:01:02.470280 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_b897159b-9178-4f59-b254-08229460867d/cinder-api-log/0.log" Feb 18 13:01:02 crc kubenswrapper[4922]: I0218 13:01:02.544184 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_b897159b-9178-4f59-b254-08229460867d/cinder-api/0.log" Feb 18 13:01:02 crc kubenswrapper[4922]: I0218 13:01:02.686817 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_bd3cd2cf-8780-4de2-925c-5385d6398e49/cinder-scheduler/0.log" Feb 18 13:01:02 crc kubenswrapper[4922]: I0218 13:01:02.741434 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_bd3cd2cf-8780-4de2-925c-5385d6398e49/probe/0.log" Feb 18 13:01:03 crc kubenswrapper[4922]: I0218 13:01:03.028091 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-fcnzl_ff0774c0-eec1-4e8d-9368-8c3f26ec3d8d/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 13:01:03 crc kubenswrapper[4922]: I0218 13:01:03.030051 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-ws7ln_b2f62f96-5ba4-4d16-89d8-11ae5e941699/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 13:01:03 crc kubenswrapper[4922]: I0218 13:01:03.237036 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6b6dc74c5-zdlvc_d7048bd5-50d1-472a-a898-6cf57cf126d8/init/0.log" Feb 18 13:01:03 crc kubenswrapper[4922]: I0218 13:01:03.417024 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6b6dc74c5-zdlvc_d7048bd5-50d1-472a-a898-6cf57cf126d8/init/0.log" Feb 18 13:01:03 crc kubenswrapper[4922]: I0218 13:01:03.494277 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-7b67m_28a59f5e-155a-44b9-827a-a48bf1615d3d/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 13:01:03 crc kubenswrapper[4922]: I0218 13:01:03.665958 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6b6dc74c5-zdlvc_d7048bd5-50d1-472a-a898-6cf57cf126d8/dnsmasq-dns/0.log" Feb 18 13:01:03 crc kubenswrapper[4922]: I0218 13:01:03.771899 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f5056168-d177-4e40-813a-db20d428ce9a/glance-log/0.log" Feb 18 13:01:03 crc kubenswrapper[4922]: I0218 13:01:03.790214 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f5056168-d177-4e40-813a-db20d428ce9a/glance-httpd/0.log" Feb 18 13:01:04 crc kubenswrapper[4922]: I0218 13:01:04.045600 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_342c8bfd-c2d6-4afd-b2be-3e1474b63b62/glance-httpd/0.log" Feb 18 13:01:04 crc kubenswrapper[4922]: I0218 13:01:04.065656 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_342c8bfd-c2d6-4afd-b2be-3e1474b63b62/glance-log/0.log" Feb 18 13:01:04 crc kubenswrapper[4922]: I0218 13:01:04.266574 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7bbf5454f6-d5958_3bc8759d-86ff-415d-936a-064ef742f0d9/horizon/0.log" Feb 18 13:01:04 crc kubenswrapper[4922]: I0218 13:01:04.408149 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-bxngq_98bc83e7-66dd-4133-82cd-d4301c233f9d/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 13:01:04 crc kubenswrapper[4922]: I0218 13:01:04.673407 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-sz5wh_c107695a-fdf7-48c6-b165-5e4dd2427148/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 13:01:04 crc kubenswrapper[4922]: I0218 13:01:04.853848 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7bbf5454f6-d5958_3bc8759d-86ff-415d-936a-064ef742f0d9/horizon-log/0.log" Feb 18 13:01:05 crc kubenswrapper[4922]: I0218 13:01:05.050586 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29523601-t5w2s_07d51aec-efff-44ea-b9c5-c5335f63e0f2/keystone-cron/0.log" Feb 18 13:01:05 crc kubenswrapper[4922]: I0218 13:01:05.236799 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29523661-8n69m_d70ede78-e133-44f7-8df1-fd86bfc44d38/keystone-cron/0.log" Feb 18 13:01:05 crc kubenswrapper[4922]: I0218 13:01:05.262984 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-b854f8786-pls2t_2efd0609-4858-47ce-8213-6a74510e8acf/keystone-api/0.log" Feb 18 13:01:05 crc kubenswrapper[4922]: I0218 13:01:05.367080 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_1b492a6f-c8fc-4a76-8645-9f94a29d5e6b/kube-state-metrics/0.log" Feb 18 13:01:05 crc kubenswrapper[4922]: I0218 13:01:05.524980 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-k2xhf_7d136111-09bf-46fe-aaf8-868a27741f9b/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 13:01:06 crc kubenswrapper[4922]: I0218 13:01:06.143016 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-f57669c89-7wt5g_49aa13b6-3343-43d5-949e-3118c1711ed0/neutron-api/0.log" Feb 18 13:01:06 crc kubenswrapper[4922]: I0218 13:01:06.192183 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-25nz6_9a5605fd-9ccc-45d2-bd39-7b9a6b8e36d3/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 13:01:06 crc kubenswrapper[4922]: I0218 13:01:06.193992 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-f57669c89-7wt5g_49aa13b6-3343-43d5-949e-3118c1711ed0/neutron-httpd/0.log" Feb 18 13:01:07 crc kubenswrapper[4922]: I0218 13:01:07.038020 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_4a95479a-1834-4e95-b18a-c0bcef05f7ed/nova-cell0-conductor-conductor/0.log" Feb 18 13:01:07 crc kubenswrapper[4922]: I0218 13:01:07.165863 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_31ef9a9b-fedd-4afd-8582-19ef097c98a2/nova-cell1-conductor-conductor/0.log" Feb 18 13:01:07 crc kubenswrapper[4922]: I0218 13:01:07.567057 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_5f598a92-b7cc-4584-9a17-d4c6d031ceeb/nova-cell1-novncproxy-novncproxy/0.log" Feb 18 13:01:07 crc kubenswrapper[4922]: I0218 13:01:07.671045 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_b05385b6-6350-4ee0-b628-a1eb55dd6067/nova-api-log/0.log" Feb 18 13:01:07 crc kubenswrapper[4922]: I0218 13:01:07.818183 4922 generic.go:334] "Generic (PLEG): container finished" podID="d70ede78-e133-44f7-8df1-fd86bfc44d38" containerID="d0f6fc0562afe44aa18fd1e63e1011ad503634a6e6172e5eb778e96a2afe56dd" exitCode=0 Feb 18 13:01:07 crc kubenswrapper[4922]: I0218 13:01:07.818236 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29523661-8n69m" event={"ID":"d70ede78-e133-44f7-8df1-fd86bfc44d38","Type":"ContainerDied","Data":"d0f6fc0562afe44aa18fd1e63e1011ad503634a6e6172e5eb778e96a2afe56dd"} Feb 18 13:01:07 crc kubenswrapper[4922]: I0218 13:01:07.939463 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_b05385b6-6350-4ee0-b628-a1eb55dd6067/nova-api-api/0.log" Feb 18 13:01:07 crc kubenswrapper[4922]: I0218 13:01:07.951407 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-hswp7_6e9e482a-c85e-473f-b848-e6fb6ba6afcd/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 13:01:07 crc kubenswrapper[4922]: I0218 13:01:07.990760 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_3ad5f860-e8c4-4b02-bd20-cd4d19ba20a6/nova-metadata-log/0.log" Feb 18 13:01:08 crc kubenswrapper[4922]: I0218 13:01:08.814955 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_873b23d0-3c83-4ab7-8178-1c4832c544a0/mysql-bootstrap/0.log" Feb 18 13:01:08 crc kubenswrapper[4922]: I0218 13:01:08.990375 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_873b23d0-3c83-4ab7-8178-1c4832c544a0/mysql-bootstrap/0.log" Feb 18 13:01:08 crc kubenswrapper[4922]: I0218 13:01:08.993181 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_7319f7de-4554-4a03-ba7f-c0f414ab2fe5/nova-scheduler-scheduler/0.log" Feb 18 13:01:09 crc kubenswrapper[4922]: I0218 13:01:09.067528 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_873b23d0-3c83-4ab7-8178-1c4832c544a0/galera/0.log" Feb 18 13:01:09 crc kubenswrapper[4922]: I0218 13:01:09.221660 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29523661-8n69m" Feb 18 13:01:09 crc kubenswrapper[4922]: I0218 13:01:09.250206 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_302e3b56-c5a4-4e80-bb7e-a9e6a61a119e/mysql-bootstrap/0.log" Feb 18 13:01:09 crc kubenswrapper[4922]: I0218 13:01:09.348409 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d70ede78-e133-44f7-8df1-fd86bfc44d38-config-data\") pod \"d70ede78-e133-44f7-8df1-fd86bfc44d38\" (UID: \"d70ede78-e133-44f7-8df1-fd86bfc44d38\") " Feb 18 13:01:09 crc kubenswrapper[4922]: I0218 13:01:09.348452 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d70ede78-e133-44f7-8df1-fd86bfc44d38-fernet-keys\") pod \"d70ede78-e133-44f7-8df1-fd86bfc44d38\" (UID: \"d70ede78-e133-44f7-8df1-fd86bfc44d38\") " Feb 18 13:01:09 crc kubenswrapper[4922]: I0218 13:01:09.348591 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjblx\" (UniqueName: \"kubernetes.io/projected/d70ede78-e133-44f7-8df1-fd86bfc44d38-kube-api-access-sjblx\") pod \"d70ede78-e133-44f7-8df1-fd86bfc44d38\" (UID: \"d70ede78-e133-44f7-8df1-fd86bfc44d38\") " Feb 18 13:01:09 crc kubenswrapper[4922]: I0218 13:01:09.348628 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d70ede78-e133-44f7-8df1-fd86bfc44d38-combined-ca-bundle\") pod \"d70ede78-e133-44f7-8df1-fd86bfc44d38\" (UID: \"d70ede78-e133-44f7-8df1-fd86bfc44d38\") " Feb 18 13:01:09 crc kubenswrapper[4922]: I0218 13:01:09.372089 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d70ede78-e133-44f7-8df1-fd86bfc44d38-kube-api-access-sjblx" (OuterVolumeSpecName: "kube-api-access-sjblx") pod "d70ede78-e133-44f7-8df1-fd86bfc44d38" (UID: "d70ede78-e133-44f7-8df1-fd86bfc44d38"). InnerVolumeSpecName "kube-api-access-sjblx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:01:09 crc kubenswrapper[4922]: I0218 13:01:09.380774 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d70ede78-e133-44f7-8df1-fd86bfc44d38-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d70ede78-e133-44f7-8df1-fd86bfc44d38" (UID: "d70ede78-e133-44f7-8df1-fd86bfc44d38"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:01:09 crc kubenswrapper[4922]: I0218 13:01:09.450724 4922 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d70ede78-e133-44f7-8df1-fd86bfc44d38-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 18 13:01:09 crc kubenswrapper[4922]: I0218 13:01:09.452446 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjblx\" (UniqueName: \"kubernetes.io/projected/d70ede78-e133-44f7-8df1-fd86bfc44d38-kube-api-access-sjblx\") on node \"crc\" DevicePath \"\"" Feb 18 13:01:09 crc kubenswrapper[4922]: I0218 13:01:09.495222 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d70ede78-e133-44f7-8df1-fd86bfc44d38-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d70ede78-e133-44f7-8df1-fd86bfc44d38" (UID: "d70ede78-e133-44f7-8df1-fd86bfc44d38"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:01:09 crc kubenswrapper[4922]: I0218 13:01:09.522875 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d70ede78-e133-44f7-8df1-fd86bfc44d38-config-data" (OuterVolumeSpecName: "config-data") pod "d70ede78-e133-44f7-8df1-fd86bfc44d38" (UID: "d70ede78-e133-44f7-8df1-fd86bfc44d38"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:01:09 crc kubenswrapper[4922]: I0218 13:01:09.554297 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d70ede78-e133-44f7-8df1-fd86bfc44d38-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 13:01:09 crc kubenswrapper[4922]: I0218 13:01:09.554342 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d70ede78-e133-44f7-8df1-fd86bfc44d38-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 13:01:09 crc kubenswrapper[4922]: I0218 13:01:09.654022 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_302e3b56-c5a4-4e80-bb7e-a9e6a61a119e/mysql-bootstrap/0.log" Feb 18 13:01:09 crc kubenswrapper[4922]: I0218 13:01:09.732931 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_302e3b56-c5a4-4e80-bb7e-a9e6a61a119e/galera/0.log" Feb 18 13:01:09 crc kubenswrapper[4922]: I0218 13:01:09.806869 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 13:01:09 crc kubenswrapper[4922]: I0218 13:01:09.807132 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 13:01:09 crc kubenswrapper[4922]: I0218 13:01:09.807235 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-znglx" Feb 18 13:01:09 crc kubenswrapper[4922]: I0218 13:01:09.808088 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a3d670811b88086243e0421af6023cabe2c743133e3bc783d1903c4468a2a91f"} pod="openshift-machine-config-operator/machine-config-daemon-znglx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 13:01:09 crc kubenswrapper[4922]: I0218 13:01:09.808246 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" containerID="cri-o://a3d670811b88086243e0421af6023cabe2c743133e3bc783d1903c4468a2a91f" gracePeriod=600 Feb 18 13:01:09 crc kubenswrapper[4922]: I0218 13:01:09.839315 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29523661-8n69m" event={"ID":"d70ede78-e133-44f7-8df1-fd86bfc44d38","Type":"ContainerDied","Data":"e7b035461fd11d5f92f5e193670bb3215ebfae191541dbd8e9b48c5dd56d3796"} Feb 18 13:01:09 crc kubenswrapper[4922]: I0218 13:01:09.839386 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7b035461fd11d5f92f5e193670bb3215ebfae191541dbd8e9b48c5dd56d3796" Feb 18 13:01:09 crc kubenswrapper[4922]: I0218 13:01:09.839498 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29523661-8n69m" Feb 18 13:01:09 crc kubenswrapper[4922]: I0218 13:01:09.887259 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_245b1cb9-d98f-4875-adf6-ab887f76849d/openstackclient/0.log" Feb 18 13:01:10 crc kubenswrapper[4922]: I0218 13:01:10.059437 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_3ad5f860-e8c4-4b02-bd20-cd4d19ba20a6/nova-metadata-metadata/0.log" Feb 18 13:01:10 crc kubenswrapper[4922]: I0218 13:01:10.613936 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-996pg_a2d0a226-07e2-402d-a868-2f8374670dac/ovn-controller/0.log" Feb 18 13:01:10 crc kubenswrapper[4922]: I0218 13:01:10.667888 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-wknpt_c831c6ce-ca0c-4f7d-8268-b4efe13e687d/openstack-network-exporter/0.log" Feb 18 13:01:10 crc kubenswrapper[4922]: I0218 13:01:10.850284 4922 generic.go:334] "Generic (PLEG): container finished" podID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerID="a3d670811b88086243e0421af6023cabe2c743133e3bc783d1903c4468a2a91f" exitCode=0 Feb 18 13:01:10 crc kubenswrapper[4922]: I0218 13:01:10.850340 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" event={"ID":"fdb7cedc-b2e3-48f0-80e0-e17073b43228","Type":"ContainerDied","Data":"a3d670811b88086243e0421af6023cabe2c743133e3bc783d1903c4468a2a91f"} Feb 18 13:01:10 crc kubenswrapper[4922]: I0218 13:01:10.850398 4922 scope.go:117] "RemoveContainer" containerID="5cb544d05c9692e1a613f94aebafc80d5e6b64a4e004e82767f3ecc7c8a1d324" Feb 18 13:01:10 crc kubenswrapper[4922]: I0218 13:01:10.869872 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-stvc7_cf286fe0-1b17-475a-b71b-ac4897c2f59d/ovsdb-server-init/0.log" Feb 18 13:01:11 crc kubenswrapper[4922]: I0218 13:01:11.065523 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-stvc7_cf286fe0-1b17-475a-b71b-ac4897c2f59d/ovsdb-server/0.log" Feb 18 13:01:11 crc kubenswrapper[4922]: I0218 13:01:11.107926 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-stvc7_cf286fe0-1b17-475a-b71b-ac4897c2f59d/ovs-vswitchd/0.log" Feb 18 13:01:11 crc kubenswrapper[4922]: I0218 13:01:11.122964 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-stvc7_cf286fe0-1b17-475a-b71b-ac4897c2f59d/ovsdb-server-init/0.log" Feb 18 13:01:11 crc kubenswrapper[4922]: I0218 13:01:11.337535 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-8jmtt_45d322f9-bf52-4679-ab43-9d222bc09a14/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 13:01:11 crc kubenswrapper[4922]: I0218 13:01:11.345689 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_09bbc755-2862-437b-9ef3-515103f77710/openstack-network-exporter/0.log" Feb 18 13:01:11 crc kubenswrapper[4922]: I0218 13:01:11.433889 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_09bbc755-2862-437b-9ef3-515103f77710/ovn-northd/0.log" Feb 18 13:01:11 crc kubenswrapper[4922]: I0218 13:01:11.695289 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_b6f3b4f2-3f65-4278-9cd0-753adfee2ecd/ovsdbserver-nb/0.log" Feb 18 13:01:11 crc kubenswrapper[4922]: I0218 13:01:11.700135 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_b6f3b4f2-3f65-4278-9cd0-753adfee2ecd/openstack-network-exporter/0.log" Feb 18 13:01:11 crc kubenswrapper[4922]: I0218 13:01:11.881326 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" event={"ID":"fdb7cedc-b2e3-48f0-80e0-e17073b43228","Type":"ContainerStarted","Data":"112ba381fc4b9cc3120f9b69127c4ddbe0d5c14674a4b80d3ac1d60cc8c37003"} Feb 18 13:01:11 crc kubenswrapper[4922]: I0218 13:01:11.891917 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_186f064b-a9e8-4637-a5eb-1646f2e1a783/openstack-network-exporter/0.log" Feb 18 13:01:11 crc kubenswrapper[4922]: I0218 13:01:11.919545 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_186f064b-a9e8-4637-a5eb-1646f2e1a783/ovsdbserver-sb/0.log" Feb 18 13:01:12 crc kubenswrapper[4922]: I0218 13:01:12.146155 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7c7b84785b-f8lmj_280ad3f5-10de-4dc8-866b-c7502c004835/placement-api/0.log" Feb 18 13:01:12 crc kubenswrapper[4922]: I0218 13:01:12.211057 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_13358646-85fa-4761-b4e8-ce5baf8851da/init-config-reloader/0.log" Feb 18 13:01:12 crc kubenswrapper[4922]: I0218 13:01:12.323581 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7c7b84785b-f8lmj_280ad3f5-10de-4dc8-866b-c7502c004835/placement-log/0.log" Feb 18 13:01:12 crc kubenswrapper[4922]: I0218 13:01:12.488776 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_13358646-85fa-4761-b4e8-ce5baf8851da/config-reloader/0.log" Feb 18 13:01:12 crc kubenswrapper[4922]: I0218 13:01:12.522041 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_13358646-85fa-4761-b4e8-ce5baf8851da/prometheus/0.log" Feb 18 13:01:12 crc kubenswrapper[4922]: I0218 13:01:12.531189 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_13358646-85fa-4761-b4e8-ce5baf8851da/thanos-sidecar/0.log" Feb 18 13:01:12 crc kubenswrapper[4922]: I0218 13:01:12.734180 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_13358646-85fa-4761-b4e8-ce5baf8851da/init-config-reloader/0.log" Feb 18 13:01:12 crc kubenswrapper[4922]: I0218 13:01:12.742850 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_9eb7dcb0-20c5-414c-bc86-58461654bcb5/setup-container/0.log" Feb 18 13:01:12 crc kubenswrapper[4922]: I0218 13:01:12.922741 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_9eb7dcb0-20c5-414c-bc86-58461654bcb5/setup-container/0.log" Feb 18 13:01:13 crc kubenswrapper[4922]: I0218 13:01:12.999998 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_bb934d91-0203-48d1-be6a-ab13e821993d/setup-container/0.log" Feb 18 13:01:13 crc kubenswrapper[4922]: I0218 13:01:13.014821 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_9eb7dcb0-20c5-414c-bc86-58461654bcb5/rabbitmq/0.log" Feb 18 13:01:13 crc kubenswrapper[4922]: I0218 13:01:13.301501 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_bb934d91-0203-48d1-be6a-ab13e821993d/setup-container/0.log" Feb 18 13:01:13 crc kubenswrapper[4922]: I0218 13:01:13.345485 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-lwp7z_fac1ed4a-2fa4-4220-80fb-f54e3a357fb9/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 13:01:13 crc kubenswrapper[4922]: I0218 13:01:13.409153 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_bb934d91-0203-48d1-be6a-ab13e821993d/rabbitmq/0.log" Feb 18 13:01:13 crc kubenswrapper[4922]: I0218 13:01:13.617481 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-h7qdx_30aa9b56-28ab-4d32-beb5-965876a6e243/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 13:01:13 crc kubenswrapper[4922]: I0218 13:01:13.627152 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-zs9qz_08ba745d-df3b-42c0-a384-ca64c96dd47f/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 13:01:13 crc kubenswrapper[4922]: I0218 13:01:13.851000 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-b5h25_227ab888-976c-4ce1-beb8-abbe305c6d79/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 13:01:13 crc kubenswrapper[4922]: I0218 13:01:13.869391 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-cz4px_c2fa843a-470e-441c-93c9-8c412459933b/ssh-known-hosts-edpm-deployment/0.log" Feb 18 13:01:14 crc kubenswrapper[4922]: I0218 13:01:14.153326 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6bb9876df9-jt7kg_8cc5cf6d-c722-42a3-8389-b991e77d1bbf/proxy-server/0.log" Feb 18 13:01:14 crc kubenswrapper[4922]: I0218 13:01:14.295910 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6bb9876df9-jt7kg_8cc5cf6d-c722-42a3-8389-b991e77d1bbf/proxy-httpd/0.log" Feb 18 13:01:14 crc kubenswrapper[4922]: I0218 13:01:14.378373 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-6gzbs_83fbf909-70fe-4d3c-9b45-3f5a6733779c/swift-ring-rebalance/0.log" Feb 18 13:01:14 crc kubenswrapper[4922]: I0218 13:01:14.495257 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0771bdc1-7622-4a65-aa82-3150630ce652/account-auditor/0.log" Feb 18 13:01:14 crc kubenswrapper[4922]: I0218 13:01:14.525487 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0771bdc1-7622-4a65-aa82-3150630ce652/account-reaper/0.log" Feb 18 13:01:14 crc kubenswrapper[4922]: I0218 13:01:14.602596 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0771bdc1-7622-4a65-aa82-3150630ce652/account-replicator/0.log" Feb 18 13:01:14 crc kubenswrapper[4922]: I0218 13:01:14.674038 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0771bdc1-7622-4a65-aa82-3150630ce652/account-server/0.log" Feb 18 13:01:14 crc kubenswrapper[4922]: I0218 13:01:14.762445 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0771bdc1-7622-4a65-aa82-3150630ce652/container-auditor/0.log" Feb 18 13:01:14 crc kubenswrapper[4922]: I0218 13:01:14.808597 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0771bdc1-7622-4a65-aa82-3150630ce652/container-replicator/0.log" Feb 18 13:01:14 crc kubenswrapper[4922]: I0218 13:01:14.848910 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0771bdc1-7622-4a65-aa82-3150630ce652/container-server/0.log" Feb 18 13:01:14 crc kubenswrapper[4922]: I0218 13:01:14.917076 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0771bdc1-7622-4a65-aa82-3150630ce652/container-updater/0.log" Feb 18 13:01:14 crc kubenswrapper[4922]: I0218 13:01:14.976472 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0771bdc1-7622-4a65-aa82-3150630ce652/object-auditor/0.log" Feb 18 13:01:15 crc kubenswrapper[4922]: I0218 13:01:15.057067 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0771bdc1-7622-4a65-aa82-3150630ce652/object-expirer/0.log" Feb 18 13:01:15 crc kubenswrapper[4922]: I0218 13:01:15.140232 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0771bdc1-7622-4a65-aa82-3150630ce652/object-replicator/0.log" Feb 18 13:01:15 crc kubenswrapper[4922]: I0218 13:01:15.163405 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0771bdc1-7622-4a65-aa82-3150630ce652/object-server/0.log" Feb 18 13:01:15 crc kubenswrapper[4922]: I0218 13:01:15.197084 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0771bdc1-7622-4a65-aa82-3150630ce652/object-updater/0.log" Feb 18 13:01:15 crc kubenswrapper[4922]: I0218 13:01:15.282733 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0771bdc1-7622-4a65-aa82-3150630ce652/rsync/0.log" Feb 18 13:01:15 crc kubenswrapper[4922]: I0218 13:01:15.348004 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0771bdc1-7622-4a65-aa82-3150630ce652/swift-recon-cron/0.log" Feb 18 13:01:15 crc kubenswrapper[4922]: I0218 13:01:15.544015 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-57sjs_0c5871a2-bb79-4b43-a830-7714fa7d8241/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 13:01:15 crc kubenswrapper[4922]: I0218 13:01:15.662346 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_4525818f-9e1d-48a0-8ec1-1a22a0841dd4/tempest-tests-tempest-tests-runner/0.log" Feb 18 13:01:15 crc kubenswrapper[4922]: I0218 13:01:15.818952 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_fc2833dd-ab51-414c-9ce3-ed8078989ea5/test-operator-logs-container/0.log" Feb 18 13:01:15 crc kubenswrapper[4922]: I0218 13:01:15.931866 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-hprwc_353e7c86-6842-40e4-ac3d-e2032eef15c5/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 13:01:16 crc kubenswrapper[4922]: I0218 13:01:16.662533 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_cd84d8c9-0a98-4f6b-b6da-887f4d294a38/watcher-applier/0.log" Feb 18 13:01:17 crc kubenswrapper[4922]: I0218 13:01:17.146495 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_43b1edea-6c95-42ae-b30a-d3ce2eb1e0de/watcher-api-log/0.log" Feb 18 13:01:17 crc kubenswrapper[4922]: I0218 13:01:17.804606 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_3df41ae7-b237-49e2-902c-f33e693f5db9/watcher-decision-engine/0.log" Feb 18 13:01:20 crc kubenswrapper[4922]: I0218 13:01:20.228816 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_43b1edea-6c95-42ae-b30a-d3ce2eb1e0de/watcher-api/0.log" Feb 18 13:01:21 crc kubenswrapper[4922]: I0218 13:01:21.019455 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_0ce20f52-4b9d-47a6-8da7-c64cd1d15623/memcached/0.log" Feb 18 13:01:45 crc kubenswrapper[4922]: I0218 13:01:45.334667 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_14461323130afa0a42cfbeae9371df3c6df69f0e6e31cfc4d749781171nvjzv_83694df8-b6fe-4913-8f73-d53972c81f36/util/0.log" Feb 18 13:01:45 crc kubenswrapper[4922]: I0218 13:01:45.545854 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_14461323130afa0a42cfbeae9371df3c6df69f0e6e31cfc4d749781171nvjzv_83694df8-b6fe-4913-8f73-d53972c81f36/pull/0.log" Feb 18 13:01:45 crc kubenswrapper[4922]: I0218 13:01:45.554087 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_14461323130afa0a42cfbeae9371df3c6df69f0e6e31cfc4d749781171nvjzv_83694df8-b6fe-4913-8f73-d53972c81f36/util/0.log" Feb 18 13:01:45 crc kubenswrapper[4922]: I0218 13:01:45.598690 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_14461323130afa0a42cfbeae9371df3c6df69f0e6e31cfc4d749781171nvjzv_83694df8-b6fe-4913-8f73-d53972c81f36/pull/0.log" Feb 18 13:01:45 crc kubenswrapper[4922]: I0218 13:01:45.805249 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_14461323130afa0a42cfbeae9371df3c6df69f0e6e31cfc4d749781171nvjzv_83694df8-b6fe-4913-8f73-d53972c81f36/extract/0.log" Feb 18 13:01:45 crc kubenswrapper[4922]: I0218 13:01:45.838317 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_14461323130afa0a42cfbeae9371df3c6df69f0e6e31cfc4d749781171nvjzv_83694df8-b6fe-4913-8f73-d53972c81f36/pull/0.log" Feb 18 13:01:45 crc kubenswrapper[4922]: I0218 13:01:45.889905 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_14461323130afa0a42cfbeae9371df3c6df69f0e6e31cfc4d749781171nvjzv_83694df8-b6fe-4913-8f73-d53972c81f36/util/0.log" Feb 18 13:01:46 crc kubenswrapper[4922]: I0218 13:01:46.416843 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-2ncv8_01766bee-50bd-4dcb-9b3d-831486ddeaf4/manager/0.log" Feb 18 13:01:46 crc kubenswrapper[4922]: I0218 13:01:46.867426 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987464f4-bnvrn_4c9af0bf-50d7-42ef-a8df-241b5ec63f5a/manager/0.log" Feb 18 13:01:47 crc kubenswrapper[4922]: I0218 13:01:47.290127 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-qm24h_51cd14ee-9b8a-421f-80bb-d208b752079d/manager/0.log" Feb 18 13:01:47 crc kubenswrapper[4922]: I0218 13:01:47.594215 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-82hvr_0032092e-84ca-426d-8f15-5141f4a8da20/manager/0.log" Feb 18 13:01:48 crc kubenswrapper[4922]: I0218 13:01:48.759658 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-r4v59_7753280d-fc59-4887-9d87-a2cfd83e7ba9/manager/0.log" Feb 18 13:01:48 crc kubenswrapper[4922]: I0218 13:01:48.881187 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-krt25_3c16d873-1097-4f56-913f-cc366ed34c23/manager/0.log" Feb 18 13:01:49 crc kubenswrapper[4922]: I0218 13:01:49.120472 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5d946d989d-6z2cq_61f73f1d-e472-411e-adc0-6755c47aa72b/manager/0.log" Feb 18 13:01:49 crc kubenswrapper[4922]: I0218 13:01:49.212291 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-jtfzr_324031ff-ceae-4065-9955-fd5745647cb0/manager/0.log" Feb 18 13:01:49 crc kubenswrapper[4922]: I0218 13:01:49.365240 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-54f6768c69-c597h_2936db6d-8a5b-4da8-9e52-e508a6e757fe/manager/0.log" Feb 18 13:01:50 crc kubenswrapper[4922]: I0218 13:01:50.166898 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64ddbf8bb-gwbk7_a7487625-0c9e-4396-8eb8-5840ce4344c8/manager/0.log" Feb 18 13:01:50 crc kubenswrapper[4922]: I0218 13:01:50.184190 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-tn47v_0a8811b6-4023-427d-a893-628e0dd338e8/manager/0.log" Feb 18 13:01:50 crc kubenswrapper[4922]: I0218 13:01:50.503343 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-wrd8w_90b4a58a-81d7-4129-8f45-5429e963676e/manager/0.log" Feb 18 13:01:50 crc kubenswrapper[4922]: I0218 13:01:50.608663 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7c6767dc9cxfsr7_081d9ec7-e338-437a-b3bc-af9b788db66a/manager/0.log" Feb 18 13:01:50 crc kubenswrapper[4922]: I0218 13:01:50.929750 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-f8b4c896c-mdz6v_51a617b6-1c84-446a-a342-bd0687227c0c/operator/0.log" Feb 18 13:01:51 crc kubenswrapper[4922]: I0218 13:01:51.190831 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-8rrxt_191b8ec5-c4e8-4e8c-92c2-fa2fd655f94a/registry-server/0.log" Feb 18 13:01:51 crc kubenswrapper[4922]: I0218 13:01:51.463222 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-d44cf6b75-z7pdl_42271b89-6aba-4e15-a2a1-856b656a1b6e/manager/0.log" Feb 18 13:01:51 crc kubenswrapper[4922]: I0218 13:01:51.881956 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-hddmr_66351682-3cdf-41cc-80d9-0bbb020144d2/manager/0.log" Feb 18 13:01:52 crc kubenswrapper[4922]: I0218 13:01:52.140916 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-98zrv_69ef021e-1b46-4aeb-8023-93f6fb366396/operator/0.log" Feb 18 13:01:52 crc kubenswrapper[4922]: I0218 13:01:52.405511 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-2bk9r_183b09db-ca5a-4aa1-b87b-908de4dc44ff/manager/0.log" Feb 18 13:01:52 crc kubenswrapper[4922]: I0218 13:01:52.935684 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7f45b4ff68-btlqf_387afbf1-afa5-414c-a22a-83a6a8197ff7/manager/0.log" Feb 18 13:01:53 crc kubenswrapper[4922]: I0218 13:01:53.112278 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7866795846-xdrrr_52123256-1372-49b6-80ed-c3112d14a8fa/manager/0.log" Feb 18 13:01:53 crc kubenswrapper[4922]: I0218 13:01:53.367018 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-8685d86d55-pbbl7_d81b14bf-a056-4780-af1a-bf38babee5b3/manager/0.log" Feb 18 13:01:53 crc kubenswrapper[4922]: I0218 13:01:53.489886 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5689f5d7c4-95x8t_4c487619-568f-44a0-9d23-037794ada114/manager/0.log" Feb 18 13:01:53 crc kubenswrapper[4922]: I0218 13:01:53.615030 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69f8888797-4fm4m_8eae5053-64f3-401a-a151-dbf22f30a845/manager/0.log" Feb 18 13:01:58 crc kubenswrapper[4922]: I0218 13:01:58.655199 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-f8lbk_ae81863a-2778-4505-9106-c850f873a75d/manager/0.log" Feb 18 13:02:15 crc kubenswrapper[4922]: I0218 13:02:15.207170 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-b6msr_adf4d88c-a19b-49bf-bb62-eef23b55efae/control-plane-machine-set-operator/0.log" Feb 18 13:02:15 crc kubenswrapper[4922]: I0218 13:02:15.385340 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-5sz92_4a5c3121-2765-47df-aa3f-22595e4b4ea9/kube-rbac-proxy/0.log" Feb 18 13:02:15 crc kubenswrapper[4922]: I0218 13:02:15.427296 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-5sz92_4a5c3121-2765-47df-aa3f-22595e4b4ea9/machine-api-operator/0.log" Feb 18 13:02:28 crc kubenswrapper[4922]: I0218 13:02:28.007700 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-tq4pt_906da7e7-ffe0-496f-bfb4-a76c2c14589e/cert-manager-controller/0.log" Feb 18 13:02:28 crc kubenswrapper[4922]: I0218 13:02:28.133792 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-wlvsw_c066ada8-ed3e-4ca8-ba77-2c8c9014fdb1/cert-manager-cainjector/0.log" Feb 18 13:02:28 crc kubenswrapper[4922]: I0218 13:02:28.196211 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-vvgzd_04a66d89-6415-45c5-b87b-b3730678eac4/cert-manager-webhook/0.log" Feb 18 13:02:40 crc kubenswrapper[4922]: I0218 13:02:40.949338 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-8x52x_8a41aeaf-5b15-4c8c-8abc-ad77b8e33896/nmstate-console-plugin/0.log" Feb 18 13:02:41 crc kubenswrapper[4922]: I0218 13:02:41.118937 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-xgmj2_ea43ed5b-6735-4fd5-8fc5-1a01dcaeea01/nmstate-handler/0.log" Feb 18 13:02:41 crc kubenswrapper[4922]: I0218 13:02:41.140213 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-2dtql_4e3e71a0-5178-4016-853d-0d0c31563d99/kube-rbac-proxy/0.log" Feb 18 13:02:41 crc kubenswrapper[4922]: I0218 13:02:41.264685 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-2dtql_4e3e71a0-5178-4016-853d-0d0c31563d99/nmstate-metrics/0.log" Feb 18 13:02:41 crc kubenswrapper[4922]: I0218 13:02:41.301890 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-p7vsx_578f51b2-8e78-4720-93f6-7cd9ce17e2ed/nmstate-operator/0.log" Feb 18 13:02:41 crc kubenswrapper[4922]: I0218 13:02:41.464312 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-7mvdv_df5bbc9b-9ba2-416b-93db-c4f6155b6906/nmstate-webhook/0.log" Feb 18 13:02:54 crc kubenswrapper[4922]: I0218 13:02:54.670252 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-cq76p_1446ef26-f977-4255-a1b2-a42e8107303e/prometheus-operator/0.log" Feb 18 13:02:54 crc kubenswrapper[4922]: I0218 13:02:54.864113 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-755d44f7c-bqmb5_879d4ddb-47d1-4987-a980-e9f05104e5cb/prometheus-operator-admission-webhook/0.log" Feb 18 13:02:54 crc kubenswrapper[4922]: I0218 13:02:54.933056 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-755d44f7c-x5s2p_333644cd-a424-47a3-b701-378149dcdc80/prometheus-operator-admission-webhook/0.log" Feb 18 13:02:55 crc kubenswrapper[4922]: I0218 13:02:55.139857 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-tkz2d_10c40ab6-7b55-410d-958e-3a6a37818c88/operator/0.log" Feb 18 13:02:55 crc kubenswrapper[4922]: I0218 13:02:55.146965 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-mh85w_20e893d8-cc0c-4bdf-83d6-698e08e5d82b/perses-operator/0.log" Feb 18 13:03:08 crc kubenswrapper[4922]: I0218 13:03:08.938218 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-8ds4f_4e80d896-3eb4-4dc8-b217-441a5a09dd05/kube-rbac-proxy/0.log" Feb 18 13:03:09 crc kubenswrapper[4922]: I0218 13:03:09.132023 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-8ds4f_4e80d896-3eb4-4dc8-b217-441a5a09dd05/controller/0.log" Feb 18 13:03:09 crc kubenswrapper[4922]: I0218 13:03:09.255966 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fwwpd_d069bacc-29a2-4aeb-9437-e654621c73c8/cp-frr-files/0.log" Feb 18 13:03:09 crc kubenswrapper[4922]: I0218 13:03:09.388950 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fwwpd_d069bacc-29a2-4aeb-9437-e654621c73c8/cp-frr-files/0.log" Feb 18 13:03:09 crc kubenswrapper[4922]: I0218 13:03:09.424401 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fwwpd_d069bacc-29a2-4aeb-9437-e654621c73c8/cp-metrics/0.log" Feb 18 13:03:09 crc kubenswrapper[4922]: I0218 13:03:09.471021 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fwwpd_d069bacc-29a2-4aeb-9437-e654621c73c8/cp-reloader/0.log" Feb 18 13:03:09 crc kubenswrapper[4922]: I0218 13:03:09.488189 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fwwpd_d069bacc-29a2-4aeb-9437-e654621c73c8/cp-reloader/0.log" Feb 18 13:03:09 crc kubenswrapper[4922]: I0218 13:03:09.648680 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fwwpd_d069bacc-29a2-4aeb-9437-e654621c73c8/cp-frr-files/0.log" Feb 18 13:03:09 crc kubenswrapper[4922]: I0218 13:03:09.673840 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fwwpd_d069bacc-29a2-4aeb-9437-e654621c73c8/cp-metrics/0.log" Feb 18 13:03:09 crc kubenswrapper[4922]: I0218 13:03:09.681977 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fwwpd_d069bacc-29a2-4aeb-9437-e654621c73c8/cp-metrics/0.log" Feb 18 13:03:09 crc kubenswrapper[4922]: I0218 13:03:09.692892 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fwwpd_d069bacc-29a2-4aeb-9437-e654621c73c8/cp-reloader/0.log" Feb 18 13:03:09 crc kubenswrapper[4922]: I0218 13:03:09.862525 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fwwpd_d069bacc-29a2-4aeb-9437-e654621c73c8/cp-metrics/0.log" Feb 18 13:03:09 crc kubenswrapper[4922]: I0218 13:03:09.867142 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fwwpd_d069bacc-29a2-4aeb-9437-e654621c73c8/cp-reloader/0.log" Feb 18 13:03:09 crc kubenswrapper[4922]: I0218 13:03:09.880280 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fwwpd_d069bacc-29a2-4aeb-9437-e654621c73c8/controller/0.log" Feb 18 13:03:09 crc kubenswrapper[4922]: I0218 13:03:09.882830 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fwwpd_d069bacc-29a2-4aeb-9437-e654621c73c8/cp-frr-files/0.log" Feb 18 13:03:10 crc kubenswrapper[4922]: I0218 13:03:10.082863 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fwwpd_d069bacc-29a2-4aeb-9437-e654621c73c8/frr-metrics/0.log" Feb 18 13:03:10 crc kubenswrapper[4922]: I0218 13:03:10.132443 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fwwpd_d069bacc-29a2-4aeb-9437-e654621c73c8/kube-rbac-proxy-frr/0.log" Feb 18 13:03:10 crc kubenswrapper[4922]: I0218 13:03:10.133985 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fwwpd_d069bacc-29a2-4aeb-9437-e654621c73c8/kube-rbac-proxy/0.log" Feb 18 13:03:10 crc kubenswrapper[4922]: I0218 13:03:10.297499 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fwwpd_d069bacc-29a2-4aeb-9437-e654621c73c8/reloader/0.log" Feb 18 13:03:10 crc kubenswrapper[4922]: I0218 13:03:10.852876 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-9cn2f_74e84ee6-9d14-48aa-9e59-f1ee46e15fcf/frr-k8s-webhook-server/0.log" Feb 18 13:03:10 crc kubenswrapper[4922]: I0218 13:03:10.958257 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-576949b4c-vwcqv_9fbb7bfe-c8d9-4a50-9326-bf07e99f4336/manager/0.log" Feb 18 13:03:11 crc kubenswrapper[4922]: I0218 13:03:11.173423 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6d8c5554f7-psxr7_7c9c6b01-e766-411c-a275-ae7ea3a9659e/webhook-server/0.log" Feb 18 13:03:11 crc kubenswrapper[4922]: I0218 13:03:11.294481 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-7rvcx_aa729491-0a34-4772-8178-d8566c355add/kube-rbac-proxy/0.log" Feb 18 13:03:11 crc kubenswrapper[4922]: I0218 13:03:11.783249 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fwwpd_d069bacc-29a2-4aeb-9437-e654621c73c8/frr/0.log" Feb 18 13:03:11 crc kubenswrapper[4922]: I0218 13:03:11.885574 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-7rvcx_aa729491-0a34-4772-8178-d8566c355add/speaker/0.log" Feb 18 13:03:23 crc kubenswrapper[4922]: I0218 13:03:23.556800 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp_00938d04-ee62-4756-830e-f66e2fbaab9d/util/0.log" Feb 18 13:03:23 crc kubenswrapper[4922]: I0218 13:03:23.668607 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp_00938d04-ee62-4756-830e-f66e2fbaab9d/util/0.log" Feb 18 13:03:23 crc kubenswrapper[4922]: I0218 13:03:23.726636 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp_00938d04-ee62-4756-830e-f66e2fbaab9d/pull/0.log" Feb 18 13:03:23 crc kubenswrapper[4922]: I0218 13:03:23.759157 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp_00938d04-ee62-4756-830e-f66e2fbaab9d/pull/0.log" Feb 18 13:03:23 crc kubenswrapper[4922]: I0218 13:03:23.916332 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp_00938d04-ee62-4756-830e-f66e2fbaab9d/util/0.log" Feb 18 13:03:23 crc kubenswrapper[4922]: I0218 13:03:23.919607 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp_00938d04-ee62-4756-830e-f66e2fbaab9d/extract/0.log" Feb 18 13:03:23 crc kubenswrapper[4922]: I0218 13:03:23.930014 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08l9tnp_00938d04-ee62-4756-830e-f66e2fbaab9d/pull/0.log" Feb 18 13:03:24 crc kubenswrapper[4922]: I0218 13:03:24.074137 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b59p4_188679bc-8b67-4136-94ce-fa515c1c950a/util/0.log" Feb 18 13:03:24 crc kubenswrapper[4922]: I0218 13:03:24.244765 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b59p4_188679bc-8b67-4136-94ce-fa515c1c950a/pull/0.log" Feb 18 13:03:24 crc kubenswrapper[4922]: I0218 13:03:24.259484 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b59p4_188679bc-8b67-4136-94ce-fa515c1c950a/util/0.log" Feb 18 13:03:24 crc kubenswrapper[4922]: I0218 13:03:24.270257 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b59p4_188679bc-8b67-4136-94ce-fa515c1c950a/pull/0.log" Feb 18 13:03:24 crc kubenswrapper[4922]: I0218 13:03:24.407144 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b59p4_188679bc-8b67-4136-94ce-fa515c1c950a/pull/0.log" Feb 18 13:03:24 crc kubenswrapper[4922]: I0218 13:03:24.437561 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b59p4_188679bc-8b67-4136-94ce-fa515c1c950a/util/0.log" Feb 18 13:03:24 crc kubenswrapper[4922]: I0218 13:03:24.465676 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b59p4_188679bc-8b67-4136-94ce-fa515c1c950a/extract/0.log" Feb 18 13:03:24 crc kubenswrapper[4922]: I0218 13:03:24.585190 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-cfw5z_523054ef-f8bb-4c7d-9baa-47191e299fcd/extract-utilities/0.log" Feb 18 13:03:24 crc kubenswrapper[4922]: I0218 13:03:24.723327 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-cfw5z_523054ef-f8bb-4c7d-9baa-47191e299fcd/extract-content/0.log" Feb 18 13:03:24 crc kubenswrapper[4922]: I0218 13:03:24.739980 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-cfw5z_523054ef-f8bb-4c7d-9baa-47191e299fcd/extract-utilities/0.log" Feb 18 13:03:24 crc kubenswrapper[4922]: I0218 13:03:24.748840 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-cfw5z_523054ef-f8bb-4c7d-9baa-47191e299fcd/extract-content/0.log" Feb 18 13:03:24 crc kubenswrapper[4922]: I0218 13:03:24.920102 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-cfw5z_523054ef-f8bb-4c7d-9baa-47191e299fcd/extract-content/0.log" Feb 18 13:03:24 crc kubenswrapper[4922]: I0218 13:03:24.926293 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-cfw5z_523054ef-f8bb-4c7d-9baa-47191e299fcd/extract-utilities/0.log" Feb 18 13:03:25 crc kubenswrapper[4922]: I0218 13:03:25.100271 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-h9zn6_f779c873-d525-428d-88ed-828d00bf17eb/extract-utilities/0.log" Feb 18 13:03:25 crc kubenswrapper[4922]: I0218 13:03:25.352528 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-h9zn6_f779c873-d525-428d-88ed-828d00bf17eb/extract-utilities/0.log" Feb 18 13:03:25 crc kubenswrapper[4922]: I0218 13:03:25.396426 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-h9zn6_f779c873-d525-428d-88ed-828d00bf17eb/extract-content/0.log" Feb 18 13:03:25 crc kubenswrapper[4922]: I0218 13:03:25.411579 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-h9zn6_f779c873-d525-428d-88ed-828d00bf17eb/extract-content/0.log" Feb 18 13:03:25 crc kubenswrapper[4922]: I0218 13:03:25.625578 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-h9zn6_f779c873-d525-428d-88ed-828d00bf17eb/extract-utilities/0.log" Feb 18 13:03:25 crc kubenswrapper[4922]: I0218 13:03:25.636333 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-h9zn6_f779c873-d525-428d-88ed-828d00bf17eb/extract-content/0.log" Feb 18 13:03:25 crc kubenswrapper[4922]: I0218 13:03:25.730126 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-cfw5z_523054ef-f8bb-4c7d-9baa-47191e299fcd/registry-server/0.log" Feb 18 13:03:25 crc kubenswrapper[4922]: I0218 13:03:25.865489 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatx846_b19cf8eb-c4e0-42a2-bc33-246e5c756bda/util/0.log" Feb 18 13:03:26 crc kubenswrapper[4922]: I0218 13:03:26.093551 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatx846_b19cf8eb-c4e0-42a2-bc33-246e5c756bda/pull/0.log" Feb 18 13:03:26 crc kubenswrapper[4922]: I0218 13:03:26.093932 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatx846_b19cf8eb-c4e0-42a2-bc33-246e5c756bda/util/0.log" Feb 18 13:03:26 crc kubenswrapper[4922]: I0218 13:03:26.166112 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatx846_b19cf8eb-c4e0-42a2-bc33-246e5c756bda/pull/0.log" Feb 18 13:03:26 crc kubenswrapper[4922]: I0218 13:03:26.351804 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatx846_b19cf8eb-c4e0-42a2-bc33-246e5c756bda/pull/0.log" Feb 18 13:03:26 crc kubenswrapper[4922]: I0218 13:03:26.371759 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatx846_b19cf8eb-c4e0-42a2-bc33-246e5c756bda/util/0.log" Feb 18 13:03:26 crc kubenswrapper[4922]: I0218 13:03:26.449029 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatx846_b19cf8eb-c4e0-42a2-bc33-246e5c756bda/extract/0.log" Feb 18 13:03:26 crc kubenswrapper[4922]: I0218 13:03:26.525734 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-h9zn6_f779c873-d525-428d-88ed-828d00bf17eb/registry-server/0.log" Feb 18 13:03:26 crc kubenswrapper[4922]: I0218 13:03:26.598941 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-gjc8w_452cdbd0-d1e1-491a-8edd-d0f88f602364/marketplace-operator/0.log" Feb 18 13:03:26 crc kubenswrapper[4922]: I0218 13:03:26.743595 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-gsthg_0c2d2657-497c-4512-97ff-be630635c1df/extract-utilities/0.log" Feb 18 13:03:26 crc kubenswrapper[4922]: I0218 13:03:26.904594 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-gsthg_0c2d2657-497c-4512-97ff-be630635c1df/extract-utilities/0.log" Feb 18 13:03:26 crc kubenswrapper[4922]: I0218 13:03:26.919687 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-gsthg_0c2d2657-497c-4512-97ff-be630635c1df/extract-content/0.log" Feb 18 13:03:26 crc kubenswrapper[4922]: I0218 13:03:26.919675 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-gsthg_0c2d2657-497c-4512-97ff-be630635c1df/extract-content/0.log" Feb 18 13:03:27 crc kubenswrapper[4922]: I0218 13:03:27.065866 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-gsthg_0c2d2657-497c-4512-97ff-be630635c1df/extract-utilities/0.log" Feb 18 13:03:27 crc kubenswrapper[4922]: I0218 13:03:27.124215 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-gsthg_0c2d2657-497c-4512-97ff-be630635c1df/extract-content/0.log" Feb 18 13:03:27 crc kubenswrapper[4922]: I0218 13:03:27.164275 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-gsthg_0c2d2657-497c-4512-97ff-be630635c1df/registry-server/0.log" Feb 18 13:03:27 crc kubenswrapper[4922]: I0218 13:03:27.289218 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-n6xrc_5e74836e-69fc-4faa-ac09-05926ad4810a/extract-utilities/0.log" Feb 18 13:03:27 crc kubenswrapper[4922]: I0218 13:03:27.459487 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-n6xrc_5e74836e-69fc-4faa-ac09-05926ad4810a/extract-content/0.log" Feb 18 13:03:27 crc kubenswrapper[4922]: I0218 13:03:27.463168 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-n6xrc_5e74836e-69fc-4faa-ac09-05926ad4810a/extract-utilities/0.log" Feb 18 13:03:27 crc kubenswrapper[4922]: I0218 13:03:27.470696 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-n6xrc_5e74836e-69fc-4faa-ac09-05926ad4810a/extract-content/0.log" Feb 18 13:03:27 crc kubenswrapper[4922]: I0218 13:03:27.665502 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-n6xrc_5e74836e-69fc-4faa-ac09-05926ad4810a/extract-utilities/0.log" Feb 18 13:03:27 crc kubenswrapper[4922]: I0218 13:03:27.680340 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-n6xrc_5e74836e-69fc-4faa-ac09-05926ad4810a/extract-content/0.log" Feb 18 13:03:27 crc kubenswrapper[4922]: I0218 13:03:27.963687 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-n6xrc_5e74836e-69fc-4faa-ac09-05926ad4810a/registry-server/0.log" Feb 18 13:03:39 crc kubenswrapper[4922]: I0218 13:03:39.564175 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-755d44f7c-bqmb5_879d4ddb-47d1-4987-a980-e9f05104e5cb/prometheus-operator-admission-webhook/0.log" Feb 18 13:03:39 crc kubenswrapper[4922]: I0218 13:03:39.576662 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-cq76p_1446ef26-f977-4255-a1b2-a42e8107303e/prometheus-operator/0.log" Feb 18 13:03:39 crc kubenswrapper[4922]: I0218 13:03:39.644531 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-755d44f7c-x5s2p_333644cd-a424-47a3-b701-378149dcdc80/prometheus-operator-admission-webhook/0.log" Feb 18 13:03:39 crc kubenswrapper[4922]: I0218 13:03:39.771638 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-mh85w_20e893d8-cc0c-4bdf-83d6-698e08e5d82b/perses-operator/0.log" Feb 18 13:03:39 crc kubenswrapper[4922]: I0218 13:03:39.788211 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-tkz2d_10c40ab6-7b55-410d-958e-3a6a37818c88/operator/0.log" Feb 18 13:03:39 crc kubenswrapper[4922]: I0218 13:03:39.807003 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 13:03:39 crc kubenswrapper[4922]: I0218 13:03:39.807068 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 13:04:09 crc kubenswrapper[4922]: I0218 13:04:09.808574 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 13:04:09 crc kubenswrapper[4922]: I0218 13:04:09.809064 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 13:04:12 crc kubenswrapper[4922]: I0218 13:04:12.756732 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-99kzq"] Feb 18 13:04:12 crc kubenswrapper[4922]: E0218 13:04:12.757505 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d70ede78-e133-44f7-8df1-fd86bfc44d38" containerName="keystone-cron" Feb 18 13:04:12 crc kubenswrapper[4922]: I0218 13:04:12.757524 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="d70ede78-e133-44f7-8df1-fd86bfc44d38" containerName="keystone-cron" Feb 18 13:04:12 crc kubenswrapper[4922]: I0218 13:04:12.757771 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="d70ede78-e133-44f7-8df1-fd86bfc44d38" containerName="keystone-cron" Feb 18 13:04:12 crc kubenswrapper[4922]: I0218 13:04:12.759451 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-99kzq" Feb 18 13:04:12 crc kubenswrapper[4922]: I0218 13:04:12.781141 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-99kzq"] Feb 18 13:04:12 crc kubenswrapper[4922]: I0218 13:04:12.914348 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkvfr\" (UniqueName: \"kubernetes.io/projected/b4669bde-5144-4129-8236-f152c6a30cad-kube-api-access-bkvfr\") pod \"redhat-operators-99kzq\" (UID: \"b4669bde-5144-4129-8236-f152c6a30cad\") " pod="openshift-marketplace/redhat-operators-99kzq" Feb 18 13:04:12 crc kubenswrapper[4922]: I0218 13:04:12.914460 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4669bde-5144-4129-8236-f152c6a30cad-utilities\") pod \"redhat-operators-99kzq\" (UID: \"b4669bde-5144-4129-8236-f152c6a30cad\") " pod="openshift-marketplace/redhat-operators-99kzq" Feb 18 13:04:12 crc kubenswrapper[4922]: I0218 13:04:12.914507 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4669bde-5144-4129-8236-f152c6a30cad-catalog-content\") pod \"redhat-operators-99kzq\" (UID: \"b4669bde-5144-4129-8236-f152c6a30cad\") " pod="openshift-marketplace/redhat-operators-99kzq" Feb 18 13:04:13 crc kubenswrapper[4922]: I0218 13:04:13.016625 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkvfr\" (UniqueName: \"kubernetes.io/projected/b4669bde-5144-4129-8236-f152c6a30cad-kube-api-access-bkvfr\") pod \"redhat-operators-99kzq\" (UID: \"b4669bde-5144-4129-8236-f152c6a30cad\") " pod="openshift-marketplace/redhat-operators-99kzq" Feb 18 13:04:13 crc kubenswrapper[4922]: I0218 13:04:13.016699 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4669bde-5144-4129-8236-f152c6a30cad-utilities\") pod \"redhat-operators-99kzq\" (UID: \"b4669bde-5144-4129-8236-f152c6a30cad\") " pod="openshift-marketplace/redhat-operators-99kzq" Feb 18 13:04:13 crc kubenswrapper[4922]: I0218 13:04:13.016743 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4669bde-5144-4129-8236-f152c6a30cad-catalog-content\") pod \"redhat-operators-99kzq\" (UID: \"b4669bde-5144-4129-8236-f152c6a30cad\") " pod="openshift-marketplace/redhat-operators-99kzq" Feb 18 13:04:13 crc kubenswrapper[4922]: I0218 13:04:13.017170 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4669bde-5144-4129-8236-f152c6a30cad-utilities\") pod \"redhat-operators-99kzq\" (UID: \"b4669bde-5144-4129-8236-f152c6a30cad\") " pod="openshift-marketplace/redhat-operators-99kzq" Feb 18 13:04:13 crc kubenswrapper[4922]: I0218 13:04:13.017676 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4669bde-5144-4129-8236-f152c6a30cad-catalog-content\") pod \"redhat-operators-99kzq\" (UID: \"b4669bde-5144-4129-8236-f152c6a30cad\") " pod="openshift-marketplace/redhat-operators-99kzq" Feb 18 13:04:13 crc kubenswrapper[4922]: I0218 13:04:13.039675 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkvfr\" (UniqueName: \"kubernetes.io/projected/b4669bde-5144-4129-8236-f152c6a30cad-kube-api-access-bkvfr\") pod \"redhat-operators-99kzq\" (UID: \"b4669bde-5144-4129-8236-f152c6a30cad\") " pod="openshift-marketplace/redhat-operators-99kzq" Feb 18 13:04:13 crc kubenswrapper[4922]: I0218 13:04:13.079728 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-99kzq" Feb 18 13:04:13 crc kubenswrapper[4922]: I0218 13:04:13.578737 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-99kzq"] Feb 18 13:04:14 crc kubenswrapper[4922]: I0218 13:04:14.246850 4922 generic.go:334] "Generic (PLEG): container finished" podID="b4669bde-5144-4129-8236-f152c6a30cad" containerID="9d7e7f7fc79b163445197b7a284968b514d2be2abd566afbb0fe72271dff0bf1" exitCode=0 Feb 18 13:04:14 crc kubenswrapper[4922]: I0218 13:04:14.246930 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-99kzq" event={"ID":"b4669bde-5144-4129-8236-f152c6a30cad","Type":"ContainerDied","Data":"9d7e7f7fc79b163445197b7a284968b514d2be2abd566afbb0fe72271dff0bf1"} Feb 18 13:04:14 crc kubenswrapper[4922]: I0218 13:04:14.247911 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-99kzq" event={"ID":"b4669bde-5144-4129-8236-f152c6a30cad","Type":"ContainerStarted","Data":"1db80cea99a696ccd86930611cb56a401b734513da64fc07525a0f22be964c57"} Feb 18 13:04:14 crc kubenswrapper[4922]: I0218 13:04:14.249459 4922 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 13:04:15 crc kubenswrapper[4922]: I0218 13:04:15.261929 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-99kzq" event={"ID":"b4669bde-5144-4129-8236-f152c6a30cad","Type":"ContainerStarted","Data":"7c302066c13122506a7d14923dd931f9f8800c89244aa442754f6252fe024074"} Feb 18 13:04:18 crc kubenswrapper[4922]: I0218 13:04:18.309616 4922 generic.go:334] "Generic (PLEG): container finished" podID="b4669bde-5144-4129-8236-f152c6a30cad" containerID="7c302066c13122506a7d14923dd931f9f8800c89244aa442754f6252fe024074" exitCode=0 Feb 18 13:04:18 crc kubenswrapper[4922]: I0218 13:04:18.309772 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-99kzq" event={"ID":"b4669bde-5144-4129-8236-f152c6a30cad","Type":"ContainerDied","Data":"7c302066c13122506a7d14923dd931f9f8800c89244aa442754f6252fe024074"} Feb 18 13:04:19 crc kubenswrapper[4922]: I0218 13:04:19.320731 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-99kzq" event={"ID":"b4669bde-5144-4129-8236-f152c6a30cad","Type":"ContainerStarted","Data":"1a35611b6198e4cc11556e483d246cd0ff22b99a19c84b6e5b856dac1279dfd6"} Feb 18 13:04:19 crc kubenswrapper[4922]: I0218 13:04:19.342634 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-99kzq" podStartSLOduration=2.860000838 podStartE2EDuration="7.342618972s" podCreationTimestamp="2026-02-18 13:04:12 +0000 UTC" firstStartedPulling="2026-02-18 13:04:14.249186053 +0000 UTC m=+5255.976890133" lastFinishedPulling="2026-02-18 13:04:18.731804187 +0000 UTC m=+5260.459508267" observedRunningTime="2026-02-18 13:04:19.340493288 +0000 UTC m=+5261.068197368" watchObservedRunningTime="2026-02-18 13:04:19.342618972 +0000 UTC m=+5261.070323052" Feb 18 13:04:23 crc kubenswrapper[4922]: I0218 13:04:23.080511 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-99kzq" Feb 18 13:04:23 crc kubenswrapper[4922]: I0218 13:04:23.082451 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-99kzq" Feb 18 13:04:24 crc kubenswrapper[4922]: I0218 13:04:24.138135 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-99kzq" podUID="b4669bde-5144-4129-8236-f152c6a30cad" containerName="registry-server" probeResult="failure" output=< Feb 18 13:04:24 crc kubenswrapper[4922]: timeout: failed to connect service ":50051" within 1s Feb 18 13:04:24 crc kubenswrapper[4922]: > Feb 18 13:04:33 crc kubenswrapper[4922]: I0218 13:04:33.128110 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-99kzq" Feb 18 13:04:33 crc kubenswrapper[4922]: I0218 13:04:33.198055 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-99kzq" Feb 18 13:04:33 crc kubenswrapper[4922]: I0218 13:04:33.375682 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-99kzq"] Feb 18 13:04:34 crc kubenswrapper[4922]: I0218 13:04:34.460305 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-99kzq" podUID="b4669bde-5144-4129-8236-f152c6a30cad" containerName="registry-server" containerID="cri-o://1a35611b6198e4cc11556e483d246cd0ff22b99a19c84b6e5b856dac1279dfd6" gracePeriod=2 Feb 18 13:04:34 crc kubenswrapper[4922]: I0218 13:04:34.936492 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-99kzq" Feb 18 13:04:35 crc kubenswrapper[4922]: I0218 13:04:35.059549 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkvfr\" (UniqueName: \"kubernetes.io/projected/b4669bde-5144-4129-8236-f152c6a30cad-kube-api-access-bkvfr\") pod \"b4669bde-5144-4129-8236-f152c6a30cad\" (UID: \"b4669bde-5144-4129-8236-f152c6a30cad\") " Feb 18 13:04:35 crc kubenswrapper[4922]: I0218 13:04:35.059751 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4669bde-5144-4129-8236-f152c6a30cad-utilities\") pod \"b4669bde-5144-4129-8236-f152c6a30cad\" (UID: \"b4669bde-5144-4129-8236-f152c6a30cad\") " Feb 18 13:04:35 crc kubenswrapper[4922]: I0218 13:04:35.059786 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4669bde-5144-4129-8236-f152c6a30cad-catalog-content\") pod \"b4669bde-5144-4129-8236-f152c6a30cad\" (UID: \"b4669bde-5144-4129-8236-f152c6a30cad\") " Feb 18 13:04:35 crc kubenswrapper[4922]: I0218 13:04:35.061649 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4669bde-5144-4129-8236-f152c6a30cad-utilities" (OuterVolumeSpecName: "utilities") pod "b4669bde-5144-4129-8236-f152c6a30cad" (UID: "b4669bde-5144-4129-8236-f152c6a30cad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 13:04:35 crc kubenswrapper[4922]: I0218 13:04:35.074549 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4669bde-5144-4129-8236-f152c6a30cad-kube-api-access-bkvfr" (OuterVolumeSpecName: "kube-api-access-bkvfr") pod "b4669bde-5144-4129-8236-f152c6a30cad" (UID: "b4669bde-5144-4129-8236-f152c6a30cad"). InnerVolumeSpecName "kube-api-access-bkvfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:04:35 crc kubenswrapper[4922]: I0218 13:04:35.162013 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkvfr\" (UniqueName: \"kubernetes.io/projected/b4669bde-5144-4129-8236-f152c6a30cad-kube-api-access-bkvfr\") on node \"crc\" DevicePath \"\"" Feb 18 13:04:35 crc kubenswrapper[4922]: I0218 13:04:35.162080 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4669bde-5144-4129-8236-f152c6a30cad-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 13:04:35 crc kubenswrapper[4922]: I0218 13:04:35.207896 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4669bde-5144-4129-8236-f152c6a30cad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b4669bde-5144-4129-8236-f152c6a30cad" (UID: "b4669bde-5144-4129-8236-f152c6a30cad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 13:04:35 crc kubenswrapper[4922]: I0218 13:04:35.263502 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4669bde-5144-4129-8236-f152c6a30cad-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 13:04:35 crc kubenswrapper[4922]: I0218 13:04:35.475105 4922 generic.go:334] "Generic (PLEG): container finished" podID="b4669bde-5144-4129-8236-f152c6a30cad" containerID="1a35611b6198e4cc11556e483d246cd0ff22b99a19c84b6e5b856dac1279dfd6" exitCode=0 Feb 18 13:04:35 crc kubenswrapper[4922]: I0218 13:04:35.475188 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-99kzq" event={"ID":"b4669bde-5144-4129-8236-f152c6a30cad","Type":"ContainerDied","Data":"1a35611b6198e4cc11556e483d246cd0ff22b99a19c84b6e5b856dac1279dfd6"} Feb 18 13:04:35 crc kubenswrapper[4922]: I0218 13:04:35.475255 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-99kzq" event={"ID":"b4669bde-5144-4129-8236-f152c6a30cad","Type":"ContainerDied","Data":"1db80cea99a696ccd86930611cb56a401b734513da64fc07525a0f22be964c57"} Feb 18 13:04:35 crc kubenswrapper[4922]: I0218 13:04:35.475275 4922 scope.go:117] "RemoveContainer" containerID="1a35611b6198e4cc11556e483d246cd0ff22b99a19c84b6e5b856dac1279dfd6" Feb 18 13:04:35 crc kubenswrapper[4922]: I0218 13:04:35.475211 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-99kzq" Feb 18 13:04:35 crc kubenswrapper[4922]: I0218 13:04:35.508585 4922 scope.go:117] "RemoveContainer" containerID="7c302066c13122506a7d14923dd931f9f8800c89244aa442754f6252fe024074" Feb 18 13:04:35 crc kubenswrapper[4922]: I0218 13:04:35.515035 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-99kzq"] Feb 18 13:04:35 crc kubenswrapper[4922]: I0218 13:04:35.523719 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-99kzq"] Feb 18 13:04:35 crc kubenswrapper[4922]: I0218 13:04:35.538993 4922 scope.go:117] "RemoveContainer" containerID="9d7e7f7fc79b163445197b7a284968b514d2be2abd566afbb0fe72271dff0bf1" Feb 18 13:04:35 crc kubenswrapper[4922]: I0218 13:04:35.584379 4922 scope.go:117] "RemoveContainer" containerID="1a35611b6198e4cc11556e483d246cd0ff22b99a19c84b6e5b856dac1279dfd6" Feb 18 13:04:35 crc kubenswrapper[4922]: E0218 13:04:35.584783 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a35611b6198e4cc11556e483d246cd0ff22b99a19c84b6e5b856dac1279dfd6\": container with ID starting with 1a35611b6198e4cc11556e483d246cd0ff22b99a19c84b6e5b856dac1279dfd6 not found: ID does not exist" containerID="1a35611b6198e4cc11556e483d246cd0ff22b99a19c84b6e5b856dac1279dfd6" Feb 18 13:04:35 crc kubenswrapper[4922]: I0218 13:04:35.584812 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a35611b6198e4cc11556e483d246cd0ff22b99a19c84b6e5b856dac1279dfd6"} err="failed to get container status \"1a35611b6198e4cc11556e483d246cd0ff22b99a19c84b6e5b856dac1279dfd6\": rpc error: code = NotFound desc = could not find container \"1a35611b6198e4cc11556e483d246cd0ff22b99a19c84b6e5b856dac1279dfd6\": container with ID starting with 1a35611b6198e4cc11556e483d246cd0ff22b99a19c84b6e5b856dac1279dfd6 not found: ID does not exist" Feb 18 13:04:35 crc kubenswrapper[4922]: I0218 13:04:35.584836 4922 scope.go:117] "RemoveContainer" containerID="7c302066c13122506a7d14923dd931f9f8800c89244aa442754f6252fe024074" Feb 18 13:04:35 crc kubenswrapper[4922]: E0218 13:04:35.585063 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c302066c13122506a7d14923dd931f9f8800c89244aa442754f6252fe024074\": container with ID starting with 7c302066c13122506a7d14923dd931f9f8800c89244aa442754f6252fe024074 not found: ID does not exist" containerID="7c302066c13122506a7d14923dd931f9f8800c89244aa442754f6252fe024074" Feb 18 13:04:35 crc kubenswrapper[4922]: I0218 13:04:35.585100 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c302066c13122506a7d14923dd931f9f8800c89244aa442754f6252fe024074"} err="failed to get container status \"7c302066c13122506a7d14923dd931f9f8800c89244aa442754f6252fe024074\": rpc error: code = NotFound desc = could not find container \"7c302066c13122506a7d14923dd931f9f8800c89244aa442754f6252fe024074\": container with ID starting with 7c302066c13122506a7d14923dd931f9f8800c89244aa442754f6252fe024074 not found: ID does not exist" Feb 18 13:04:35 crc kubenswrapper[4922]: I0218 13:04:35.585120 4922 scope.go:117] "RemoveContainer" containerID="9d7e7f7fc79b163445197b7a284968b514d2be2abd566afbb0fe72271dff0bf1" Feb 18 13:04:35 crc kubenswrapper[4922]: E0218 13:04:35.585534 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d7e7f7fc79b163445197b7a284968b514d2be2abd566afbb0fe72271dff0bf1\": container with ID starting with 9d7e7f7fc79b163445197b7a284968b514d2be2abd566afbb0fe72271dff0bf1 not found: ID does not exist" containerID="9d7e7f7fc79b163445197b7a284968b514d2be2abd566afbb0fe72271dff0bf1" Feb 18 13:04:35 crc kubenswrapper[4922]: I0218 13:04:35.585566 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d7e7f7fc79b163445197b7a284968b514d2be2abd566afbb0fe72271dff0bf1"} err="failed to get container status \"9d7e7f7fc79b163445197b7a284968b514d2be2abd566afbb0fe72271dff0bf1\": rpc error: code = NotFound desc = could not find container \"9d7e7f7fc79b163445197b7a284968b514d2be2abd566afbb0fe72271dff0bf1\": container with ID starting with 9d7e7f7fc79b163445197b7a284968b514d2be2abd566afbb0fe72271dff0bf1 not found: ID does not exist" Feb 18 13:04:36 crc kubenswrapper[4922]: I0218 13:04:36.987196 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4669bde-5144-4129-8236-f152c6a30cad" path="/var/lib/kubelet/pods/b4669bde-5144-4129-8236-f152c6a30cad/volumes" Feb 18 13:04:39 crc kubenswrapper[4922]: I0218 13:04:39.807014 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 13:04:39 crc kubenswrapper[4922]: I0218 13:04:39.807296 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 13:04:39 crc kubenswrapper[4922]: I0218 13:04:39.807374 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-znglx" Feb 18 13:04:39 crc kubenswrapper[4922]: I0218 13:04:39.808148 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"112ba381fc4b9cc3120f9b69127c4ddbe0d5c14674a4b80d3ac1d60cc8c37003"} pod="openshift-machine-config-operator/machine-config-daemon-znglx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 13:04:39 crc kubenswrapper[4922]: I0218 13:04:39.808200 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" containerID="cri-o://112ba381fc4b9cc3120f9b69127c4ddbe0d5c14674a4b80d3ac1d60cc8c37003" gracePeriod=600 Feb 18 13:04:39 crc kubenswrapper[4922]: E0218 13:04:39.952734 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 13:04:40 crc kubenswrapper[4922]: I0218 13:04:40.526096 4922 generic.go:334] "Generic (PLEG): container finished" podID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerID="112ba381fc4b9cc3120f9b69127c4ddbe0d5c14674a4b80d3ac1d60cc8c37003" exitCode=0 Feb 18 13:04:40 crc kubenswrapper[4922]: I0218 13:04:40.526138 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" event={"ID":"fdb7cedc-b2e3-48f0-80e0-e17073b43228","Type":"ContainerDied","Data":"112ba381fc4b9cc3120f9b69127c4ddbe0d5c14674a4b80d3ac1d60cc8c37003"} Feb 18 13:04:40 crc kubenswrapper[4922]: I0218 13:04:40.526170 4922 scope.go:117] "RemoveContainer" containerID="a3d670811b88086243e0421af6023cabe2c743133e3bc783d1903c4468a2a91f" Feb 18 13:04:40 crc kubenswrapper[4922]: I0218 13:04:40.527369 4922 scope.go:117] "RemoveContainer" containerID="112ba381fc4b9cc3120f9b69127c4ddbe0d5c14674a4b80d3ac1d60cc8c37003" Feb 18 13:04:40 crc kubenswrapper[4922]: E0218 13:04:40.527747 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 13:04:52 crc kubenswrapper[4922]: I0218 13:04:52.974434 4922 scope.go:117] "RemoveContainer" containerID="112ba381fc4b9cc3120f9b69127c4ddbe0d5c14674a4b80d3ac1d60cc8c37003" Feb 18 13:04:52 crc kubenswrapper[4922]: E0218 13:04:52.975222 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 13:05:04 crc kubenswrapper[4922]: I0218 13:05:04.540982 4922 scope.go:117] "RemoveContainer" containerID="112ba381fc4b9cc3120f9b69127c4ddbe0d5c14674a4b80d3ac1d60cc8c37003" Feb 18 13:05:04 crc kubenswrapper[4922]: E0218 13:05:04.541926 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 13:05:18 crc kubenswrapper[4922]: I0218 13:05:18.994063 4922 scope.go:117] "RemoveContainer" containerID="112ba381fc4b9cc3120f9b69127c4ddbe0d5c14674a4b80d3ac1d60cc8c37003" Feb 18 13:05:18 crc kubenswrapper[4922]: E0218 13:05:18.994931 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 13:05:31 crc kubenswrapper[4922]: I0218 13:05:31.972803 4922 scope.go:117] "RemoveContainer" containerID="112ba381fc4b9cc3120f9b69127c4ddbe0d5c14674a4b80d3ac1d60cc8c37003" Feb 18 13:05:31 crc kubenswrapper[4922]: E0218 13:05:31.973544 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 13:05:46 crc kubenswrapper[4922]: I0218 13:05:46.973123 4922 scope.go:117] "RemoveContainer" containerID="112ba381fc4b9cc3120f9b69127c4ddbe0d5c14674a4b80d3ac1d60cc8c37003" Feb 18 13:05:46 crc kubenswrapper[4922]: E0218 13:05:46.974153 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 13:05:48 crc kubenswrapper[4922]: I0218 13:05:48.149651 4922 generic.go:334] "Generic (PLEG): container finished" podID="e0dbd879-002b-41f9-aa0f-dd9a7aa8fa44" containerID="1a1f695d758905465b01fcc900845ef77ea0ac9586c1010946a5b06f4411a66c" exitCode=0 Feb 18 13:05:48 crc kubenswrapper[4922]: I0218 13:05:48.149699 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bf65p/must-gather-pnxz8" event={"ID":"e0dbd879-002b-41f9-aa0f-dd9a7aa8fa44","Type":"ContainerDied","Data":"1a1f695d758905465b01fcc900845ef77ea0ac9586c1010946a5b06f4411a66c"} Feb 18 13:05:48 crc kubenswrapper[4922]: I0218 13:05:48.150324 4922 scope.go:117] "RemoveContainer" containerID="1a1f695d758905465b01fcc900845ef77ea0ac9586c1010946a5b06f4411a66c" Feb 18 13:05:48 crc kubenswrapper[4922]: I0218 13:05:48.543410 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-bf65p_must-gather-pnxz8_e0dbd879-002b-41f9-aa0f-dd9a7aa8fa44/gather/0.log" Feb 18 13:05:56 crc kubenswrapper[4922]: I0218 13:05:56.709429 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-bf65p/must-gather-pnxz8"] Feb 18 13:05:56 crc kubenswrapper[4922]: I0218 13:05:56.710392 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-bf65p/must-gather-pnxz8" podUID="e0dbd879-002b-41f9-aa0f-dd9a7aa8fa44" containerName="copy" containerID="cri-o://87e25aa3435f2488f0e8ad59cf6c52d09f26e56dbbb0c16afa85618c9286b2b0" gracePeriod=2 Feb 18 13:05:56 crc kubenswrapper[4922]: I0218 13:05:56.725242 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-bf65p/must-gather-pnxz8"] Feb 18 13:05:57 crc kubenswrapper[4922]: I0218 13:05:57.202407 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-bf65p_must-gather-pnxz8_e0dbd879-002b-41f9-aa0f-dd9a7aa8fa44/copy/0.log" Feb 18 13:05:57 crc kubenswrapper[4922]: I0218 13:05:57.203690 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bf65p/must-gather-pnxz8" Feb 18 13:05:57 crc kubenswrapper[4922]: I0218 13:05:57.236993 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-bf65p_must-gather-pnxz8_e0dbd879-002b-41f9-aa0f-dd9a7aa8fa44/copy/0.log" Feb 18 13:05:57 crc kubenswrapper[4922]: I0218 13:05:57.237519 4922 generic.go:334] "Generic (PLEG): container finished" podID="e0dbd879-002b-41f9-aa0f-dd9a7aa8fa44" containerID="87e25aa3435f2488f0e8ad59cf6c52d09f26e56dbbb0c16afa85618c9286b2b0" exitCode=143 Feb 18 13:05:57 crc kubenswrapper[4922]: I0218 13:05:57.237581 4922 scope.go:117] "RemoveContainer" containerID="87e25aa3435f2488f0e8ad59cf6c52d09f26e56dbbb0c16afa85618c9286b2b0" Feb 18 13:05:57 crc kubenswrapper[4922]: I0218 13:05:57.237764 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bf65p/must-gather-pnxz8" Feb 18 13:05:57 crc kubenswrapper[4922]: I0218 13:05:57.262695 4922 scope.go:117] "RemoveContainer" containerID="1a1f695d758905465b01fcc900845ef77ea0ac9586c1010946a5b06f4411a66c" Feb 18 13:05:57 crc kubenswrapper[4922]: I0218 13:05:57.305858 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkczn\" (UniqueName: \"kubernetes.io/projected/e0dbd879-002b-41f9-aa0f-dd9a7aa8fa44-kube-api-access-hkczn\") pod \"e0dbd879-002b-41f9-aa0f-dd9a7aa8fa44\" (UID: \"e0dbd879-002b-41f9-aa0f-dd9a7aa8fa44\") " Feb 18 13:05:57 crc kubenswrapper[4922]: I0218 13:05:57.305921 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e0dbd879-002b-41f9-aa0f-dd9a7aa8fa44-must-gather-output\") pod \"e0dbd879-002b-41f9-aa0f-dd9a7aa8fa44\" (UID: \"e0dbd879-002b-41f9-aa0f-dd9a7aa8fa44\") " Feb 18 13:05:57 crc kubenswrapper[4922]: I0218 13:05:57.312435 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0dbd879-002b-41f9-aa0f-dd9a7aa8fa44-kube-api-access-hkczn" (OuterVolumeSpecName: "kube-api-access-hkczn") pod "e0dbd879-002b-41f9-aa0f-dd9a7aa8fa44" (UID: "e0dbd879-002b-41f9-aa0f-dd9a7aa8fa44"). InnerVolumeSpecName "kube-api-access-hkczn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:05:57 crc kubenswrapper[4922]: I0218 13:05:57.352323 4922 scope.go:117] "RemoveContainer" containerID="87e25aa3435f2488f0e8ad59cf6c52d09f26e56dbbb0c16afa85618c9286b2b0" Feb 18 13:05:57 crc kubenswrapper[4922]: E0218 13:05:57.352786 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87e25aa3435f2488f0e8ad59cf6c52d09f26e56dbbb0c16afa85618c9286b2b0\": container with ID starting with 87e25aa3435f2488f0e8ad59cf6c52d09f26e56dbbb0c16afa85618c9286b2b0 not found: ID does not exist" containerID="87e25aa3435f2488f0e8ad59cf6c52d09f26e56dbbb0c16afa85618c9286b2b0" Feb 18 13:05:57 crc kubenswrapper[4922]: I0218 13:05:57.352817 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87e25aa3435f2488f0e8ad59cf6c52d09f26e56dbbb0c16afa85618c9286b2b0"} err="failed to get container status \"87e25aa3435f2488f0e8ad59cf6c52d09f26e56dbbb0c16afa85618c9286b2b0\": rpc error: code = NotFound desc = could not find container \"87e25aa3435f2488f0e8ad59cf6c52d09f26e56dbbb0c16afa85618c9286b2b0\": container with ID starting with 87e25aa3435f2488f0e8ad59cf6c52d09f26e56dbbb0c16afa85618c9286b2b0 not found: ID does not exist" Feb 18 13:05:57 crc kubenswrapper[4922]: I0218 13:05:57.352837 4922 scope.go:117] "RemoveContainer" containerID="1a1f695d758905465b01fcc900845ef77ea0ac9586c1010946a5b06f4411a66c" Feb 18 13:05:57 crc kubenswrapper[4922]: E0218 13:05:57.353010 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a1f695d758905465b01fcc900845ef77ea0ac9586c1010946a5b06f4411a66c\": container with ID starting with 1a1f695d758905465b01fcc900845ef77ea0ac9586c1010946a5b06f4411a66c not found: ID does not exist" containerID="1a1f695d758905465b01fcc900845ef77ea0ac9586c1010946a5b06f4411a66c" Feb 18 13:05:57 crc kubenswrapper[4922]: I0218 13:05:57.353030 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a1f695d758905465b01fcc900845ef77ea0ac9586c1010946a5b06f4411a66c"} err="failed to get container status \"1a1f695d758905465b01fcc900845ef77ea0ac9586c1010946a5b06f4411a66c\": rpc error: code = NotFound desc = could not find container \"1a1f695d758905465b01fcc900845ef77ea0ac9586c1010946a5b06f4411a66c\": container with ID starting with 1a1f695d758905465b01fcc900845ef77ea0ac9586c1010946a5b06f4411a66c not found: ID does not exist" Feb 18 13:05:57 crc kubenswrapper[4922]: I0218 13:05:57.410633 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkczn\" (UniqueName: \"kubernetes.io/projected/e0dbd879-002b-41f9-aa0f-dd9a7aa8fa44-kube-api-access-hkczn\") on node \"crc\" DevicePath \"\"" Feb 18 13:05:57 crc kubenswrapper[4922]: I0218 13:05:57.535541 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0dbd879-002b-41f9-aa0f-dd9a7aa8fa44-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "e0dbd879-002b-41f9-aa0f-dd9a7aa8fa44" (UID: "e0dbd879-002b-41f9-aa0f-dd9a7aa8fa44"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 13:05:57 crc kubenswrapper[4922]: I0218 13:05:57.614202 4922 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e0dbd879-002b-41f9-aa0f-dd9a7aa8fa44-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 18 13:05:58 crc kubenswrapper[4922]: I0218 13:05:58.985564 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0dbd879-002b-41f9-aa0f-dd9a7aa8fa44" path="/var/lib/kubelet/pods/e0dbd879-002b-41f9-aa0f-dd9a7aa8fa44/volumes" Feb 18 13:06:00 crc kubenswrapper[4922]: I0218 13:06:00.975673 4922 scope.go:117] "RemoveContainer" containerID="112ba381fc4b9cc3120f9b69127c4ddbe0d5c14674a4b80d3ac1d60cc8c37003" Feb 18 13:06:00 crc kubenswrapper[4922]: E0218 13:06:00.976655 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 13:06:14 crc kubenswrapper[4922]: I0218 13:06:14.972951 4922 scope.go:117] "RemoveContainer" containerID="112ba381fc4b9cc3120f9b69127c4ddbe0d5c14674a4b80d3ac1d60cc8c37003" Feb 18 13:06:14 crc kubenswrapper[4922]: E0218 13:06:14.973825 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 13:06:27 crc kubenswrapper[4922]: I0218 13:06:27.973523 4922 scope.go:117] "RemoveContainer" containerID="112ba381fc4b9cc3120f9b69127c4ddbe0d5c14674a4b80d3ac1d60cc8c37003" Feb 18 13:06:27 crc kubenswrapper[4922]: E0218 13:06:27.974278 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 13:06:41 crc kubenswrapper[4922]: I0218 13:06:41.973459 4922 scope.go:117] "RemoveContainer" containerID="112ba381fc4b9cc3120f9b69127c4ddbe0d5c14674a4b80d3ac1d60cc8c37003" Feb 18 13:06:41 crc kubenswrapper[4922]: E0218 13:06:41.974273 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 13:06:53 crc kubenswrapper[4922]: I0218 13:06:53.973717 4922 scope.go:117] "RemoveContainer" containerID="112ba381fc4b9cc3120f9b69127c4ddbe0d5c14674a4b80d3ac1d60cc8c37003" Feb 18 13:06:53 crc kubenswrapper[4922]: E0218 13:06:53.974587 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 13:07:07 crc kubenswrapper[4922]: I0218 13:07:07.972931 4922 scope.go:117] "RemoveContainer" containerID="112ba381fc4b9cc3120f9b69127c4ddbe0d5c14674a4b80d3ac1d60cc8c37003" Feb 18 13:07:07 crc kubenswrapper[4922]: E0218 13:07:07.973776 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 13:07:11 crc kubenswrapper[4922]: I0218 13:07:11.222444 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mhb77"] Feb 18 13:07:11 crc kubenswrapper[4922]: E0218 13:07:11.223277 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4669bde-5144-4129-8236-f152c6a30cad" containerName="extract-content" Feb 18 13:07:11 crc kubenswrapper[4922]: I0218 13:07:11.223576 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4669bde-5144-4129-8236-f152c6a30cad" containerName="extract-content" Feb 18 13:07:11 crc kubenswrapper[4922]: E0218 13:07:11.223598 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4669bde-5144-4129-8236-f152c6a30cad" containerName="extract-utilities" Feb 18 13:07:11 crc kubenswrapper[4922]: I0218 13:07:11.223606 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4669bde-5144-4129-8236-f152c6a30cad" containerName="extract-utilities" Feb 18 13:07:11 crc kubenswrapper[4922]: E0218 13:07:11.223615 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0dbd879-002b-41f9-aa0f-dd9a7aa8fa44" containerName="gather" Feb 18 13:07:11 crc kubenswrapper[4922]: I0218 13:07:11.223624 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0dbd879-002b-41f9-aa0f-dd9a7aa8fa44" containerName="gather" Feb 18 13:07:11 crc kubenswrapper[4922]: E0218 13:07:11.223647 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0dbd879-002b-41f9-aa0f-dd9a7aa8fa44" containerName="copy" Feb 18 13:07:11 crc kubenswrapper[4922]: I0218 13:07:11.223656 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0dbd879-002b-41f9-aa0f-dd9a7aa8fa44" containerName="copy" Feb 18 13:07:11 crc kubenswrapper[4922]: E0218 13:07:11.223672 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4669bde-5144-4129-8236-f152c6a30cad" containerName="registry-server" Feb 18 13:07:11 crc kubenswrapper[4922]: I0218 13:07:11.223679 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4669bde-5144-4129-8236-f152c6a30cad" containerName="registry-server" Feb 18 13:07:11 crc kubenswrapper[4922]: I0218 13:07:11.223933 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4669bde-5144-4129-8236-f152c6a30cad" containerName="registry-server" Feb 18 13:07:11 crc kubenswrapper[4922]: I0218 13:07:11.223963 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0dbd879-002b-41f9-aa0f-dd9a7aa8fa44" containerName="copy" Feb 18 13:07:11 crc kubenswrapper[4922]: I0218 13:07:11.223979 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0dbd879-002b-41f9-aa0f-dd9a7aa8fa44" containerName="gather" Feb 18 13:07:11 crc kubenswrapper[4922]: I0218 13:07:11.227415 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mhb77" Feb 18 13:07:11 crc kubenswrapper[4922]: I0218 13:07:11.237088 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mhb77"] Feb 18 13:07:11 crc kubenswrapper[4922]: I0218 13:07:11.402127 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45c08d45-e7a7-4df0-b6f5-bc7467e63e0c-utilities\") pod \"certified-operators-mhb77\" (UID: \"45c08d45-e7a7-4df0-b6f5-bc7467e63e0c\") " pod="openshift-marketplace/certified-operators-mhb77" Feb 18 13:07:11 crc kubenswrapper[4922]: I0218 13:07:11.402181 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h7b9\" (UniqueName: \"kubernetes.io/projected/45c08d45-e7a7-4df0-b6f5-bc7467e63e0c-kube-api-access-8h7b9\") pod \"certified-operators-mhb77\" (UID: \"45c08d45-e7a7-4df0-b6f5-bc7467e63e0c\") " pod="openshift-marketplace/certified-operators-mhb77" Feb 18 13:07:11 crc kubenswrapper[4922]: I0218 13:07:11.402521 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45c08d45-e7a7-4df0-b6f5-bc7467e63e0c-catalog-content\") pod \"certified-operators-mhb77\" (UID: \"45c08d45-e7a7-4df0-b6f5-bc7467e63e0c\") " pod="openshift-marketplace/certified-operators-mhb77" Feb 18 13:07:11 crc kubenswrapper[4922]: I0218 13:07:11.504548 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45c08d45-e7a7-4df0-b6f5-bc7467e63e0c-utilities\") pod \"certified-operators-mhb77\" (UID: \"45c08d45-e7a7-4df0-b6f5-bc7467e63e0c\") " pod="openshift-marketplace/certified-operators-mhb77" Feb 18 13:07:11 crc kubenswrapper[4922]: I0218 13:07:11.504613 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8h7b9\" (UniqueName: \"kubernetes.io/projected/45c08d45-e7a7-4df0-b6f5-bc7467e63e0c-kube-api-access-8h7b9\") pod \"certified-operators-mhb77\" (UID: \"45c08d45-e7a7-4df0-b6f5-bc7467e63e0c\") " pod="openshift-marketplace/certified-operators-mhb77" Feb 18 13:07:11 crc kubenswrapper[4922]: I0218 13:07:11.504747 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45c08d45-e7a7-4df0-b6f5-bc7467e63e0c-catalog-content\") pod \"certified-operators-mhb77\" (UID: \"45c08d45-e7a7-4df0-b6f5-bc7467e63e0c\") " pod="openshift-marketplace/certified-operators-mhb77" Feb 18 13:07:11 crc kubenswrapper[4922]: I0218 13:07:11.505172 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45c08d45-e7a7-4df0-b6f5-bc7467e63e0c-utilities\") pod \"certified-operators-mhb77\" (UID: \"45c08d45-e7a7-4df0-b6f5-bc7467e63e0c\") " pod="openshift-marketplace/certified-operators-mhb77" Feb 18 13:07:11 crc kubenswrapper[4922]: I0218 13:07:11.505311 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45c08d45-e7a7-4df0-b6f5-bc7467e63e0c-catalog-content\") pod \"certified-operators-mhb77\" (UID: \"45c08d45-e7a7-4df0-b6f5-bc7467e63e0c\") " pod="openshift-marketplace/certified-operators-mhb77" Feb 18 13:07:11 crc kubenswrapper[4922]: I0218 13:07:11.525963 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8h7b9\" (UniqueName: \"kubernetes.io/projected/45c08d45-e7a7-4df0-b6f5-bc7467e63e0c-kube-api-access-8h7b9\") pod \"certified-operators-mhb77\" (UID: \"45c08d45-e7a7-4df0-b6f5-bc7467e63e0c\") " pod="openshift-marketplace/certified-operators-mhb77" Feb 18 13:07:11 crc kubenswrapper[4922]: I0218 13:07:11.552431 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mhb77" Feb 18 13:07:12 crc kubenswrapper[4922]: I0218 13:07:12.120030 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mhb77"] Feb 18 13:07:12 crc kubenswrapper[4922]: I0218 13:07:12.934112 4922 generic.go:334] "Generic (PLEG): container finished" podID="45c08d45-e7a7-4df0-b6f5-bc7467e63e0c" containerID="2dd7ee1f953724e3c0334d3deb2d01dc2999914581eb57258d980386aa039c6f" exitCode=0 Feb 18 13:07:12 crc kubenswrapper[4922]: I0218 13:07:12.934215 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhb77" event={"ID":"45c08d45-e7a7-4df0-b6f5-bc7467e63e0c","Type":"ContainerDied","Data":"2dd7ee1f953724e3c0334d3deb2d01dc2999914581eb57258d980386aa039c6f"} Feb 18 13:07:12 crc kubenswrapper[4922]: I0218 13:07:12.934411 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhb77" event={"ID":"45c08d45-e7a7-4df0-b6f5-bc7467e63e0c","Type":"ContainerStarted","Data":"9b23fc28df447b251af9fd618c70dcc3bcf509261525ed49f1de115eec8204ed"} Feb 18 13:07:14 crc kubenswrapper[4922]: I0218 13:07:14.955759 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhb77" event={"ID":"45c08d45-e7a7-4df0-b6f5-bc7467e63e0c","Type":"ContainerStarted","Data":"d91b6cea303f2de28d0a8cb55b55ce4c6b9bf9ae3bf255b5e82505529f1828d9"} Feb 18 13:07:15 crc kubenswrapper[4922]: E0218 13:07:15.415985 4922 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45c08d45_e7a7_4df0_b6f5_bc7467e63e0c.slice/crio-conmon-d91b6cea303f2de28d0a8cb55b55ce4c6b9bf9ae3bf255b5e82505529f1828d9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45c08d45_e7a7_4df0_b6f5_bc7467e63e0c.slice/crio-d91b6cea303f2de28d0a8cb55b55ce4c6b9bf9ae3bf255b5e82505529f1828d9.scope\": RecentStats: unable to find data in memory cache]" Feb 18 13:07:15 crc kubenswrapper[4922]: I0218 13:07:15.969552 4922 generic.go:334] "Generic (PLEG): container finished" podID="45c08d45-e7a7-4df0-b6f5-bc7467e63e0c" containerID="d91b6cea303f2de28d0a8cb55b55ce4c6b9bf9ae3bf255b5e82505529f1828d9" exitCode=0 Feb 18 13:07:15 crc kubenswrapper[4922]: I0218 13:07:15.969618 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhb77" event={"ID":"45c08d45-e7a7-4df0-b6f5-bc7467e63e0c","Type":"ContainerDied","Data":"d91b6cea303f2de28d0a8cb55b55ce4c6b9bf9ae3bf255b5e82505529f1828d9"} Feb 18 13:07:16 crc kubenswrapper[4922]: I0218 13:07:16.994785 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhb77" event={"ID":"45c08d45-e7a7-4df0-b6f5-bc7467e63e0c","Type":"ContainerStarted","Data":"1e335fe845c10501bf69adb521d2071968fffed624df954ac9f16774cf7af8e9"} Feb 18 13:07:17 crc kubenswrapper[4922]: I0218 13:07:17.015896 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mhb77" podStartSLOduration=2.3724311240000002 podStartE2EDuration="6.015880288s" podCreationTimestamp="2026-02-18 13:07:11 +0000 UTC" firstStartedPulling="2026-02-18 13:07:12.935688149 +0000 UTC m=+5434.663392229" lastFinishedPulling="2026-02-18 13:07:16.579137313 +0000 UTC m=+5438.306841393" observedRunningTime="2026-02-18 13:07:17.011787134 +0000 UTC m=+5438.739491224" watchObservedRunningTime="2026-02-18 13:07:17.015880288 +0000 UTC m=+5438.743584368" Feb 18 13:07:21 crc kubenswrapper[4922]: I0218 13:07:21.553513 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mhb77" Feb 18 13:07:21 crc kubenswrapper[4922]: I0218 13:07:21.554106 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mhb77" Feb 18 13:07:21 crc kubenswrapper[4922]: I0218 13:07:21.600560 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mhb77" Feb 18 13:07:22 crc kubenswrapper[4922]: I0218 13:07:22.079641 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mhb77" Feb 18 13:07:22 crc kubenswrapper[4922]: I0218 13:07:22.122210 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mhb77"] Feb 18 13:07:22 crc kubenswrapper[4922]: I0218 13:07:22.973125 4922 scope.go:117] "RemoveContainer" containerID="112ba381fc4b9cc3120f9b69127c4ddbe0d5c14674a4b80d3ac1d60cc8c37003" Feb 18 13:07:22 crc kubenswrapper[4922]: E0218 13:07:22.973720 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 13:07:24 crc kubenswrapper[4922]: I0218 13:07:24.048948 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mhb77" podUID="45c08d45-e7a7-4df0-b6f5-bc7467e63e0c" containerName="registry-server" containerID="cri-o://1e335fe845c10501bf69adb521d2071968fffed624df954ac9f16774cf7af8e9" gracePeriod=2 Feb 18 13:07:24 crc kubenswrapper[4922]: I0218 13:07:24.515011 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mhb77" Feb 18 13:07:24 crc kubenswrapper[4922]: I0218 13:07:24.665765 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45c08d45-e7a7-4df0-b6f5-bc7467e63e0c-utilities\") pod \"45c08d45-e7a7-4df0-b6f5-bc7467e63e0c\" (UID: \"45c08d45-e7a7-4df0-b6f5-bc7467e63e0c\") " Feb 18 13:07:24 crc kubenswrapper[4922]: I0218 13:07:24.665937 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8h7b9\" (UniqueName: \"kubernetes.io/projected/45c08d45-e7a7-4df0-b6f5-bc7467e63e0c-kube-api-access-8h7b9\") pod \"45c08d45-e7a7-4df0-b6f5-bc7467e63e0c\" (UID: \"45c08d45-e7a7-4df0-b6f5-bc7467e63e0c\") " Feb 18 13:07:24 crc kubenswrapper[4922]: I0218 13:07:24.666006 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45c08d45-e7a7-4df0-b6f5-bc7467e63e0c-catalog-content\") pod \"45c08d45-e7a7-4df0-b6f5-bc7467e63e0c\" (UID: \"45c08d45-e7a7-4df0-b6f5-bc7467e63e0c\") " Feb 18 13:07:24 crc kubenswrapper[4922]: I0218 13:07:24.666811 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45c08d45-e7a7-4df0-b6f5-bc7467e63e0c-utilities" (OuterVolumeSpecName: "utilities") pod "45c08d45-e7a7-4df0-b6f5-bc7467e63e0c" (UID: "45c08d45-e7a7-4df0-b6f5-bc7467e63e0c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 13:07:24 crc kubenswrapper[4922]: I0218 13:07:24.674105 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45c08d45-e7a7-4df0-b6f5-bc7467e63e0c-kube-api-access-8h7b9" (OuterVolumeSpecName: "kube-api-access-8h7b9") pod "45c08d45-e7a7-4df0-b6f5-bc7467e63e0c" (UID: "45c08d45-e7a7-4df0-b6f5-bc7467e63e0c"). InnerVolumeSpecName "kube-api-access-8h7b9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:07:24 crc kubenswrapper[4922]: I0218 13:07:24.768562 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45c08d45-e7a7-4df0-b6f5-bc7467e63e0c-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 13:07:24 crc kubenswrapper[4922]: I0218 13:07:24.768599 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8h7b9\" (UniqueName: \"kubernetes.io/projected/45c08d45-e7a7-4df0-b6f5-bc7467e63e0c-kube-api-access-8h7b9\") on node \"crc\" DevicePath \"\"" Feb 18 13:07:24 crc kubenswrapper[4922]: I0218 13:07:24.939890 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45c08d45-e7a7-4df0-b6f5-bc7467e63e0c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "45c08d45-e7a7-4df0-b6f5-bc7467e63e0c" (UID: "45c08d45-e7a7-4df0-b6f5-bc7467e63e0c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 13:07:24 crc kubenswrapper[4922]: I0218 13:07:24.972425 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45c08d45-e7a7-4df0-b6f5-bc7467e63e0c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 13:07:25 crc kubenswrapper[4922]: I0218 13:07:25.058956 4922 generic.go:334] "Generic (PLEG): container finished" podID="45c08d45-e7a7-4df0-b6f5-bc7467e63e0c" containerID="1e335fe845c10501bf69adb521d2071968fffed624df954ac9f16774cf7af8e9" exitCode=0 Feb 18 13:07:25 crc kubenswrapper[4922]: I0218 13:07:25.059018 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhb77" event={"ID":"45c08d45-e7a7-4df0-b6f5-bc7467e63e0c","Type":"ContainerDied","Data":"1e335fe845c10501bf69adb521d2071968fffed624df954ac9f16774cf7af8e9"} Feb 18 13:07:25 crc kubenswrapper[4922]: I0218 13:07:25.059950 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhb77" event={"ID":"45c08d45-e7a7-4df0-b6f5-bc7467e63e0c","Type":"ContainerDied","Data":"9b23fc28df447b251af9fd618c70dcc3bcf509261525ed49f1de115eec8204ed"} Feb 18 13:07:25 crc kubenswrapper[4922]: I0218 13:07:25.059049 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mhb77" Feb 18 13:07:25 crc kubenswrapper[4922]: I0218 13:07:25.059990 4922 scope.go:117] "RemoveContainer" containerID="1e335fe845c10501bf69adb521d2071968fffed624df954ac9f16774cf7af8e9" Feb 18 13:07:25 crc kubenswrapper[4922]: I0218 13:07:25.085013 4922 scope.go:117] "RemoveContainer" containerID="d91b6cea303f2de28d0a8cb55b55ce4c6b9bf9ae3bf255b5e82505529f1828d9" Feb 18 13:07:25 crc kubenswrapper[4922]: I0218 13:07:25.095253 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mhb77"] Feb 18 13:07:25 crc kubenswrapper[4922]: I0218 13:07:25.109338 4922 scope.go:117] "RemoveContainer" containerID="2dd7ee1f953724e3c0334d3deb2d01dc2999914581eb57258d980386aa039c6f" Feb 18 13:07:25 crc kubenswrapper[4922]: I0218 13:07:25.121486 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mhb77"] Feb 18 13:07:25 crc kubenswrapper[4922]: I0218 13:07:25.159399 4922 scope.go:117] "RemoveContainer" containerID="1e335fe845c10501bf69adb521d2071968fffed624df954ac9f16774cf7af8e9" Feb 18 13:07:25 crc kubenswrapper[4922]: E0218 13:07:25.159845 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e335fe845c10501bf69adb521d2071968fffed624df954ac9f16774cf7af8e9\": container with ID starting with 1e335fe845c10501bf69adb521d2071968fffed624df954ac9f16774cf7af8e9 not found: ID does not exist" containerID="1e335fe845c10501bf69adb521d2071968fffed624df954ac9f16774cf7af8e9" Feb 18 13:07:25 crc kubenswrapper[4922]: I0218 13:07:25.159892 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e335fe845c10501bf69adb521d2071968fffed624df954ac9f16774cf7af8e9"} err="failed to get container status \"1e335fe845c10501bf69adb521d2071968fffed624df954ac9f16774cf7af8e9\": rpc error: code = NotFound desc = could not find container \"1e335fe845c10501bf69adb521d2071968fffed624df954ac9f16774cf7af8e9\": container with ID starting with 1e335fe845c10501bf69adb521d2071968fffed624df954ac9f16774cf7af8e9 not found: ID does not exist" Feb 18 13:07:25 crc kubenswrapper[4922]: I0218 13:07:25.159921 4922 scope.go:117] "RemoveContainer" containerID="d91b6cea303f2de28d0a8cb55b55ce4c6b9bf9ae3bf255b5e82505529f1828d9" Feb 18 13:07:25 crc kubenswrapper[4922]: E0218 13:07:25.160117 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d91b6cea303f2de28d0a8cb55b55ce4c6b9bf9ae3bf255b5e82505529f1828d9\": container with ID starting with d91b6cea303f2de28d0a8cb55b55ce4c6b9bf9ae3bf255b5e82505529f1828d9 not found: ID does not exist" containerID="d91b6cea303f2de28d0a8cb55b55ce4c6b9bf9ae3bf255b5e82505529f1828d9" Feb 18 13:07:25 crc kubenswrapper[4922]: I0218 13:07:25.160142 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d91b6cea303f2de28d0a8cb55b55ce4c6b9bf9ae3bf255b5e82505529f1828d9"} err="failed to get container status \"d91b6cea303f2de28d0a8cb55b55ce4c6b9bf9ae3bf255b5e82505529f1828d9\": rpc error: code = NotFound desc = could not find container \"d91b6cea303f2de28d0a8cb55b55ce4c6b9bf9ae3bf255b5e82505529f1828d9\": container with ID starting with d91b6cea303f2de28d0a8cb55b55ce4c6b9bf9ae3bf255b5e82505529f1828d9 not found: ID does not exist" Feb 18 13:07:25 crc kubenswrapper[4922]: I0218 13:07:25.160157 4922 scope.go:117] "RemoveContainer" containerID="2dd7ee1f953724e3c0334d3deb2d01dc2999914581eb57258d980386aa039c6f" Feb 18 13:07:25 crc kubenswrapper[4922]: E0218 13:07:25.160427 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dd7ee1f953724e3c0334d3deb2d01dc2999914581eb57258d980386aa039c6f\": container with ID starting with 2dd7ee1f953724e3c0334d3deb2d01dc2999914581eb57258d980386aa039c6f not found: ID does not exist" containerID="2dd7ee1f953724e3c0334d3deb2d01dc2999914581eb57258d980386aa039c6f" Feb 18 13:07:25 crc kubenswrapper[4922]: I0218 13:07:25.160448 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dd7ee1f953724e3c0334d3deb2d01dc2999914581eb57258d980386aa039c6f"} err="failed to get container status \"2dd7ee1f953724e3c0334d3deb2d01dc2999914581eb57258d980386aa039c6f\": rpc error: code = NotFound desc = could not find container \"2dd7ee1f953724e3c0334d3deb2d01dc2999914581eb57258d980386aa039c6f\": container with ID starting with 2dd7ee1f953724e3c0334d3deb2d01dc2999914581eb57258d980386aa039c6f not found: ID does not exist" Feb 18 13:07:26 crc kubenswrapper[4922]: I0218 13:07:26.990023 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45c08d45-e7a7-4df0-b6f5-bc7467e63e0c" path="/var/lib/kubelet/pods/45c08d45-e7a7-4df0-b6f5-bc7467e63e0c/volumes" Feb 18 13:07:37 crc kubenswrapper[4922]: I0218 13:07:37.973623 4922 scope.go:117] "RemoveContainer" containerID="112ba381fc4b9cc3120f9b69127c4ddbe0d5c14674a4b80d3ac1d60cc8c37003" Feb 18 13:07:37 crc kubenswrapper[4922]: E0218 13:07:37.974386 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 13:07:52 crc kubenswrapper[4922]: I0218 13:07:52.973606 4922 scope.go:117] "RemoveContainer" containerID="112ba381fc4b9cc3120f9b69127c4ddbe0d5c14674a4b80d3ac1d60cc8c37003" Feb 18 13:07:52 crc kubenswrapper[4922]: E0218 13:07:52.974309 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 13:08:07 crc kubenswrapper[4922]: I0218 13:08:07.973325 4922 scope.go:117] "RemoveContainer" containerID="112ba381fc4b9cc3120f9b69127c4ddbe0d5c14674a4b80d3ac1d60cc8c37003" Feb 18 13:08:07 crc kubenswrapper[4922]: E0218 13:08:07.974482 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 13:08:15 crc kubenswrapper[4922]: I0218 13:08:15.813179 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2tr7d"] Feb 18 13:08:15 crc kubenswrapper[4922]: E0218 13:08:15.815150 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45c08d45-e7a7-4df0-b6f5-bc7467e63e0c" containerName="registry-server" Feb 18 13:08:15 crc kubenswrapper[4922]: I0218 13:08:15.815229 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="45c08d45-e7a7-4df0-b6f5-bc7467e63e0c" containerName="registry-server" Feb 18 13:08:15 crc kubenswrapper[4922]: E0218 13:08:15.815291 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45c08d45-e7a7-4df0-b6f5-bc7467e63e0c" containerName="extract-content" Feb 18 13:08:15 crc kubenswrapper[4922]: I0218 13:08:15.815352 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="45c08d45-e7a7-4df0-b6f5-bc7467e63e0c" containerName="extract-content" Feb 18 13:08:15 crc kubenswrapper[4922]: E0218 13:08:15.815437 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45c08d45-e7a7-4df0-b6f5-bc7467e63e0c" containerName="extract-utilities" Feb 18 13:08:15 crc kubenswrapper[4922]: I0218 13:08:15.815498 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="45c08d45-e7a7-4df0-b6f5-bc7467e63e0c" containerName="extract-utilities" Feb 18 13:08:15 crc kubenswrapper[4922]: I0218 13:08:15.815737 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="45c08d45-e7a7-4df0-b6f5-bc7467e63e0c" containerName="registry-server" Feb 18 13:08:15 crc kubenswrapper[4922]: I0218 13:08:15.817273 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2tr7d" Feb 18 13:08:15 crc kubenswrapper[4922]: I0218 13:08:15.831179 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2tr7d"] Feb 18 13:08:15 crc kubenswrapper[4922]: I0218 13:08:15.948489 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dafba0f3-3fc9-4640-9b50-ed91dacae456-utilities\") pod \"redhat-marketplace-2tr7d\" (UID: \"dafba0f3-3fc9-4640-9b50-ed91dacae456\") " pod="openshift-marketplace/redhat-marketplace-2tr7d" Feb 18 13:08:15 crc kubenswrapper[4922]: I0218 13:08:15.948563 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dafba0f3-3fc9-4640-9b50-ed91dacae456-catalog-content\") pod \"redhat-marketplace-2tr7d\" (UID: \"dafba0f3-3fc9-4640-9b50-ed91dacae456\") " pod="openshift-marketplace/redhat-marketplace-2tr7d" Feb 18 13:08:15 crc kubenswrapper[4922]: I0218 13:08:15.949092 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr4nf\" (UniqueName: \"kubernetes.io/projected/dafba0f3-3fc9-4640-9b50-ed91dacae456-kube-api-access-hr4nf\") pod \"redhat-marketplace-2tr7d\" (UID: \"dafba0f3-3fc9-4640-9b50-ed91dacae456\") " pod="openshift-marketplace/redhat-marketplace-2tr7d" Feb 18 13:08:16 crc kubenswrapper[4922]: I0218 13:08:16.050724 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dafba0f3-3fc9-4640-9b50-ed91dacae456-catalog-content\") pod \"redhat-marketplace-2tr7d\" (UID: \"dafba0f3-3fc9-4640-9b50-ed91dacae456\") " pod="openshift-marketplace/redhat-marketplace-2tr7d" Feb 18 13:08:16 crc kubenswrapper[4922]: I0218 13:08:16.050865 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr4nf\" (UniqueName: \"kubernetes.io/projected/dafba0f3-3fc9-4640-9b50-ed91dacae456-kube-api-access-hr4nf\") pod \"redhat-marketplace-2tr7d\" (UID: \"dafba0f3-3fc9-4640-9b50-ed91dacae456\") " pod="openshift-marketplace/redhat-marketplace-2tr7d" Feb 18 13:08:16 crc kubenswrapper[4922]: I0218 13:08:16.050917 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dafba0f3-3fc9-4640-9b50-ed91dacae456-utilities\") pod \"redhat-marketplace-2tr7d\" (UID: \"dafba0f3-3fc9-4640-9b50-ed91dacae456\") " pod="openshift-marketplace/redhat-marketplace-2tr7d" Feb 18 13:08:16 crc kubenswrapper[4922]: I0218 13:08:16.051340 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dafba0f3-3fc9-4640-9b50-ed91dacae456-catalog-content\") pod \"redhat-marketplace-2tr7d\" (UID: \"dafba0f3-3fc9-4640-9b50-ed91dacae456\") " pod="openshift-marketplace/redhat-marketplace-2tr7d" Feb 18 13:08:16 crc kubenswrapper[4922]: I0218 13:08:16.051352 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dafba0f3-3fc9-4640-9b50-ed91dacae456-utilities\") pod \"redhat-marketplace-2tr7d\" (UID: \"dafba0f3-3fc9-4640-9b50-ed91dacae456\") " pod="openshift-marketplace/redhat-marketplace-2tr7d" Feb 18 13:08:16 crc kubenswrapper[4922]: I0218 13:08:16.083148 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr4nf\" (UniqueName: \"kubernetes.io/projected/dafba0f3-3fc9-4640-9b50-ed91dacae456-kube-api-access-hr4nf\") pod \"redhat-marketplace-2tr7d\" (UID: \"dafba0f3-3fc9-4640-9b50-ed91dacae456\") " pod="openshift-marketplace/redhat-marketplace-2tr7d" Feb 18 13:08:16 crc kubenswrapper[4922]: I0218 13:08:16.146397 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2tr7d" Feb 18 13:08:16 crc kubenswrapper[4922]: I0218 13:08:16.599991 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2tr7d"] Feb 18 13:08:17 crc kubenswrapper[4922]: I0218 13:08:17.511944 4922 generic.go:334] "Generic (PLEG): container finished" podID="dafba0f3-3fc9-4640-9b50-ed91dacae456" containerID="fed158d3668fe2c8d8f255ced66c9052205f77cd8cc2ad5b3e580da203887ed9" exitCode=0 Feb 18 13:08:17 crc kubenswrapper[4922]: I0218 13:08:17.512032 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2tr7d" event={"ID":"dafba0f3-3fc9-4640-9b50-ed91dacae456","Type":"ContainerDied","Data":"fed158d3668fe2c8d8f255ced66c9052205f77cd8cc2ad5b3e580da203887ed9"} Feb 18 13:08:17 crc kubenswrapper[4922]: I0218 13:08:17.512301 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2tr7d" event={"ID":"dafba0f3-3fc9-4640-9b50-ed91dacae456","Type":"ContainerStarted","Data":"92e95d0f181c491562dbf6090df7742c16451f82f80274d633c92225b2a86025"} Feb 18 13:08:18 crc kubenswrapper[4922]: I0218 13:08:18.522304 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2tr7d" event={"ID":"dafba0f3-3fc9-4640-9b50-ed91dacae456","Type":"ContainerStarted","Data":"78f78102af1e4ae9d9c1f68a59aa0fdad955fdc330d38dc31f6af099cc297af9"} Feb 18 13:08:19 crc kubenswrapper[4922]: I0218 13:08:19.532586 4922 generic.go:334] "Generic (PLEG): container finished" podID="dafba0f3-3fc9-4640-9b50-ed91dacae456" containerID="78f78102af1e4ae9d9c1f68a59aa0fdad955fdc330d38dc31f6af099cc297af9" exitCode=0 Feb 18 13:08:19 crc kubenswrapper[4922]: I0218 13:08:19.532632 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2tr7d" event={"ID":"dafba0f3-3fc9-4640-9b50-ed91dacae456","Type":"ContainerDied","Data":"78f78102af1e4ae9d9c1f68a59aa0fdad955fdc330d38dc31f6af099cc297af9"} Feb 18 13:08:19 crc kubenswrapper[4922]: I0218 13:08:19.973483 4922 scope.go:117] "RemoveContainer" containerID="112ba381fc4b9cc3120f9b69127c4ddbe0d5c14674a4b80d3ac1d60cc8c37003" Feb 18 13:08:19 crc kubenswrapper[4922]: E0218 13:08:19.973903 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 13:08:20 crc kubenswrapper[4922]: I0218 13:08:20.544143 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2tr7d" event={"ID":"dafba0f3-3fc9-4640-9b50-ed91dacae456","Type":"ContainerStarted","Data":"6b04406f1f39eb78dd15003661715a98b5b5b4893d808592949d3621c656e8bf"} Feb 18 13:08:20 crc kubenswrapper[4922]: I0218 13:08:20.570403 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2tr7d" podStartSLOduration=3.156680084 podStartE2EDuration="5.570379904s" podCreationTimestamp="2026-02-18 13:08:15 +0000 UTC" firstStartedPulling="2026-02-18 13:08:17.515238933 +0000 UTC m=+5499.242943013" lastFinishedPulling="2026-02-18 13:08:19.928938753 +0000 UTC m=+5501.656642833" observedRunningTime="2026-02-18 13:08:20.560788311 +0000 UTC m=+5502.288492391" watchObservedRunningTime="2026-02-18 13:08:20.570379904 +0000 UTC m=+5502.298083994" Feb 18 13:08:26 crc kubenswrapper[4922]: I0218 13:08:26.147158 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2tr7d" Feb 18 13:08:26 crc kubenswrapper[4922]: I0218 13:08:26.148075 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2tr7d" Feb 18 13:08:26 crc kubenswrapper[4922]: I0218 13:08:26.190407 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2tr7d" Feb 18 13:08:26 crc kubenswrapper[4922]: I0218 13:08:26.661206 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2tr7d" Feb 18 13:08:26 crc kubenswrapper[4922]: I0218 13:08:26.715720 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2tr7d"] Feb 18 13:08:28 crc kubenswrapper[4922]: I0218 13:08:28.627548 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2tr7d" podUID="dafba0f3-3fc9-4640-9b50-ed91dacae456" containerName="registry-server" containerID="cri-o://6b04406f1f39eb78dd15003661715a98b5b5b4893d808592949d3621c656e8bf" gracePeriod=2 Feb 18 13:08:29 crc kubenswrapper[4922]: I0218 13:08:29.090004 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2tr7d" Feb 18 13:08:29 crc kubenswrapper[4922]: I0218 13:08:29.115692 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dafba0f3-3fc9-4640-9b50-ed91dacae456-catalog-content\") pod \"dafba0f3-3fc9-4640-9b50-ed91dacae456\" (UID: \"dafba0f3-3fc9-4640-9b50-ed91dacae456\") " Feb 18 13:08:29 crc kubenswrapper[4922]: I0218 13:08:29.115774 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hr4nf\" (UniqueName: \"kubernetes.io/projected/dafba0f3-3fc9-4640-9b50-ed91dacae456-kube-api-access-hr4nf\") pod \"dafba0f3-3fc9-4640-9b50-ed91dacae456\" (UID: \"dafba0f3-3fc9-4640-9b50-ed91dacae456\") " Feb 18 13:08:29 crc kubenswrapper[4922]: I0218 13:08:29.115958 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dafba0f3-3fc9-4640-9b50-ed91dacae456-utilities\") pod \"dafba0f3-3fc9-4640-9b50-ed91dacae456\" (UID: \"dafba0f3-3fc9-4640-9b50-ed91dacae456\") " Feb 18 13:08:29 crc kubenswrapper[4922]: I0218 13:08:29.116826 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dafba0f3-3fc9-4640-9b50-ed91dacae456-utilities" (OuterVolumeSpecName: "utilities") pod "dafba0f3-3fc9-4640-9b50-ed91dacae456" (UID: "dafba0f3-3fc9-4640-9b50-ed91dacae456"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 13:08:29 crc kubenswrapper[4922]: I0218 13:08:29.117348 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dafba0f3-3fc9-4640-9b50-ed91dacae456-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 13:08:29 crc kubenswrapper[4922]: I0218 13:08:29.122593 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dafba0f3-3fc9-4640-9b50-ed91dacae456-kube-api-access-hr4nf" (OuterVolumeSpecName: "kube-api-access-hr4nf") pod "dafba0f3-3fc9-4640-9b50-ed91dacae456" (UID: "dafba0f3-3fc9-4640-9b50-ed91dacae456"). InnerVolumeSpecName "kube-api-access-hr4nf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:08:29 crc kubenswrapper[4922]: I0218 13:08:29.144601 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dafba0f3-3fc9-4640-9b50-ed91dacae456-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dafba0f3-3fc9-4640-9b50-ed91dacae456" (UID: "dafba0f3-3fc9-4640-9b50-ed91dacae456"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 13:08:29 crc kubenswrapper[4922]: I0218 13:08:29.218910 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dafba0f3-3fc9-4640-9b50-ed91dacae456-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 13:08:29 crc kubenswrapper[4922]: I0218 13:08:29.218952 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hr4nf\" (UniqueName: \"kubernetes.io/projected/dafba0f3-3fc9-4640-9b50-ed91dacae456-kube-api-access-hr4nf\") on node \"crc\" DevicePath \"\"" Feb 18 13:08:29 crc kubenswrapper[4922]: I0218 13:08:29.639267 4922 generic.go:334] "Generic (PLEG): container finished" podID="dafba0f3-3fc9-4640-9b50-ed91dacae456" containerID="6b04406f1f39eb78dd15003661715a98b5b5b4893d808592949d3621c656e8bf" exitCode=0 Feb 18 13:08:29 crc kubenswrapper[4922]: I0218 13:08:29.639312 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2tr7d" event={"ID":"dafba0f3-3fc9-4640-9b50-ed91dacae456","Type":"ContainerDied","Data":"6b04406f1f39eb78dd15003661715a98b5b5b4893d808592949d3621c656e8bf"} Feb 18 13:08:29 crc kubenswrapper[4922]: I0218 13:08:29.639338 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2tr7d" event={"ID":"dafba0f3-3fc9-4640-9b50-ed91dacae456","Type":"ContainerDied","Data":"92e95d0f181c491562dbf6090df7742c16451f82f80274d633c92225b2a86025"} Feb 18 13:08:29 crc kubenswrapper[4922]: I0218 13:08:29.639344 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2tr7d" Feb 18 13:08:29 crc kubenswrapper[4922]: I0218 13:08:29.639354 4922 scope.go:117] "RemoveContainer" containerID="6b04406f1f39eb78dd15003661715a98b5b5b4893d808592949d3621c656e8bf" Feb 18 13:08:29 crc kubenswrapper[4922]: I0218 13:08:29.659164 4922 scope.go:117] "RemoveContainer" containerID="78f78102af1e4ae9d9c1f68a59aa0fdad955fdc330d38dc31f6af099cc297af9" Feb 18 13:08:29 crc kubenswrapper[4922]: I0218 13:08:29.681838 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2tr7d"] Feb 18 13:08:29 crc kubenswrapper[4922]: I0218 13:08:29.690919 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2tr7d"] Feb 18 13:08:29 crc kubenswrapper[4922]: I0218 13:08:29.698908 4922 scope.go:117] "RemoveContainer" containerID="fed158d3668fe2c8d8f255ced66c9052205f77cd8cc2ad5b3e580da203887ed9" Feb 18 13:08:29 crc kubenswrapper[4922]: I0218 13:08:29.736941 4922 scope.go:117] "RemoveContainer" containerID="6b04406f1f39eb78dd15003661715a98b5b5b4893d808592949d3621c656e8bf" Feb 18 13:08:29 crc kubenswrapper[4922]: E0218 13:08:29.737390 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b04406f1f39eb78dd15003661715a98b5b5b4893d808592949d3621c656e8bf\": container with ID starting with 6b04406f1f39eb78dd15003661715a98b5b5b4893d808592949d3621c656e8bf not found: ID does not exist" containerID="6b04406f1f39eb78dd15003661715a98b5b5b4893d808592949d3621c656e8bf" Feb 18 13:08:29 crc kubenswrapper[4922]: I0218 13:08:29.737432 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b04406f1f39eb78dd15003661715a98b5b5b4893d808592949d3621c656e8bf"} err="failed to get container status \"6b04406f1f39eb78dd15003661715a98b5b5b4893d808592949d3621c656e8bf\": rpc error: code = NotFound desc = could not find container \"6b04406f1f39eb78dd15003661715a98b5b5b4893d808592949d3621c656e8bf\": container with ID starting with 6b04406f1f39eb78dd15003661715a98b5b5b4893d808592949d3621c656e8bf not found: ID does not exist" Feb 18 13:08:29 crc kubenswrapper[4922]: I0218 13:08:29.737458 4922 scope.go:117] "RemoveContainer" containerID="78f78102af1e4ae9d9c1f68a59aa0fdad955fdc330d38dc31f6af099cc297af9" Feb 18 13:08:29 crc kubenswrapper[4922]: E0218 13:08:29.737721 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78f78102af1e4ae9d9c1f68a59aa0fdad955fdc330d38dc31f6af099cc297af9\": container with ID starting with 78f78102af1e4ae9d9c1f68a59aa0fdad955fdc330d38dc31f6af099cc297af9 not found: ID does not exist" containerID="78f78102af1e4ae9d9c1f68a59aa0fdad955fdc330d38dc31f6af099cc297af9" Feb 18 13:08:29 crc kubenswrapper[4922]: I0218 13:08:29.737750 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78f78102af1e4ae9d9c1f68a59aa0fdad955fdc330d38dc31f6af099cc297af9"} err="failed to get container status \"78f78102af1e4ae9d9c1f68a59aa0fdad955fdc330d38dc31f6af099cc297af9\": rpc error: code = NotFound desc = could not find container \"78f78102af1e4ae9d9c1f68a59aa0fdad955fdc330d38dc31f6af099cc297af9\": container with ID starting with 78f78102af1e4ae9d9c1f68a59aa0fdad955fdc330d38dc31f6af099cc297af9 not found: ID does not exist" Feb 18 13:08:29 crc kubenswrapper[4922]: I0218 13:08:29.737764 4922 scope.go:117] "RemoveContainer" containerID="fed158d3668fe2c8d8f255ced66c9052205f77cd8cc2ad5b3e580da203887ed9" Feb 18 13:08:29 crc kubenswrapper[4922]: E0218 13:08:29.737996 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fed158d3668fe2c8d8f255ced66c9052205f77cd8cc2ad5b3e580da203887ed9\": container with ID starting with fed158d3668fe2c8d8f255ced66c9052205f77cd8cc2ad5b3e580da203887ed9 not found: ID does not exist" containerID="fed158d3668fe2c8d8f255ced66c9052205f77cd8cc2ad5b3e580da203887ed9" Feb 18 13:08:29 crc kubenswrapper[4922]: I0218 13:08:29.738018 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fed158d3668fe2c8d8f255ced66c9052205f77cd8cc2ad5b3e580da203887ed9"} err="failed to get container status \"fed158d3668fe2c8d8f255ced66c9052205f77cd8cc2ad5b3e580da203887ed9\": rpc error: code = NotFound desc = could not find container \"fed158d3668fe2c8d8f255ced66c9052205f77cd8cc2ad5b3e580da203887ed9\": container with ID starting with fed158d3668fe2c8d8f255ced66c9052205f77cd8cc2ad5b3e580da203887ed9 not found: ID does not exist" Feb 18 13:08:30 crc kubenswrapper[4922]: I0218 13:08:30.973535 4922 scope.go:117] "RemoveContainer" containerID="112ba381fc4b9cc3120f9b69127c4ddbe0d5c14674a4b80d3ac1d60cc8c37003" Feb 18 13:08:30 crc kubenswrapper[4922]: E0218 13:08:30.974092 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 13:08:30 crc kubenswrapper[4922]: I0218 13:08:30.984933 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dafba0f3-3fc9-4640-9b50-ed91dacae456" path="/var/lib/kubelet/pods/dafba0f3-3fc9-4640-9b50-ed91dacae456/volumes" Feb 18 13:08:45 crc kubenswrapper[4922]: I0218 13:08:45.973430 4922 scope.go:117] "RemoveContainer" containerID="112ba381fc4b9cc3120f9b69127c4ddbe0d5c14674a4b80d3ac1d60cc8c37003" Feb 18 13:08:45 crc kubenswrapper[4922]: E0218 13:08:45.974437 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 13:08:59 crc kubenswrapper[4922]: I0218 13:08:59.973675 4922 scope.go:117] "RemoveContainer" containerID="112ba381fc4b9cc3120f9b69127c4ddbe0d5c14674a4b80d3ac1d60cc8c37003" Feb 18 13:08:59 crc kubenswrapper[4922]: E0218 13:08:59.974382 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 13:09:10 crc kubenswrapper[4922]: I0218 13:09:10.246959 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pw2bz"] Feb 18 13:09:10 crc kubenswrapper[4922]: E0218 13:09:10.248961 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dafba0f3-3fc9-4640-9b50-ed91dacae456" containerName="extract-utilities" Feb 18 13:09:10 crc kubenswrapper[4922]: I0218 13:09:10.249041 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="dafba0f3-3fc9-4640-9b50-ed91dacae456" containerName="extract-utilities" Feb 18 13:09:10 crc kubenswrapper[4922]: E0218 13:09:10.249142 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dafba0f3-3fc9-4640-9b50-ed91dacae456" containerName="extract-content" Feb 18 13:09:10 crc kubenswrapper[4922]: I0218 13:09:10.249205 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="dafba0f3-3fc9-4640-9b50-ed91dacae456" containerName="extract-content" Feb 18 13:09:10 crc kubenswrapper[4922]: E0218 13:09:10.249265 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dafba0f3-3fc9-4640-9b50-ed91dacae456" containerName="registry-server" Feb 18 13:09:10 crc kubenswrapper[4922]: I0218 13:09:10.249317 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="dafba0f3-3fc9-4640-9b50-ed91dacae456" containerName="registry-server" Feb 18 13:09:10 crc kubenswrapper[4922]: I0218 13:09:10.249560 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="dafba0f3-3fc9-4640-9b50-ed91dacae456" containerName="registry-server" Feb 18 13:09:10 crc kubenswrapper[4922]: I0218 13:09:10.251236 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pw2bz" Feb 18 13:09:10 crc kubenswrapper[4922]: I0218 13:09:10.291411 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pw2bz"] Feb 18 13:09:10 crc kubenswrapper[4922]: I0218 13:09:10.418329 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0615d08-764f-4a22-8877-94c6e23119ef-catalog-content\") pod \"community-operators-pw2bz\" (UID: \"b0615d08-764f-4a22-8877-94c6e23119ef\") " pod="openshift-marketplace/community-operators-pw2bz" Feb 18 13:09:10 crc kubenswrapper[4922]: I0218 13:09:10.418709 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wb2tc\" (UniqueName: \"kubernetes.io/projected/b0615d08-764f-4a22-8877-94c6e23119ef-kube-api-access-wb2tc\") pod \"community-operators-pw2bz\" (UID: \"b0615d08-764f-4a22-8877-94c6e23119ef\") " pod="openshift-marketplace/community-operators-pw2bz" Feb 18 13:09:10 crc kubenswrapper[4922]: I0218 13:09:10.418778 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0615d08-764f-4a22-8877-94c6e23119ef-utilities\") pod \"community-operators-pw2bz\" (UID: \"b0615d08-764f-4a22-8877-94c6e23119ef\") " pod="openshift-marketplace/community-operators-pw2bz" Feb 18 13:09:10 crc kubenswrapper[4922]: I0218 13:09:10.521232 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wb2tc\" (UniqueName: \"kubernetes.io/projected/b0615d08-764f-4a22-8877-94c6e23119ef-kube-api-access-wb2tc\") pod \"community-operators-pw2bz\" (UID: \"b0615d08-764f-4a22-8877-94c6e23119ef\") " pod="openshift-marketplace/community-operators-pw2bz" Feb 18 13:09:10 crc kubenswrapper[4922]: I0218 13:09:10.521671 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0615d08-764f-4a22-8877-94c6e23119ef-utilities\") pod \"community-operators-pw2bz\" (UID: \"b0615d08-764f-4a22-8877-94c6e23119ef\") " pod="openshift-marketplace/community-operators-pw2bz" Feb 18 13:09:10 crc kubenswrapper[4922]: I0218 13:09:10.521925 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0615d08-764f-4a22-8877-94c6e23119ef-catalog-content\") pod \"community-operators-pw2bz\" (UID: \"b0615d08-764f-4a22-8877-94c6e23119ef\") " pod="openshift-marketplace/community-operators-pw2bz" Feb 18 13:09:10 crc kubenswrapper[4922]: I0218 13:09:10.522134 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0615d08-764f-4a22-8877-94c6e23119ef-utilities\") pod \"community-operators-pw2bz\" (UID: \"b0615d08-764f-4a22-8877-94c6e23119ef\") " pod="openshift-marketplace/community-operators-pw2bz" Feb 18 13:09:10 crc kubenswrapper[4922]: I0218 13:09:10.522290 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0615d08-764f-4a22-8877-94c6e23119ef-catalog-content\") pod \"community-operators-pw2bz\" (UID: \"b0615d08-764f-4a22-8877-94c6e23119ef\") " pod="openshift-marketplace/community-operators-pw2bz" Feb 18 13:09:10 crc kubenswrapper[4922]: I0218 13:09:10.541280 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wb2tc\" (UniqueName: \"kubernetes.io/projected/b0615d08-764f-4a22-8877-94c6e23119ef-kube-api-access-wb2tc\") pod \"community-operators-pw2bz\" (UID: \"b0615d08-764f-4a22-8877-94c6e23119ef\") " pod="openshift-marketplace/community-operators-pw2bz" Feb 18 13:09:10 crc kubenswrapper[4922]: I0218 13:09:10.596323 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pw2bz" Feb 18 13:09:11 crc kubenswrapper[4922]: I0218 13:09:11.088253 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pw2bz"] Feb 18 13:09:12 crc kubenswrapper[4922]: I0218 13:09:12.009250 4922 generic.go:334] "Generic (PLEG): container finished" podID="b0615d08-764f-4a22-8877-94c6e23119ef" containerID="908890a96584ac69af3d7a4795a02523328fd40c50d264b71e0dd53105fb4a74" exitCode=0 Feb 18 13:09:12 crc kubenswrapper[4922]: I0218 13:09:12.009441 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pw2bz" event={"ID":"b0615d08-764f-4a22-8877-94c6e23119ef","Type":"ContainerDied","Data":"908890a96584ac69af3d7a4795a02523328fd40c50d264b71e0dd53105fb4a74"} Feb 18 13:09:12 crc kubenswrapper[4922]: I0218 13:09:12.009975 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pw2bz" event={"ID":"b0615d08-764f-4a22-8877-94c6e23119ef","Type":"ContainerStarted","Data":"e7c03e13078ba65076c28d4d878840509f8443583976608c7c256183fc823241"} Feb 18 13:09:13 crc kubenswrapper[4922]: I0218 13:09:13.020675 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pw2bz" event={"ID":"b0615d08-764f-4a22-8877-94c6e23119ef","Type":"ContainerStarted","Data":"65525613a226afeeb976852cdbbb3222a7eb669500cd277dc78ac0385c20605b"} Feb 18 13:09:14 crc kubenswrapper[4922]: I0218 13:09:14.973480 4922 scope.go:117] "RemoveContainer" containerID="112ba381fc4b9cc3120f9b69127c4ddbe0d5c14674a4b80d3ac1d60cc8c37003" Feb 18 13:09:14 crc kubenswrapper[4922]: E0218 13:09:14.974705 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 13:09:15 crc kubenswrapper[4922]: I0218 13:09:15.038101 4922 generic.go:334] "Generic (PLEG): container finished" podID="b0615d08-764f-4a22-8877-94c6e23119ef" containerID="65525613a226afeeb976852cdbbb3222a7eb669500cd277dc78ac0385c20605b" exitCode=0 Feb 18 13:09:15 crc kubenswrapper[4922]: I0218 13:09:15.038147 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pw2bz" event={"ID":"b0615d08-764f-4a22-8877-94c6e23119ef","Type":"ContainerDied","Data":"65525613a226afeeb976852cdbbb3222a7eb669500cd277dc78ac0385c20605b"} Feb 18 13:09:15 crc kubenswrapper[4922]: I0218 13:09:15.041452 4922 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 13:09:16 crc kubenswrapper[4922]: I0218 13:09:16.048694 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pw2bz" event={"ID":"b0615d08-764f-4a22-8877-94c6e23119ef","Type":"ContainerStarted","Data":"98c26d86df78b4f621fa13bc0f508b740a3e9be7d2950a117cc1d052828a4b5e"} Feb 18 13:09:16 crc kubenswrapper[4922]: I0218 13:09:16.070719 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pw2bz" podStartSLOduration=2.652823639 podStartE2EDuration="6.070693909s" podCreationTimestamp="2026-02-18 13:09:10 +0000 UTC" firstStartedPulling="2026-02-18 13:09:12.013297487 +0000 UTC m=+5553.741001567" lastFinishedPulling="2026-02-18 13:09:15.431167757 +0000 UTC m=+5557.158871837" observedRunningTime="2026-02-18 13:09:16.06560971 +0000 UTC m=+5557.793313790" watchObservedRunningTime="2026-02-18 13:09:16.070693909 +0000 UTC m=+5557.798397989" Feb 18 13:09:20 crc kubenswrapper[4922]: I0218 13:09:20.597511 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pw2bz" Feb 18 13:09:20 crc kubenswrapper[4922]: I0218 13:09:20.598062 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pw2bz" Feb 18 13:09:20 crc kubenswrapper[4922]: I0218 13:09:20.643620 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pw2bz" Feb 18 13:09:21 crc kubenswrapper[4922]: I0218 13:09:21.146398 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pw2bz" Feb 18 13:09:21 crc kubenswrapper[4922]: I0218 13:09:21.198840 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pw2bz"] Feb 18 13:09:23 crc kubenswrapper[4922]: I0218 13:09:23.110267 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pw2bz" podUID="b0615d08-764f-4a22-8877-94c6e23119ef" containerName="registry-server" containerID="cri-o://98c26d86df78b4f621fa13bc0f508b740a3e9be7d2950a117cc1d052828a4b5e" gracePeriod=2 Feb 18 13:09:23 crc kubenswrapper[4922]: I0218 13:09:23.573311 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pw2bz" Feb 18 13:09:23 crc kubenswrapper[4922]: I0218 13:09:23.680521 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0615d08-764f-4a22-8877-94c6e23119ef-catalog-content\") pod \"b0615d08-764f-4a22-8877-94c6e23119ef\" (UID: \"b0615d08-764f-4a22-8877-94c6e23119ef\") " Feb 18 13:09:23 crc kubenswrapper[4922]: I0218 13:09:23.680781 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wb2tc\" (UniqueName: \"kubernetes.io/projected/b0615d08-764f-4a22-8877-94c6e23119ef-kube-api-access-wb2tc\") pod \"b0615d08-764f-4a22-8877-94c6e23119ef\" (UID: \"b0615d08-764f-4a22-8877-94c6e23119ef\") " Feb 18 13:09:23 crc kubenswrapper[4922]: I0218 13:09:23.680855 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0615d08-764f-4a22-8877-94c6e23119ef-utilities\") pod \"b0615d08-764f-4a22-8877-94c6e23119ef\" (UID: \"b0615d08-764f-4a22-8877-94c6e23119ef\") " Feb 18 13:09:23 crc kubenswrapper[4922]: I0218 13:09:23.682385 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0615d08-764f-4a22-8877-94c6e23119ef-utilities" (OuterVolumeSpecName: "utilities") pod "b0615d08-764f-4a22-8877-94c6e23119ef" (UID: "b0615d08-764f-4a22-8877-94c6e23119ef"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 13:09:23 crc kubenswrapper[4922]: I0218 13:09:23.687228 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0615d08-764f-4a22-8877-94c6e23119ef-kube-api-access-wb2tc" (OuterVolumeSpecName: "kube-api-access-wb2tc") pod "b0615d08-764f-4a22-8877-94c6e23119ef" (UID: "b0615d08-764f-4a22-8877-94c6e23119ef"). InnerVolumeSpecName "kube-api-access-wb2tc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:09:23 crc kubenswrapper[4922]: I0218 13:09:23.737650 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0615d08-764f-4a22-8877-94c6e23119ef-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b0615d08-764f-4a22-8877-94c6e23119ef" (UID: "b0615d08-764f-4a22-8877-94c6e23119ef"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 13:09:23 crc kubenswrapper[4922]: I0218 13:09:23.783109 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0615d08-764f-4a22-8877-94c6e23119ef-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 13:09:23 crc kubenswrapper[4922]: I0218 13:09:23.783563 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wb2tc\" (UniqueName: \"kubernetes.io/projected/b0615d08-764f-4a22-8877-94c6e23119ef-kube-api-access-wb2tc\") on node \"crc\" DevicePath \"\"" Feb 18 13:09:23 crc kubenswrapper[4922]: I0218 13:09:23.783693 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0615d08-764f-4a22-8877-94c6e23119ef-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 13:09:24 crc kubenswrapper[4922]: I0218 13:09:24.123721 4922 generic.go:334] "Generic (PLEG): container finished" podID="b0615d08-764f-4a22-8877-94c6e23119ef" containerID="98c26d86df78b4f621fa13bc0f508b740a3e9be7d2950a117cc1d052828a4b5e" exitCode=0 Feb 18 13:09:24 crc kubenswrapper[4922]: I0218 13:09:24.123779 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pw2bz" event={"ID":"b0615d08-764f-4a22-8877-94c6e23119ef","Type":"ContainerDied","Data":"98c26d86df78b4f621fa13bc0f508b740a3e9be7d2950a117cc1d052828a4b5e"} Feb 18 13:09:24 crc kubenswrapper[4922]: I0218 13:09:24.123825 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pw2bz" event={"ID":"b0615d08-764f-4a22-8877-94c6e23119ef","Type":"ContainerDied","Data":"e7c03e13078ba65076c28d4d878840509f8443583976608c7c256183fc823241"} Feb 18 13:09:24 crc kubenswrapper[4922]: I0218 13:09:24.123847 4922 scope.go:117] "RemoveContainer" containerID="98c26d86df78b4f621fa13bc0f508b740a3e9be7d2950a117cc1d052828a4b5e" Feb 18 13:09:24 crc kubenswrapper[4922]: I0218 13:09:24.123919 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pw2bz" Feb 18 13:09:24 crc kubenswrapper[4922]: I0218 13:09:24.145277 4922 scope.go:117] "RemoveContainer" containerID="65525613a226afeeb976852cdbbb3222a7eb669500cd277dc78ac0385c20605b" Feb 18 13:09:24 crc kubenswrapper[4922]: I0218 13:09:24.157647 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pw2bz"] Feb 18 13:09:24 crc kubenswrapper[4922]: I0218 13:09:24.167026 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pw2bz"] Feb 18 13:09:24 crc kubenswrapper[4922]: I0218 13:09:24.189121 4922 scope.go:117] "RemoveContainer" containerID="908890a96584ac69af3d7a4795a02523328fd40c50d264b71e0dd53105fb4a74" Feb 18 13:09:24 crc kubenswrapper[4922]: I0218 13:09:24.208298 4922 scope.go:117] "RemoveContainer" containerID="98c26d86df78b4f621fa13bc0f508b740a3e9be7d2950a117cc1d052828a4b5e" Feb 18 13:09:24 crc kubenswrapper[4922]: E0218 13:09:24.208714 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98c26d86df78b4f621fa13bc0f508b740a3e9be7d2950a117cc1d052828a4b5e\": container with ID starting with 98c26d86df78b4f621fa13bc0f508b740a3e9be7d2950a117cc1d052828a4b5e not found: ID does not exist" containerID="98c26d86df78b4f621fa13bc0f508b740a3e9be7d2950a117cc1d052828a4b5e" Feb 18 13:09:24 crc kubenswrapper[4922]: I0218 13:09:24.208767 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98c26d86df78b4f621fa13bc0f508b740a3e9be7d2950a117cc1d052828a4b5e"} err="failed to get container status \"98c26d86df78b4f621fa13bc0f508b740a3e9be7d2950a117cc1d052828a4b5e\": rpc error: code = NotFound desc = could not find container \"98c26d86df78b4f621fa13bc0f508b740a3e9be7d2950a117cc1d052828a4b5e\": container with ID starting with 98c26d86df78b4f621fa13bc0f508b740a3e9be7d2950a117cc1d052828a4b5e not found: ID does not exist" Feb 18 13:09:24 crc kubenswrapper[4922]: I0218 13:09:24.208798 4922 scope.go:117] "RemoveContainer" containerID="65525613a226afeeb976852cdbbb3222a7eb669500cd277dc78ac0385c20605b" Feb 18 13:09:24 crc kubenswrapper[4922]: E0218 13:09:24.209060 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65525613a226afeeb976852cdbbb3222a7eb669500cd277dc78ac0385c20605b\": container with ID starting with 65525613a226afeeb976852cdbbb3222a7eb669500cd277dc78ac0385c20605b not found: ID does not exist" containerID="65525613a226afeeb976852cdbbb3222a7eb669500cd277dc78ac0385c20605b" Feb 18 13:09:24 crc kubenswrapper[4922]: I0218 13:09:24.209088 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65525613a226afeeb976852cdbbb3222a7eb669500cd277dc78ac0385c20605b"} err="failed to get container status \"65525613a226afeeb976852cdbbb3222a7eb669500cd277dc78ac0385c20605b\": rpc error: code = NotFound desc = could not find container \"65525613a226afeeb976852cdbbb3222a7eb669500cd277dc78ac0385c20605b\": container with ID starting with 65525613a226afeeb976852cdbbb3222a7eb669500cd277dc78ac0385c20605b not found: ID does not exist" Feb 18 13:09:24 crc kubenswrapper[4922]: I0218 13:09:24.209105 4922 scope.go:117] "RemoveContainer" containerID="908890a96584ac69af3d7a4795a02523328fd40c50d264b71e0dd53105fb4a74" Feb 18 13:09:24 crc kubenswrapper[4922]: E0218 13:09:24.209382 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"908890a96584ac69af3d7a4795a02523328fd40c50d264b71e0dd53105fb4a74\": container with ID starting with 908890a96584ac69af3d7a4795a02523328fd40c50d264b71e0dd53105fb4a74 not found: ID does not exist" containerID="908890a96584ac69af3d7a4795a02523328fd40c50d264b71e0dd53105fb4a74" Feb 18 13:09:24 crc kubenswrapper[4922]: I0218 13:09:24.209407 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"908890a96584ac69af3d7a4795a02523328fd40c50d264b71e0dd53105fb4a74"} err="failed to get container status \"908890a96584ac69af3d7a4795a02523328fd40c50d264b71e0dd53105fb4a74\": rpc error: code = NotFound desc = could not find container \"908890a96584ac69af3d7a4795a02523328fd40c50d264b71e0dd53105fb4a74\": container with ID starting with 908890a96584ac69af3d7a4795a02523328fd40c50d264b71e0dd53105fb4a74 not found: ID does not exist" Feb 18 13:09:24 crc kubenswrapper[4922]: I0218 13:09:24.987881 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0615d08-764f-4a22-8877-94c6e23119ef" path="/var/lib/kubelet/pods/b0615d08-764f-4a22-8877-94c6e23119ef/volumes" Feb 18 13:09:28 crc kubenswrapper[4922]: I0218 13:09:28.980716 4922 scope.go:117] "RemoveContainer" containerID="112ba381fc4b9cc3120f9b69127c4ddbe0d5c14674a4b80d3ac1d60cc8c37003" Feb 18 13:09:28 crc kubenswrapper[4922]: E0218 13:09:28.981299 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-znglx_openshift-machine-config-operator(fdb7cedc-b2e3-48f0-80e0-e17073b43228)\"" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" Feb 18 13:09:41 crc kubenswrapper[4922]: I0218 13:09:41.973831 4922 scope.go:117] "RemoveContainer" containerID="112ba381fc4b9cc3120f9b69127c4ddbe0d5c14674a4b80d3ac1d60cc8c37003" Feb 18 13:09:42 crc kubenswrapper[4922]: I0218 13:09:42.290807 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-znglx" event={"ID":"fdb7cedc-b2e3-48f0-80e0-e17073b43228","Type":"ContainerStarted","Data":"d5a75f5dffb9bb27b6966398e4cdc574eee1f469794b1d2baf0992bcc03c6b7b"} Feb 18 13:12:09 crc kubenswrapper[4922]: I0218 13:12:09.807812 4922 patch_prober.go:28] interesting pod/machine-config-daemon-znglx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 13:12:09 crc kubenswrapper[4922]: I0218 13:12:09.808401 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-znglx" podUID="fdb7cedc-b2e3-48f0-80e0-e17073b43228" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515145335267024460 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015145335270017367 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015145321540016505 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015145321540015455 5ustar corecore